why can zope not handle large files
For a project we are working on we have export of an MS Access database as csv (comma seperated) files. The csv file is about 37MB with rows of 18 numbers sepearted by commas. The file has been uploaded into Zope and then we are trying to insert each row into a PostgreSQL database using either the PoPy or PsycoPG database adapter. Basically Zope never comes back from trying to do this. We have also tested against Oracle with the same results. It's been run on an Athlon 600 and a Sun E450. Zope just starts to take up more and more memory eventually filling up swap and never completing the transaction. If we split the 37MB file up into small 3-4MB chunks, everything works fine. But it just can't handle the larger file. To me, 37MB doesn't seem like much. If it were .5GB, I could understand Zope having problems. How can I get this to work? I am not on the zope list (though I'll probably subscribe here in a moment) so please CC me on a reply. Thanks, -Chris
Check the archives - someone else was having this problem too. Zope can handle large files (I've been testing it out with huge image files), it was something to do with large chunks of stuff to/from the SQL database.
From: "Christopher N. Deckard" <cnd@ecn.purdue.edu> Date: Fri, 18 May 2001 16:09:01 -0500 To: zope@zope.org Subject: [Zope] why can zope not handle large files
For a project we are working on we have export of an MS Access database as csv (comma seperated) files. The csv file is about 37MB with rows of 18 numbers sepearted by commas. The file has been uploaded into Zope and then we are trying to insert each row into a PostgreSQL database using either the PoPy or PsycoPG database adapter. Basically Zope never comes back from trying to do this. We have also tested against Oracle with the same results. It's been run on an Athlon 600 and a Sun E450. Zope just starts to take up more and more memory eventually filling up swap and never completing the transaction. If we split the 37MB file up into small 3-4MB chunks, everything works fine. But it just can't handle the larger file. To me, 37MB doesn't seem like much. If it were .5GB, I could understand Zope having problems. How can I get this to work?
I am not on the zope list (though I'll probably subscribe here in a moment) so please CC me on a reply.
Thanks, -Chris
_______________________________________________ Zope maillist - Zope@zope.org http://lists.zope.org/mailman/listinfo/zope ** No cross posts or HTML encoding! ** (Related lists - http://lists.zope.org/mailman/listinfo/zope-announce http://lists.zope.org/mailman/listinfo/zope-dev )
Sounds like the problem is with inserting many rows into the database in one transaction. Breaking up the inserts into many transactions should help. How exactly do you put the csv file into Zope? Eric Balasbas Senior Developer eric@virtosi.com http://www.virtosi.com/ Virtosi Ltd. Design -- Branding -- Zope On Fri, 18 May 2001, Christopher N. Deckard wrote:
For a project we are working on we have export of an MS Access database as csv (comma seperated) files. The csv file is about 37MB with rows of 18 numbers sepearted by commas. The file has been uploaded into Zope and then we are trying to insert each row into a PostgreSQL database using either the PoPy or PsycoPG database adapter. Basically Zope never comes back from trying to do this. We have also tested against Oracle with the same results. It's been run on an Athlon 600 and a Sun E450. Zope just starts to take up more and more memory eventually filling up swap and never completing the transaction. If we split the 37MB file up into small 3-4MB chunks, everything works fine. But it just can't handle the larger file. To me, 37MB doesn't seem like much. If it were .5GB, I could understand Zope having problems. How can I get this to work?
I am not on the zope list (though I'll probably subscribe here in a moment) so please CC me on a reply.
Thanks, -Chris
_______________________________________________ Zope maillist - Zope@zope.org http://lists.zope.org/mailman/listinfo/zope ** No cross posts or HTML encoding! ** (Related lists - http://lists.zope.org/mailman/listinfo/zope-announce http://lists.zope.org/mailman/listinfo/zope-dev )
How do I make "one transaction" become "many transactions"? The file is uploaded when creating a new "File" object. -Chris Eric Balasbas wrote:
Sounds like the problem is with inserting many rows into the database in one transaction. Breaking up the inserts into many transactions should help. How exactly do you put the csv file into Zope?
Eric Balasbas Senior Developer eric@virtosi.com
http://www.virtosi.com/ Virtosi Ltd. Design -- Branding -- Zope
On Fri, 18 May 2001, Christopher N. Deckard wrote:
For a project we are working on we have export of an MS Access database as csv (comma seperated) files. The csv file is about 37MB with rows of 18 numbers sepearted by commas. The file has been uploaded into Zope and then we are trying to insert each row into a PostgreSQL database using either the PoPy or PsycoPG database adapter. Basically Zope never comes back from trying to do this. We have also tested against Oracle with the same results. It's been run on an Athlon 600 and a Sun E450. Zope just starts to take up more and more memory eventually filling up swap and never completing the transaction. If we split the 37MB file up into small 3-4MB chunks, everything works fine. But it just can't handle the larger file. To me, 37MB doesn't seem like much. If it were .5GB, I could understand Zope having problems. How can I get this to work?
I am not on the zope list (though I'll probably subscribe here in a moment) so please CC me on a reply.
Thanks, -Chris
_______________________________________________ Zope maillist - Zope@zope.org http://lists.zope.org/mailman/listinfo/zope ** No cross posts or HTML encoding! ** (Related lists - http://lists.zope.org/mailman/listinfo/zope-announce http://lists.zope.org/mailman/listinfo/zope-dev )
-- -------------------------------------------------------------------- Christopher N. Deckard | Lead Web Systems Developer cnd@ecn.purdue.edu | Engineering Computer Network http://www.ecn.purdue.edu/ | Purdue University ---- zlib.decompress('x\234K\316Kq((-J)M\325KM)\005\000)"\005w') ---
the transaction. If we split the 37MB file up into small 3-4MB chunks, everything works fine. But it just can't handle the larger file. To me, 37MB doesn't seem like much. If it were .5GB, I could understand Zope having problems. How can I get this to work?
I tested uploading files of up to 100 MB (images, word documents etc.) without problems. They are also served properly by ZServer. So it doesn't seem to be a problem for Zope to handle the file upload/download itself. But if you try to parse a 37 MB document at once, that might be a bit difficult. How exactly are you doing it? There must be lots of ways to handle this, e.g. from Python you could read the file row by row and commit batches of 10 or so to the database. Remember that Zope is not designed for parsing files of that size. Usually, there is no way a DTML page could be that big. Joachim
participants (4)
-
Christopher N. Deckard -
Eric Balasbas -
Joachim Werner -
marc lindahl