Check the archives - someone else was having this problem too. Zope can handle large files (I've been testing it out with huge image files), it was something to do with large chunks of stuff to/from the SQL database.
From: "Christopher N. Deckard" <cnd@ecn.purdue.edu> Date: Fri, 18 May 2001 16:09:01 -0500 To: zope@zope.org Subject: [Zope] why can zope not handle large files
For a project we are working on we have export of an MS Access database as csv (comma seperated) files. The csv file is about 37MB with rows of 18 numbers sepearted by commas. The file has been uploaded into Zope and then we are trying to insert each row into a PostgreSQL database using either the PoPy or PsycoPG database adapter. Basically Zope never comes back from trying to do this. We have also tested against Oracle with the same results. It's been run on an Athlon 600 and a Sun E450. Zope just starts to take up more and more memory eventually filling up swap and never completing the transaction. If we split the 37MB file up into small 3-4MB chunks, everything works fine. But it just can't handle the larger file. To me, 37MB doesn't seem like much. If it were .5GB, I could understand Zope having problems. How can I get this to work?
I am not on the zope list (though I'll probably subscribe here in a moment) so please CC me on a reply.
Thanks, -Chris
_______________________________________________ Zope maillist - Zope@zope.org http://lists.zope.org/mailman/listinfo/zope ** No cross posts or HTML encoding! ** (Related lists - http://lists.zope.org/mailman/listinfo/zope-announce http://lists.zope.org/mailman/listinfo/zope-dev )