[Zope] why can zope not handle large files

Christopher N. Deckard cnd@ecn.purdue.edu
Fri, 18 May 2001 16:09:01 -0500


For a project we are working on we have export of an MS Access
database as csv (comma seperated) files.  The csv file is about 37MB
with rows of 18 numbers sepearted by commas.  The file has been
uploaded into Zope and then we are trying to insert each row into a
PostgreSQL database using either the PoPy or PsycoPG database
adapter.  Basically Zope never comes back from trying to do this. 
We have also tested against Oracle with the same results.  It's been
run on an Athlon 600 and a Sun E450.  Zope just starts to take up
more and more memory eventually filling up swap and never completing
the transaction.  If we split the 37MB file up into small 3-4MB
chunks, everything works fine.  But it just can't handle the larger
file.  To me, 37MB doesn't seem like much.  If it were .5GB, I could
understand Zope having problems.  How can I get this to work?

I am not on the zope list (though I'll probably subscribe here in a
moment) so please CC me on a reply.

Thanks,
-Chris