Inserting large amounts of data (Was: Re: [Zope] Postgres adapters)

Christopher N. Deckard cnd@ecn.purdue.edu
Mon, 21 May 2001 15:03:49 -0500


Jim Penny wrote:
> 
> On Mon, May 21, 2001 at 10:42:53AM -0500, Christopher N. Deckard wrote:
> > I'm trying to take a "File" object in ZODB and iterate over the
> > entire thing (about 300,000 lines) and insert into a Postgres
> > database.  The file is about 37MB of comma sepearated numbers.  I
> > guess Zope treats the entire transaction as one transaction and
> > doesn't really ever come back from attempting to insert every row.
> > Do you have an idea on how to split this up into multiple
> > transactions or do a commit or some sort so that Zope doesn't croke?
> >
> > Thanks,
> > -Chris
> 
>  Has it been exported to filespace?
> Is it a Unix-like system?  Are you comfortable writing edit scripts?
> If yes to all of these, then I would recommend
> using psql to inject it into the database.  If not, I would drop it
> to the filesystem and then use psql ;-).
> 
> Jim Penny

Jim,
What do you mean by exported to the filespace?  

Zope 2.3.2 running on RedHat Linux 7.1 with my own compiled Postgres
7.1.1.  I want all transactions (inserts, updates, even table
creations) to be handled through Zope.  I do not want to give users
access to a shell on the system.  I could do the number of inserts
that I need to do over xml-rpc, but again, I want this to be as
simple as upload a CSV (comma seperated) file, specify the table,
and have it insert every row in the file.  Again, I don't want to
give anyone access to a shell on the machine.  Everything must be
done through Zope.

-Chris