Here is the situation. I have a 9.5 MB file object containing text data. I want to convert the text data into the properties for DTML Documents. I have a script for performing the conversion and creating the DTML Documents. The script runs just fine (by hitting the "Try" tab for the Python Script) when the number of records is low (e.g., less than 10). However, when I invoke the script for the large file, I get a Site Error. Even cutting the file down to about sevearl hundred records gives me the same error. The work around would be to write several hundred files and then run the script on them individually. Clearly, however, that is labor intensive. There has to be another way. Incidentally, I did not try to implement this as a relational database routine because there are only 13,116 records (yes, only) and it is much easier to set up editing and such via Zope, rather than incurring the extra overhead by using a relational database, such as mySQL or Postgres. Incidentally, I'm running Zope 2-4-3 on Linux (Debian, Potato using python 1.5). Disk space isn't a problem. However there is only 256 MB of RAM. Any help would be greatly appreciated. Ron
complaw@hal-pc.org writes:
... creating large amounts of objects in single transaction causes a Site Error ... Which SiteError do you get?
Incidentally, I'm running Zope 2-4-3 on Linux (Debian, Potato using python 1.5). Disk space isn't a problem. However there is only 256 MB of RAM. Everything you change in one transaction must live inside RAM (and swap space). Although, 256MB should be enough to create 13 k DTML documents, maybe something strange causes excessive memory consumption.
You could try to use subtransactions (see OFS.Image.File, for an example) or commit your transaction after every x documents: t= get_transaction() t.commit(); t.begin() You need an external method (or other code unhindered by Zope's security subsystem). Dieter
participants (2)
-
complaw@hal-pc.org -
Dieter Maurer