Here is the situation. I have a 9.5 MB file object containing text data. I want to convert the text data into the properties for DTML Documents. I have a script for performing the conversion and creating the DTML Documents. The script runs just fine (by hitting the "Try" tab for the Python Script) when the number of records is low (e.g., less than 10). However, when I invoke the script for the large file, I get a Site Error. Even cutting the file down to about sevearl hundred records gives me the same error. The work around would be to write several hundred files and then run the script on them individually. Clearly, however, that is labor intensive. There has to be another way. Incidentally, I did not try to implement this as a relational database routine because there are only 13,116 records (yes, only) and it is much easier to set up editing and such via Zope, rather than incurring the extra overhead by using a relational database, such as mySQL or Postgres. Incidentally, I'm running Zope 2-4-3 on Linux (Debian, Potato using python 1.5). Disk space isn't a problem. However there is only 256 MB of RAM. Any help would be greatly appreciated. Ron