We're handling something similar, most of our files are in the 60-80Mb range and in varying formats from plain text to MS-Word to PDF. We're just getting into the testing phases now so I'm not certain if the performance will be noticeable different under load or not but it's food for thought at least. We're storing the files on the filesystem with an ExtFile type Product and using TextIndexNG2 to index the fulltext of the documents into Zope. The theory being that we can minimize the bloat to the ZODB by only storing the data needed for the searches in the ZODB and keeping the actual file contents out on the file system. Someone with a better working knowledge of Zope might be able to give you a theoretical guestimate on the performance differences, if any, of storing/serving large objects internally vs externally. HTH Lee
From: "Sebastian Krollmann" <sebastian.krollmann@gmx.net> To: <zope@zope.org> Subject: [Zope] searching and serving large textfiles ~120 Mb Date: Fri, 5 Dec 2003 10:31:08 +0100
Hi zopistas,
I need to access large textfiles (~120Mb) from zope. I know the python lager file support and that it is better to keep large files out of the ZODB. I have to do a full text search on these files residing in a folder hierachy on the server, show their content around the location of the found string and allow scrolling in that files source from zope.
Has anybody done something similar to this with that lager files and would share his experiences? Are there any do's and don'ts or best ways to do it?
Thanks for your answers,
SK
_________________________________________________________________ Add photos to your messages with MSN 8. Get 2 months FREE*. http://join.msn.com/?page=dept/features&pgmarket=en-ca&RU=http%3a%2f%2fjoin....