We desire to use our website (Zope) to gather large files from our users. These files can be rather large (10-70mb). From what I have read It would probably not be a good thing to import these files into the Zope database. So does anybody have any other ideas of how to move these files onto our server?
Hi! Your question actually has got to parts: a) Does ZODB work with files that large? Answers: - Standard file storage: Yes, if you use it on a system where the 2GB limit does not apply, e.g. Linux with Kernel 2.4 on Intel ... - BerkeleyDB storage: Yes, certainly. The 2GB limit does apply, too, but not "that early", as the data is split into several files There are other possibilities, like uploading directly to the file system, as Thomas pointed out. All the code can be found in LocalFS, as it also has UPLOAD capabilities built in. If you prefer a file-by-file representation in Zope (LocalFS just "mounts" a filesystem tree), ExtFile/ExtImage might serve you better. b) Can I upload files of that size via the browser? Yes, but there might be problems. We tested uploads to the ZODB and got them working up to at least 100 MB per file. But that was over a 10MBit Ethernet line. Over the web, there might be timeout problems etc. Alternatives that are more reliable include FTP upload or maybe WebDAV (which seems to be quite slow, but very reliable to us on the Windows platform). Cheers Joachim.