[Zope-dev] Downloading large files

John Eikenberry jae@kavi.com
Wed, 29 Mar 2000 13:05:23 -0800 (PST)


It is Zope 2.1.4, compiled from source on a BSDI/3.1 machine (don't
ask). 

We are using Apache with PCGI, and we thought of this as the source of the
problem. So we opened up a port for direct access to ZServer and used this
for the downloading. But it still had the problem. (So, no, PCGI isn't the
problem).

My current theory is that as the linked list of Pdata objects get
activated to access the content of the large file they don't get
deactivated fast enough and all stay in memory. And that if I could
somehow manually deactivate them, I could get them out of memory. 

This of course this is just the latest in a long stream of threories that
went nowhere... so hopefully you'll tell me I just need to upgrade my Zope
installation. :)

Thanks for the help.

On Wed, 29 Mar 2000, Michel Pelletier wrote:

> John Eikenberry wrote:
> > 
> > Here's the deal. We have a download area on one of our Zope sites. It
> > contains a few fairly large files (between 45-60Meg).
> > 
> > When downloading these files, the Zope process grows by the amount of the
> > file. Ending up at 50+ meg in active memory. The server we run on has a
> > per process memory limitation of ~70Meg (this cannot be raised or
> > avoided). Thus, if 2 people try to download files at the same time... Zope
> > crashes.
> 
> Are you using Apache/PCGI?  I'm pretty sure that we fixed this behavior
> in Zope itself a while back, but I think that PCGI queues the whole
> request before sending it back.  I could be wrong.  What ver of Zope are
> you using?
> 
> -Michel
> 

---

John Eikenberry
[jae@kavi.com - http://zhar.net/] 
______________________________________________________________
"A society that will trade a little liberty for a little order
 will deserve neither and lose both."
                                         --B. Franklin