[Zope-dev] database memory concerns

Joerg Wittenberger Joerg Wittenberger <Joerg.Wittenberger@pobox.com>
Thu, 5 Aug 1999 23:22:23 +0200 (CEST)


Hello folk,

For evaluation purpose I've got a zope installation working about 2
weeks ago.  I did not do too much, just about 1M worth of data (my old
home page and some docs) thrown in and installed a squishdot site for
test purpose.

Thist blew the data base to approx 29M!!!  Now I wanted to move most
of it, actually one folder, into the new version.  Because I already
hit to many minor problems, I decided not to experiment with copying
the old data base over the new one.  Instead I exported my data (still
not much more that 1M worth).  This grew it to 39.95 MByte!  After
reimporting the whole thing into the new data base, the latter weights
in with 14M, buh, at least a bit saved.

Hm, now I'm a bit afraid of what's going on.  Im considering zope,
because it might be a choice for a valuable amount of documents.  But
those are approx. 2GByte of text files spread over something like 200K
files.  Given the above experience I have to expect 60GByte of disk
usage.  True?

BTW: what gets stored in the spare bits?

I should mention that I decided to use the gadfly data base, because I
did not find any hints in the docs, why I should use the one or the
other.  Only a short note, which seems to indicate that other data
base adaptors might be broken with the new version (V2 of zope).

Moreover I'm a bit concerned about scalability issues.  I could not
find any note at all at that topic.  How much can I expect zope to
handle?  I have approx 100 users accessing, searching, extending the
above mentioned 2GB data.  The required solution should at least
indicate that, after 10 years and another 20Gbyte it's still working.

thanks a lot

/Jerry