[Zope-dev] what you see and what dont is all dom

Damian Morton morton@dennisinter.com
Fri, 17 Sep 1999 10:54:42 -0400


----- Original Message ----- 
From: Richard Wackerbarth <rkw@dataplex.net>
To: <morton@dennisinter.com>
Cc: <zope-dev@zope.org>
Sent: Friday, September 17, 1999 6:10 AM
Subject: Re: [Zope-dev] what you see and what dont is all dom


> On Thu, 16 Sep 1999, Damian Morton wrote:
> > > > If you could decompose a website into a kind of feedforward
> > > > dataflow machine and add some intelligent caching, you would have a very
> > > > efficient dynamic website.
> 
> > By rendering and caching each document element separately as the source data
> changes, you get the best of both worlds - dynamic and static. The alternative
> is to cache each page and annotate a timeout, or some kind of trigger to cause
> a cache invalidation, and in my opinion, this could get really messy. Of
> course, many DOM structures have no inputs and depend on nothing, i.e. they are
> static - whole trees of these might be kept in a cached compact state - e.g.
> pick a compression algorithm that decompresses blazingly fast. If you wanted to
> get real tricky, you could use come kind of heirarchical caching e.g. with
> time, html -> compressed html -> compressed filed html
> 
> The problem with this is that it only works if the DOM is slowly changing WRT
> the queries. If you have a lot of the tree which is changed more often than it
> is referenced, you end up either spending a lot of resources rendering things
> that are never used or your scheme reverts to the "render on demand" scheme
> that we already have. In that case, the overhead of sorting out the cache
> validity may be greater than the savings which you get by cacheing. It all
> depends on the characteristics of the content.

I guess my experience is with sites that tend to get up to a million of hits a day. In this case, the DOM is slowly changing wrt to the queries. However, even in a lower demand situation; a situation in which the changes outstrip the demand, the dataflow scheme still has advantages. You are still only rendering that which needs to be rendered, which is generally a good thing. You can also get more sophisticated about your dataflow, mixing a feedforward with a demand-driven scheme based of usage statistics. If a given tree is demanded more frequently than it changes, then it is rendered as its inputs change. If the tree is changed more frequently than it is demanded, then it should be rendered on demand.
In this way the scheme is adaptive with respect to both demand and design. I do think, however, that in most places, in most websites, the parts that change as frequently as (or more frequently than) they are demanded will be small in number and scope. About the only things I can think of that might fall in this category are time-based elements, and non-deterministic elements.