[ZODB-Dev] zeo2a1 performance cold spots
Jeremy Hylton
jeremy@alum.mit.edu
Fri, 28 Jun 2002 14:23:47 -0400
> Ah, no, I was right the first time. FORCE_PRODUCT_RELOAD was
> causing excess
> ZEO traffic, but the traffic was still slower than it should be.
> As far as I
> can tell this is due to ZEO not turning on TCP_NODELAY, which
> adds a little
> latency to every request.
I'm not particularly expert in TCP buffering or the Nagle algorithm, but it
doesn't seem like it would add latency to every request. He quit and
latency to small messages need for synchronous requests. Unfortunately, the
zeoLoad a message is probably the most common message, and it is small, and
it is synchronous.
However, I'm not particularly thrilled with the idea of disabling the Nagle
algorithm. It seems to reduce latency in your benchmark, but it may also
greatly increased the number of packets sent. I would rather understand what
about ZEO has changed to cause the effect to be visible.
> I am hesitant to commit this because it looks like ZEO has always
> worked this
> way, and I am suprised noone else has seen this effect before.
> Particularly
> on windows. Any thoughts?
I know I made one small change at the SMAC layer. When a message is being
output, there are now two different strings appended to the list of pending
messages. The first string is the length of the second string. I made the
change to accommodate large messages, where the old code created a single
string containing both elements and had to copy the large message to
accomplish that. It seemed more efficient to simply appended to different
strings. Perhaps the code that actually sends the data is waiting after the
first small string is written.
There may be other small changes of this sort that have a significant
effect. I can't think of anything off the top of my head.
If you want to do an experiment, I would try changing the handle_write()
method in SMAC. If the list output has more than one element, I do a
string.join to create a single buffer before calling send().
Jeremy