Thank you, Andreas!
Squid + Zope is used on several large production sites and my experiences are very good. Although you need to have some knowledge because Squid has a lot of configuration options and it takes some time to get Squid tuned in the right way.
I know that it is a popular way of acceleration. One of the things I just noticed is that a problem is maybe coming from the fact, that on my machine a single python process has to deal with all the requests. Is there an easy way to give a site multiple processes to server the same site? The machine has certainly spare capacity. (I started another site on the same box and it's fast.) Is this where I need ZEO or can this be configured for a single instance of Zope? Also, where can I find some good documentation on how to run Squid for Zope and how to tune it? There are loads of references on zope.org, which one's helped you?
Depends on the load and the hardware of the machine. In general it is possible. For a site with lot of traffic I would a dedicated Squid machine. At least put the Squid cache and the Zope instance on different disks to spread IO.
The load on the machine is not the problem at this stage. I have a good RAID and IO seems not to be the bottleneck. Cheers Marc