The question: Over the past few months, we have accumulated around 2 GIGs of articles. This trend is expected to continue. Can Zope and ZCatalog handle this kind of load?
I dont see why not! I would imagine it depends on the amount of full-text for each object that is indexed and how many objects are free-textable. I'm not quite sure about ZCatalog and large amount of full-text, for object indexes ZCatalog would be best bet. large amounts of 'free-text' -- not quite sure.
Will the searching capabilities be shot in the ass if I stored the articles as external files? (Does ZCatalog even know about those things?)
no, ZCatalog does not know about external files (out of the box)
OR, should I stick with the articles in the filesystem with ht://dig like we've got now, and just build the interface to ht://dig in Zope?
we use Ultraseek http://ultraseek.com/products/ultraseek/ultratop.htm at work (which is available for the low low price of a 1000$ for x files indexed) and there is a Zope Ultraseek DA (havent used it yet)! I've run across a potentially really kick ass indexer, havent used it but it has a really nice cover! its called Udmsearch - http://mysearch.udm.net/ - its actively developed, searches all types of sources (news, ftp, http, FS, databases!!) and its OSS! <- I will be using this in the next 3 months ht://dig I believe is in PERL, (yuk!) <- resentful over doing a PERL project right now ~runyaga