Hi I'm trying to use ZCatalogs excellent indexing capability as a front end for data mining. We have 80k records which I'm reading using a ZSQL method (which also uses a custom base class for the records), for each record I then have to pull in data from three other tables, then I add that virtual object to the catalog. All of this is done via a python script. Here's the kicker: Zope takes forever to do this, and it tends to freeze the zope instance while it's doing it, in fact I have yet to successfully do this for a batch size larger than 100 records. My suspicion is that zope is doing this all in a transaction, and not committing the change per object, but instead waiting for the whole 80k records to be worked on, then commit the whole bag. Needless to say this isn't optimal. Is there a way I can force the transaction to commit after each record is done? Here's my code (the getXXX are ZSQL methods, the container is a ZCatalog object): ----------------------------------------------------- if user_id: users = context.getUsers(user_id=user_id) else: users = context.getUsers() for user in users: features_read = context.getReadFeatures(user_id=user.user_id) user.addFeatures(features_read) interests = context.getInterests(user_id=user.user_id) user.addInterests(interests) mailings = context.getMailings(user_id=user.user_id) user.addMailings(mailings) applications = context.getApplications(user_id=user.user_id) user.addApplications(applications) uid = '%s' % user.user_id container.catalog_object(user, uid) ------------------------------------------------------ 271 North Avenue Suite 1210 New Rochelle, NY 10801 ebizQ Links Webinars http://www.ebizq.net/webinars/?campaign=exchange White Papers http://www.ebizq.net/white_papers/?campaign=exchange