Hello AM, the problem is to decide wether a robot or a browser is downloading the page. If the robot is very simple or the robot user very unexperienced it would be possible to detect the robot simply by the user-agent. But most robots will disguise themselves as Internet Explorer. Look in your logfile. If you see robots than you can do something. But be very carefull if you don't want to exclude even the searchengine robots. Best thing to do is to set up a robots.txt for your domain and (may be in your standard_html_header) detect the user-agent and serve a different content. Don't just send nothing or an error message. That would trigger the robot and make him disguise itself as Internet Explorer. If you are really sure it will work with your audience you could try to protect your picture links with Javascript. Ulli -- Get more qualifyed targeted traffic. http://www.PublisherSEO.com/1 World Wide Web Publisher, Ulrich Wisser, Odensvag 13, S-14571 Norsborg http://www.publisher.de Tel: +46-8-53460905 Fax: +46-8-534 609 06