Hi all, Does anyone know about the relationship between Google Sitemaps and the robots.txt file? Specifically, when a site is being spidered does the data in the sitemap over-ride the robots.txt file? We have an automatically generated sitemap which includes some pages that *are* in our robots.txt file. I'm trying to figure out whether we're effectively over-riding our own robots.txt in the context of Google (or any other search engine that can use our map). Cheers, Simon -- Simon Marshall Swyddog Datblygu'r We Prifysgol Aberystwyth E-bost: [log in to unmask] Ffon: (01970) 62 2459 Ffacs: (01970) 62 1554 -- Simon Marshall Web Development Officer Aberystwyth University E-mail: [log in to unmask] Tel: (01970) 62 2459 Fax: (01970) 62 1554