The Sitemap Protocol allows you to inform search engines about URLs on a website that are available for crawling. In its simplest form, a Sitemap is an XML file that lists URLs for a site. It has been discovered that many site owners are not building their Sitemaps through spidering, but by scripted runs on their web root directory structures. If this is the case, an attacker may be able to use sitemaps to enumerate all files and directories in the web server root.
Solution
Site owners should be wary of automatically generating sitemap.xml files, and admins should review the contents of there sitemap.xml file for sensitive material.