Search engines and robots.txt:
Since the birth of internet search engines, the robots.txt file has been how webmasters could let search engines like Google know what content should get crawled and indexed. However, as part of Google Sitemaps, later named XML Sitemaps Protocol, the usage was expanded with Sitemaps Autodiscovery. It is now possible for webmaster to direct search engines to the website XML sitemap. The moment a search engine has found your website and the robots.txt file, it will also know where to find your XML sitemap.
Submit your XML sitemaps:
In the beginning XML sitemaps submission required you had created and verified a Google Webmaster Tools account. You also had to submit your sitemap files manually. Now, instead of submitting XML sitemaps to all search engines individually, you can be done with them all in seconds.
As see from above, to add XML sitemaps autodiscovery to a robots.txt file, add the fully qualified XML sitemap file path like this: Sitemap: http://www.example.com/sitemap.xml.
Complete robots.txt example for XML sitemaps autodiscovery
If you have created a sitemap index file, you can also reference that:
Sitemap: Sitemap: http://www.example.com/sitemap-index.xml
Manual XML sitemap submission
There can be good reasons to submit your sitemaps manually the first time, e.g. to get acquainted with the different search engine and webmaster tools available:
* Google Webmaster Tools
* Live Webmaster Tools (MSN)
* Yahoo SiteExplorer
Advanced manage and submit sitemaps
In the beginning no search engines supported cross submit multiple websites in one XML sitemap file. However, now most include support for new ways of managing sitemaps across multiple sites. Requirement is you need to verify ownership of all websites:
* Sitemaps protocol: Cross sitemaps submit and manage using robots.txt.
* Google: More website verification methods than sitemaps protocol defines.
Read More Articles Visit http://www.xml-sitemaps-generator.com