Are you a Google Analytics enthusiast?
More SEO Content
Confusion With Site Maps And Robots.txt And Warnings In G-Wmt
Posted 28 December 2013 - 08:18 AM
I am running a large retail website with a lot of products and product pages/category pages etc. I have a search function and search page found at mysite.com/search, and I have just started to use my robots.txt file to stop Google crawling anything found at /search and at /admin etc. My CMS creates an XML site map that is auto updated every time a new page is added or removed. But now i am getting warnings in Google Webmaster tools saying that Sitemap contains urls which are blocked by robots.txt and has identified these pages. I cannot tell my CMS not to show these in the sitemap/xml,even though Google recommends to keep pages blocked by robots out of the sitemap.xml So I'm between a rock and a hard place. If i just let the list of these warnings from WMT about this error build up, is it going to effect the rankings of any other pages? I don't want my admin and search page to be indexed anyway, just my other 1,999 pages. Looking forward to your help, thanks.
Posted 28 December 2013 - 08:30 AM Best Answer
Change your CMS
Remove the sitemap completely.
f i just let the list of these warnings from WMT about this error build up, is it going to effect the rankings of any other pages?
Not even slightly
Posted 28 December 2013 - 08:32 AM
Wow that was fast, thanks so much. Liking this forum already!
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users