I have always been told by this forum that creating and serving different content for the end user vs. what is provided to the search engine crawler was a big no-no and considered black hat.
However, I have come across this source code on a friends wesbite and so am a little confused.
<!-- IMPORTANT! LOOKING FOR THIS SITE'S SEO CONTENT? This site's SEO content, such as meta tags and headers, is not here. This is because search engines, like Google, actually crawl the site's homepage via http://wwwxxxxx.co.uk/?_escaped_fragment_= Internal pages, like “About”, also have their own special search engine versions, for example: http://www.xxxxx.co.uk/?_escaped_fragment_=about%2Fc1enr If you're looking for this site's SEO content, that's where you can view it. Want more information about Ajax page crawling? Read Google's explanation here: http://bit.ly/ajaxcrawling -->
Has this SEO ethos changed and now we should all be using stealth mechanisms to serve different contet to the SE's vs the end user?
'Special search engine versions' sounds like a complete ethos and paradigm shift to me not to mention dodgy as hell!
Your advice is appreciated.