Are you a Google Analytics enthusiast?
More SEO Content
Posted 12 March 2010 - 05:44 AM
I have heard about a technique where a robots meta tag with noindex, follow is used on certain pages such as sitemaps to stop the page getting indexed but still pass on pagerank to sub-pages. I don't quite understand the value of this technique, has anybody had any experience with this?
Posted 12 March 2010 - 01:56 PM
Some people want to hide their linking strategies (even internally) from simple searches to limit the amount of competitive intelligence they divulge. For example, many black hat SEOs operate what are called "blog farms" -- large groups of autogenerated blog sites. The linking pages on these domains might be kept invisible so that people evaluating those sites don't see the whole picture.
Some SEOs may feel they are preventing an "unimportant" page from appearing in SERPs. However, an HTML sitemap is usually a very important page. Unfortunately, too many SEOs bought into the nonsense theories behind PageRank sculpting and didn't stop to evaluate user behavior.
Some people might noindex a page either for aesthetic reasons or because they are uncertain about its future. I've seen some designers take a page out of the index when they deprecate it (that is often a step prior to deletion that one resorts to if a page has many inbound links).
In short, there could be any number of reasons for why people do this. None of them are universally applicable or valuable reasons. You have to decide whether to do this on a case-by-case basis.
Posted 12 March 2010 - 05:05 PM
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users