Are you a Google Analytics enthusiast?
More SEO Content
"google Likes Sites That Have More Pages"
Posted 08 January 2007 - 01:06 AM
Posted 08 January 2007 - 01:43 AM
Posted 08 January 2007 - 02:08 AM
Posted 08 January 2007 - 02:11 AM
Posted 08 January 2007 - 11:01 AM
This prevents sites with tons of pages and good PR from having an unduly strong showing in the SERPs, and giving alternate sites more of a chance. It also tends to do bad things to many spammers
Therefore, a site with lots of pages will not have as large of an influence as it used to, which also acknowledges that it used to, and that Google had to try to correct it.
It's still a good idea to have enough content to be useful to a visitor and to be considered significant to a search engine. It's not strictly necessary, as has been mentioned earlier in this thread (There are many PDF's that do very well, and they only count as a single page), but all things being equal, it makes sense that more *good* pages is better than less.
Personally, I consider 5 pages to be the minimum I'll work with when a client wants a site translated into another language, for example (this is roughly the equivalent of a new site). Sure, you could do only one page, and probably get ranked somewhere, but 5 is much more comfortable for me. Anything less and IMO you are being ranked on your links only, rather than your content and links, and that's a position I try to never be in.
Posted 08 January 2007 - 02:18 PM
Posted 08 January 2007 - 02:57 PM
The size of a site is only indirectly related to this - in terms of internal links and opportunities to use keywords. But:
1. An orphaned or poorly linked to page in a great site with lots of pages will not do well. This is common in large, multi-department sites.
2. A well linked to page in an otherwise forgettable or very small site can do very well indeed. This happens all the time for blogs.
Therefore the size of the site is not the point - the links and content are.
Now, when you control a site, obviously it's far easier to control the linking and content structure, thereby affecting the ranking of a page. The more pages you have, the more opportunities you have to do this. But it's not about google "liking" a big site - it really doesn't care how big or small it is - it only cares about links and page content.
It may sound like I'm splitting hairs here, but it's important to understand the difference. Saying that Google likes big sites implies the following:
1. I can auto-generate a million pages and my site will suddenly rank better - FALSE
2. I can have a huge site with great content but not bother to link to the pages and I will still rank well because the site is big - FALSE
3. I can have a huge site with serious redirection, DNS, robots.txt issues and all will be fine - FALSE
4. My small but good site can't compete with with a large but bad site - FALSE
5. If I have 50 pages and my competitor has 51, he will beat me - FALSE
I'm sure there are many other examples. Saying Google likes big sites is looking at a symptom and calling it a cause.
Example Claim: Google also likes sites in the English language.
My proof? Most of the highest ranking sites are in English.
Of course this is nonsense. But indirectly, since currently most authoritative sites are in the English language and they tend to only link to other English language sites, there is an indirect result of English sites being better able to get links and therefore PR.
On the flip side, you could point out that since the competition is far less, that non-English sites are liked better by Google, since you can rank well much more easily.
In reality, talking about Google "liking" or "disliking" English sites is totally misleading, since you would be looking at the wrong issue. The same applies to the size of a site - it's an indirect indication of content and opportunities for links, but is not a reason to "like" or "dislike" a site on that basis.
A good theory is supposed to help you make decisions and predictions. "Big site" vs "smaller site" does not do this. "Good link structure" vs "Bad Link Structure" is much closer, and "Good links and Content" vs 'Bad links and Content" the closest. It implies that in order to do better in rankings, you need to add more good content and get good links - a much more plausible theory.
If it doesn't explain previous results AND allow you to predict future behaviour, it's an incomplete or bad theory - basic scientific principle.
Posted 10 January 2007 - 03:52 PM
I agree with what most people are saying in this thread but I think that there are some algos that google uses that are domain-specific and aren't just specific to pages or groups of pages (sites). For instance, Google is a registrar and they have access to whois records which they can use for monitoring expired domains and applying the aging delay. There are probably other filters that are applied at the domain level which effect how much google 'likes' or 'dislikes' a domain and, therefore, a site.
Posted 10 January 2007 - 04:24 PM
It isn't unreasonable to suggest that Google likes big sites, because if you actually go and look at some of the most comptetive keywords and then look at the sites who are dominating then you may notice also that the sites doing well are fairly large. Look a little further into it and you'd also notice that most offer a combination of unique and original too. Where sites contain stuff that can be found elsewhere you may see also that a lot of them offer added value; by way of additional tools, or reviews or comparisons and the like.
Here are 3 queries plucked from the hole in my head.
Holiday in Ibiza
Is there a singular small site in any of those serps? At a glance no. reason being that for most queries there are a zillion and one other related topics and items. It follows that as a result, anyone who is seriously tackling any of these markets will be looking to build a kick ass resource. A consequence that stems from this is that you end up building lots of content and adding lots of value. Its this content and value that helps build your trust and authority, not the random creation of lots of same ol same ol pages.
So um..yeah in summation, I agree G prolly does like big sites, but not just because they are big, but because they are broad and have depth and are repeatedly visited by their users.
Posted 11 January 2007 - 03:08 AM
I would agree with that, as far as it goes. Obviously a site that is considered to be spamming would harm the rankings of individual pages within it, even if a few of those pages happen to be "clean", for example. Other examples include the aging delay, etc.
But looking at the domain in this way is still part of the process of evaluating the *page* (and it's relationships to other pages) that Google is looking at in response to a query. It doesn't care about the other pages in the site that are not potential responses to a query at that point.
For example, when many pages from a single domain (or related domains) are retrieved as a response to a query, Google runs a filter to only show a couple of them, rather than allowing a single site to have the top 10 positions just because it has 10 really well optimized pages on the subject at hand.
That filter is obvously using the common domain as something it looks at, but only in relation to the pages it's considering. The fact that the site in question has 5 or 5000 other pages in it that are not potential results does not come into the picture. In short, the size of the site isn't looked at.
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users