Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!

 



Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 

 



 

 www.CustomReportSharing.com 

From the folks who brought you High Rankings!



Photo
- - - - -

Url Structures: Deep Pages = Weaker Ranking?


  • Please log in to reply
28 replies to this topic

#16 Alan Perkins

Alan Perkins

    Token male admin

  • Admin
  • 1,642 posts
  • Location:UK

Posted 21 November 2011 - 03:54 PM

Hi kgerson

There is some truth in the "flat site architecture" theories, and some fallacy. It's definitely true that you shouldn't introduce artificial layers of depth (e.g. "roster" when talking about sports injuries might be an example of this).

If you've been doing a lot of reading around then no doubt you've come across the concept of a "crawl budget", and this is something I do subscribe to. In a nutshell, if your site is a Pagerank 1 and has a million similar pages, it's unlikely that many of them will be crawled and indexed - but if it's a Pagerank 10 and has a million unique pages, then it's likely that most of them will be crawled and indexed. So, with your information architecture, what you're looking to do is make the most of your crawl budget to get as many of your unique pages crawled and indexed as possible; and to present those pages in a way that maximises their chances of ranking for your target keywords, by placing pages that are targeting more competitive keywords more strongly in the information architecture than pages that are targeting very low competition, long tail phrases.

Artificial layers of depth consume crawl budget needlessly. But true layers of depth help you to make more effective use of your "link juice" to target head and tail terms appropriately.

#17 newhat

newhat

    HR 2

  • Members
  • PipPip
  • 22 posts

Posted 21 November 2011 - 04:25 PM

QUOTE(Alan Perkins @ Nov 21 2011, 03:54 PM) View Post
If you've been doing a lot of reading around then no doubt you've come across the concept of a "crawl budget", and this is something I do subscribe to. In a nutshell, if your site is a Pagerank 1 and has a million similar pages, it's unlikely that many of them will be crawled and indexed - but if it's a Pagerank 10 and has a million unique pages, then it's likely that most of them will be crawled and indexed. So, with your information architecture, what you're looking to do is make the most of your crawl budget to get as many of your unique pages crawled and indexed as possible; and to present those pages in a way that maximises their chances of ranking for your target keywords, by placing pages that are targeting more competitive keywords more strongly in the information architecture than pages that are targeting very low competition, long tail phrases.

Artificial layers of depth consume crawl budget needlessly. But true layers of depth help you to make more effective use of your "link juice" to target head and tail terms appropriately.

I appreciate the comment but, quite honestly, what does any of this say? If you've got an Amazon.com budget, all your pages will be crawled. If you're in the middle of the pack, hope that a couple of your pages will be indexed higher and focus on a couple of them.

This is not always simple. In the past, I had the ability to use for urls domain. com / cat / subcat / postname /

I ended up shortening the url to get rid of the subcat because it made the URL longer and frequently duplicated the postname, e.g.:

all sports. tld / baseball / mlb / redsox / red-sox-sign-albert-pujols/

That created a bit of a problem since you'll see url length will duplicate. However, the division of the actual directories in the breadcrumbs remained the same. So what is a site supposed to do in order to reduce the clicks and focus on that page? Is it supposed to move to a flatter hierarchy or move the red sox down to the baseball sublevel for the sake of giving them more presence?

Now here's the quandary. What a bout a site that is just about baseball? They are just:

mlb / redsox / postname /

Do they automatically rank better than you do assume speed and backlinks are of the same number and quality? I think you understand where I'm going with this...



#18 Alan Perkins

Alan Perkins

    Token male admin

  • Admin
  • 1,642 posts
  • Location:UK

Posted 21 November 2011 - 04:44 PM

QUOTE
That created a bit of a problem since you'll see url length will duplicate.


Sorry, I don't follow what you mean by this.

Try not to think in terms of URL. If you have a 1 million page website, the URLs could be as simple as

www.mysite.com/000000, www.mysite.com/000001 ... www.mysite.com/999998, www.mysite.com/999999



#19 newhat

newhat

    HR 2

  • Members
  • PipPip
  • 22 posts

Posted 21 November 2011 - 05:19 PM

Mistype - the duplication will appear in the url -- baseball/red-sox/red-sox-win-the-world-series

Virtually all of the headlines in red-sox will have the word "red sox" in them, e.g. red sox aquire albert pujols, red sox buy the yankees, etc. instead of crawling all the way to the end and focusing on the rest, e.g. albert pujols too.

#20 Alan Perkins

Alan Perkins

    Token male admin

  • Admin
  • 1,642 posts
  • Location:UK

Posted 21 November 2011 - 05:44 PM

QUOTE
Virtually all of the headlines in red-sox will have the word "red sox" in them


Only because you put them there ...

#21 newhat

newhat

    HR 2

  • Members
  • PipPip
  • 22 posts

Posted 21 November 2011 - 05:55 PM

QUOTE(Alan Perkins @ Nov 21 2011, 05:44 PM) View Post
Only because you put them there ...

Because they make sense - hence I avoid them. For example, what am I supposed to write as an article headline if it's all about what's happening with the red sox and need to get that message clear in the headline? It will create natural redundancy and also a long url. Every team's url will look like this:

team/team-does-this /team/team-does-that

Given the fact that SEs supposedly only take mostly into account a certain number of characters from the left, you'd best use that space wisely instead of piling up keywords in duplicate in the URL if they can be avoided. Hence the shorter URL will work just as effectively and you won't be penalized for keyword stuffing either (either directly or indirectly.)

Anyways, the issue was URL structures and depth. I'm struggling with that same issue on my site and, to some extent, I'm told that there is some validity of the depth. If I showed you some sites that have blown by my traffic thanks to Panda, you'd be disgusted. It is what it is and we have to figure out how all these mechanics actually work at the end of the day.

#22 Michael Martinez

Michael Martinez

    HR 10

  • Active Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,065 posts
  • Location:Georgia

Posted 21 November 2011 - 06:08 PM

QUOTE(kgerson @ Nov 21 2011, 12:20 PM) View Post
It's not crap at all. You can tell me all day about MUST categorizing but SEO experts (everyone is, lol) have repeatedly told me that the lower down the level closer to the root/home, the better results you'll get. Hence you might make decisions you don't want to make but they might be better if your main money is coming from those categories.


It's crap, it's crap, it's crap. It DOES NOT MATTER where the content is placed in the chain of links from your home page. It never has mattered, it never will matter.

People who don't know what they are talking about have endlessly repeated this nonsense without any clue.

Web pages perform well because of the value they provide and the value that is attached to them by other pages, and where they are in the URL structure or the site navigation is IRRELEVANT.

#23 newhat

newhat

    HR 2

  • Members
  • PipPip
  • 22 posts

Posted 21 November 2011 - 06:36 PM

QUOTE(Michael Martinez @ Nov 21 2011, 06:08 PM) View Post
It's crap, it's crap, it's crap. It DOES NOT MATTER where the content is placed in the chain of links from your home page. It never has mattered, it never will matter. People who don't know what they are talking about have endlessly repeated this nonsense without any clue.

Web pages perform well because of the value they provide and the value that is attached to them by other pages, and where they are in the URL structure or the site navigation is IRRELEVANT.

Many of the things I'm repeating are provided courtesy of Go ogle's bio nic webmasters (granted, I think some of them have little clue despite being honored by Google.) If web pages perform well because of the value they provide and attached, then I'll show you some sites that have suddenly been beating mine in overall traffic and you can explain to me what "value" is. smile.gif

Don't know you yet but I am assuming that you have placed a great deal of your time in studying this phenomenon known as SEO. I guess that part of the issue is defining what does it mean "site architecture" since one might be able to get to a page from several locations. For example, what if you had a site map of links on the bottom of your pages, as I've seen from several sites which makes the first 40 or so pages all have the same number of clicks to get there? For example, using my guide, what if I put on the footer my favorite baseball teams so that a link to the team page is off the home page? Would all those pages rank higher even if the breadcrumb said differently?

footer
------
redsox, whitesox, royals, angels, dodgers, padres, rangers, astros, marlins, orioles, devil rays

breadcrumb on the page:

home >> baseball >> AL >> Red Sox

So which is the route the search engine prioritizes? Is it following the direct route off the home page (one click) or is it following the breadcrumb? What if there was no breadcrumb at all? Would the teams in the footer rank above the other teams? You and i know the structure but would anyone else?

Here's another iteration:

baseball.domai n.com /redsox/page.html

versus

www.domai n.com /baseball/redsox/page.html

===

There are several differences I've noticed and I'm curious to hear your opinion. Glad to share mine.

#24 Alan Perkins

Alan Perkins

    Token male admin

  • Admin
  • 1,642 posts
  • Location:UK

Posted 21 November 2011 - 07:14 PM

QUOTE(kgerson)
Given the fact that SEs supposedly only take mostly into account a certain number of characters from the left, you'd best use that space wisely instead of piling up keywords in duplicate in the URL if they can be avoided.


No. You're losing the plot of this thread here. Go back to the start - depth is not about how long the URL is, it's about number of clicks from the home page.

QUOTE
It is what it is and we have to figure out how all these mechanics actually work at the end of the day.


True! But I think you're focusing on the wrong things.

QUOTE(Michael Martinez)
Web pages perform well because of the value they provide and the value that is attached to them by other pages, and where they are in the URL structure or the site navigation is IRRELEVANT.


No that's wrong. If you bury content deep in your IA it won't perform as well as if you place the same content on your home page (on the understanding that most if not all of the best links to your site are to the home page, which is normally the case).






#25 Michael Martinez

Michael Martinez

    HR 10

  • Active Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,065 posts
  • Location:Georgia

Posted 21 November 2011 - 08:13 PM

The Bionic posters and I go way back. Some of them give good advice and some of them give not-so-good advice.

At the end of the day, you pick a Website structure that you can live with and you go with that until it stops working. I wouldn't focus too much on "SEO advice". I can blindly hand out SEO advice all day. Some of it might even be relevant and useful to someone.

As Jill often says, test everything (but understand the risks and potential rewards).

QUOTE
If you bury content deep in your IA it won't perform as well as if you place the same content on your home page...


No, THAT's wrong.

I've been burying content deep in my IA for years and it has performed as well as root pages.

#26 Alan Perkins

Alan Perkins

    Token male admin

  • Admin
  • 1,642 posts
  • Location:UK

Posted 22 November 2011 - 05:11 AM

QUOTE(Michael Martinez)
I've been burying content deep in my IA for years and it has performed as well as root pages.


That's possible in some circumstances, e.g. if you have a high authority site and/or lots of inbound links to deep pages, especially if the site is a high authority on some fairly low competition/long tail terms. But it's not something that holds up in practice generally and nor, theoretically and logically, should it.

#27 Michael Martinez

Michael Martinez

    HR 10

  • Active Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,065 posts
  • Location:Georgia

Posted 22 November 2011 - 04:56 PM

QUOTE(Alan Perkins @ Nov 22 2011, 02:11 AM) View Post
That's possible in some circumstances, e.g. if you have a high authority site and/or lots of inbound links to deep pages, especially if the site is a high authority on some fairly low competition/long tail terms. But it's not something that holds up in practice generally and nor, theoretically and logically, should it.


It holds up fine in practice across site after site, Alan. The idea that deeply buried content cannot perform as well as a root domain is an SEO myth. It's not about authority, it's about the frame of reference that the search engine has to work with. That frame of reference can be changed for any query.

#28 Alan Perkins

Alan Perkins

    Token male admin

  • Admin
  • 1,642 posts
  • Location:UK

Posted 22 November 2011 - 05:41 PM

OK well I disagree Michael, based on my own extensive experience. But if you're happy in doing it your way, I see no reason to try to persuade you otherwise ...

#29 newhat

newhat

    HR 2

  • Members
  • PipPip
  • 22 posts

Posted 22 November 2011 - 08:09 PM

QUOTE(Michael Martinez @ Nov 21 2011, 08:13 PM) View Post
At the end of the day, you pick a Website structure that you can live with and you go with that until it stops working. I wouldn't focus too much on "SEO advice". I can blindly hand out SEO advice all day. Some of it might even be relevant and useful to someone.

Respectfully, I couldn't disagree more with this statement. If anything, doing this will cause a site owner to spend far more money paying some developer or "SEO Consultant" far more money than necessary. And yes, if you have a huge corporate budget and plenty of content to play with, you can do anything and continue to milk the cow and show results.

Fact is that there are certain choices that are better than others. You may not want to spill these "secrets" that are out on the web en masse to anyone who wants to look. Right now I'm dealing with a problem that concerns the above. Making changes to the structure of your website can be like making changes to the foundation of a 10 story building five years later. Perhaps it is best to hear some good SEO advice all day before making some major site architecture decisions. In my case, what I had done could not have been foreseen but it is what it is. Planning in the early stages is extremely important and there are some rules that don't change much later.




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

SPAM FREE FORUM!
 
If you are just registering to spam,
don't bother. You will be wasting your
time as your spam will never see the
light of day!