Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!

 



Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 

 



 

 www.CustomReportSharing.com 

From the folks who brought you High Rankings!



Photo
- - - - -

Site/linking Structure For Browsing Routes


  • Please log in to reply
15 replies to this topic

#1 ArthurS

ArthurS

    HR 1

  • Members
  • Pip
  • 8 posts
  • Location:US

Posted 28 May 2013 - 11:37 AM

Hello All!

 

I've been pondering on this topic myself and searching the web for a while, but haven't come up with a definite solution. Here is the situation in short:

I'm working on a site, which routes point A to point B. There are literally thousands of such points. There is a browsing functionality included on the site, but since there are nearly countless number of combinations of A and B, I will be limiting As and Bs only to the more popular ones.

 

Here come the two questions:

1. The easy one: How many combinations of A and B are practical to include in the browsing functionality as to not 'overwhelm' users and/or search engines? If, say, I take only 10 most popular points from each region I still end up with well over a million pages, which is probably beyond practical. Or not? Does it matter at all how many pages will there be?

 

2. The tough one: how do I structure links for browsing? Naturally I'm not going to just layout a million links divided in however many pages - that would be impractical for all - human user and search engines. So, for a human user I figure the browsing logic will probably make sense: select origin region -> select country of origin -> select municipality -> select point of origin THEN: -> select destination region -> select destination country -> select destination municipality -> select destination point. However, I have serious doubts that search engines are going to want to dig so deep, considering there is very little valuable information (besides lists of countires, municipalities, etc.) until you finally get to the resulting page - this is where the interesting stuff is supposed to be. So, the question is - what linking structure will be practical for human users AND search engines (and if there is such a way, how to tell search engines not to bother too much indexing interim pages and just continue to the last page).

 

Sorry if this sounds confusing, I'm ready to provide examples and further explanations if necessary. Would be very interested to hear comments and suggestions.



#2 chrishirst

chrishirst

    A not so moderate moderator.

  • Moderator
  • 6,946 posts
  • Location:Blackpool UK

Posted 29 May 2013 - 08:38 AM

Does it matter at all how many pages will there be?

 

No, provided that only ONE URI can be used to access the document.

 

 

how to tell search engines not to bother too much indexing interim pages and just continue to the last page).

Search engines DO NOT follow a particular "route" through a site structure to a URL, If there is a link to that URL they go DIRECTLY to it.



#3 ArthurS

ArthurS

    HR 1

  • Members
  • Pip
  • 8 posts
  • Location:US

Posted 29 May 2013 - 09:36 AM

Chrishirst, thanks for your input.

 

No, provided that only ONE URI can be used to access the document.

 

 

Search engines DO NOT follow a particular "route" through a site structure to a URL, If there is a link to that URL they go DIRECTLY to it.

 

Chrishirst, thanks for your input!

 

Yes, each URI is unique. So I suppose that won't be a problem then.

 

With the second part I'm still unclear. Let me give an example to clarify:

The final URI that I'd hope to get indexed will have a form of:

?regionFrom=2&ctryFrom=50&areaFrom=123123&poiFrom=123131313213&regionTo=5&ctryTo=33&areaTo=321321&poiTo=34546135465.

It is 8 clicks away from the browse page and in order to get there, spider would have to go through:

?regionFrom=2

?regionFrom=2&ctryFrom=50

?regionFrom=2&ctryFrom=50&areaFrom=123123

?regionFrom=2&ctryFrom=50&areaFrom=123123&poiFrom=123131313213

?regionFrom=2&ctryFrom=50&areaFrom=123123&poiFrom=123131313213&regionTo=5

?regionFrom=2&ctryFrom=50&areaFrom=123123&poiFrom=123131313213&regionTo=5&ctryTo=33

?regionFrom=2&ctryFrom=50&areaFrom=123123&poiFrom=123131313213&regionTo=5&ctryTo=33&areaTo=321321

All the browser would see on the way are lists of links to regions, countries, cities, POIs, which probably isn't of much value. What I'm doubting is that a spider will bother going through 7-8 pages full of links and not much other content, especially on a newer site with little reputation...

 

Perhaps, what I should also explain is why I wanted to have the browsing functionality in case it didn't come out clear from the original post:

 

It is mostly to give something to the search engines, because to human users the easiest and fastest way to search for needed information is to enter values into FROM and TO fields and press 'Enter'. So the plan was to create a bunch of links to pages with routing results, which would be indexed and serve as doorway pages to the site (how much is 'a bunch' was first part of the question). Now that I think about it, perhaps, in this context I should emphasize the first part of the question more than the second - would it be maybe better to make links to 10-20 most popular/most recent search results rather than trying to index 100 000 pages that may be interesting but take almost 10 clicks to get to?



#4 chrishirst

chrishirst

    A not so moderate moderator.

  • Moderator
  • 6,946 posts
  • Location:Blackpool UK

Posted 29 May 2013 - 03:50 PM

Maybe but search engines DO NOT start "at the home page" and traverse ALL the way down.

 

If there is a link to a URL that is ten layers down, they WILL simply go to that URL, They don't 'click links' they read source code and add the href attribute value to their "crawl list"

 

BUT with URLs like that I wouldn't give much chance of ANY URLs that are NOT linked directly from a 'static' document URL getting indexed. AND they are going to create HUGE amounts of duplicate content, because there is MORE THAN A SINGLE URI pointing to the SAME content.



#5 ArthurS

ArthurS

    HR 1

  • Members
  • Pip
  • 8 posts
  • Location:US

Posted 30 May 2013 - 02:17 AM

Hi and thanks again

 

"Maybe but search engines DO NOT start "at the home page" and traverse ALL the way down."

May be, but with my currentl linking structure (effectiveness of which I serious doubt) they can't get to the final page

?regionFrom=2&ctryFrom=50&areaFrom=123123&poiFrom=123131313213&regionTo=5&ctryTo=33&areaTo=321321

withouth first visiting

?regionFrom=2&ctryFrom=50&areaFrom=123123&poiFrom=123131313213&regionTo=5&ctryTo=33

because this is the only page that has this link to this final page.

 

If there is a link to a URL that is ten layers down, they WILL simply go to that URL, They don't 'click links' they read source code and add the href attribute value to their "crawl list"

That's clear, but as I explained above, they (spiders) still need to read content of page higher in hierarchy in ourder to get the URL of the final (results) page. There is no direct link to the last page from anywhere else besides the previous page.

 

BUT with URLs like that I wouldn't give much chance of ANY URLs that are NOT linked directly from a 'static' document URL getting indexed. AND they are going to create HUGE amounts of duplicate content, because there is MORE THAN A SINGLE URI pointing to the SAME content.

 

This I didn't get at all. Although interim pages may be a result of a database query, they are pretty much static - their content doesn't change much (at least until new continents or countries emerge).

Content of each page is unique, there is absolutely zero duplicate content. Example: when selecting point of origin you choose Americas - you get North, Central and South America with lists of respective countries - this is just interim information in the context of the site, but it is unique and is not repeated on any other URI..... well, besides when doing selection for the destination region - if it happens to be within same region, you will get same list of countries, the only difference will be the breadcrumbs in the header. Is this what you mean by duplicate content?



#6 chrishirst

chrishirst

    A not so moderate moderator.

  • Moderator
  • 6,946 posts
  • Location:Blackpool UK

Posted 30 May 2013 - 11:55 AM

because this is the only page that has this link to this final page.

 

 

Then YOU need to change that.

 

 

Content of each page is unique, there is absolutely zero duplicate content.

 

If more than ONE URL will show the same content THERE IS.

 

 

"duplicate content" IS NOT just about using the same content on more 'pages'. THE BIGGEST cause of duplication is content on one site, that is accessible via MORE THAN ONE URL.

 

For example

 

site.tld

www.site.tld

site.sld/index.php

www.site.tld/index.php

 

That's FOUR copies BEFORE URL parameters are put into the mix.



#7 ArthurS

ArthurS

    HR 1

  • Members
  • Pip
  • 8 posts
  • Location:US

Posted 31 May 2013 - 10:38 AM

Hi again!

 

Thank you for trying to help, but unfortunately, this is mostly general advice. Although not an SEO expert, I have been building sites for the past 10 yrs, so I am familiar with the basics. The question is of more practical and specific nature: I have a website with several million of unique content pages, but content is dynamic and is formed in response to user input, so it's not visible to search engines. In order to get some search engine esposure I would like to create a certain number of static links to these pages and thus make them and crawlable. So, I need to a) determine how many of such links is practical to have and B) how to link them. I suppose the two questions are inter-related, because optimal linking structure will depend on the number of links, i.e. - if I had only 10 links I could link to them straight from the home page as 'most popular' or 'most recent'. But if I have 100 000 links I need to organize them in categories. So, simply put - how many pages would be optimal (new site not much reputation) and if a large number of links - what liking structure would make sense?



#8 torka

torka

    Vintage Babe

  • Moderator
  • 4,622 posts
  • Location:Triangle area, NC, USA, Earth (usually)

Posted 31 May 2013 - 11:49 AM

Perhaps the confusion is arising because you're asking questions to which there is no answer (or at least, not the kind of answer you seem to be looking for).

 

You say you want to know how many pages are "optimal" -- and I assume by that you mean "optimal for the search engines", because your users will likely input their location and destination instead of browsing a list of hundreds or thousands of pages. (At least, I would.)

 

And the answer to the question of how many is "optimal" is: it depends.

 

It depends on who your competition is, and how good a job they're doing of optimizing. It depends on what locations are included in your database and how popular they are. It depends on what the specific search algorithm is that day, and what it will be next week and next year and beyond that (which is to say, not only does "it depend" but it will change over time, often without warning). It depends on how good you actually are at SEO. It depends on a whole lot of factors, most of which are not only out of our control, but are things about which we do not (and cannot) have any idea.

 

And, of course, all the search engines tell you to not create content specifically for them, but to create content for your users.

 

So, in fact, YOU are the only person who can answer the question of how many pages are "optimal" for your site. Ponder these points:

 

How many such pages do you have the time and inclination to set up? (This is really the key. You need to make sure you have enough time to do some real marketing for your site and that you're not just spending all your time setting up and maintaining these pages.)

 

How many really popular starting points and ending points are there? (Seriously, if it's a combination that will only be searched for once or twice a year, is it really going to be worth it for you to have taken the time to create a browsable page for that combo? Focus on the ones that will potentially bring you enough volume to generate a positive ROI for the time you spend.)

 

How often do you think your human visitors are actually going to use one of these pages? (Relates to the answer above. It's likely the majority of time these pages get used, it will be because somebody found the page in the search results. People who don't land on one of these pages right off the bat are much more likely to use the data input option. They're used to this from things like Google Maps, so it's not a big shock to the system.)

 

We can tell you how to pull this off from a technical standpoint. We can give you advice on how to structure your HTML site map page for usability, once you know how many "pre-built" pages you want to let your visitors use for browsing.

 

But when it comes to deciding how many pages you should create to start with? That's up to you.

 

My :02:

 

--Torka :propeller:



#9 Jill

Jill

    Recovering SEO

  • Admin
  • 32,963 posts

Posted 31 May 2013 - 12:12 PM

 

But if I have 100 000 links I need to organize them in categories. 

 

 

 

Bingo! That's exactly what you need to do. Once you do that, you'll know how many categories you have and whether or not you also  need subcategories. With 100,000 product pages, it's likely you'll have quite a few categories and subcategories to deal with.

Don't take this task lightly. Seriously spend a lot of time to determine what would be top-level categories and what should be sub-categories and create them accordingly. 

Be sure to add the top level categories to your global navigation. And within each subcategory, be sure to have a sub-menu which links to all the other subcategories within that main category.



#10 ArthurS

ArthurS

    HR 1

  • Members
  • Pip
  • 8 posts
  • Location:US

Posted 31 May 2013 - 04:59 PM

 

Bingo! That's exactly what you need to do. Once you do that, you'll know how many categories you have and whether or not you also  need subcategories. With 100,000 product pages, it's likely you'll have quite a few categories and subcategories to deal with.

Don't take this task lightly. Seriously spend a lot of time to determine what would be top-level categories and what should be sub-categories and create them accordingly. 

Be sure to add the top level categories to your global navigation. And within each subcategory, be sure to have a sub-menu which links to all the other subcategories within that main category.

 

Hi Jill! First off, thanks for the great informational resource and secondly thanks for your input!

 

As a matter of fact, I think what I'm doing now by narrowing down the world to a specific city can be considered categorizing in a way. The thing is, since I have to do this twice before getting to the final result (first for point of dispatch and second for destination point) I'm afraid the linking becomes a bit too deep for search engines to try to crawl. I had the site up for about a month and had close to 90 000 pages indexed, but none of the indexed pages seemed to have reached the final level - all I had indexed were lists of countries and cities, which isn't particularly interesting. What I did then was adding direct links to routes between world's largest cities to the top regional level. This did help to get these route pages indexed, but they do not appear to be ranking as well, although competition seems fairly low. So I am suspecting google may be assigning them very little weight because there is simply too many of such links (I took largest 15 largest cities of the region and routed them to 15 largest cities of the world's regions resulting about 225 route links on the page). Sounds like too many links per page, doesn't it?



#11 ArthurS

ArthurS

    HR 1

  • Members
  • Pip
  • 8 posts
  • Location:US

Posted 31 May 2013 - 05:39 PM

Torka, thank you, really appreciate your feedback and opinion!

 

I am being completely straight forward about this - the browsing functionality is 99-100% for the search engines, to make as many doorway pages to the site as practically possible. For a human user the search functionality is, without a doubt, a lot more convenient and that's what I expect them to use when they get to the site. At the same token, what browsing results yield in the end is exactly the same what searching yields, so my concious is clear here - I'm not tryingn to fool anyone, I am only trying to make search engines aware of the same what a human user is able to see.

 

Very good point on thinking about time and other resource constraints when determining the number of pages to index. The thing is, results are calculated, I don't have to create and optimize each result page individually. Instead I have a routing algorithm and a result outpu template, which lays out the route in the way which makes it at least search engine friendly and accentuates what is deemed to be the key information. When routing algorithm is updated (which happens regularly) or when I update the layout template all results are affected automatically, so I don't have to adjust each page individually. In this way, how many static links I want to create is really a matter of strategy rather than time resources.

 

Speaking of strategy, I don't really want to get into the crazy race over the most competitive keywords, which can potentially generate substantial traffic. I believe this makes traffic too sensitive to even minor changes in search algorithms, which as you correctly noticed occur on continous basis. So, I was thinking that instead of trying to generate 100 000 visits from one page, I would try to generate 1 visit on 100 000 pages ...or 1 000 000 pages. Do you think this approach makes sense?

 

By the way, to show what I think I'd like to achieve as end result, let me refer to a particular example. As to not violate any forum guidelines, I will not post links, but make a search e.g. for  "distance from Shelby to Columbus" and see the top results. Ok, Columbus is a large city, but Shelby is a rather small town. How is this done? How many static links would these sites have to have indexed in order for a combination Shelby - Columbus to come up in the results? Is this just a coincidence or is there some techical implementation, of which I'm not aware of, which allows to get dynamic results indxed?



#12 chrishirst

chrishirst

    A not so moderate moderator.

  • Moderator
  • 6,946 posts
  • Location:Blackpool UK

Posted 01 June 2013 - 08:20 AM

Please, please, stop this thinking that search engines "browse" navigational structures starting from the "home page" (root URL).

 

Crawling starts at ANY url that they find a link to and the links in that document will ALL become "entry URLs".

 

Think of your navigation as a matrix of interconnections NOT a straight line or as point to point hierarchies. Search crawlers/indexers have no need to go from A to Z via B,C,D ...W,X,Y  in sequence. Because if point A links to point Z they have found Z in one hop, or if point A links to point M which in turn links to point Z it has only been two 'hops'. EVERY URL on a website is an entry URL and is such is the first 'hop'., the idea that everything HAS to start at the root URL is a very, very "old-fashioned"  idea, which dates back to BEFORE search engines ever appeared and as such hasn't been the case for almost twenty years.



#13 ArthurS

ArthurS

    HR 1

  • Members
  • Pip
  • 8 posts
  • Location:US

Posted 03 June 2013 - 10:17 AM

Please, please, stop this thinking that search engines "browse" navigational structures starting from the "home page" (root URL).

 

Chris, thank you. I don't know which part of my ramble has left an impression that I think that that search engines crawl starting from the root. I don't. No worries :)

 

Thanks to everyone for their input. Having talked it out, heard opinions and given it additional thought I think I have an idea of how I want to proceed. It will be a combination of things - will most likely lower total number of static links, maybe shorten and re-write URLs, and optimize result pages to emphasize most important data. Thanks again!



#14 Jill

Jill

    Recovering SEO

  • Admin
  • 32,963 posts

Posted 03 June 2013 - 10:19 AM

 

. It will be a combination of things - will most likely lower total number of static links, maybe shorten and re-write URLs, and optimize result pages to emphasize most important data. Thanks again!

 

 

 

How did you come up with those things based on what we posted here? I don't see how any of that is going to solve your problem.



#15 ArthurS

ArthurS

    HR 1

  • Members
  • Pip
  • 8 posts
  • Location:US

Posted 03 June 2013 - 12:24 PM

Jill, it is not that anyone gave step by step instruction, that's not what I was looking for anyway. That's why I said 'having talked it out, given it more thought'. I have collected bits andn pieces here, taken into consideration what people have said (e.g. about number of links, categorizing, etc.) and have determined what I'm going to try to do.






0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

SPAM FREE FORUM!
 
If you are just registering to spam,
don't bother. You will be wasting your
time as your spam will never see the
light of day!