Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!

 



Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 

 



 

 www.CustomReportSharing.com 

From the folks who brought you High Rankings!



Photo
- - - - -

What Internal Link Structure Is Friendliest For The Engines?


  • Please log in to reply
6 replies to this topic

#1 chamuel

chamuel

    HR 1

  • Members
  • Pip
  • 3 posts

Posted 09 April 2009 - 10:01 PM

I have a new website with 500 pages for UN-related products.

Up til now, I have been getting customers 100% through PPC advertising. So I have not bothered to “index” or “link” those 500 pages together in anyway that will help the spiders index all the pages.

I think up til now, they have only been indexing the HOME page (index.html).

Now -- with the help of you guys -- that is about to change!

I just need some genius’s advice about the best SIMPLE “index” or “link” system I should use, to help the spiders find all the pages (not just the home page).

I have created a sitemap.xml file and uploaded it to the site.

But I have not created ANY links on any of the pages. I guess that means the spiders will come to my home page, ONLY index that page, and then leave.

I do NOT want visitors to any one of these 500 pages (including the home page), to visit any of the OTHER pages in my site (the products are unrelated).

Does all this makes sense?

So . . . should I just put a small link on the bottom of the HOME page that says something like “map” then hyperlink it to the sitemap.xml file? Will THAT be enough to keep the spiders happy?

Do I ALSO need to put a similar link (to sitemap.sml) on each of the 500 PAGES?

All this stuff is strictly for the benefit of the search spiders.

(Heck, I don’t know if all the engines even USE sitemaps! . . . and I want to keep them ALL happily indexing my entire site.)

Can anyone tell me how to set up my links to “connect” all these pages . . for the spiders (not just for the engines that use sitemaps).

Thanks


#2 donp

donp

    HR 4

  • Active Members
  • PipPipPipPip
  • 149 posts
  • Location:N Georgia Mountains

Posted 10 April 2009 - 07:31 AM

Ask for a site review...that will most likely get some better input.


#3 Jill

Jill

    Recovering SEO

  • Admin
  • 32,982 posts

Posted 10 April 2009 - 08:22 AM

QUOTE
I do NOT want visitors to any one of these 500 pages (including the home page), to visit any of the OTHER pages in my site (the products are unrelated).

Does all this makes sense?


Nope. I don't get it at all.

Why not just have 500 separate sites then?

#4 Randy

Randy

    Convert Me!

  • Moderator
  • 17,540 posts

Posted 10 April 2009 - 08:27 AM

No site review available donp. Those are reserved for more established, active HRF members and chamuel is new here.

On to the questions!

QUOTE
All this stuff is strictly for the benefit of the search spiders.


That's a bad idea from the get go. Any time you do something that is strictly for the benefit of the search engines and not users you're in the danger zone.

QUOTE
But I have not created ANY links on any of the pages. I guess that means the spiders will come to my home page, ONLY index that page, and then leave.

I do NOT want visitors to any one of these 500 pages (including the home page), to visit any of the OTHER pages in my site (the products are unrelated).

Does all this makes sense?


No it doesn't.

QUOTE
Can anyone tell me how to set up my links to “connect” all these pages . . for the spiders


You can include a reference to your XML Sitemap in your robots.txt file. But you cannnot connect the pages with links because you don't want any apparently. Note that reference in an XML Sitemap does not guarantee those pages will be either spidered or indexed.

You could also work at getting links from external sites to each of the 500 pages. That would give you a better chance of each page getting indexed.

But you cannot have it both ways. If you want internal linkage to help with indexing, then you need to provide real links. Something for visitors, not just for the spiders.

Frankly, I recommend you reconsider linking the pages together into a site. Because right now it's not a site, just a collection of pages.


#5 chamuel

chamuel

    HR 1

  • Members
  • Pip
  • 3 posts

Posted 10 April 2009 - 09:01 AM

QUOTE(Jill @ Apr 10 2009, 09:22 AM) View Post
Nope. I don't get it at all.

Why not just have 500 separate sites then?



You mean, get 500 separate domains (each domain with an "optimized" index.html page emphasizing the keywords for that specific product)? Then submit each domain separately to the search engines?

(Please forgive my naive question . . . this is new to me . . . I don't have any SEO experience!)

I appreciate your guidance.

#6 chamuel

chamuel

    HR 1

  • Members
  • Pip
  • 3 posts

Posted 10 April 2009 - 09:06 AM

Thanks so much for your reply (to this obvious "new" guy to SEO).

Ok. I grasp your points . . . . very reasonable!

What if all the pages had 95% identical content (wording) . . . with each page only containing 5% content that was unique to the specific product?

And what if I created SEPARATE domains for each product . . . with just 1 page per domain (the indx.html) page. And then submitted each of the 500 domains to the engines.

Would they be able to tell the content was 95% the same between all my domains? Or would they even care?

I'd like to keep things simple . . . but still keep the engines happy!

(And also . . keep the members of this terrific forum happy,too!)


QUOTE(Randy @ Apr 10 2009, 09:27 AM) View Post
No site review available donp. Those are reserved for more established, active HRF members and chamuel is new here.

On to the questions!
That's a bad idea from the get go. Any time you do something that is strictly for the benefit of the search engines and not users you're in the danger zone.
No it doesn't.
You can include a reference to your XML Sitemap in your robots.txt file. But you cannnot connect the pages with links because you don't want any apparently. Note that reference in an XML Sitemap does not guarantee those pages will be either spidered or indexed.

You could also work at getting links from external sites to each of the 500 pages. That would give you a better chance of each page getting indexed.

But you cannot have it both ways. If you want internal linkage to help with indexing, then you need to provide real links. Something for visitors, not just for the spiders.

Frankly, I recommend you reconsider linking the pages together into a site. Because right now it's not a site, just a collection of pages.



#7 Jill

Jill

    Recovering SEO

  • Admin
  • 32,982 posts

Posted 10 April 2009 - 09:55 AM

QUOTE
What if all the pages had 95% identical content (wording) . . . with each page only containing 5% content that was unique to the specific product?


Then there's no reason for them to exist, and you should just have one domain and one page.




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

SPAM FREE FORUM!
 
If you are just registering to spam,
don't bother. You will be wasting your
time as your spam will never see the
light of day!