Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!

 



Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 

 



 

 www.CustomReportSharing.com 

From the folks who brought you High Rankings!



Photo
- - - - -

Difference In The Way You Link Your Website Homepage


  • Please log in to reply
6 replies to this topic

#1 dominicsp

dominicsp

    Dominic P

  • Active Members
  • PipPipPipPip
  • 137 posts
  • Location:India

Posted 16 February 2009 - 08:52 AM

Hi All,

I received this input from a SEO company that my website has an issue?

1. XML sitemap is not available. Try to prioritize the pages those youmust target to the search engines and provide them with sitemap files.In future you can delete the un-necessary pages from domain. In long runthis will have a positive impact on your site. [Dominic] Is this important
2. Your site has a canonical issue. Don't make half of your links go to http://www.mywebsite.com/ and the other half go to http://mywebsite.com/
3. Your site doesn't have robots file. Please add a robot file toinstruct the search engines avoiding access of your un-necessary pages. [Dominic] Is this important even if I do not have any pages to exclude

Can anyone help me understand this issue and if there is one in the first place.

Thanks in advance

Regards,
Dominic

#2 Yoshimi

Yoshimi

    HR 3

  • Active Members
  • PipPipPip
  • 56 posts

Posted 16 February 2009 - 08:57 AM

All of the things this company has informed you of are important. I would recommend that you read the Google webmaster guidelines and sign up to Google webmaster tools which will give you more information, including the info on how to set these things up.

The things they have pointed out all help the search engines to see your site in the way you want it too. the canonicalisation issue may mean that links into your site are diluted, so instead of all of your links pointing to one page they are split between two.

Whether the fact that the info they have provided means they are a good SEO company is a different matter and if you are considering employing them on the basis of this email you should do further research.

#3 Randy

Randy

    Convert Me!

  • Moderator
  • 17,540 posts

Posted 16 February 2009 - 09:05 AM

1. No it's not important, as long as your site is already spider friendly and the pages/links are accessible. XML Sitemaps are only used for URL Discovery. Thus if your pages are already being found and indexed, a URL Sitemap isn't going to help. FWIW, there's not a single site of mine that has an XML Sitemap. I don't need 'em since my sites are built to be spider friendly. So for me the XML Sitemap would be a time wasting duplication.

If there's something that keeps the spiders from being able to spider some of your pages and XML Sitemap can help the engines in the discovery process. Otherwise it's a useless exercise.

2. This can be an important one sometimes. By having your pages available at both the www and non-www address you can effectively end up splitting your PageRank/Link Popularity. It is better to concentrate this into a single address. Additionally, if you have an SSL/Secure side of your site you'll want to be very careful with canonical issues. Your SSL certificate will be valid for only one url address. Either the www address or without the www. Having an SSL cert that says one address and giving users access to the other is going to produce an error for them, which of course can drastically affect Trust and thus Sales.

There are several ways one can cure this issue. If you're on a Unix/Linux system with Apache the easiest is usually a small .htaccess redirect from the unwanted url to the right one.

3. A complete non-issue if you don't have pages you want to exclude or don't have other reasons to need a robots.txt file. robots.txt is an exclusionary process. Thus if there is nothing to exclude there doesn't need to be a robots.txt. No robots.txt simply tells the bots to spider everything they can find.

#4 dominicsp

dominicsp

    Dominic P

  • Active Members
  • PipPipPipPip
  • 137 posts
  • Location:India

Posted 16 February 2009 - 02:05 PM

QUOTE(Randy @ Feb 16 2009, 10:05 AM) View Post
2. This can be an important one sometimes. By having your pages available at both the www and non-www address you can effectively end up splitting your PageRank/Link Popularity. It is better to concentrate this into a single address. Additionally, if you have an SSL/Secure side of your site you'll want to be very careful with canonical issues. Your SSL certificate will be valid for only one url address. Either the www address or without the www. Having an SSL cert that says one address and giving users access to the other is going to produce an error for them, which of course can drastically affect Trust and thus Sales.

There are several ways one can cure this issue. If you're on a Unix/Linux system with Apache the easiest is usually a small .htaccess redirect from the unwanted url to the right one.


How do I figure out if I have this issue? and how do I resolve this on an IIS server.

#5 Randy

Randy

    Convert Me!

  • Moderator
  • 17,540 posts

Posted 16 February 2009 - 04:21 PM

Finding out if the potential is there is easy. Fire up your browser and see if you can pull up the same page using both the www and non-www version of the address.

Finding out if it's a real issue is a bit more difficult. About the best you can do is conduct a site: yourdomain.com type of search on a couple of the search engines. Note there's no www in the query. It should pull up both address versions of your pages if the search engines haven't already merged them. So look in the results to see if you spot both the www and non-www addresses showing up.

As far as fixing it...

With Google you can simply log into their Webmaster Tools thingee, verify your site and set which version you want to be the hostname. If you need to fix it at the server level for all of the search engines and all of your visitors you'll need some Administrator privileges in IIS. Ian has an IIS 301 tutorial on his site that covers the most common situations.

#6 Jill

Jill

    Recovering SEO

  • Admin
  • 32,967 posts

Posted 16 February 2009 - 05:04 PM

Actually, it's pretty simple to see if it's an issue with Google. View the cache of both the www and the non-www versions. If they both say "this is Google's cache of www.example.com" (rather than this is the cache of example.com) then Google sees them as the same.

Almost always they see them the same these days when I check. It doesn't appear to be the issue it once was.

#7 dominicsp

dominicsp

    Dominic P

  • Active Members
  • PipPipPipPip
  • 137 posts
  • Location:India

Posted 16 February 2009 - 09:57 PM

Thanks Jill and Randy,

Google shows up www.example.com in both the searches, so I guess I'm ok for now and I have used the Google webmasters tool and set the domain to www.example.com

Thanks
Dominic




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

SPAM FREE FORUM!
 
If you are just registering to spam,
don't bother. You will be wasting your
time as your spam will never see the
light of day!