Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!

 



Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 

 



 

 www.CustomReportSharing.com 

From the folks who brought you High Rankings!



Photo

High Rankings Question Of The Week


  • Please log in to reply
13 replies to this topic

#1 Jill

Jill

    Recovering SEO

  • Admin
  • 32,940 posts

Posted 28 August 2013 - 10:43 AM

What technical issues do you see on websites that are potentially hurting their SEO performance?

View the full article

#2 RyanWatson

RyanWatson

    HR 1

  • Members
  • Pip
  • 3 posts

Posted 07 September 2013 - 07:13 AM

Article is worthy of reading, I am glad to see all the technical reasons hurting SEO performance, some issues I feel should be on the list:

 

1. The difference in www or non www URL, there should be a 301 redirection on either of the URL to other.

2. Sitemap and robots file should be on the server for better reading of search engines.

 

There are really great points that I can only include two from my side.


  • SidV likes this

#3 Jill

Jill

    Recovering SEO

  • Admin
  • 32,940 posts

Posted 07 September 2013 - 12:56 PM

I beg to differ.

 

Most always these days Google knows the difference between your www version and your non-www version. And a sitemap doesn't provide any advantage that I know of. I do agree that a robots.txt file is worth having, but mostly to prevent 404 errors (if you don't actually need to exclude any files).



#4 RyanWatson

RyanWatson

    HR 1

  • Members
  • Pip
  • 3 posts

Posted 09 September 2013 - 03:16 AM

With all due respect,

 

I do not want to take a single risk in SEO as things are getting difficult with increasing penalties with Google  Panda and Penguin, I want to get done with all the possible things which might help my website to get good reputation in the eyes of Google, Check out these links which still shows the importance of XML Sitemap and www or non www URLs and they are from the Google support:

 

https://support.goog...wer/44231?hl=en

https://support.goog...er/156184?hl=en



#5 Jill

Jill

    Recovering SEO

  • Admin
  • 32,940 posts

Posted 09 September 2013 - 07:26 AM

There are just so many other things that actually have an effect. 



#6 chrishirst

chrishirst

    A not so moderate moderator.

  • Moderator
  • 6,872 posts
  • Location:Blackpool UK

Posted 09 September 2013 - 11:45 AM

I do not want to take a single risk in SEO as things are getting difficult with increasing penalties with Google  Panda and Penguin, I

 

Then don't do anything that is designed to 'trick' Google's algorithms. But not having an XML sitemap and/or not setting a 'preferred' URL form isn't going to make ANY difference at all to "reputation".



#7 RyanWatson

RyanWatson

    HR 1

  • Members
  • Pip
  • 3 posts

Posted 16 September 2013 - 01:17 AM

 

Then don't do anything that is designed to 'trick' Google's algorithms.

 

I totally agree to you on this but I include Sitemap for better readability of search engine crawler. The pages which might be left by crawler, an XML Sitemap also shows crawler if there are new pages created.



#8 Jill

Jill

    Recovering SEO

  • Admin
  • 32,940 posts

Posted 16 September 2013 - 07:47 AM

Except that the crawler doesn't need to be shown that, as it can easily find the new pages by doing its thing--crawling. 



#9 torka

torka

    Vintage Babe

  • Moderator
  • 4,604 posts
  • Location:Triangle area, NC, USA, Earth (usually)

Posted 16 September 2013 - 09:09 AM

FWIW, I've never created an XML sitemap for any website I've worked on, and all my new pages get crawled just fine. It's wonderful what one can accomplish with just a dab of well-thought-out internal linking. :)

 

--Torka :propeller:


  • SidV likes this

#10 OldWelshGuy

OldWelshGuy

    Work is Fun

  • Moderator
  • 4,713 posts
  • Location:Neath, South Wales, UK

Posted 16 September 2013 - 06:54 PM

I have seen loads of sites crash and burn after adding an xml sitemap. Maybe it is because pre sitemap google had its own idea of page value, then a sitemap (badly valued) was uploaded telling it that pretty much all pages are equal, and BOOM1 a spanner has been thrown into the works and rankings tumble as google takes the site owners word for page values.
  • SidV likes this

#11 runningfast

runningfast

    HR 1

  • Members
  • Pip
  • 3 posts

Posted 27 September 2013 - 06:55 PM

I have seen loads of sites crash and burn after adding an xml sitemap. Maybe it is because pre sitemap google had its own idea of page value, then a sitemap (badly valued) was uploaded telling it that pretty much all pages are equal, and BOOM1 a spanner has been thrown into the works and rankings tumble as google takes the site owners word for page values.

 

So are you saying it's a good idea for me to leave it off? As in not create one? 

 

Many of the high ranking sites I compete against, don't have any sitemap that is visible to the public. 



#12 chrishirst

chrishirst

    A not so moderate moderator.

  • Moderator
  • 6,872 posts
  • Location:Blackpool UK

Posted 28 September 2013 - 10:36 AM

Many of the high ranking sites I compete against, don't have any sitemap that is visible to the public. 

 

And what does that tell you?



#13 runningfast

runningfast

    HR 1

  • Members
  • Pip
  • 3 posts

Posted 28 September 2013 - 01:03 PM

I guess I answered my own question :) 



#14 chrishirst

chrishirst

    A not so moderate moderator.

  • Moderator
  • 6,872 posts
  • Location:Blackpool UK

Posted 29 September 2013 - 06:55 AM

:goodjob:  :)






0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

SPAM FREE FORUM!
 
If you are just registering to spam,
don't bother. You will be wasting your
time as your spam will never see the
light of day!