Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!

 



Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 

 



 

 www.CustomReportSharing.com 

From the folks who brought you High Rankings!



Photo
- - - - -

Access Denied - 2508 Errors - 403 Response Code In Webmaster Tools


  • Please log in to reply
8 replies to this topic

#1 sourabhrana

sourabhrana

    HR 1

  • Members
  • Pip
  • 4 posts
  • Location:new delhi, india

Posted 14 May 2013 - 09:52 AM

 
on 9th May we received two Messages in Google webmaster tools
 
1st one is 
Increase in authorization permission errors
May 9, 2013

Google detected a significant increase in the number of URLs we were blocked from crawling due to authorization permission errors. 

 

& 2nd one is 

 

Big traffic change for top URL

May 9, 2013

Search results clicks for ........ have decreased significantly.

The number of clicks that your site receives from Google can change from day to day for a variety of factors, including automatic algorithm updates. However, if you have recently made significant changes to the content or configuration of your site, this change may be an indication that there are problems. 

 

 

After this my all top urls webpages are deindexed in google & my website is coming without WWW version.  If I am putting EX- www.domainname.co.uk  then first result is coming without WWW like domainname.co.uk 

 

Not a single keyword is ranking in Google UK for my top pages.  I uploaded a crawl error report snapshot of GWT in this post.

 

I created links only & only from WHITE HAT SEO methods. 

 

the only change I made before this update creating a customized 404 Error Page & I redirected all the not working pages onto Error 404 page. Is this the reason of this problem.  Previously all not working links of my site was redirected on home page. Now, Removed the page 301 redirected onto HOME PAGE.

 

 

My 8 months hardwork is going to waste after this update. Kindly tell me how to solve this. Is this the problem of crawl errors or any other factor. Will google reindex my www domain name & all the top flight pages. 

 

 



#2 torka

torka

    Vintage Babe

  • Moderator
  • 4,618 posts
  • Location:Triangle area, NC, USA, Earth (usually)

Posted 14 May 2013 - 10:45 AM

You need to figure out why access has been denied to Googlebot for all those pages. Did someone make a badly-formed update to your robots.txt file, for instance, that is preventing Googlebot from accessing your pages? Has anyone added recently some sort of script that requires a response before a visitor can access the page (maybe a pop up window or something like that), and forgot to exclude search engine spiders? I suspect if you can find and correct the "authorization permission" issue, the rest may fall into place.

 

--Torka :propeller:


Edited by torka, 14 May 2013 - 10:46 AM.
edited for clarity


#3 Jill

Jill

    Recovering SEO

  • Admin
  • 32,961 posts

Posted 14 May 2013 - 11:16 AM

the only change I made before this update creating a customized 404 Error Page & I redirected all the not working pages onto Error 404 page. 

 

 

 

 

Obviously you must have done something wrong with that change. Carefully review what you did and you'll probably be able to figure out the mistake.



#4 sourabhrana

sourabhrana

    HR 1

  • Members
  • Pip
  • 4 posts
  • Location:new delhi, india

Posted 15 May 2013 - 04:48 AM

I agree & I am reviewing this thing. but I am not able to figure out why Google sent me the 1st message "Increase in authorization permission errors"  

 

We blocked only Result.aspx & contect.aspx which is dynamically generated pages in my site. These pages dynamically generated for on our flight pages. This is the reason we blocked these pages. Is this is the reason of loss of traffic & ranking that google ?

 

 

My robotx.txt file

User-agent: *
Disallow: /Result.aspx
Disallow: /Contect.aspx
Allow: /

Sitemap: http://www.mysitename.co.uk/sitemap.xml


#5 sourabhrana

sourabhrana

    HR 1

  • Members
  • Pip
  • 4 posts
  • Location:new delhi, india

Posted 16 May 2013 - 03:19 AM

Hello Fellow members,

 

From 9th may I am getting this error messages & these crawl errors is increasing daily. Google is not able to crawl my URLS & getting 403 response code & saying ACCESS Denied Errors in GWT. My all Indexed pages are de-indexed.

 

Why I am receiving this errors ? My website is working fine but why Google is not able to crawl my pages. PLEASE TELL ME what is the ISSUE, I need to resolve ASAP

 

 

Crawl Errors Snapshot is attached for reference 



#6 Alan Perkins

Alan Perkins

    Token male admin

  • Admin
  • 1,642 posts
  • Location:UK

Posted 16 May 2013 - 07:34 AM

Take a look at an example of a URL that WMT is saying is giving a 403.

 

Is it a URL that you would actually want in the Google index?

If not, use robots.txt to prevent Googlebot accessing the URL and receiving a 403.

 

If it is a URL that you would want in the index, check that you can access the URL yourself, when not logged in to your website.

 

If you can access the URL yourself, try accessing it as Googlebot. In Webmaster Tools, under the "Health" menu, select "Fetch as Google". Put the URL in, press "Fetch", wait, and you'll see what Google sees when it tries to access that URL.


  • Jill likes this

#7 torka

torka

    Vintage Babe

  • Moderator
  • 4,618 posts
  • Location:Triangle area, NC, USA, Earth (usually)

Posted 16 May 2013 - 09:24 AM

We had two threads going related to this one issue: one started May 14 and another started May 16. For the purposes of clarity, I've merged the two into a single thread.

 

--Torka :propeller:


  • sourabhrana likes this

#8 sourabhrana

sourabhrana

    HR 1

  • Members
  • Pip
  • 4 posts
  • Location:new delhi, india

Posted 16 May 2013 - 10:16 AM

Very very thanks for this reply...

 

Take a look at an example of a URL that WMT is saying is giving a 403.

 

"Is it a URL that you would actually want in the Google index? " YES 

 

 

I didn't blocked any url from google.

" If not, use robots.txt to prevent Googlebot accessing the URL and receiving a 403. "

 

Yes I can access this URL at any time 

If it is a URL that you would want in the index, check that you can access the URL yourself, when not logged in to your website.

 

 

 

 

I done this but google was showing error and not fetching

 

 

"If you can access the URL yourself, try accessing it as Googlebot. In Webmaster Tools, under the "Health" menu, select "Fetch as Google". Put the URL in, press "Fetch", wait, and you'll see what Google sees when it tries to access that URL."

 

 

The problem is that why Google is refusing to crawl my site & he is saying that  "Googlebot couldn't crawl your URL because your server either requires login to access the page, or is blocking Googlebot from accessing your site." 

 

I am not able to find the solution & very very stressed about this because I worked very hard for my site. Before this error all the keywords was ranking & now pages are deindexed from google :( :(  feeling like crying :(

 

 

 



#9 Alan Perkins

Alan Perkins

    Token male admin

  • Admin
  • 1,642 posts
  • Location:UK

Posted 16 May 2013 - 10:31 AM

 

 

My robotx.txt file

User-agent: *
Disallow: /Result.aspx
Disallow: /Contect.aspx

 

Allow: /

Sitemap: http://www.mysitenam....uk/sitemap.xml

 

 

First of all, the file is called robots.txt, not robotx.txt, so make sure you've named it right.

 

Next, you probably meant "Contact.aspx", not "Contect.aspx", so make sure that's correct.

 

Finally, delete the allow line. You don't need it.

 

So your file should be called robots.txt, it should be placed in the root folder, and it should contain the following:

User-agent: *
Disallow: /Result.aspx
Disallow: /Contact.aspx

Sitemap: http://www.mysitename.co.uk/sitemap.xml





0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

SPAM FREE FORUM!
 
If you are just registering to spam,
don't bother. You will be wasting your
time as your spam will never see the
light of day!