Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!

 



Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 

 



 

 www.CustomReportSharing.com 

From the folks who brought you High Rankings!



Photo
- - - - -

Should I Block This Page In Robots.txt


  • This topic is locked This topic is locked
5 replies to this topic

#1 Grumpy

Grumpy

    HR 1

  • Members
  • Pip
  • 5 posts
  • Location:Midwest

Posted 02 July 2015 - 12:46 AM

I used the Facebook API to pull out the feed and update a website.  T cleint can logn into Facebook and post all they want and it shows up under a "Current News" page on the real website.   Should I tell google not to crawl this page in robots.txt becaue of duplicate content  issues?



#2 chrishirst

chrishirst

    A not so moderate moderator.

  • Moderator
  • 7,718 posts
  • Location:Blackpool UK

Posted 02 July 2015 - 06:35 AM

Depends on which URL you would prefer searchers to find.



#3 Grumpy

Grumpy

    HR 1

  • Members
  • Pip
  • 5 posts
  • Location:Midwest

Posted 02 July 2015 - 07:43 AM

Depends on which URL you would prefer searchers to find.

Well, I can't put a robots.txt file on Facebook. Does Facebook have the option to lockout search engines? Again, your answer doesn't help me much.  Which version is Google likely to consider duplacate content? If Google decides the content on my website is a duplicate of Facebook, will it negatevely affect any other page on that same site?

 

I'm looking for general advice on this issue. 



#4 Jill

Jill

    Recovering SEO

  • Admin
  • 33,244 posts

Posted 02 July 2015 - 07:50 AM

 If Google decides the content on my website is a duplicate of Facebook, will it negatevely affect any other page on that same site?

 

 

It shouldn't be a problem. 



#5 Michael Martinez

Michael Martinez

    HR 10

  • Active Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,325 posts
  • Location:Georgia

Posted 02 July 2015 - 10:58 AM

I would not block the page.



#6 chrishirst

chrishirst

    A not so moderate moderator.

  • Moderator
  • 7,718 posts
  • Location:Blackpool UK

Posted 02 July 2015 - 01:27 PM


 

 

 Which version is Google likely to consider duplacate content?

No idea.

 


 

 

 If Google decides the content on my website is a duplicate of Facebook, will it negatevely affect any other page on that same site?

Probably not.!

 


 

 

Well, I can't put a robots.txt file on Facebook. Does Facebook have the option to lockout search engines? Again, your answer doesn't help me much. 

plus


 

 

I'm looking for general advice on this issue. 

 

Means you really want specific advice then.

 

 

And you would understand my answer if you understood that "duplicate" content of that kind of level ie: practically none at all, probably won't even get filtered as 'duplicated'.

 

 

You really need to get a sense of proportion, Google deals with tens of millions of pages every single day of the week, two pages on completely different sites showing closely similar content that are actually related to each other ain't even going raise a 'blip' on their 'radar'.

 

Now IF you were showing those same "Facebook" posts on 200 separate URLs on different domains, that, might just be a problem. Your kind of "duplication" will probably be happening quite by accident a thousand times a day with zero problem to any of the URLs involved.


  • Wildfigmedia likes this




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

We are now a read-only forum.
 
No new posts or registrations allowed.