Are you a Google Analytics enthusiast?
More SEO Content
Dynamic Page In Robots.txt
Posted 01 July 2004 - 10:45 AM
We have a specific dynamic page (well, just the "?id=23" part of the page) which is in Google, but needs to be excluded. The other id=XX are all of my client's press releases, but one press release needs to be pulled and it's already indexed.
I am working with the client to clean up the error that the page is returning. It is currently returning a 500 server error rather than a 404 for that particular item. However, I would like to have them add that particular release to their robots.txt file. Is there a way to list a dynamic page so that *ALL* of their press releases don't get un-indexed, just this one that is no longer there? I don't see anything about robots parsing dynamic page links to be that specific.
Ideas? It is urgent that this page gets removed from Google. The snippet that shows up in the description is a problem. Clicking on cache isn't showing anything (which is good) -- but the page and description have got to go.
Of course, this item is listed as #2 under a particular search phrase! (And we still don't want it!)
Posted 01 July 2004 - 12:05 PM
Thanks for the moral support (at least, I can feel that you're out there!)
Posted 01 July 2004 - 04:46 PM
Generally, Google will keep a 404 in the index for a while, presumably to give you a chance to fix it. On the other hand, if you give them different content for the page, the changes will be reflected in the next update (after which, you can then remove the page and wait for it to be dropped).
Google also has a "Remove URL" form, but I've never used it, so don't know how long it takes.
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users