On a client site on Apr 19/14, GWMT started producing thousands of 404 errors and we're now to 16,000 404 errors being reported right now.
Many of these errors are legitimate pages/files from a very old site dating back 5-8 years ago. Obviously the pages and files were not 301'd at the time. We can see that at least for some of the links we tested, these links still appear on other sites and that's clearly how Google found them. (but why only report them now?)
It's not possible to redirect such a large number of old links. Given that Google says "Generally, 404s don't harm your site's performance in search...."
Now there is the "Priority" column that GWMT's blog says:
One thing we’re really excited about in this new version of the Crawl errors feature is that you can really focus on fixing what’s most important first. We’ve ranked the errors so that those at the top of the priority list will be ones where there’s something you can do, whether that’s fixing broken links on your own site, fixing bugs in your server software, updating your Sitemaps to prune dead URLs, or adding a 301 redirect to get users to the “real” page. We determine this based on a multitude of factors, including whether or not you included the URL in a Sitemap, how many places it’s linked from (and if any of those are also on your site), and whether the URL has gotten any traffic recently from search.
Edited by ttw, 20 May 2014 - 03:51 PM.