Are you a Google Analytics enthusiast?
More SEO Content
You Can Now Officially Tell Google When You Change Domains!
Posted 10 June 2009 - 04:23 PM
This should make things go a lot smoother and more quickly when you have to change your website's domain to something else.
You have to have both domains registered with Google's Webmaster Tools and that's where you will get the change of address feature.
Posted 11 June 2009 - 04:09 AM
Of course I'm very happy they finally did.
Thanks for the update Jill!
Posted 11 June 2009 - 12:38 PM
I hope Microsoft, Ask, and Yahoo! implement something similar.
Posted 12 June 2009 - 04:53 AM
Well to have them registered and verified in GWMT you have to still OWN the domain and have either the HTML file or a file with the META code hosted on the domain... and if you still own it and verify it, then surely you can just do a 301 redirect.
So what does this facility do that a 301 redirect doesn't?
Posted 12 June 2009 - 09:14 AM
There are too many sites out there these days that get hacked. Seriously, there are. Little old me has been called in to help clean up such messes no less than 10 times in the last 2 months, and I don't advertise or perform this service for hire. It's just people finding me by word of mouth and me being nice enough to help out friends and friends of friends.
My worry is what happens if the hackers change their plan of attack, and instead of actively hacking a site they start verifying these sites in GWMT and use this new method to point the authority of these domains to another domain that they either own or are being paid to help improve.
If they set up a 301 redirect it would be immediately evident to even a novice. Even more evident than the than their stealthy js payloads. But the GWMT "solution" would be completely transparent to most. They wouldn't even know to look for it. And I don't see any mention of any sort of appeals process to get one's own domain back. Or the mention of verification requiring something more than before, say an email confirmation process that includes the requirement that the email address actually be tied to the domain.
I don't see this new process as being nearly as safe as they make it sound. In fact it opens a new hole for people to worry about that wasn't there before. And something that is going to hurt you for at least six months instead of just a few days if you have the bad luck to have it happen to you. And the only way to combat it is something you need to do before any trouble finds you, meaning you really need to register and verify all of your domains via GWMT before anything bad happens.
Is this maybe the goal? To get more people to sign their sites up for GWMT?
Posted 12 June 2009 - 09:19 AM
Posted 12 June 2009 - 09:24 AM
Some way to challenge the validity of the GWMT verification that would get rid of the 180 day minimum cap. That would still be worse than it is now, and since 301's do the same thing makes all of this sort of questionable to begin with. But at least if there were an appeal process one could complete in a timely fashion (days, not weeks or months) it wouldn't be quite as bad as things now stand.
Frankly though, I doubt we'll hear much hubbub about this possibility. Most are simply going to think they've been penalized by Google, or not have enough knowledge to know to look for that GWMT verification file or meta info.
On the flip side, I guess it does open up a whole new business service one could charge for if they knew what to look for and how to fix it. Assuming there is some way to fix it. Which there isn't right now.
Posted 12 June 2009 - 09:27 AM
This could turn into serious mess and as you say people will be scratching their heads wonder how the hell someone elses site is appearing when clicking on their own SERPs listing.
Posted 12 June 2009 - 03:12 PM
In my (perhaps rather outlandish, but quite possible) example the original site would no longer rank for its phrases. However since all of it authority and links were being stealthily transferred to a different, unknown to them domain it's entirely possible the other domain would start ranking for at least some of the original sites phrases.
To the original site it would simply look like they'd been penalized. First they would lose their rankings. Then their site would disappear from the index altogether. Bam! It's gotta be a penalty and nobody without FTP access would be able to diagnose the real problem if the hacker used one of those specially named uploaded files for verification.
FTR the only reason I thought of it is my recent past experience with hacked domains. That and the fact that I spend most of my day, every day, trying to think like the other guy. In this case I'm just thinking of how I'd make hay with this new capability if I were a hacker.
Thank your lucky stars I'm not.
Posted 15 June 2009 - 04:03 AM
Thank your lucky stars I'm not.
Why do they do what they do, if you're that clever go and do something usefull and legal, hey make yourself a millionaire, not write a virus which could have you looking at 20+ years in jail not to mention the hasle and upset you cause with your stupid little viruses.
If they want to impress people, try writing something that will revolutionise the world, everyone will want to buy and helps them become the next Bill Gates, that'll impress me!
Yup i'm sure many people are glad you've not sucome to the 'Dark Side'
Posted 15 June 2009 - 04:34 AM
Excluding the Webmaster Tools thing...
Is there a quick check one could do on the site to see if where hacked?
If so, where do you look for dodgy code, the index page or are all pages prone to attack?
I´m sure it would be helpful to many forum members.
Posted 15 June 2009 - 08:16 AM
If you already use this method for validating domain via GWMT, look for a file that doesn't match your special google number.
The other is to check every single page which makes up your website and see if file exists that you didn't create or if any of your pages have the google META tag with the special code in which isn't yours if you use it.
For large sites this could be a nightmare, I guess if you know server side code, you could write a script to read the source of everyfile in your website and look for dodgy google.html files or META tags
Edited by 1dmf, 15 June 2009 - 11:13 AM.
Posted 15 June 2009 - 11:04 AM
There are other ways to do it, but it's not something most people are going to have the ability to do. For one site that stayed pretty static (or in this case the pages that build the static looking pages themselves stay static) I set up a little cron job that shoots me off an email if/when the modified date on certain files gets changes.
Or you could have a packet monitor set up on your PC, such as Wireshark, to see where your surfing is actually attempting to retrieve stuff from. Or if your anti-virus is really, really good and gives you alerts or saves stuff in a log file you can review you'd see it there too. Assuming of course it recognized the virus was a virus.
Another way to do it, if you have them available to you, would be to review the FTP Logs once per week or so to see if any IP numbers you don't recognize have FTP'd into the site.
In short, there's not a single, easy way to find these things. But with the way it's being used now to deliver a virus payload using Chrome will give you a several days advance notice that somethings up before the site's rankings actually start to get hurt. Chrome seems to get those updates immediately, whereas they don't show up in the SERPs for several days.
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users