Jump to content

  • Log in with Facebook Log in with Twitter Log In with Google      Sign In   
  • Create Account

Subscribe to HRA Now!

 



Are you a Google Analytics enthusiast?

Share and download Custom Google Analytics Reports, dashboards and advanced segments--for FREE! 

 



 

 www.CustomReportSharing.com 

From the folks who brought you High Rankings!



Photo
- - - - -

Small Business Recently Had Ranking Ruined By Url Injection, Please He


  • Please log in to reply
15 replies to this topic

#1 coresash

coresash

    HR 1

  • Members
  • Pip
  • 6 posts

Posted 17 April 2014 - 03:04 AM

Hi all,

Put simply, it's taken years of hard work to get my business to the level we're at now and I really hope one of you guys can help me out, or at least provide some solid advice. I apologize if any of my terminology is off, I'm a layman.

The root problem:

Recently, my website was subject to a URL injection attack. I was made aware of this only when I saw a "Hacking suspected" label beneath my organic search result and checked my Webmaster tools. Google had taken manual action against my website due to the attack, killing our rankings. For anyone unfamiliar with URL injection, they used a vulnerability in an old version of WordPress to add several new directories to my FTP, making spammy websites appear in impressions for users who were searching for phrases such as "How much of X drug can I take". The websites appear as if they were under my domain until clicked, at which point they redirect to the spammy drug sites.

An example of the links
34qs4k7.jpg


Steps taken thus far:

Requested removal of the any identified rogue URLs through webmaster tools, removed all directories related to spammy links from my FTP, replaced wp-includes and wp-admin, updated WordPress, updated all plugins and installed Sucuri plugin and performed a scan which came out clean. Once I'd done all this I checked a few URLs from each of the directories (e.g "love", "pepper", "toe") and they came up as a 404. I then requested removal of the manual action taken by Google and within a day or two they had removed the penalties. It was not long before my ranking appeared to return to normal.

So what's the problem?

The problem is that I'm not sure whether or not the issue is over. Firstly, I received a message shortly after the attack, which I didn't find out about for a couple of days, telling me about the large increase in 404 errors..

My ranking has since plummeted again. I assumed this was due to the crawl errors which resulted from the URL injection - the spam links now returned a 404, which Google thinks belongs to my domain, which it obviously doesn't. The only two people who answered this question on Google's help forums seemed to think that the 404 error resulted from Google "Making sure the URL still doesn't exist". I don't agree with this and I'd like to find out what you guys think about it. One thing that really worries me is the impressions over this period.

Impressions before, during and after the URL injection
23w4psi.jpg

FYI - I noticed the issue on the 2nd of April. I applied all the fixes on the 3rd of April, after which point everything seemed fine - later, however, there was another spike, which on the surface of things seems to coincide with the second drop in ranking around the last few days. What could these impressions mean? I was under the impression that if Google lifted the manual action, that the issue must have been fixed.

So, to sum up, I need help. I'm hoping you guys can be the ones to give some good advice on fixing this problem.

Thanks.



#2 chrishirst

chrishirst

    A not so moderate moderator.

  • Moderator
  • 7,378 posts
  • Location:Blackpool UK

Posted 17 April 2014 - 04:33 AM

 

add several new directories to my FTP

??????????

FTP is a communication protocol not something that can be "added to" or "removed from" either.

 

 

The only two people who answered this question on Google's help forums seemed to think that the 404 error resulted from Google "Making sure the URL still doesn't exist" I don't agree with this and I'd like to find out what you guys think about it.

And why do you disagree with this when it is 100% accurate, the HTTP 404 response only occurs when a useragent requests a URL that is "Not Found" on the server.

 

 

making spammy websites appear in impressions for users who were searching for phrases such as "How much of X drug can I take". The websites appear as if they were under my domain until clicked, at which point they redirect to the spammy drug sites.

The "impressions"  are mostly faked, and the implanted URLs are there for the scammers to redirect mugs from UCE mail 'shots' to their "affiliate" URLs or for "buy traffic" mugs who pay real money for fake traffic.

 

The Wordpress attack vector they probably used is one that exploits "weak permissions" in the older 'default' templates that allows the attacker to place "symlinks" that can be then used to gain access to control panel functions vial Perl scripts.



#3 coresash

coresash

    HR 1

  • Members
  • Pip
  • 6 posts

Posted 17 April 2014 - 04:50 AM

??????????

FTP is a communication protocol not something that can be "added to" or "removed from" either.

 

Sorry but it's really, really obvious that I'm talking about where the files for my website are stored.

 

And why do you disagree with this when it is 100% accurate, the HTTP 404 response only occurs when a useragent requests a URL that is "Not Found" on the server.

 

 

I realize that. However, it's not accurate to say that Google is crawling this to make sure the url still doesn't exist. If google picks up a crawl error, they don't know whether or not it's "supposed" to be like that at all. They just think that I have a poorly maintained website. This could then affect rankings.

 

Would really appreciate an answer to some of the questions in my OP, particularly the second spike in impressions (not sure where you got the idea they are faked).



#4 chrishirst

chrishirst

    A not so moderate moderator.

  • Moderator
  • 7,378 posts
  • Location:Blackpool UK

Posted 17 April 2014 - 08:32 AM

No, Google do not care how well you maintain your website, or even if you do maintain it, all they know is when their 'bot tried to crawl a particular URL it returned a 404 response code, and 404 responses are a fact of life.

 

NOW, when they have sent you a message that suggests your site has been hacked and contains 'malicious' URLs they WILL place a warning message on ALL results for your website AND ARE going to send a bot on a MORE frequent basis to see IF those URLs still exist on the site or if they have been removed. When it HAS been determined that  they no longer exist, their "This site may be malicious" warning on ANY result URLs that  point to your website can be removed and searchers can once again click through to your URLs WITHOUT the interstitial warning 'page'.

 

 

The malicious URLs may not PHYSICALLY exist on your site, but one of the exploits was/is to add several lines of .htaccess code that redirected referral traffic from Search ONLY and YOU are none the wiser because you do not go to your site via Google results.



#5 Michael Martinez

Michael Martinez

    HR 10

  • Active Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,233 posts
  • Location:Georgia

Posted 17 April 2014 - 08:39 AM

There may be spammy links pointing to the directories that the hacker created.  That kind of activity often accompanies URL injection attacks.

 

When I have fixed hacked Websites I have also implemented 301 redirects for the directories that were created.  You can send the crawlers offsite or to your root URL or to your sitemap.  But a high number of 404s is not supposed to impact your rankings.  I implemented the redirects for two reasons: first, one attack was so pernicious that the directories kept coming back; second, I don't like seeing tons of manageable errors in my server logs.

 

I think you should look for suspicious links pointing to your site.  If you find them, you should disavow them, explaining that you think they may be associated with the hack.

 

As for any message Google sent you about a large number of 404s, they are merely being helpful, letting you know that they suddenly see a huge change in your site's crawlability and indexability; that is in case you broke your navigation.


  • coresash likes this

#6 coresash

coresash

    HR 1

  • Members
  • Pip
  • 6 posts

Posted 17 April 2014 - 09:03 AM

one attack was so pernicious that the directories kept coming back

 

 

Ouch. At least with me the directories have stayed deleted. I did find this strange code, which has since been removed, right at the start of my .htaccess file:


<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} (google|yahoo|aol|bing|crawl|aspseek|icio|bot|spider|nutch|slurp|seznam) [OR]
RewriteCond %{HTTP_REFERER} (google|aol|yahoo|msn|search|bing)
RewriteCond %{REQUEST_URI} /$ [OR]
RewriteCond %{REQUEST_FILENAME} (html|php|htm)$ [NC]
RewriteCond %{REQUEST_FILENAME} !common.php
RewriteCond /var/sites/c/coresashwindows.co.uk/public_html/common.php -f
RewriteRule ^.*$ /common.php [L]
</IfModule>

The common.php was not a part of the install as I checked the files in the FTP against the older, clean backups.

 


 

I think you should look for suspicious links pointing to your site.  If you find them, you should disavow them, explaining that you think they may be associated with the hack.

 

Thanks for the advice. Although the only associated website I've seen so far is my own - if you type site:coresashwindows.co.uk into google you'll see what I mean. All the links in the results appear to come from my domain.

 

Do you have any idea what the second spike in impressions could be due to, as shown in the graph in the op?



#7 chrishirst

chrishirst

    A not so moderate moderator.

  • Moderator
  • 7,378 posts
  • Location:Blackpool UK

Posted 17 April 2014 - 11:49 AM

That "strange code" is the attack code, and you can see how long it has been around

 

http://blog.sucuri.n...cks-part-1.html


  • coresash likes this

#8 Michael Martinez

Michael Martinez

    HR 10

  • Active Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,233 posts
  • Location:Georgia

Posted 18 April 2014 - 08:33 AM

The "site:" query doesn't show you what is linking to your site but rather which pages from your site the search engine is showing in its index.

 

You'll need to look at the backlinks reported for your site in Webmaster Tools.

 

It's unusual for a hack to get into your .htaccess file.  You can add the following code to the file to prevent that from happening again:

<Files ".htaccess">
Order Allow,Deny
Deny from all
</Files>

They would have to log into your Web hosting account to alter the .htaccess after you add that.  I trust you changed the password for the Web hosting control panel though.  If you haven't yet done that, you should.



#9 chrishirst

chrishirst

    A not so moderate moderator.

  • Moderator
  • 7,378 posts
  • Location:Blackpool UK

Posted 18 April 2014 - 10:45 AM

 

It's unusual for a hack to get into your .htaccess file

Actually ... it isn't. I have had to 'clean' several sites for people who's websites were hit with this kind of attack

 

 

You can add the following code to the file to prevent that from happening again:

 

No it won't, that code only prevents access via a http: request, these kind of attacks do not involve http or webdav access via a direct http/https URL, The attackers  plant upload scripts that are used to overwrite the .htaccess file.

 

Setting 644 or 444 permissions on .htaccess, index.html, index.php etc. can stop them overwriting the files.

 

And once your site has been hit, they will keep coming back if you don't lock the files down and close whatever exploit they used. There is also a flaw in the cPanel "legacy" FileManager that can be also used to execute scripts as the user account, And it is STILL there, even the  current versions.

 

http://www.juniper.n.../vuln34142.html



#10 Michael Martinez

Michael Martinez

    HR 10

  • Active Members
  • PipPipPipPipPipPipPipPipPipPip
  • 5,233 posts
  • Location:Georgia

Posted 18 April 2014 - 10:16 PM

Actually ... it isn't. I have had to 'clean' several sites for people who's websites were hit with this kind of attack

It appears you and I have different types of hacks in mind. The hacks I have been dealing with mostly come from remote process calls (HTTP/HTTPS) so ...
 

No it won't, that code only prevents access via a http: request, these kind of attacks do not involve http or webdav access via a direct http/https URL, The attackers  plant upload scripts that are used to overwrite the .htaccess file.

Like I said, it will prevent that kind of hack.
 

Setting 644 or 444 permissions on .htaccess, index.html, index.php etc. can stop them overwriting the files.

Except for when they have backdoors, in which case changing permissions makes no difference because they can use exec code to change the permissions (such as when they crack passwords, create their own logins, and use vulnerabilities in blogs and forums to create admin accounts for those platforms that can then change the permissions on files, allow uploads, etc.).

So let's come to some agreement here: There are MANY ways to hack sites and no single sure-fire ways to prevent the hacks.

The more software you install on your site the more potential vulnerabilities you have to guard against. You need to keep good, clean backups of your uploaded files and any databases you maintain for your sites and you need to look at the installed files and the data in the databases on a periodic basis to make sure that they haven't been modified.

Finally, most of the advice people share about passwords is useless. The degree of complexity that goes into a password does not determine how secure it is. If you're using passwords based on personal information then someone who has your personal information may be able to guess your passwords; otherwise they have to test out every possible iteration in what is called a "Brute Force Dictionary Attack".

The best protection against a BFDA is to use LONG passwords (so "1hsX$#3" is easier to crack than "1234567890abc").

Edited by Michael Martinez, 18 April 2014 - 10:16 PM.


#11 chrishirst

chrishirst

    A not so moderate moderator.

  • Moderator
  • 7,378 posts
  • Location:Blackpool UK

Posted 19 April 2014 - 07:03 AM

Changing permissions on the commonly attacked files does actually stop these kind of "drive by"  injection attacks, because the implanted scripts do need 'elevated' permission to change permissions on the files they want to attack. So the result is a "chicken and the egg" paradox for the script, so it 'dies' with it's work incomplete. It's just a shame that the same cannot be said of the "crackers" who launch these attacks!

 

 

The best protection against a BFDA is to use LONG passwords (so "1hsX$#3" is easier to crack than "1234567890abc").

Absolutely!

 

The passwords I have for "mission critical" logins, such as server 'root', WHM/cPanel, banking etc. are ten  to fifteen characters long generated by an LibreOffice macro that I wrote, so no one else can figure out the hashing algorithm without having access to my computer. I just wish that I could convince the hosting clients that converting their eight character (WHM generated) account login to letters and numbers is not really best practice for 'security'. Yes it may be easy to remember, but there is a whole world of cyber-malcontents who know exactly how cPanel creates user names from domain names, so they already know your password  before you even change it..



#12 coresash

coresash

    HR 1

  • Members
  • Pip
  • 6 posts

Posted 22 April 2014 - 03:42 AM

Hey guys,

 

Thanks so much for all your help. After the bank holiday, I took another look at the SEO and it seems to be totally back on track. The insight into the code was great, Chrishurst. Michael thanks for the tip to deny the http request attacks. Now I'm thinking about what I need to do to stop this from ever happening again - is there any code from my FTP that I can post here for you guys, so you can verify that it's clean? I'll start with the .htaccess:

# BEGIN WordPress
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteBase /
RewriteRule ^index\.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
</IfModule>

# END WordPress

# validate X-UA-Compatible meta tag
Header set X-UA-Compatible "IE=Edge,chrome=1"

# hide config.php
<Files wp-config.php>  
   order allow,deny  
   deny from all  
</Files>  

# protect individual files
<Files .htaccess>  
   order allow,deny  
   deny from all  
</Files>  

# protect from sql injection
Options +FollowSymLinks
RewriteEngine On
RewriteCond %{QUERY_STRING} (\<|%3C).*script.*(\>|%3E) [NC,OR]
RewriteCond %{QUERY_STRING} GLOBALS(=|\[|\%[0-9A-Z]{0,2}) [OR]
RewriteCond %{QUERY_STRING} _REQUEST(=|\[|\%[0-9A-Z]{0,2})
RewriteRule ^(.*)$ index.php [F,L]

# protect directory browsing
Options All -Indexes


# BEGIN GZIP
<ifmodule mod_deflate.c>
AddOutputFilterByType DEFLATE text/text text/html text/plain text/xml text/css application/x-javascript application/javascript
</ifmodule>
# END GZIP

# compress text, html, javascript, css, xml:
AddOutputFilterByType DEFLATE text/plain
AddOutputFilterByType DEFLATE text/html
AddOutputFilterByType DEFLATE text/xml
AddOutputFilterByType DEFLATE text/css
AddOutputFilterByType DEFLATE application/xml
AddOutputFilterByType DEFLATE application/xhtml+xml
AddOutputFilterByType DEFLATE application/rss+xml
AddOutputFilterByType DEFLATE application/javascript
AddOutputFilterByType DEFLATE application/x-javascript

# Or, compress certain file types by extension:
<files *.html>
SetOutputFilter DEFLATE
</files>


#Expire Header
<FilesMatch "\.(ico|jpg|jpeg|png|gif|js|css|swf)$">
ExpiresDefault "access plus 2 hours"
</FilesMatch>

# Turn on Expires and set default to 0
ExpiresActive On
ExpiresDefault A0
 
# Set up caching on media files for 1 year (forever?)
<filesMatch "\.(flv|ico|pdf|avi|mov|ppt|doc|mp3|wmv|wav)$">
ExpiresDefault A29030400 
Header append Cache-Control "public"
</filesMatch>
 
# Set up caching on media files for 1 week
<filesMatch "\.(gif|jpg|jpeg|png|swf)$">
ExpiresDefault A604800
Header append Cache-Control "public"
</filesMatch>
 
# Set up 2 Hour caching on commonly updated files
<filesMatch "\.(xml|txt|html|js|css)$">
ExpiresDefault A604800
Header append Cache-Control "proxy-revalidate"
</filesMatch>
 
# Force no caching for dynamic files
<filesMatch "\.(php|cgi|pl|htm)$">
ExpiresActive Off
Header set Cache-Control "private, no-cache, no-store, proxy-revalidate, no-transform"
Header set Pragma "no-cache"
</filesMatch>

#ETag
FileETag none

# Enable compression: mod_deflate configuration for Apache 2.x
<IfModule mod_deflate.c>
SetOutputFilter DEFLATE
# Dont compress
SetEnvIfNoCase Request_URI \.(?:gif|jpe?g|png)$ no-gzip dont-vary
SetEnvIfNoCase Request_URI \.(?:exe|t?gz|zip|bz2|sit|rar)$ no-gzip dont-vary
#Dealing with proxy servers
<IfModule mod_headers.c>
Header append Vary User-Agent
</IfModule>
</IfModule>


#Deny access from IPs attack
order allow,deny
deny from 78.149.123.255
deny from 82.45.152.152
deny from 79.135.120.106
deny from 109.230.251.120
deny from 46.205.96.168
deny from 109.145.194.4
deny from 195.33.27.190
deny from 109.157.227.138
deny from 71.40.108.83
deny from 91.224.160.25
allow from all

Anything look suspicious to you?

 

Thanks



#13 coresash

coresash

    HR 1

  • Members
  • Pip
  • 6 posts

Posted 22 April 2014 - 05:13 AM

Update:

<Files ".htaccess">
Order Allow,Deny
Deny from all
</Files>

This had already been added into my .htaccess. I guess this narrows down the possibilities and chrishirst's suggestion of someone just over writing the .htaccess seems likely to be correct. What do you recommend I do, from here on out?

 

 

 

 

#14 chrishirst

chrishirst

    A not so moderate moderator.

  • Moderator
  • 7,378 posts
  • Location:Blackpool UK

Posted 22 April 2014 - 06:28 AM

Change your hosting account passwords and block the possible exploits.

Change the WordPress admin user name to something other than 'admin' and/or disable or remove the 'admin' user, set a user as "Administrator/Super User" first.

#15 coresash

coresash

    HR 1

  • Members
  • Pip
  • 6 posts

Posted 22 April 2014 - 06:31 AM

Change your hosting account passwords and block the possible exploits.

Change the WordPress admin user name to something other than 'admin' and/or disable or remove the 'admin' user, set a user as "Administrator/Super User" first.

 

Hey. This was all done on the first day, other than the administrator/super user bit. I did post thanking you and michael for your help with this, but the post didn't seem to show up. I wanted to post my .htaccess file which might have made the mod deny the post - is it safe to post the .htaccess? I wanted you to take a look since you seem familiar with this issue.






0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

SPAM FREE FORUM!
 
If you are just registering to spam,
don't bother. You will be wasting your
time as your spam will never see the
light of day!