Need to Stop Bots from Killing my Webserver

Need to Stop Bots from Killing my Webserver - .htaccess files are extremely useful in many cases for users who either do not have root permissions or for users who simply aren't comfortable in making changes in their web server's configuration file. Trying to debug .htaccess not working isn't always the easiest thing to do, however, hopefully by checking the discuss below mentioned about php, htaccess, robots.txt, , .htaccess common problems as well as the troubleshooting tips, you'll have a better grasp on what you may have to modify to get your .htaccess file running smoothly.Problem :


I am having EXTREME bot problems on the some of my websites within my hosting account. The bots utilize over 98% of my CPU resources and 99% of my bandwidth for my entire hosting account. These bots are generating over 1 GB of traffic per hour for my sites. The real human traffic for all of these sites is less than 100 MB / month.



I have done extensive research on both robots.txt and .htaccess file to block these bots but all methods failed.



I have also put code in the robots.txt files to block access to the scripts directories, but these bots (Google, MS Bing, and Yahoo) ignore the rules and run the scripts anyways.



I do not want to completely block the Google, MS Bing, and Yahoo bots, but I want to limit there crawl rate. Also, adding a Crawl-delay statement in the robots.txt file does not slow down the bots. My current robots.txt and .htacces code for all sites are stated below.



I have setup both Microsoft and Google webmaster tools to slow down the crawl rate to the absolute minimum, but they are still hitting these sites at a rate of 10 hits / second.



In addition, every time I upload a file that causes an error, the entire VPS webserver goes down within seconds such that I cannot even access the site correct the issue due to the onslaught of hits by these bots.



What can I do to stop the on-slot of traffic to my websites?



I tried asking my web hosting company (site5.com) many times about this issue in the past months and they cannot help me with this problem.



What I really need is to prevent the Bots from running the rss2html.php script. I tried both sessions and cookies and both failed.



robots.txt



User-agent: Mediapartners-Google
Disallow:
User-agent: Googlebot
Disallow:
User-agent: Adsbot-Google
Disallow:
User-agent: Googlebot-Image
Disallow:
User-agent: Googlebot-Mobile
Disallow:
User-agent: MSNBot
Disallow:
User-agent: bingbot
Disallow:
User-agent: Slurp
Disallow:
User-Agent: Yahoo! Slurp
Disallow:
# Directories
User-agent: *
Disallow: /
Disallow: /cgi-bin/
Disallow: /ads/
Disallow: /assets/
Disallow: /cgi-bin/
Disallow: /phone/
Disallow: /scripts/
# Files
Disallow: /ads/random_ads.php
Disallow: /scripts/rss2html.php
Disallow: /scripts/search_terms.php
Disallow: /scripts/template.html
Disallow: /scripts/template_mobile.html


.htaccess



ErrorDocument 400 http://english-1329329990.spampoison.com
ErrorDocument 401 http://english-1329329990.spampoison.com
ErrorDocument 403 http://english-1329329990.spampoison.com
ErrorDocument 404 /index.php
SetEnvIfNoCase User-Agent "^Yandex*" bad_bot
SetEnvIfNoCase User-Agent "^baidu*" bad_bot
Order Deny,Allow
Deny from env=bad_bot
RewriteEngine on
RewriteCond %HTTP_user_agent bot* [OR]
RewriteCond %HTTP_user_agent *bot
RewriteRule ^.*$ http://english-1329329990.spampoison.com [R,L]
RewriteCond %QUERY_STRING mosConfig_[a-zA-Z_]1,21(=|%3D) [OR]
# Block out any script trying to base64_encode crap to send via URL
RewriteCond %QUERY_STRING base64_encode.*(.*) [OR]
# Block out any script that includes a <script> tag in URL
RewriteCond %QUERY_STRING (<|%3C).*script.*(>|%3E) [NC,OR]
# Block out any script trying to set a PHP GLOBALS variable via URL
RewriteCond %QUERY_STRING GLOBALS(=|[|%[0-9A-Z]0,2) [OR]
# Block out any script trying to modify a _REQUEST variable via URL
RewriteCond %QUERY_STRING _REQUEST(=|[|%[0-9A-Z]0,2)
# Send all blocked request to homepage with 403 Forbidden error!
RewriteRule ^(.*)$ index.php [F,L]
RewriteCond %REQUEST_FILENAME !-f
RewriteCond %REQUEST_FILENAME !-d
RewriteCond %REQUEST_URI !^/index.php
RewriteCond %REQUEST_URI (/|.php|.html|.htm|.feed|.pdf|.raw|/[^.]*)$ [NC]
RewriteRule (.*) index.php
RewriteRule .* - [E=HTTP_AUTHORIZATION:%HTTP:Authorization,L]
# Don't show directory listings for directories that do not contain an index file (index.php, default.asp etc.)
Options -Indexes
<Files http://english-1329329990.spampoison.com>
order allow,deny
allow from all
</Files>
deny from 108.
deny from 123.
deny from 180.
deny from 100.43.83.132


UPDATE TO SHOW ADDED USER AGENT BOT CHECK CODE



<?php
function botcheck()
$spiders = array(
array('AdsBot-Google','google.com'),
array('Googlebot','google.com'),
array('Googlebot-Image','google.com'),
array('Googlebot-Mobile','google.com'),
array('Mediapartners','google.com'),
array('Mediapartners-Google','google.com'),
array('msnbot','search.msn.com'),
array('bingbot','bing.com'),
array('Slurp','help.yahoo.com'),
array('Yahoo! Slurp','help.yahoo.com')
);
$useragent = strtolower($_SERVER['HTTP_USER_AGENT']);
foreach($spiders as $bot) {
if(preg_match("/$bot[0]/i",$useragent)){
$ipaddress = $_SERVER['REMOTE_ADDR'];
$hostname = gethostbyaddr($ipaddress);
$iphostname = gethostbyname($hostname);
if (preg_match("/$bot[1]/i",$hostname) && $ipaddress == $iphostname){return true;}
}
}

if(botcheck() == false)
?>


I also added the following to the top tof the rss2html.php script



// Checks to see if this script was called by the main site pages, (i.e. index.php or mobile.php) and if not, then sends to main page
session_start();
if(isset($_SESSION['views']))$_SESSION['views'] = $_SESSION['views']+ 1; else $_SESSION['views'] = 1;
if($_SESSION['views'] > 1) header("Location: http://website.com/index.php");

Solution :

You could set your script to throw a 404 error based on the user agent string provided by the bots - they'll quickly get the hint and leave you alone.



if(isset($_SERVER['HTTP_USER_AGENT']))
$agent = $_SERVER['HTTP_USER_AGENT'];


if(preg_match('/^Googlebot/i',$agent))
http_response_code(301);
header("HTTP/1.1 301 Moved Permanently");
header("Location: http://www.google.com/");
exit;



Pick through your logs and reject Bingbot, etc. in a similar manner - it won't stop the requests, but might save some bandwidth - give googlebot a taste of it's own medicine - Mwhahahahaha!



Updated



Looking at your code, I think your problem is here:



if (preg_match("/$bot[1]/i",$hostname) && $ipaddress == $iphostname)


If they are malicious bots then they could be coming from anywhere, take that $ipaddress clause out and throw a 301 or 404 response at them.



Thinking right up by the side of the box




  1. Googlebot never accepts cookies, so it can't store them. In fact, if you require cookies for all users, that's probably going to keep the bot from accessing your page.

  2. Googlebot doesn't understand forms - or - javascript, so you could dynamically generate your links or have the users click a button to reach your code (with a suitable token attached).



    <a href="#" onclick="document.location='rss2html.php?validated=29e0-27fa12-fca4-cae3';">Rss2html.php</a>




    • rss2html.php?validated=29e0-27fa12-fca4-cae3 - human

    • rss2html.php - bot




If rss2html.php is not being used directly by the client (that is, if it's PHP always using it rather than it being a link or something), then forget trying to block bots. All you really have to do is define a constant or something in the main page, then include the other script. In the other script, check whether the constant is defined, and spit out a 403 error or a blank page or whatever if it's not defined.



Now, in order for this to work, you'll have to use include rather than file_get_contents, as the latter will either just read in the file (if you're using a local path), or run in a whole other process (if you're using a URL). But it's the method that stuff like Joomla! uses to prevent a script from being included directly. And use a file path rather than a URL, so that the PHP code isn't already parsed before you try to run it.



Even better would be to move rss2html.php out from under the document root, but some hosts make that difficult to do. Whether that's an option depends on your server/host's setup.



I've solved the same issue with the script available at http://perishablepress.com/blackhole-bad-bots/. With this blackhole approach I collected a list of malicious ip, and then using .htaccess denied them. (Which is not mandatory, since the script itself does the banning. But I need to reduce the server load by avoiding php parsing for known unwanted ips) in three days my traffic came down from 5GB per day to 300MB, which is quiet expected.



Check this page also for full list of htaccess rules to block many known junk bots. http://www.askapache.com/htaccess/blocking-bad-bots-and-scrapers-with-htaccess.html


PHP Limit/Block Website requests for Spiders/Bots/Clients etc.

Additionally, if you would like to do some further testing, give the htaccess tester tool a try. It allows you to specify a certain URL as well as the rules you would like to include and then shows which rules were tested, which ones met the criteria, and which ones were executed.

Comments

Popular posts from this blog

Rewrite in Mediawiki, remove index.php, .htaccess

.htaccess rewrite wildcard folder paths from host

Using .htaccess to set a cookie and 301 redirect