I took the 1st one this morning and the 2nd one few hours later. It was filling up my vpss numtcpsock, which slowed down my vps dramtically. Any tips or suggestions? Is there a way to lower the number of numtcpsocks
early morning
Implemented WordPress a little while ago via cPanel's Fantastico widget -- vanilla implementation.
Just about every day, I get spam comments in the blog's Inbox for moderation.
Was wondering if folks had general tips on how to prevent or minimize this sort of nuisance and make the blog less bot-accessible, and/or where I might read up on ways to do so.
Bots with an IP of 74.6.*.* are endlessly crawling my forums, never seeming to be able to finish their task and using up huge amounts of bandwidth. Ideally I would like to be able to turn them away at the gate but allow others to view the site.
If I block the IP via the control panel they still visit presumably just getting error pages but still using up lots of bandwidth.
Looking trough my logs I found something that bothers me, there are bots who keep doing requests on my website with pages like /admin or /secure to find vulnerabilities. It's making about 5-6 requests for unexisting pages every second until it comes to the end of it's dictionary (the pages are even sorted in alphabetical order,
Is there some way to let my Apache server block access to these bots when they make X attemps to see a page who does not exists in a short amount of time? A bit like iptables reject connection if someone tries to log in but fails to do so too many times.
I have a dedicated RHEL server with cPanel and my server loads spikes about +0.4 (out of 2.0) for about 30 mins every 4-6 hours or so. My regular server load is 0.01, because there is barely any traffic on the server yet, but by looking at my top processes in WHM, I can see that the processes that are spiking the Server Load when it is high, is something like:
...something along these lines. And a lot of times there are 10-20 of these sshd processes at one time.
My server is managed and my dedicated server engineer said it was probably a bot trying passwords. He took one of the IP's, said it was from Taiwan, and blocked that IP in iptables.
However, this is still happening constantly with different IP's. Is there a way to prevent this from happening? I'm the only person (and my host) who should be able to login to my server using SSH... however, I don't have a static IP and I work from multiple locations, so only allowing certain IP's won't work for me.
First off, is this normal? Or am I being attacked or what? What can I do to remedy this? It seems the bots haven't successfully logged in, but they are spiking my server load which is NOT what I want.
I am having a problem with blocking bots using .htaccess. I think I tried all possible syntax variants, yet all the bots that I am blocking get HTTP 200 response instead of 403 (I can verify it using access log).
I am using Apache 2.4 running on Ubuntu 14.04.2 with Plesk 12.0.18.
My AllowOverride is set to allow the use of .htaccess files, so .htaccess file gets loaded: when I make an error in .htaccess sysntax I can see the error in the error log and the webpages don't load. Besides, I have some "Deny from [IP address]" directives in the .htaccess and I see that these IPs get HTTP 403 response when access my site.
I spent hours trying different variants of .htaccess syntax (see below) and neither seems to work...
variant 0:
SetEnvIfNoCase User-Agent LivelapBot bad_bot SetEnvIfNoCase User-Agent TurnitinBot bad_bot Order allow,deny Allow from all Deny from env=bad_bot
If I know the IP range that I want to block the best option is to block it with IPTABLES. This works well when you want to block entire countries. But what happens when you want to block specific IPs rather than ranges? Is iptables still more effective than "deny from [IP]" in .htaccess? I read that you don't want iptables to grow too big as it slows performance, but I guess it is still more effective than having big .htaccess..?
When it comes to blocking spam bots or referrers, robots.txt is just a suggestion for bots, when I looked at my traffic logs I noticed that most bots don't even look at robots.txt file. As far as I understand the only option here is to use .htaccess
1. I am currently using this in my .htaccess: SetEnvIfNoCase User-Agent *ahrefsbot* bad_bot=yes SetEnvIfNoCase Referer fbdownloader.com spammer=yes ... SetEnvIfNoCase Referer social-buttons.com spammer=yes Order allow,deny Allow from all Deny from env=spammer Deny from env=bad_bot
2. Apparently, there is another approach as per below: # Deny domain access to spammers RewriteEngine on RewriteBase / RewriteCond %{HTTP_USER_AGENT} queryseeker [OR] RewriteCond %{HTTP_REFERER} ^(www.)?.*(-|.)?adult(-|.).*$ [OR] ... RewriteCond %{HTTP_REFERER} ^(www.)?.*(-|.)?sex(-|.).*$ RewriteRule .* - [F,L]
Which approach is better #1 or #2? Any better alternative?
Finally, somebody suggested that you need to have both (as per example below). Is it true?