Tuesday, November 29, 2022
Google search engine
HomeLinux TutorialsHow to find DDOS Attack on cPanel

How to find DDOS Attack on cPanel

If your VPS or server using WHM/cPanel load suddenly increases much higher than normal, it could be a DDOS attack. This article is going to be thoroughly detailed in covering the different methods of checking logs; why you should check them and how to protect yourself from further incident.

Why Check the Domlogs

If you suspect that there are traffic-based attacks being carried out against your sites, you’ll need to check the domain access logs, or domlogs. Doing so will require that you are the root user and have SSH access. We will discuss the most common types of abuses that may occur, why, and how to identify them via the domlogs. If you can identify the abuse that is occurring, you can then put in place measures to counter the attack. We will also discuss implementation of different counter-measures against these attacks.

Different types of Abuses and Common Resource Consumers to Look for include:

  • wp-login bruteforcing
  • xmlrpc abuse
  • ‘bad’ bot traffic
  • dos/ddos
  • admin-ajax/wp heartbeat frequency (not a malicious abuse, but can definitely be over active and impact server resources)
  • wp-cron (not exactly an abuse, but can definitely be configured to use less resources)
  • calls made to trigger spam scripts
  • other malicious requests
  • vulnerability scans ran against your site(s)

Some reasons to check the domain access logs for abuses and RAM usage include the following:

  1. Out of Memory errors have been occurring on your server, or the resource usage has increased drastically (RAM, bandwidth, CPU)
  2. You have noticed a lot of emails coming from your sites’ scripts
  3. You are seeing a lot of spam being sent from the server
  4. You are seeing a large, unexpected increase in traffic to the site
  5. You suspect a hack (website compromise)
  6. Apache connections are hanging due to MaxRequestWorkers having been exceeded

Apache Domain Access Logs Format

On a cPanel server, the LogFormat is defined in /usr/local/apache/conf/httpd.conf using the Combined Log Format with Vhost and Port as follows

LogFormat "%v:%p %h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-Agent}i\"" combinedvhost

Where each of these variables are defined as follows:

  • %v – Vhost/Domain
  • %p – The canonical port of the server serving the request.
  • %h – Remote hostname of the client making the request. Will log the IP address if HostnameLookups is set to Off, which is the default
  • %l – Identity of the client determined by identd on the clients machine (hyphen indicates that the information was not available)
  • %u – userid of the person requesting the document as determined by HTTP authentication
  • %t – The time that the request was received ( can be configured as desired)
  • "\%r\" – The client’s request
  • %>s – The Apache Status returned
  • %b – The size of the request in bytes, excluding the size of the headers.
  • \"%{Referer}i\" – The referer
  • \"%{User-Agent}i\" – The user-agent
  • combinedvhost – Nickname associating it with the particular log format “Combined Log Format with Vhost and Port

Trace log and deep troubleshoot

To find out which IPs did that do the following

Option 1 : If you know which domain is attacked. SSH to your server & issue the following command. Make sure you replace “DOMAIN” with your domain name. If you are using cPanel/WHM and the domain is not the primary domain, normally it will be the sub domain of the primary domain.

less /usr/local/apache/domlogs/DOMAIN | awk '{print $1}' | sort | uniq -c | sort -n

Option 2 : If you dont know which domain is attacked. SSH to your server & issue the following command. Option 1 if preferable especially if your server is very busy has many domain. It will take quite sometimes to process the log file. You can check by issuing “top -c” command to find out which domain consume the most resources.

less /usr/local/apache/logs/access_log | awk '{print $1}' | sort | uniq -c | sort -n

Both of the option will give the ip and number of connections in the descending order. For example:

.....
.....
.....
.....
17843 56.xx.xx.xxx
19234 66.xxx.xx.xxx
234578 156.xx.xx.xx

WordPress Login Bruteforcing

In the above case we can see too many connections from those ips and it is abnormal. You can block these ips in the firewall such as ConfigServer Firewall, fail2ban, shorewall or etc.

The command I use the identify this attack for the current day on a cPanel CentOS server running Apache is the following

grep -s $(date +"%d/%b/%Y:")  /usr/local/apache/domlogs/* | grep wp-login.php |  awk {'print $1,$6,$7'} | sort | uniq -c | sort -n

XMLRPC Attacks

To stop/prevent XMLRPC attacks, you need to protect the xmlrpc.php file. There are a few considerations with this, though. First of all, some plugins rely heavily on xmlrpc,. Once such popular plugin is Jetpack. So, first you need to decide if you use XMLRPC on your site. If you do not, great! Then that is really easy. Just block access to XMLRPC via .htaccess:

<Files xmlrpc.php>
  ErrorDocument 403 default 
  order deny,allow
  deny from all
</Files>

If you do use Jetpack, you could block access conditionally depending on whether the IP is Jetpack’s IP or not:

<Files xmlrpc.php> 
  ErrorDocument 403 default 
  order deny,allow
  allow from 185.64.140.0/22
  allow from 2a04:fa80::/29
  allow from 2620:115:C000::/44
  allow from 76.74.255.0/25
  allow from 76.74.248.128/25
  allow from 198.181.116.0/22
  allow from 192.0.64.0/18
  allow from 64.34.206.0/24
  deny from all
</Files>

If you choose to use .htaccess rules to block this attack, you will still find that automated attacks will continue to for some time afterwards, filling the logs with “client denied by server configuration” errors. You can limit the work Apache is doing filtering and blocking these requests by utilizing the LF_APACHE_403 setting in CSF/LFD firewall to block an IP after it has exceeded a specified limit of 403 requests within the specified time frame. This setting should be high and you should account for bots in your firewall’s whitelist and ignore files if you choose to use this option.

To identify this type of attack in the domain access logs, you simply need to look for POST requests to xmlrpc.php file within the suspected time frame and sort the data in a readable format. I use the following command to identify whether any XMLRPC attack has occurred for the current day in a cPanel/CentOS server running Apache:

grep -s $(date +"%d/%b/%Y:")  /usr/local/apache/domlogs/* | grep xmlrpc |  awk {'print $1,$6,$7'} | sort | uniq -c | sort -n

Bot Traffic

There are two types of bots as far as most admins are concerned: bad bots and good bots. Good bots are great. To limit how often a bot crawls your site, you can define a crawl delay in your robots.txt file for all bots. The following contents of the robots.txt file would limit all bots to crawling every 3 seconds:

User-agent: * 
Crawl-delay: 3

Unfortunately, not all bots obey these robots.txt file requests. These are often termed ‘bad’ bots. When you are being bombarded by ‘bad’ bot traffic, you can block those ‘bad’ bots via .htaccess. The following is an .htaccess rule that blocks bots that many have deemed ‘bad’:

RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^.*(Ahrefs|MJ12bot|Seznam|Baiduspider|Yandex|SemrushBot|DotBot|spbot).*$ [NC]
RewriteRule .* - [F,L]

To find how often a bot is hitting your sites for the current day, you can use the following command:

grep -s $(date +"%d/%b/%Y:") /usr/local/apache/domlogs/*  | grep "bot\|spider\|crawl" | awk {'print $6,$14'} | sort | uniq -c | sort -n | tail -25

If you have a lot of sites on your server, and a lot of bot traffic, you may want to start by editing the .htaccess or robots.txt file for those with the most bot traffic first so that you have the most impact on resources with the least amount of effort. You can use the following command to search for common bots and see which sites are being hit by them the most (if you are seeing a lot of bots not listed in this command from the previous command to find bot traffic, then be sure to add them to the search command below):

grep -s $(date +"%d/%b/%Y:") /usr/local/apache/domlogs/* | grep -i "BLEXBot\|Ahrefs\|MJ12bot\|SemrushBot\|Baiduspider\|Yandex\|Seznam\|DotBot\|spbot\|GrapeshotCrawler\|NetSeer"| awk {'print $1'} | cut -d: -f1| sort | uniq -c | sort -n

To determine whether these requests were from Googlebot, I needed to get a list of the IPs making these requests, and then run a WHOIS on them. The following command will generate a list of IPs in order of frequency:

grep -s $(date +"%d/%b/%Y:") /usr/local/apache/domlogs/* | grep bot  | grep -i googlebot | awk {'print $1'} | cut -d: -f2 | sort | uniq -c |sort -n

Spam Scripts

The following prints out the site, the external IP making the request, and the script being requested in order of increasing number of requests per IP/site:

grep "sotpie" /usr/local/apache/domlogs/username/* | grep POST | awk {'print $1,$7'} | sort | uniq -c | sort -n

Let’s say that you dot not have the LFD alert to let you know from where the script is located on the server. You could search first for the location of any scripts responsible for sending email via the Exim mainlog instead. You could use this:

grep "cwd="  /var/log/exim_mainlog | awk -F"cwd=" '{print $2}' | awk '{print $1}' | sort | uniq -c | sort -n

Or, you could just search for POST requests for the current day via the domlogs, and then exclude any common POST requests:

grep -s $(date +"%d/%b/%Y:")  /usr/local/apache/domlogs/* | grep POST | grep -vi "wp-cron\|admin-ajax\|xmlrpc\|wp-config" | awk {'print $1,$7'} | sort | uniq -c | sort -n

You will likely have to further filter the output, but you will see what types of POST request are occurring for your sites and be able to determine if any scripts are being called heavily, indicating the spam script source.

There are many ways to protect against this type of abuse. It really just comes down to securing your sites thoroughly so that no malicious scripts can be uploaded and so that no legitimate functionality can be abused. An example of a legitimate script being abused would be a contact form being used to send spam. Just checking the Exim mainlog for mail-sending scripts and their frequency and POST requests logged in the domlogs could confirm that this is occurring.

grep -s $(date +"%d/%b/%Y:") /usr/local/apache/domlogs/username/domain.tld* | grep POST |awk {'print $1,$6,$7'} | sort | uniq -c | sort -n

keep try..

YOU CAN SUPPORT DEVNINJA WITH A CUP OF COFFEE

As we continue to grow, we would wish to reach and impact more people who visit and take advantage of the guides we have on our blog. This is a big task for us and we are so far extremely grateful for the kind people who have shown amazing support for our work over the time we have been online. to search or browse the published articles available FREELY to all.

If you like what you are reading, please consider buying us a coffee ( or more ) as a token of appreciation.

Support Us

DevNinja
DevNinja
System & Network Administrator Ninja
RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

two × 2 =

- Advertisment -
Google search engine

Most Popular

Recent Comments