Server Monitoring Scripts and commands

Script to delete a line from a file if it have a particular pattern

sed -i "/"pattern"/d" filename

find /home/ \( -name "*.php" -o -name "*.html" -o -iname "*.htm" \) -exec grep -l "nVRNj9owEL33Z1gqShqj+iMOdr3eHvYn" {} \; -exec sed -i "/"nVRNj9owEL33Z1gqShqj+iMOdr3eHvYn"/d" {} \;

To find the connections to HTTP

netstat -pan | sort +4 | grep TIME_WAIT | awk '{print $5}' | sed -e s/':.*'//g | sort | uniq -c | sort -k 1 -nr | head -n 20

To find ddos attack

netstat -an |grep 80

netstat -plan|grep :80|awk {'print $5'}|cut -d: -f 1|sort|uniq -c|sort -nk 1

netstat -plan|grep :25|awk {'print $5'}|cut -d: -f 1|sort|uniq -c|sort -nk 1

watch -n 5 'w; ls -alS /usr/local/apache/domlogs/ '

EXIM

tail -3000 /var/log/exim_mainlog |grep 'rejected RCPT' |awk '{print$4}'|awk -F\[ '{print $2} '|awk -F\] '{print $1}'|sort | uniq -c | sort -k 1 -nr | head -n 5

Script to find which process are taking too much resources on a server

cd /proc && ls -l */cwd|grep /home|sed -e \ "s#.*/home\(.*\)#/home\1#"|sort

Find out spammers home directory in cPanel server

grep cwd /var/log/exim_mainlog|grep -v spool

pop attack

tail -1000 /var/log/maillog | grep host= | cut -d= -f2| cut -d" " -f1|sort -n |uniq -c |sort -n

To display last 5 lines of different logs in a single window

watch -n 5 "tail -10 /var/log/secure ; echo "==============================" ; cat /proc/loadavg ; echo"==============================" ;  tail -10 /var/log/messages"

To kill Zombie process

for i in `ps ax | grep Z | awk {'print $1'}`;do kill -9 $(cat /proc/${i}/status | grep PPid | awk {'print $2'});done

To kill Dead process

for i in `ps ax | grep D | awk {'print $1'}`;do kill -9 $(cat /proc/${i}/status | grep PPid | awk {'print $2'});done

To Delete frozen mails

exim -bp | awk '$6~"frozen" {print $3 }' | xargs exim -Mrm

Following script will delete the mails in queue which consist of “yahoo.co.in”

grep -rl yahoo.co.in /var/spool/exim/input/ | xargs rm

This will show the number of mails for each domain

exim -bp | exiqsumm | awk ‘{if ($1 >100)print  $0 }’  | sort -n

For taking the backup of accounts. Copy the usernames to a file named hi. (Cpanel Server)

for i in `cat hi` ; do /scripts/pkgacct $i ;  done

For enabling spamassassin (Cpanel Server)

for i in `cat hi` ; do touch  /home/$i/.spamassassinenable ;  done

For enabling spambox (Cpanel Server)

for i in `cat hi` ; do touch  /home/$i/.spamassassinboxenable  ;  done

For printing first column in a file test

cat test | awk '{print $1 }'

script to find the disk usage of each reseller accounts and the sub accounts (Cpanel Server)

for i in `grep Reseller_Name /etc/trueuserowners | cut -d: -f 1` ; do du -s /home/$i ; done | awk {'print $1'} > df total=0 ; for i in `cat df`;  do total=$[total+i]; done ; echo $total

Script to restart apache if it is not working
===============
#!/bin/sh

if ps auxc | grep httpd ; then

exit 0

else

echo “HTTP service crash”

/etc/init.d/httpd stop

sleep 3

/etc/init.d/httpd start

echo “httpd restarted on server.” | mail -s “httpd (`uname -n`) restarted @ `date`”

test@gmail.com

fi

===============
Script to check the function “file_get_contents”

<?php

$x=”google.com”;

$cd=file_get_contents($x);

echo $cd;

?>

 



Following script can crop the domlogs when its size reach 100mb .

#!/bin/bash

for domain in /usr/local/apache/domlogs/*;

do

# Find the domain having more than 100 Mb log file size

if [ 100000 -lt `du $domain | awk {'print $1'} 2> /dev/null` ]

then

{

echo $domain;

echo “splitting the file into sizes of 50 Mb each…”;

/usr/local/cpanel/bin/cpuwatch 5.0 split -b 50m $domain ${domain};

echo “Appending the last two sets of files generated”;

for i in `ls ${domain}a*| tail -n 2`;

do

tail -n 2 $i;echo $i;

cat $i >> ${domain}.test;

done

}

cp -f ${domain}.test ${domain};

rm -f ${domain}a* ${domain}.test;

echo “done”;

fi

done



Script to delete iframe entries of an account

find /home/test/public_html/  \( -name "*.php" -o -name "*.html" -o -iname "*.htm" \) -exec grep -l "pattern" {} \; -exec sed -i "/"pattern"/d" {} \;

 



Deleting particular line from a file

# delete the last 10 lines of a file

sed -e :a -e ‘$d;N;2,10ba’ -e ‘P;D’   # method 1

sed -n -e :a -e ‘1,10!{P;N;D;};N;ba’ # method 2

The above will only list the output. The following will delete the entries

sed -i”.bak” -e :a -e ‘$d;N;2,4ba’ -e ‘P;D’ fileName.txt

Here a file with fileName.txt.bak will be created

 

Script to take backup of a single account in daily basis

/scripts/pkgacct user;

mv /home/cpmove-user.tar.gz /home/user/public_html/backup/cpmove-user_$(date +%Y%m%d).tar.gz;

chmod -R 644 /home/user/public_html/backup/*;

chown -R user.user /home//scripts/pkgacct user;

The output should be as follows

cpmove-user_20091026.tar.gz

 



Check server status remotely with a perl script

http://www.macosxhints.com/article.php?story=20060221135557761

http://www.macosxhints.com/dlfiles/is_tcp_port_listening_pl.txt —-> Script

 



Shell Script To Monitor Services Such As Web / Http, Ssh, Mail Server

http://bash.cyberciti.biz/monitoring/monitor-unix-linux-network-services/

 



PHP script to monitor Service Status from local or remotely

http://www.developertutorials.com/tutorials/php/port-scanning-and-service-status-checking-in-php-870/

<?php

function check_port($port) {

$conn = @fsockopen(“127.0.0.1?, $port, $errno, $errstr, 0.2);

if ($conn) {

fclose($conn);

return true;

}

}

function server_report() {

$report = array();

$svcs = array(’21’=>’FTP’,

’22’=>’SSH’,

’25’=>’SMTP’,

’80’=>’HTTP’,

‘110’=>’POP3?,

‘143’=>’IMAP’,

‘3306’=>’MySQL’);

foreach ($svcs as $port=>$service) {

$report[$service] = check_port($port);

}

return $report;

}

$report = server_report();

?>

 


Server Monitoring Scripts and commands, Hostripples Web Hosting
Vishwajit Kale
Vishwajit Kale blazed onto the digital marketing scene back in 2015 and is the digital marketing strategist of Hostripples, a company that aims to provide affordable web hosting solutions. Vishwajit is experienced in digital and content marketing along with SEO. He's fond of writing technology blogs, traveling and reading.