It’s important to ensure system is kept virus-free, whether it be a server, workstation, or personal desktop computer. This tends to be easier with windows as we’re all familiar with the mountain of free and paid anti-virus programs available. You simply install one and it usually scans and monitors your system on its own. There are some options like this for Linux, but generally, as with any Linux system, you get many more configuration options to have it run how you want it to.
We covered how to install ClamAV on CentOS 7 a few months back. Please see “Installing ClamAV Anti-Virus on CentOS 7” for more information. This quick article will be focused on the actual scanning.
Our servers run many different software applications serving many different users. These users have permissions to upload and install PHP-based software in their cPanel account as they see fit. Unfortunately, many of them do not keep software up to date, do not use secure passwords, use unmaintained poorly-coded plugins, etc. This ends up making them vulnerable to malicious activity – the most common being email spam without the person’s knowledge.
Considering we want to scan for and remove any issues as quickly as possible, but do not want to scan hundreds of gigs of data every night, we came up with the simple little script below.
for folder in /home/*/public_html
find $folder -type f -newerct '2 days ago' | grep ".php" >> /tmp/filelist.txt
find $folder -type f -newerct '2 days ago' -name ".*" | grep ".php" >> /tmp/filelist.txt
clamscan -i -f /tmp/filelist.txt --move=/quarantine
ClamAV is capable of scanning a provided list of files rather than an entire directory. We used the find command to find all files modified in the past 48 hours and are only interested in files ending in .php (for this particular scan). Additionally, we have a second find command to find hidden files, since that’s a common method of spammers hiding their code from view. The list is generated in the /tmp directory, passed to ClamAV for scanning, then removed at the end. The entire thing is scheduled to run every 6 hours through cron and piped to a log file. Of course, we also routinely run full system scans of all files, just not nearly as often.
We found this work work very well for us, hopefully it helps.