As a follow up to our syslog sever documentation, we wanted to also document how to enable encryption on the syslog stream since private information, including credentials, could be getting passed from client to server in the logs. In this document, we will be using self-signed certificates, including a self-generated CA certificate.
The number of servers in our farm is continuing to grow. It’s becoming more and more difficult to monitor them all as closely as we would like. We decided that it’s time to set up as centralized location for log files to keep a closer eye on everything and allow us to easily develop our own reports and triggers against the logs. For this, we will be using rsyslog with a 3rd party program, LogAnalyzer. For the purposes of this document, we will assume that you already have a MySQL database configured running on a separate server.
As the battle to fight spam continues on the mail relay that serves our shared hosting services, we started working on a new way to stop spam from being sent. We have some customers who end up with compromised sites over and over again. These customers often don’t care that their site is infected get get irritated when we suspend their accounts. Some of them have asked us to simply remove email permissions from their accounts.
We have been fighting a lot of spam recently on our web hosting service. We decided the best route to go was to set up a mail gateway on a separate server and run spam scans on all outgoing mail with SpamAssassin to discard junk. This helps prevent our servers from appearing on blacklists and helps keep customers happy. This tutorial walks through the process we used to set up our mail gateway. We are running CentOS 6.6 x64 with postfix and SpamAssassin.
One of the most important aspects of system administration these days is monitoring the traffic on your server. Many hosting providers impose limits on how much you can transfer per month. If you go over these limits, it’s not really a problem from the data center’s point of view, but you will usually end up getting slapped with a fee per GB you transfer in excess of your limits. This can add up quickly if you have a busy site.
We use materialized views in Oracle to copy data from our production database to our data warehouse for Cognos reporting. These materialized views all refresh over night. This works great, but over time, some of the tables have become quite large. One troublesome table is over 12 million rows and was taking an hour to refresh each night.