June 2010 Archives

Monitoring security

| | No TrackBacks
Security camera

Is the security for real?

Do you keep the security level high enough?

The question seems simple. But if the answer is positive, the next answer will be "What is high enough?"

The security camera on the photo is a mock-up. An imitation, used to deter possible criminals. It doesn't watch anything, even if it's installed. Are your security systems for real, is the monitoring you perform for real? The answer is not obvious, even if you have installed a network monitoring software piece and know exactly, what to monitor.

Nowadays, it's not enough to monitor the servers alone, whatever services on them you watch. The fact the service is available and replies with expected data doesn't mean it's in good state.

For example, if the pieces of software installed is of old version and can be compromised, the security is weak. So monitoring vital system themselves is not enough.

Hidden flaws of security

The problem of using insecure software isn't limited with insecure, out-of-date functions. At times less secure configuration can be a possible reason of system malfunction of failure. How to determine there are flaws in configuration?

The two aspects of problem have no fully automated solution. Several pieces of software do have a mailing list or other means to notify of out-of-date components or security threats. Most, however, do not. The only means to be in course of events is to follow all the news on security-related forums and software sites and react immediately to every new threat published.

Talking of studying log files, it's relatively easy to detect how much time a given string is found in a given log file (say, ssh log file registers all login attempts, so if anything strange happens, it's better to be notified as soon as possible).

Also, a good advice is to install intrusion detection piece of software such as Snort, update its rules on a regular basis and use its notification features to measure the security risk level and/or signal alerts.

Monitoring software may be the dashboard of all your security setup; it's relatively easy to report all the important news to the single command centre and raise a relevant alert condition when necessary.

Security isn't a one-time set of action. It's philosophy, discipline and everyday, routing work on researching the security world and being alerted prior to the possible flaws are exploited.

Keep it simple, sage

| | No TrackBacks
jigsaw-keyhole.png

Reinventing the wheel

Simpler solutions aren't always obvious. When looking for a way to monitor a parameter, one is always tempted to re-invent the wheel, e.g. to create a custom script every time the exiting monitor types are not supported.

Definitely, it's not the best idea. First of all, it takers time. Second, it might be an inefficient solution. Third, it might be non-portable: if you wish to cerate the same monitor type for another host, you might have to create a similar script from scratch.

When it comes to monitoring the simplest parameters of the server, such as CPU load, memory usage and so on, there is no need to create a complex scheme of running remote scripts/applications and reporting data back to the monitor.

The magic abbreviation is SNMP, Simple Network Management Protocols. Let's explain, briefly, how it can be used to monitor a number of system-level parameters of a server. A Linux-powered server is assumed, although in this given case most of other operating system can be monitored through the same facility.

SNMP

SNMP daemon isn't running by default; refer to longer how-tos such as Monitoring server performance for more details on installing the daemon.

Take care when setting up the daemon: it can support a number of protocols; if you don't wish to handle security-relate issues when using version 3of SNMP, you may use v1 or v2c, but keep in mind their security level is but basic, and if you don't restrict, by other means, who is granted access to the daemon (restrict to localhost, if monitoring from the same server), you are virtually giving all the important data to whoever wish to gain unauthorized access to the server.

Try to restrict access to read-only, there is hardly need to grant write access to monitoring software.

It's easy to find the OIDs (object identifiers) of the data you wish to monitor; i.e., to allow viewing general system information such as RAM usage, grant access to .1.3.6.1.4.1 hierarchy.

Most of the data you could use are numeric; thus, the SNMP-based monitors available at IP Host Network Monitor software can be used to create very precise monitors able to reflect the level of resource usage without creating sever-based scripts and communicating with them.

Write access

Note that certain variables (OIDs) can be writable, thus allowing to control, to some extent, the device your monitoring software is  connected to via SNMP.

Note only that SNMP is supported by many devices, such as routers, and it can be used, say, to programmatically restrict or even close access to them/set usage limits base upon parameters being monitored. Say, you can restrict or limit transfer speed for ethernet cards if a data transferred cross a limit. However, it is strongly advised that the network monitoring software does never modify any device's settings it monitors.

Data cobweb: let computers talk to computers

| | No TrackBacks
Matrix screen

Inhuman interface

Monitoring software often deals with human-readable data. Whether you are monitoring a Web site availability, or a search results output, you have to parse data and analyse it. Although it may seem to fit all the needs, it isn't so.

To begin with, Web sites most often are indeed created for human beings. I.e., content is generated to suit human visitors, to make them comfortable, to represent data in a manner most useful for human readers.

Also, these pages may be quite complex and their creation involves a number of services, such as database engines. In case a monitor watches regular Web pages, the site engine performs quite a lot of useless actions. To satisfy a monitor, simpler and much less resourceful approach could be used.

Raw and pure data

I suppose you understood the idea: let's generate simpler, shorter Web page prepared for monitoring software, for computer processing. First, we won't use that many resources; second, we can arrange data in a manner that makes their analysis much more efficient.

The implementation may differ. Myself, I use a single string that contains a number of characters, every one for a service I plan to watch. Processing them becomes a simple task; and if the such a 'state summary' isn't generated for every request, it can be further time and resources saving.

Now imaging monitoring other services. Opening ports for them, or creating tunnels, secured connections able to traverse firewalls and/or other restrictions can be very unsafe and boring. Instead, we can use the same approach: generate state reports as simple files that can be accessed via other protocol, such as HTTP, and contain enough data to notify the monitoring piece of software of all important state changes.

An additional security can be achieved, if the data sent are encrypted, and/or if a special data representation is used, obscuring the actual data received. If this encrypted report page is cached and only generated

You will make all the data read-only, thus preventing any possible unauthorized data access, or allowing the monitoring software to read intrinsics of your site.

You will only have to parse "usual", human-readable pages in case you monitor somewhere else' sites.

Common traits in network monitoring and DoS

| | No TrackBacks
The Siege of Antioch

Pros and cons of monitoring

The disadvantage of not monitoring network resources is obvious: in case of any connectivity or functionality problem, you are not warned, thus a number of unpleasant consequences may follow.

All right, so we accept we need monitoring. The next question is, whether this can harm your resources (site, for example) in any way? The not so obvious answer is yes, it can.

Let's imagine we have a site and let's analyze who accesses it and how the site can react to this.

Bots, your zealous readers

Who's reading your site, mostly? Unless yours is a popular blog with thousands of readers, the answer is: bots. Search spiders, RSS bots, whatever else. Human readers can at times be in severe minority.

Spiders can be a nuisance, especially new ones, mostly misbehaving. They do not always obey robots.txt rules, thus can exhaust your bandwidth or overload server, and cause many other malfunctions. However, these often supply their identifier ('User-Agent' header) and can be told from all the other readers, especially if IP addresses of crawlers are known.

Faceless monitoring

Here we come to a very interesting conclusion: if a monitoring process doesn't behave like a browser, or at least like a well-known bot, if it sends no HTTP headers and connects to a server periodically and at known time intervals, it can be viewed as harmful.

Anonymous spiders, those harvesting email addresses and other information, are seldom written in a well-behaving manner. They do not respect robots.txt (of course),they have a habit of overwhelming server with requests.

In other words, if your monitoring process doesn't introduce itself with User-Agent, does access the site too often and doesn't throttle data transfer speed, it can be viewed as data harvester or even an attacker. A common DoS is exhausting bandwidth, when a number of processes request large pages many a time in quick sequence.

So, if you plan to use monitoring HTTP/HTTPS resources, make sure that
— your monitoring software supplies HTTP headers, providing unique id
— your monitoring software does throttle the connection speed and does not exhaust too much of channel speed
— your monitoring software adds small random intervals toits polling schedule

About this Archive

This page is an archive of entries from June 2010 listed from newest to oldest.

April 2010 is the previous archive.

Find recent content on the main index or look in the archives to find all content.