Monitoring software often deals with human-readable data. Whether you are monitoring a Web site availability, or a search results output, you have to parse data and analyse it. Although it may seem to fit all the needs, it isn't so.
To begin with, Web sites most often are indeed created for human beings. I.e., content is generated to suit human visitors, to make them comfortable, to represent data in a manner most useful for human readers.
Also, these pages may be quite complex and their creation involves a number of services, such as database engines. In case a monitor watches regular Web pages, the site engine performs quite a lot of useless actions. To satisfy a monitor, simpler and much less resourceful approach could be used.
Raw and pure data
I suppose you understood the idea: let's generate simpler, shorter Web page prepared for monitoring software, for computer processing. First, we won't use that many resources; second, we can arrange data in a manner that makes their analysis much more efficient.
The implementation may differ. Myself, I use a single string that contains a number of characters, every one for a service I plan to watch. Processing them becomes a simple task; and if the such a 'state summary' isn't generated for every request, it can be further time and resources saving.
Now imaging monitoring other services. Opening ports for them, or creating tunnels, secured connections able to traverse firewalls and/or other restrictions can be very unsafe and boring. Instead, we can use the same approach: generate state reports as simple files that can be accessed via other protocol, such as HTTP, and contain enough data to notify the monitoring piece of software of all important state changes.
An additional security can be achieved, if the data sent are encrypted, and/or if a special data representation is used, obscuring the actual data received. If this encrypted report page is cached and only generated
You will make all the data read-only, thus preventing any possible unauthorized data access, or allowing the monitoring software to read intrinsics of your site.
You will only have to parse "usual", human-readable pages in case you monitor somewhere else' sites.