Subscribe Now

* You will receive the latest news and updates on the Canadian IT marketplace.

Trending News

Blog Post

Clearing the path for logging
SECURITY SHELF

Clearing the path for logging 

Security logs are not a new requirement. The Trusted Computer System Evaluation Criteria (commonly referred to as the Orange Book) was published in 1985 and explained, “A trusted system must be able to record the occurrences of security-relevant events in an audit log.”Thirty years later, many products still do not log security-related events.

Back in 1985, it made sense to suggest that administrators regularly review logs. However, the volume of logs produced today makes doing so virtually impossible. As a result, a growing category of products exist to collect, search, and analyze logs. The newest products apply Big Data techniques to deal with high log volumes.

From a security perspective, three significant challenges exist: Obtaining good data in formats that can be programatically processed, identifying anomalous events and behaviour patterns that are indicative of a security problem, and quickly taking action.

Simply parsing logs from different products and normalizing them into a standard format for analysis is a tremendous challenge. As Karen Kent and Murugiah Souppaya explained almost ten years ago in NIST Special Publication 800-92:

“Many of the log source types use different formats for their logs, such as comma-separated or tab-separated text files, databases, syslog, Simple Network Management Protocol (SNMP), Extensible Markup Language (XML), and binary files. Some logs are designed for humans to read, while others are not; some logs use standard formats, while others use proprietary formats. Some logs are created not for local storage in a file, but for transmission to another system for processing; a common example of this is SNMP traps. For some output formats, particularly text files, there are many possibilities for the sequence of the values in each log entry and the delimiters between the values (e.g., comma-separated values, tab delimited values, XML).”

While several standards have been proposed in the past three decades, very little progress has been made as vendors have yet to achieve consensus and adopt them.

From a technical perspective, this lack of progress is difficult to explain. Authentication and authorization attempts can be described in terms of a source, destination, subject, object, and result. Every access to a file, printer, or other device can be described in terms of read, write, create, delete, or rename. Virtually every firewall reports on source, destination, and disposition. Anti-virus products know which user (subject) is accessing which file (object) when a malware signature is detected. Data Loss Prevention products have similar information. Operating systems know every process that is started, which process started it, the associated user-id, and what resources it accesses.

In a world where sophisticated protocols and data structures are created and refined on a regular basis, there is no technical reason that every basic action a user takes —from logging on their PC at the start of the workday to logging off at the end —isn’t recorded in a standard format and transmitted to a central logging repository. The industry simply needs to come together, agree on a format, and customers must then demand that vendors comply.

Identifying security incidents from log files has often been described as searching for the proverbial needle in a haystack. However, today’s reality more closely resembles searching for a needle in a needle-stack. Dropped packets at the perimeter firewall are mostly background noise, and using dropped packet counts usually identifies the least sophisticated threat. Targeted attacks quietly slip through porous perimeters and do their damage from the inside. To internal systems they appear as internal users.

“When you look at the long string of breaches over the past year they have one common element, existing accounts were used to gain access to sensitive data and systems beyond the normal scope of activity for that user,” explained Eric Ouellet, vice president of strategy at Bay Dynamics. “This has become the preferred method for attackers because they know organizations have yet to invest in tools that provide visibility into the actual behaviours of users. Focusing on users instead of individual events aligns with the way organizations naturally operate and provides significant leverage in investigations. Each user generates thousands of events every day. Correlating events to users and then identifying when they deviate from the norm effectively reduces the scope of work from hundreds of thousands of events to a few hundred or thousand users. You can build a team to review a few thousand users, but you can not build a team to review hundreds of thousands of events.”

Then there is the challenge of doing something about it. A security analytics system should be capable of automatically baselining behaviours and prioritizing incidents. It should triage events into different remediation paths such as automated emergency action, initiating investigations, and assigning just-in-time training. Some major security breaches last year could have been averted if the associated alarms were correctly prioritized and immediately acted on.

If a PC is suddenly being used to access thousands of documents in the middle of the night, or a server establishes a connection and starts exfiltrating data, it may be too late by the time a security analysts receives an email report and actions it. Cybersecurity systems require far more automation, and security teams need better analysis tools. Accomplishing both of these requires good data in a standard format.

Have a security question that you’d like answered in a future column? Email eric.jacksch@iticonline.ca.

Related posts