Feature

IT security analytics: the before, during and after

The scope of IT security analytics is broad. In an ideal world, threat intelligence, provided in advance, would prevent IT security incidents from occurring in the first place.

However, complete mitigation will never be possible and incidents are inevitable, often with associated data breaches.

140328_0084.jpg

Post-event clear up requires intelligence gathering, too. The quicker that can be done, the better; more chance of finding the smoking gun.

The net result of trying to speed up incident response is that an increasing capability to use intelligence as an event is occurring. As one supplier, Cisco’s Sourcefire, puts it: the need for security intelligence is “before, during and after” an incident.

In the past, there have been distinct products in each area, but the boundaries between them are blurring as suppliers extend their reach, in some cases competing with each other where they previously did not, but also co-operating to share intelligence. The more timely that intelligence can be gathered, the more likely it is that it will be put to use for proactive defence, rather than post-event clear-up. This is the area of real-time security analytics.

Blacklists and whitelists

First, let’s look at the before. Threat intelligence is the lifeblood of the IT security industry. It includes blacklists of common spam emails, malware signatures and dodgy URLs, as well as whitelists of known good stuff (applications you want your users to run or websites you are happy for them to visit). All this is still a key part of protecting IT users and relies on the vast threat intelligence-gathering networks that sit at the core of most IT security companies. Examples include Cisco’s Advanced Malware Protection (from its Sourcefire acquisition, now integrated across the Cisco security portfolio); the Symantec Protection Network; McAfee’s Global Threat Intelligence; and Trend Micro’s Smart Protection Network.

All IT security suppliers have access to such resources at some level. Part of the power of these networks is that they are kept up to date by gathering intelligence from, and sharing it with, huge customer bases. However, many now accept that intelligence gathered before is never going to stop the most insidious threats. However good such networks are, unwanted security breaches will still occur.

So let’s now look at what may need to be done after: the worst-case scenario, when an event has occurred and systems and/or data have been compromised. The requirement now is to understand the extent of the damage. This is the world of IT forensics: the preparing of reports for internal investigations, responding to regulators and, in some cases, communicating with crime investigators. Examples of relevant incidents include the discovery of unknown malware (which may or may not have been egressing data), evidence of hacking and, in some cases, the suspicious behaviour of employees.

Clues to what has happened

Well-established suppliers of forensics include Guidance Software, Access Data, Stroz Friedberg and Dell Forensics. In 2013, Guidance released a new version of its Encase product, called Encase Analytics. Many of the clues to what has happened lie on the servers, storage systems and end-user devices, so although Encase Analytics is a network-based tool, these end points are its focus. The volumes of data involved can be huge and, as Guidance puts it, this is where “big data meets digital investigations”.

To use intelligence from a range of sources in real time to identify and mitigate threats is the holy grail of IT security

To complete its reports, Encase Analytics needs kernel-level access across multiple operating systems to inspect registries, system data, memory, hidden data, and so on. Network and security appliance log files are also of use. To access information from these, Guidance can take feeds from SIEM (security information and event management) tools, which are discussed below. Guidance has hundreds of enterprise customers that use its tools. One of the benefits is to be able to offer ready-customised reports for specific regulatory regimes, such as PCI/DSS, the UK Data Protection Act and the mooted EU Data Protection Law.

Access Data’s Cyber Intelligence and Response Technology (CIRT) provides host and network forensics as well as the trickier-to-address volatile memory, processing data collected from all these areas to provide a comprehensive insight into incidents. With some new capabilities, Access Data is repackaging this as a platform it calls Insight to provide continuous automated incident resolution (CAIR).

New capabilities

The new capabilities include improved malware analysis (what might this software have done already; what could it do in the future?), more automated responses (freeing up staff to focus on exceptions) and real-time alerts. This is all well beyond historical forensics, moving Access Data from after to during, and even some before capability. Like Guidance and other suppliers, Access Data relies on SIEM suppliers for some of its intelligence.

In the past, SIEM has also typically been an after technology. Most SIEM suppliers come from a log management background, which is the collection and storage of data from network and security system log files for later analysis.

Many of the major IT security suppliers have entered the SIEM market via acquisitions: HP of ArcSight in 2010, IBM of Q1 Labs in 2011, McAfee of Nitro Security in 2011, EMC’s RSA of Netwitness in 2011 and KEYW’s Hexis of Sensage in 2012. Other suppliers include LogRhythm, Red Lambda and Trustwave. Splunk is often included in the list of SIEM suppliers, but its focus is even broader, using IT operational intelligence for providing commercial as well as security insight.

As with forensics, the volumes of data are so big that SIEM is increasingly referred to as a “big data problem”. It fits the definition well, if you go by the five Vs of big data: volume, variety, velocity, value and veracity. There is certainly lots of data involved (volume) and it comes from a range of sources (variety), often being enriched with data from other sources (for example, user and device information, content classifications data and threat intelligence networks). However, it is the increasing capability to use SIEM data in real time that ticks the velocity box and this is turning SIEM into a during technology. Anything that minimises the impact of security incidents clearly has value, and veracity comes from the truth exposed through deep insight.

Plenty of measures

To use intelligence from a range of sources in real time to identify and mitigate threats as they occur is the holy grail of IT security. Of course, there are plenty of measures that can be taken: running suspicious files in sandboxes (witness the rapid growth of supplier FireEye); only allowing known good files to run (for example, with white listing technology from Bit9, another supplier that has upped its ante for the during with its recent merger with Carbon Black); blocking access to dangerous areas of the web, which is a constantly moving goal (URL filtering from Websense, Proofpoint and others); or judicious checking of content in use (content inspection and redaction from Clearswift and others in the data loss prevention/ DLP sector).

More extensive protection

These are all point products that help towards the broader aspiration of real-time mitigation. Supplementing these with analytics across a wide range of sources during an attack provides more extensive protection. Examples are:

  • Identifying unusual traffic between servers, which can be a characteristic of undetected malware searching data stores;
  • Matching data egress from a device with access records from a suspicious IP address, user or location;
  • Preventing non-compliant movement of data (which may be simply down to an employee being ignorant of the rules);
  • Linking IT security events with physical security systems (for example, maintenance of plant infrastructure restricted to certain employees known to be on the premises);
  • Identifying unusual access routes, (for example some databases are only normally accessed via certain applications and not directly by users).

So, in general terms, the news is good. The suppliers that aim to protect IT infrastructure are upping the ante in the arms race with attackers. More and more are making use of their ability to process and analyse large volumes of data in real time to better protect IT systems. But the bad news is that there is no silver bullet and never will be.

A range of security technologies will be required to provide state-of-the-art defences and there will be no standing still. Those who would steal your data are moving the goalposts all the time and they will be doing that before, during and after their attacks.

Bob Tarzey is analyst and director at Quocirca.


Email Alerts

Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

This was first published in March 2014

 

COMMENTS powered by Disqus  //  Commenting policy