Nearly a year after the discussions that led to the founding of the Anti-Malware Testing Standards Organization, testing seems to be the only thing anyone wants to talk to me about, in and out of my company. So what progress has been made? There is certainly a need for rationalization and standardization: while the industry has been quick to complain about the low level of testing competence, it has failed to provide the sort of information prospective testers need about good testing practice. AMTSO provides an opportunity for the testing and anti-malware industries, as well as their respective customers and independent stakeholders (academia, the media, and independent researchers) to lay down some guidelines and a central, authoritative information resource.
However, it is also a chance to pull testing away from the wild and woolly 1990s, when malware detection was based on static analysis (signatures and generic signatures detecting multiple variants, and passive heuristics, where the malicious potential of a file is assessed by looking at its code rather than by observing its behaviour.
Most current testing is still based on the assumption that this is baseline, current anti-malware technology: even where a test group attempts to measure proactive detection, it usually does so using on-demand scanners (even command-line scanners) in a homogeneous testing environment. This is convenient for testers (and that is important - even barely adequate anti-malware detection or performance testing is expensively resource-intensive), but it misses an important (even critical) point.
There are still very competent scanners that are largely dependent on known malware detection and passive heuristics, but many are now focused on some form of behaviour analysis (active heuristics, sandboxing, emulation and so on). These techniques examine the behaviour of a program by allowing it to execute in a restricted, in some sense virtualized environment, so that it cannot do damage to a real-life system. This is surprisingly effective, which is why the bad guys spend serious R&D time tweaking so that badware will get past security software. But static testing does not usually address dynamic detection. Put simply, if you do not test the on-access (real-time) component of a scanner, the suspicious code is not executed, so there is no behaviour to observe.
In consequence, static testing does not provide an accurate reflection of the detection capabilities of scanners using dynamic analysis techniques: in fact, it can seriously understate them. Those testers in tune with the AMTSO thoughtscape are, therefore, moving towards improving their ability to test dynamically, so that the next generation of tests is likely to be less flawed. Meanwhile, AMTSO has published a set of draft guidelines for good practice in testing, in accordance with its published aims of improving objectivity, quality and relevance in testing. Other documents covering static and dynamic testing methodologies should follow shortly.
Most unsponsored individuals with an interest in testing standards will find AMTSO's subscription rates a barrier to full-on engagement, but there are opportunities to join the debate even so, starting with a blog at AMTSO's website. AMTSO cannot fix the entire testing problem in one fell swoop, but it may be the best chance we have right now of establishing testing scenarios that reflect real detection performance.
This was first published in November 2008