Model-based automation can help users keep pace with changing security demands

Setting up automated systems that respond to events can improve adaptability

MartinSadler_150

Setting up automated systems that respond to events can improve adaptability

The IT industry is under pressure to be more responsive to business needs. At the same time there are pressures to reduce costs, even as new technologies increase the complexity that chief information officers and company staff have to deal with.

And then there is the ever-present worry that the next cyber attack could cripple an enterprise's ability to operate.

One approach that offers some hope of alleviating these problems is model-based automation. This describes systems in terms that are closer to business requirements and then automates the mapping of these models to application, middleware, processor, storage and networking configurations.

Model-based automation provides several benefits. Systems can be implemented faster and resources adapted to meet varying demands. And by analysing the models the consequences of future usage can be simulated and understood.

These models will also be important in encouraging the take-up of utility computing, allowing CIOs to manage the trade-offs between directly controlling day-to-day needs while off-loading peak demands to service providers.

Imagine an online shopping service for an enterprise that sees high seasonal variation in sales. Ideally, the service might like to provide for an average day's sales, and redirect resources from other enterprise applications, such as internal e-mail, as demand rises. On the busiest days of the year it might need to buy additional resources from a utility service provider. Alternatively it might invest in more infrastructure and then offer back spare capacity to the utility service providers.

Model-based automation provides the basis for the CIOs to make informed choices, see them implemented, monitor them and continuously adapt to changing business needs.

If, alongside the models that describe business needs,there are also models that describe the security policy of the enterprise, then as automation engines are making provision for a service they can also ensure that the right security mechanisms are put in place.

Firewalls will automatically be placed and configured appropriately. Memory will automatically be wiped clean between different customers' data being processed. Information will automatically be appropriately encrypted at rest and in transit. And privacy requirements for customer data will be enforced. Automation allows the enterprise to be more confident of its security engineering.

As new vulnerabilities come to light, or as applications are used in new ways and the understanding of threats changes, it also becomes much easier to maintain business services.

If the security model is modified to describe the threat, the underlying automation engines can determine new ways of mapping the models to the infrastructure, migrating any running services to new configurations of resources without loss of service. So, for instance, a newly discovered vulnerability in a web server used to provide part of an online shopping service could lead to the infrastructure being reconfigured, with firewalls in different places.

By looking at product and security models hand-in-hand the simulations needed to determine risk can be carried out. The models also allow for "what-if" questions to be answered, such as, "If an intruder gained access to the root password on this machine, would they be able to see all customers' preferences?"

But being more secure is not enough: CIOs also need to be convinced that their IT environment is safe. At the same time that systems are deployed on to the underlying infrastructure, security monitoring can be deployed which can provide CIOs with the assurances they need.

The idea is not so much about collecting vast amounts of data, correlating it and, when many alarms are triggered, working out what is happening. Rather it is to have the automation engine deduce from the models what we should be looking for and then selectively put in place appropriate monitoring.

This allows CIOs to see the state of their IT environment through a simple dashboard, with traffic light-style alerts that can quickly focus attention on key problems.

From Sarbanes-Oxley to data protection, this model-based assurance provides a much stronger technological foundation for showing that the enterprise is meeting compliance and governance requirements.

There are also advantages in sharing these security and assurance models - and some of the monitoring information - across enterprise boundaries. If systems are tightly coupled with those of suppliers or customers, I do not just want to know my cyberworld is safe, I would also like to know what state their IT environments are in.

As well as service level agreements between enterprises, there is also an increasing importance in models that explicitly determine the trust relationships and specify which security information should be shared in supply chains and ecosystems.

At HP Laboratories, we are working on the nature of these security models - how to link them with other aspects of model-based automation to produce secure environments, how the automatic provisioning of monitoring leads to assurance, and how trust is built between enterprises as appropriate parts of the security models are shared.

We see such security models and automation technology as increasingly central to realising a vision of a truly adaptive enterprise.

Martin Sadler is director of the Trusted Systems Lab at HP Laboratories in Bristol and manages security research at Hewlett-Packard

Read more on Hackers and cybercrime prevention

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close