Simulation or emulation - the model way to build and analyse your virtual network

Feature

Simulation or emulation - the model way to build and analyse your virtual network

network2802_150.jpg

Network modelling can iron out performance problems in advance.

What do large corporate computer networks and spacecraft have in common? Apart from being complex and expensive, they are also critical pieces of equipment. If one goes wrong in operation, the results can be disastrous.

Astronauts get around this problem by working through different flight scenarios on the ground in a simulator. Thanks to network modelling software, network managers can do the same.

Network modelling software creates a virtual model of a computer network by documenting its component parts and enabling you to simulate traffic running across it. These software products enable users to analyse ways to improve network performance, trace the causes of network problems, or assess the impact of changes to the network - all without touching a single router port in the real world.

There are many scenarios in which modelling network performance can be useful. For example, an IT department that hopes to use its network for VoIP might want to assess the impact on existing network applications, working out in advance how and where to use compression and packet shaping technology.

Such operations have traditionally been based on what is referred to as "baseline techniques". Companies will monitor network performance over time, extrapolating this historical data to predict what will happen when they add more applications and servers, said Andy Woyzbun, lead analyst for telephony and networks at Info-Tech Research Group in Canada.

"They look at bandwidth as the measure of capacity. They look at purchasing capacity and don't necessarily look at other characteristics of network performance. That, for many organisations, will be good enough," said Woyzbun.

Baselining will give you a steady linear curve with which to plot gradually changing network statistics, but networks do not always behave in a predictable way. Also, historical data may not exist for certain situations - if you are merging companies, for example, and need to assess the effect of a new firm's applications and clients on your network.

One company that is attempting to address this issue is Opnet. Its IT Guru basic platform for network modelling draws together information about the devices on your network and creates a virtual representation of your infrastructure. This includes the servers hosting your applications, the client machines, switches, and routers.

The system's virtual network environment also creates a set of baseline network performance data which it draws from sources including third-party network management and monitoring tools. The tools do the same type of job as configuration management databases.

Network managers can mix the baseline data with the virtual model to analyse network performance using Opnet's model library. This contains virtual models of equipment operating a variety of network protocols and standards, and includes virtual versions of equipment from different network suppliers.

While most network modelling software companies apply network modelling techniques to resolve network performance and reliability problems, other companies focus on other scenarios.

For example, Skybox Security sells a network modelling tool that gathers configuration data from the network and feeds it into a model that can be used to assess security. Its simulations are designed to conduct "what-if" scenarios for security purposes, such as, "What if we deployed our intrusion prevention system but moved it from point A to point B?" or "If we infect a virtual model with a worm, what propagation attributes will the worm adopt?"

The answers to these questions can be found if you combine simulation with the virtual model itself, said Skybox Security's vice-president of worldwide marketing, Ed Cooper.

Strictly speaking, network modelling involves simulating all the devices within your infrastructure. But this is not the only way of understanding your network. Whereas network simulators embody the full complexity of a company's network, emulators effectively create a "black box" version of a network, recreating high-level conditions using a limited number of parameters.

One company specialising in this approach is Itheon. Its network emulator is essentially a box that you plug in between the devices to be tested. "We behave like a piece of wire, but you can change the characteristics of the wire," said Frank Puranik, chief technology officer of Itheon. The emulator can then be used to increase the latency of the link, and restrict the bandwidth.

The main difference between complex network modelling and network emulation is that with the latter, you can run tests with real applications and real equipment in the network, said Puranik. "It's not a modelling thing because you can be really using your application. You're using it for real but the emulator is restricting bandwidth."

Shunra sells software that encapsulates high-level network performance data and uses it to create a test environment between real-world servers and applications.

Boaz Grinvald, chief executive of Shunra, said, "Traditional network modelling requires a high level of sophistication from the end-user. You have to feed a mathematical model with lots of information and then calibrate and tweak it to be as close as possible to reality." Shunra's network emulation product enables network managers to capture network behaviour by monitoring packets. This can provide network engineers with an accurate model of network behaviour for testing purposes.

Even VMWare is getting into emulation. The virtualisation software company uses a concept called virtual machine teaming, said European technical marketing manager Richard Garsthagen.

Customers use the company's technology, which runs multiple operating systems in a single machine, to create clients, routers, firewalls and application servers in one environment, building virtual network connections between them. "We can control what types the networks are," said Garsthagen. "We can control the bandwidth and how much traffic will get lost."

Should your company choose the traditional network modelling or the emulation route? The answer lies both in your IT department's level of maturity and in the complexity of the network you are trying to analyse.

Devesh Satyavolu, product marketing director for enterprise solutions at Opnet, argues that diagnostics can be difficult with emulation tools that approximate a network into three or four different parameters. You may be able to test the network under different conditions, but he said it is difficult to find out why a problem is occurring or to decide what to do about it.

If your application servers in Singapore are taking 15 seconds to respond, should you apply accelerators for the applications on the edge of the network, or increase Wan bandwidth? Should you subdivide your Singapore Lan into more subnets, or add more backup capacity?

"It is not about testing a specific device configuration," said Grinvald. "It's more about seeing whether your disaster recovery scheme can stand significant failure, or what happens if your branch offices shift from using a primary data centre to a secondary data centre."

In other words, the model is designed to answer the "what if" question, rather than "what just happened, and what should we do about it?" In this sense, Grinvald suggests the two approaches to network analysis complement each other.

The simpler emulation tools can be used to help track down the causes of problems, as long as you have an engineer's instinct and an inquiring mind.

One US-based network analyst who did not want to be named used Itheon's tool to help trace the cause of a network problem in a large aerospace company. His company's Wan-based Citrix installation was performing poorly, and he was trying to prove to senior management that there was a problem in the Lan, rather than the wide area infrastructure.

"Everyone said it was the Wan and I said it wasn't," he said,explaining that he used tools to reproduce real-world network operation in the labs and run through different scenarios.

"We developed the applications in the lab environment and simulated a couple of sites going up to 120-180ms latency and we saw the applications still performing in the lab."

Using a process of elimination, the analyst was able to determine that the problem lay with some workstations on the Lan which were misconfigured with an inadequate TCP window transmission site, which meant machines were retransmitting too many packets. "It took a combination of a whole bunch of tools, but it would have been extremely difficult to narrow that down to the Lan," he said. The Itheon tool told him what was not causing the problem, giving him more confidence to identify possible trouble spots in the network.

The bottom line with traditional network modelling technologies is that many companies simply will not have the expertise to use them properly, said Woyzbun.

"Most networked organisations typically do not have the kind of network engineers that it requires to put together a model in the first place, and if the model shows negative performance, to know what changes to make," he said.

"So the bottom line is that if you are big and complex enough and the application is critical enough, these tools are workable. But either you will have to hire people for the organisation who are familiar with modelling and with that particular tool, or outsource that to a specialised consulting organisation."

Woyzbun also does not believe network modelling or emulation can help to solve all of a network manager's real-world problems. He recalls routers misbehaving when they became saturated.

"Diagnosing router problems was an issue but modelling would not have fixed that problem. It was just a case of the [quality of the] routers, which have been improved by the suppliers."  Woyzbun said many problems lie within servers and applications rather than on the network. Stress-testing the servers can help to identify the causes of those problems, but that is handled by an entirely different class of tool.

His point is not entirely valid; modelling tool suppliers like Opnet will help to capture and visualise application and server activity as part of your model. But again, expect the overhead for learning and deploying these tools to be much higher than simply running real applications over a network emulator in the lab.

The ultimate goal for any company deploying network modelling or emulation products would be to enhance its planning capability and optimise its network, but for many the main reason for turning to such tools is for firefighting purposes.

With a high level of expertise required to use traditional modelling products, mid-range businesses are less likely to use this class of application.

Network emulation may be more appropriate here, but larger companies with the resources to invest in building and understanding a virtual model of their network could find themselves able to respond more rapidly to problems - and might end up being able to plan and optimise their networks more effectively, too.


Email Alerts

Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

This was first published in February 2006

 

COMMENTS powered by Disqus  //  Commenting policy