Efficiency must be balanced with security

Networks will be bigger and faster in future but they may not be safer

Networks will be bigger and faster in future but they may not be safer

Last month, scientists from US institutions Caltech and Cern transferred data from California to Geneva at an average rate of 6.25 gigabits per second. This is 10,000 times faster than a typical home broadband connection.

With the next generation internet - IPv6 - just around the corner, Wans are getting quicker and this is good news. However, new risks and additional responsibilities require a rethink on how we manage network performance.

Millions of PCs were infected in the UK this month by a family of internet worms exploiting a loophole in Microsoft's Windows operating systems.

One variant of the worm, Sasser D, was said to have scanned the internet so aggressively for computers to infect that networks slowed down inexorably as they became congested with data, in a similar way to how denial-of-service attacks work. Many businesses' systems simply ground to a halt as a result.

Installing anti-virus software and firewalls to keep out virulent pests is as standard practice now in organisations as having immunisation jabs before travelling to far-flung destinations. It is a necessary precaution.

But as most companies have found with the Sasser worm, having the protection in place is no longer enough. The protection is only as efficient as the Wan or Lan they run on.

Viruses and worms are spreading faster, hitting computers before organisations can safeguard their networks and systems by uploading patches and updates. Internet worm Code Red, for example, doubled its infection rate every 37 minutes; Slammer, which followed in January was even more deadly, doubling its infection rate every 8.5 seconds.

Here lies the problem with network speed. As networks and data transfer become faster, so do the threats of supercharged attacks. Without proper management, your greatest asset also becomes your Achilles' heel.

The growing trend towards network monitoring and optimisation tools is one indicator of the user community looking for turbo-charged performance packages. But this needs to be refined for the network challenges of tomorrow.

We all know that speed is not always derived from fatter pipes and faster processors, but at the same time, hitting the Nitro button in times of need is also not always the answer. IT departments must be clear that issues, such as network latency, data compression, application prioritisation, security, caching and hybrid networks, need to be treated as one collective performance challenge. Dealing with them one at a time will always leave part of the network vulnerable and slow.

In the case of a virus attack, rapid response is needed, but even the less glamorous worlds of replication and synchronisation require capacity, speed and careful management. Data bottlenecks can hurt a business as much as a Love Bug but the network has to respond in different ways to deal with these problems.

As networks become faster, IT departments have to be smarter in how they think about optimisation. The anti-virus issue is a very tangible example of the respect with which network speed should be treated and it offers a lesson for business applications as a whole.

Efficiency is delivered through a delicate balancing act of resources with clear planning including contingencies. Without a more insightful understanding of the power offered by next generation performance, the 'better, faster, safer' network will always be a megabit away.

Steven Wastie is marketing director at Peribit Networks

Read more on IT risk management

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close