Recent research commissioned by BT found that the top 2,000 European businesses are wasting more than of three million working hours every year, equating to £165m, trying to get to the root of poor application performance.
These are rather worrying statistics given that business success relies heavily on mission-critical applications running smoothly.
As convergence becomes the norm and organisations start to run increasing numbers of applications over a single corporate network, the task of assuring application performance looks set to get even more complex.
Each application has different operating metrics and requirements and each has the potential to negatively impact the performance of another. With the rise of real-time applications, the effect of this becomes even more apparent - for example, voice and video can only work with a minuscule amount of delay before their quality deteriorates audibly and visibly, impacting the end-user's experience.
However, it is not just legitimate business practice that IT departments need to take into account. Rogue activities, such as excessive internet surfing and downloading music and film can also have a detrimental effect by stealing away bandwidth from mission-critical applications.
Until now the standard tactic has been to throw extra bandwidth at the problem of poor application performance, but as network prices begin to stabilise and with continuous rigorous control of IT budgets, this cannot remain a viable long-term option.
In addition, companies' operations and customers are becoming dispersed so business success relies more heavily on the underlying information and communications technology being failsafe. As a result, it is important that IT departments get to grips with the activities taking place on their corporate networks.
So how can network managers control and reduce overall costs and improve service levels? One alternative is to analyse network activity so that applications can be prioritised and managed more effectively to support the performance of the business as a whole. There are hundreds of different network and application monitoring tools on the market, all of which promise to help IT departments gain better visibility and greater understanding of network traffic.
However, the BT research suggested that over-stretched IT departments are finding it difficult to get to grips with this sort of specialised work. More than 60% of IT departments which responded are struggling to resolve issues around application performance because of a lack of resources - money, people and time - and 11% are short of the necessary skills and expertise.
The tricky part is not putting probes into a network to get a view of performance. The challenge lies in interpreting the data and in bringing network, application and consultancy skills together to boost the efficiency of the infrastructure and overall performance of the business.
It is important that companies weigh up whether or not they have the required expertise to carry out this task in-house. If not, companies could hand over application and network monitoring to a trusted third party with the necessary resources, skills and expertise to turn the captured information into meaningful data.
What is more, with an economic upturn hopefully on the horizon, IT managers will be looking to spend more time exploiting new opportunities rather than having to acquire new skills and spend more time fire-fighting.
There has never been a better time for IT departments to de-clutter their corporate networks. However, the sheer choice of monitoring tools on the market and the complexity of interpreting the data collected means many organisations will find it beneficial to work with a third-party provider who can help guide them through the maze. Ultimately, those businesses that get their infrastructure in order now will fare best in the future.
Ivor Kendall is general manager of IP infrastructure at BT