Doing more with video conferencing

Rob Bamforth | No Comments
| More

While video conferencing is not a new technology, several factors appear to be driving forward the adoption of video as a communications tool, especially for personal use. Whether Skyping with distant relatives, using video in gameplay or creating recordings to share on YouTube, video has become much more broadly accepted and valued. So, just as in other areas of technology is this consumer appetite for video translating into more widespread business usage? Not really, which begs the question why?

First, any investment in technology must demonstrate clear benefits. With video conferencing there are generally two that are often quoted - saving travel costs and environmental benefits. The current economic climate means that the second of these is no longer the burning issue it once was, and the former is more difficult to measure than it seems at first glance. The problem is that while travel costs can be measured, the savings made because video conferencing was used instead are harder to quantify.

This can mean that the business incentive for increasing the use of video could be muted from a narrow financial perspective. However, take a wider look at the value proposition and the position changes.

Among existing users of video conferencing, both frequently quoted benefits are recognised, but only travel cost savings are really thought of as important. What is more interesting are the secondary benefits, which all revolve around more effective and efficient collaborative and individual working. These benefits in particular appear to be magnified as the adoption of video becomes more widespread across an organisation and as frequency of its use increases.

Once the financial imperative is understood, the things that are holding adoption back fall into two categories - technology and people.

Many organisations have taken a half-hearted or tentative approach to installing video conferencing systems; they have put them in the boardrooms where only a select few people have access; they buy different, often incompatible, systems when they do broaden deployments to other areas; or they have been slow to roll out video to regular desktop, laptop and mobile devices. Generally, training is after the event rather than getting everyone up to speed and familiar at the time of installation. In many cases there are disconnects between the needs of the business, those in IT running the video conferencing setups and the facilities groups that managed the rooms and the workplace.

From the individual's perspective, the technology is perceived as hard to use, unreliable and probably not worth trying without a fair amount of hand holding. Many are already uncomfortable being 'on camera', something that's often not helped if the primary use of video conferences is for mass meetings, where the screen enhances a feeling of formality and not even the friendliness of a handshake or shared cup of coffee are on offer.

No wonder the ones who embrace video the most are those most comfortable with the technology and those in senior positions or with the greatest access. Digital natives, the young group most likely to be very comfortable with video are much less likely to be embracing video in the workplace - why? It is probably simply because they typically are not in sufficiently senior positions to take advantage. Given the potential productivity benefits, this would seem to be a huge mistake.

Most of the technology challenges surrounding video have a direct impact on the people issues too. Past experiences colour current judgment, and many will have had bad experiences with video conferencing, which knock their confidence.

Despite these negative perceptions, current technology has moved on a great deal. Video and audio quality can be incredibly high, interoperability issues are largely contained, and it can be easy and reliable to set up video calls if the right investment decisions have been taken.

Video can be used on every device from a smartphone to big screens in the boardroom and the formality of call setup in advance should be a thing of the past, making it simpler to encourage more frequent usage and familiarity on all screens. This breadth and increased frequency of use is when the real benefits of productivity and collaboration really kick in, but of course requires funding.

Those with video already will often find a mix of systems, some of which will require upgrading to gain consistency across all their video platforms and almost all will require further investment in software platforms to ensure that even the smallest mobile devices can be added into the video mix.

Consistency, widespread availability and encouragement to use video regularly will have a real impact on adoption and if this is mixed with induction education to build early familiarity and further training on how to feel comfortable, relaxed and proficient on camera, both people and technology challenges will have been adequately addressed.

It might require some investment in a mix of hardware, some upgrades, software and services, but creating a democratic and pervasive communication culture that is comfortable with using video will pay off.

For some thoughts on how to finance the changes required to address an enterprise video conferencing strategy as a whole, download this free report.


Redundant array of inexpensive 'things'

Rob Bamforth | No Comments
| More

There has been plenty of hype surrounding the internet of things (IoT) and especially super smart things such as the iconic Nest (now Google) thermostat. While many of these devices are interesting, they often come with premium pricing for an expansive set of features doesn't fit all needs - some things are a bit less smart (& sexy) but could bring significant value.

This is where historically the term 'Machine to Machine' (M2M) has been applied, especially by mobile telcos desperate to seek out new revenue streams from connecting lots of remote low data demand devices. Here, it is often only one or two attributes or states of each relatively dumb 'thing' that are of interest - a sensor taking a measurement, a system being switched on. One sensor on its own, is not really interesting, but using the concept en masse with dozens, hundred or thousands of sensors or devices makes things much more interesting.

This does not mean suddenly flooding the internet with masses of data - no matter how much the telcos might like that idea. The term IoT conjures up an image that all of these things, smart and dumb, will be connected to a single network, when in reality most have very little in common, except their ability to converse using a basic universal protocol set, based on IP. An effective IoT application is one that might take advantage of some connectivity to the wider internet, but is also built on internet technologies that exploit the economies of scale of standard components and common protocols.

When this concept of simple, mass connectivity crosses into the physical aspects of the 'things' as well, the proposition becomes even more interesting, even in what seems like the simple use of under floor heating.

In the industrial research centre, SPECIFIC, in Swansea, a combination of academic research from Swansea University, with industrial skills from Tata steel, NSG Pilkington and BASF, is leading to some interesting developments of 'things'. The SPECIFIC consortium has the concept of 'buildings as power stations' at its core, and is creating low cost, robust items to capture, store and give off, energy.

One example that is currently being commercialised is a heated floor tile. This is a standard 600mm metal wrapped square of chipboard, designed to be stood on free-standing posts to provide a raised access flooring system, typical of most office, business and educational environments. The only difference is a coating on the top and a power connector, and with power applied, the upper surface of the tile warms up.

Under floor heating is not new, but fine-grained control of a tiny area at very low cost, is. However, being able to individually heat every single floor tile is only of real interest when intelligent controls can be applied. What is the temperature in the office, how many people are currently in there, which tiles are exposed or covered by furniture, which rooms are they in, which tiles is the sun currently shining on etc.?

A single smart system that detects some of this data from sensors, has other elements filled in from information in employees' calendars or room booking systems and then access to weather sites and other external data sources over the internet can start to be very effective, efficient and comfortable.

This type of application requires some integration and might not look a elegant as the smart Nest thermostat, but could deliver significant benefits such as cost and energy reduction by applying heating when and where required. This would not be a smart IoT object that 'learns', but a distributed one that constantly takes into account the current and forthcoming situation and applies those requirements.

Some IoT applications focus too much on putting smarts into every individual 'thing', but without fine-grained control of the related physical attributes (in this case heating and energy) being managed, the cost of deploying some IoT applications might be higher than the value of the benefits they deliver.

Smart applications can be built by intelligently assembling and connecting a great many dumb components, not simply by adding expensive smarts into something that was historically dumb. This will be increasingly true for enterprise IoT applications, which have many legacy components and systems to accommodate.

With so much attention on sexy consumer IoT applications there is a danger that the skills necessary for commercial IoT integration to physical things as well as the requirements for IoT applications that deliver real value for businesses, will be overlooked. It would be a shame if the current IoT bandwagon leaves great masses of 'dumb', but worthwhile, things behind.

Extending unified communications

Rob Bamforth | No Comments
| More

In the last couple of decades the number of digital communications options for most workers has soared, bringing with them information overload and post holiday inbox anxiety now only offset by taking mobile devices everywhere. Where once simply not answering the phone or opening the mail would cut off interruptions, most now have multiple forms of telephony - online, mobile, desktop - instant messaging, email and a plethora of social networks clamouring for their attention.

It might seem easier if one preferred mechanism for communication was to become the default for each individual, however, things are not so simple. Individual communications preferences vary depending on the task being performed, and while this can clearly be seen with personal communications, very similar behaviours are just as prevalent in the workplace.

This means that employers need to provide a broad kit-bag of communications tools which will undoubtedly be added to by the ones employees bring themselves.

However, having to switch from one task to another can be very disruptive, hence the emergence of the idea of pulling together the multiple strands of communication, known as unified communications (UC) or to some vendors as unified communications and collaboration (UC&C).

Unfortunately, unifying the communications networks and 'plumbing' was initially seen as the most important aspect of this, especially to most vendors and those in IT managing the infrastructure. However, the critical element for the individual (and the business) is dealing with the flow of work and interruptions, as these affect personal productivity.  It also impacts working together with colleagues and 3rd parties, and so introduces a pressing need to sort out the collaboration element as this affects overall business process productivity.

So what's the best strategy for addressing unified communications? The most important thing is to recognise that ultimately, it is about connecting people not just networks. UC is therefore as fundamental as the heart of the traditional business communications through telephone system, the private branch exchange or PBX.

When many organisations were initially sold on the idea of UC, it was a PBX element that got their attention as UC was often pitched as a cheaper way to make phone calls and simplify network management though IP telephony. These are not a true reflection of the benefit of UC, nor do they generally justify the total cost of investment, especially when new network equipment is needed.

The real benefits come from how UC simplifies tasks for individuals, not just simplifies the network. UC has to provide flexible mechanisms and choices, enabling the right tools to be selected for productive and efficient communication and collaboration.

It is clear that business use of UC encompasses a huge set of capabilities from message immediacy to media rich content sharing. Its effective use, however, requires agile user behaviour to be able to seamlessly glide between and among these different strands of media. This requires consistency in approach and deployment otherwise employees will spend more time focussing on the tool rather than the job.

Why is it now important to push UC into all corners of the business?

First, because working patterns are changing; more people are working from home, on the move or flexibly in multiple locations across their workplaces. Many are working on remote sites and even those sat in the traditional desks, offices and areas of their 'static' workplaces often have many more connections to remote co-workers than in the past.

Second, there is a need for 'friction free' collaboration. The internet and globalisation are great levellers and so gaining any edge or just staying ahead of the competition is getting harder. Economic pressures mean that big budgets are no longer easy to come by, and organisations need to sweat more of their assets and this includes getting the best out of their workforce - not just individually, but as a collective team.

The challenge with UC is it needs to be applied everywhere and consistently which leads to a need for significant investment in a broad range of elements - software for UC clients, diverse hardware from new IP phones to servers and services to ensure deployment success. All elements are important to avoid the pitfalls encountered when moving to an enterprise wide full-scale production roll out.

Getting all elements in sync will require the certainty of funding to completion. It might be fine to trial different elements to see where preferences lay with different features or tools, but once the decision is made, UC needs to be a giant leap, not a timorous small step. The 'U' might stand for unified, but it might just as easily stand for 'universal', given that it is by achieving this that its real benefits will be realised.

For some thoughts on how to finance the changes required to address a unified communications strategy as a whole, download this free report.


Online security in insurance sector

Bob Tarzey | No Comments
| More

Much of Quocirca's research looks at the differing attitudes to IT between various business sectors. For example a 2014 report titled Online domain maturity, which was sponsored by Neustar, showed that retailers and financial services were the most likely to interacting online with consumers. Another 2015 report, Room for improvement, Building confidence in data security, which was sponsored by Digital Guardian, showed that by some measure, financial services were the most confident about data security.

Such comparisons are useful as they show what one sector is achieving and how another sector might benefit by taking similar measures. However, even within a given sector there are extremes; whilst more than half of financial services organisations are very confident about data security, 4% are not that confident. More granular research is needed to tease out where in a sector such differences lie.

Quocirca was recently invited to attend an insurance industry round table focussed on IT security. The event was hosted by Entrust Datacard, a provider of strong authentication tools, digital certificates and online fraud prevention products. If the views of the dozen or so attendees, who represented some of the best known names in the UK insurance industry, are anything to go by, their sub-sector has a lower level of confidence about data security than banks (some organisations have a foot in both camps, so called bancassurance).

Why should this be so? For a start, whereas banks deal directly with their customers money, for insurance companies it is largely secondary, in other words, if your bank account is hacked money may be transferred, it is harder to exploit and online insurance account. Secondly, it was evident that one of the biggest concerns for insurers is insurance fraud, however carried out, and it was not clear that this was harder or easier to deal with as the industry has moved online.

Before the round table Quocirca had considered in what areas insurance companies may be vulnerable. It was agreed that the two obvious ones were the protection of personal and payment card data. Protecting both is of course a regulatory requirement, but also makes good business sense. An insurance company may be targeted for such data, not because it is an insurance company per se, but because its defences are weaker.

However, during the discussion some interesting insurance specific threats emerged. Stealing lists of policy holders would be useful for planning crimes, for example the targeted thefts of high value cars. The task would be much easier with a current list of owners and their addresses than having to travel the streets to search for targets. Another involved intellectual property (IP); as quoting for insurance has moved online, the industry has become highly competitive. To appear high on the listings of comparisons sites, where many insurance buyers end up, involves quoting via tightly guarded algorithms, some felt there was a possibility of industrial espionage in this area.

Another area of concern was the insurance supply chain; many policies are sold via agents and brokers. However good a given insurance company's own data security is, its Achilles' heel could well turn out to be a smaller partner. It was noted that some well publicised data breaches relied on compromising smaller partners to find a way into a larger organisation's IT systems. There should be an onus on insurers to advise and certify the security of their supply chain partners.

There are of course many benefits of being able to safely transact online. Quocirca research, to be published later this year, shows that confidence in the omni-channel (the mix and match of mobile apps, web sites, telephone, face-to-face etc. for communication with customers) goes hand-hand with higher levels of confidence in data security. All agreed the insurance industry had to further embrace the omni-channel. Another was being able to verify the ownership of insured assets, many of which can now be certified electronically via the internet-of-things (IoT), reducing the possibility of fraud.

Another opportunity for some insurance companies is insuring their business customers against online risk. Just as in other areas, those who have taken measures to mitigate the risk will get cheaper premiums. As the sector relies more and more on online interaction to keep up with its customers, insurers cannot afford to be seen to fall short of the IT security standards they expect of those they insure.


Changing the world - company-by-company, city-by-city, person-by-person?

Clive Longbottom | No Comments
| More
I had a good conversation with a large computer company recently about its corporate social responsibility (CSR) programme.  Alongside all the normal stuff of recycling, minimising the use of chemical and elemental nasties and minimising the environmental impact on the surrounding environment, we got on to the subject of how a large organisation should view the use of its technology in the wider sense of being "good for the world".

This uncovered some interesting ground.  The first is that there are three main approaches to what is "good for the world":
1. minimising the impact of the company, its suppliers and its customers on the usage of non- (or slowly) renewable resources
2. creating a sustainable ecosystem such that the impact of the company, its suppliers and customers is nett neutral
3. creating a means of ensuring that the overall impact of the company, its suppliers and customers on the planet is nett positive.

From there, we came down to the issue of what is "good"?  In today's global village, there are many different cultural cliques; providing technology that helps one group may be completely against the aims, beliefs or cultural norm of another group.  This doesn't even have to be by country: the increasing mobility of the global citizens can lead to these issues being seen across very small areas, such as within individual cities.

From there, we also have to look at what it is that technology can do, and what those with access to the technology are aiming to do - which can be completely different things.  For example, big data analysis to identify people missing from home with medical needs can also be used to track other people for other reasons as they go about their daily business.  This can get into problems around the concept of "one man's freedom fighter is another man's terrorist".

However, the main discussions centred around what is it that technology vendors can do for the greater good of the planet, rather than the greater good of individual commercial concerns?  This is where it becomes apparent that the cultural differences between different groups can get in the way.

During the Industrial revolution, there was mass migration from the countryside into the cities - the saying the London's streets were "paved with gold" being a major draw pulling workers from the villages and towns into the high-growth cities.  However, not everyone found wealth - work houses flourished; deaths from malnutrition were commonplace.  The size of cities grew, and people suffered as land lay fallow and unused in the countryside due to lack of workers.

Wind forward a couple of hundred years, and we see the same happening: not only in emerging countries such as India, China and Brazil, but still within the UK, as workers from within and without the UK still try to follow the money to London.

The company I was talking to had the approach that they could provide technology that could ameliorate the problems caused by this sort of mass movement within the cities themselves - dealing with areas such as energy grids, intelligent water usage, traffic and logistics movement and so on.  It wasn't particularly aimed at ensuring that there would be jobs available for everyone, which could be more of an issue.  My view was slightly more radical - technology should be used to try and stop the mass movement of people from one are to another.

Why?  The world is increasingly dependent on a smaller number of people within the agrarian systems to provide enough food for a growing number of people who are getting more removed from any understanding of the food chain.  Agrarian economies are being ripped apart as high-tech, retail and manufacturing companies grow within their regions.  Families find themselves saving up money to send a child away from the family to a city, in the hope that they can make their fortune - even though it is unlikely that they will.

Why not use technology to keep families and communities together?  Make a farmer a better, more effective farmer; encourage farmers to get together as co-operatives to provide a better mix of farm produce to buyers; increase economies of scale of purchasing and production and share workforces.  Encourage artisans to work from their own towns and villages, providing work for others in their communities.  Use technology, such as video conferencing and screen sharing alongside IoT monitoring, for education and medical support for those towns and villages that are remote - and should stay viable as remote communities.

It seems to me that many tech vendors see technology as something that fits into a series of small problems that are not always joined up into a bigger picture - intelligent cities, organisations and so on.  The knock on impact from this is that technology is doing the exact opposite of what it should do: it is creating a two-tier environment of the technology haves and have nots, with the haves being focused on specific centres.  These centres then create problems such as the need for increased energy distribution, travel congestion, increasing housing and food costs where more technology is needed to minimise the impact, and the spiral continues.

No - let's improve the planet one person at a time: let's find out what an individual truly wants out of their life and apply technology to help them.  As each person gets what they need, they will work more closely together.  As communities become empowered, they will be more efficient and effective.  As communities build to provide support for themselves and to others outside the community, the growing world population may find that it can be adequately supported.

And only then technology can be seen to have had a nett positive impact on the planet.

Supporting serious software investment

Rob Bamforth | No Comments
| More

The language of IT is not only peppered with technical jargon and the eponymous TLAs (three letter acronyms), it also afflicts the purchasing processes. The term 'big iron' is not just for the benefit of the 'tin-shifters' involved in selling computer hardware, it also helps those 'server huggers' and 'bean counters' in control of IT budgets understand they are getting something substantial.

Indeed, when open systems burst through the cosy proprietary computer industry in the 1990s, many buyers were a bit put off to be told that their relatively few-year-old IT hardware systems, which were the size of a couple of bulky washing machines, could easily be replaced by a 'pizza box' whose performance and capacity would also provide future proofing for up to a decade or so.

Despite the frequently encountered resistance to spending any budget, IT managers generally find getting approval to buy hardware to be easier than software. Those in control of the purse strings as well as the IT department itself want to see something tangible for their large-scale investment. That software covers so many things from the hidden depths of operating systems, through application servers and databases to the applications users recognise does not help make its investment case.

Software may also suffer from being seen not only as a bit ephemeral, but also somewhat simpler than hardware, which after all needs to be fabricated in sheets of silicon, hot dipped in solder, cased in steel and eventually enveloped in plastic (thankfully less often beige than in the past).

Surely, software can be knocked up in someone's bedroom, as most people outside of the software industry have a young niece or nephew who has been writing programs at home in their spare time. And it is so easy and cheap to copy, so perhaps is not worth that much after all?

This hobbyist view of software dates back to teenage geek owner of early home computers and right up to present day image of casual software developers creating mobile apps at home. In reality the industry is massively more complex and sophisticated, despite never quite living up to the term 'software engineering' coined in the 1980s.

Software is complicated and takes time and effort to get right, even with the latest moves towards more rapid application development and shortening the develop/test/deploy cycles through the processes of 'Dev-Ops'. Despite being a virtual product, good, commercial software with sufficient industrial strength to run a business costs real money.

Software is just as worthy of investment, yet has been viewed differently to hardware, with many companies also finding it difficult to fund through a financial package as often many finance providers will also view software as separate and different.  There is no reason why this should be so, and there are opportunities to spread the cost of software investment through external financing, yet too often companies will try to cut corners in order to keep software costs down. These approaches are mistaken when alternatives allow software investment decisions to be made for business rather than spurious financial reasons.

So what are the potentially harmful software investment avoidance practices that might be dealt with through proper financing?

Delaying software upgrades. While many vendors might rightly be accused of capitalising on maintenance and frequent major re-releases of their software, there comes a time when delays will simply increase long term costs and could expose an organisation to other problems. A recent case in point has been the procrastination by many companies over leaving behind Windows XP, and the other major one is Windows Server 2003. There will always be extra costs incurred and mass upgrades should not be made too early to avoid making mistakes, but leaving it too late is costing more, opening up security risks and making staff less productive because they have to use out of date software.

Delaying hardware upgrades with software dependencies. Often a problem where companies struggle to find a way of financing both the hardware and software elements of a major upgrade programme. Rather than delaying the whole process, which will have been started for good business reasons, there is a need to find a way to finance the whole project.

Bearing down on license numbers. Restricting licenses on some arbitrary basis simply to keep software costs down is a false economy. While some software products may be available through a more flexible software-as-a-service model that allows for incremental users, not all will be. Forcing users to work less efficiently because of restricted access to certain software products, rather than what is best for the business, is false economy. Spend all the way up to the numbers required, look for opportunities where other users could be added to the mix and there may also be ways for the business overall to gain from the economies of scale of enterprise or site licenses.

Good enough? In many areas of software there are products that approximately compete or overlap in certain functionality. They may on the face of it appear comparable, but without meeting the complete set of user or business needs. Those without in-depth knowledge of the requirements   may assume they are good enough, especially if they are cheaper than a more functional alternative, which could end up causing problems for the business or leaving users frustrated and less productive. Good business value is returned from investing in what is needed rather than cheap and cheerful alternatives.

Hardware may appear for many to be the driver for IT investment, but it is software that delivers the business value in the end. Getting the right software to do the job is far more important than reducing the cost of investment. It needs to be fully and appropriately funded, and can be externally financed just like hardware to ensure that the right tools are delivered, at the right time, to get the right business outcomes.

For some thoughts on how to finance the changes required to address an enterprise software strategy as a whole, download this free report.


The software game - Perforce's new Helix Platform

Bob Tarzey | No Comments
| More

A few years ago a friend of mine told me he was concerned that his son wanted to take a degree and find employment in the software games industry. I think my friend thought it might be a frivolous career path. Not at all, I reassured him, the games industry is deadly serious, it is as Formula 1 is to the motor industry, at the bleeding edge and the UK is a world leader.

A 2014 report from Nesta, a charity that promotes innovation in the UK, estimates there are nearly 2,000 games companies in the UK contributing billions to the economy. The industry is estimated to be growing at 22% per year. Some may consider playing games frivolous, but building and selling them is not.

However exciting a game may be to play, building robust, performant and attractive packages requires the same rigorous processes that must be applied to any software development project. For games software the project management tools must be capable of handling not just software source code, with all the version control, configuration building and testing that entails, but also the wide range of other content, not least high definition video.

The team members that build games generally fall in to two camps: the more technical software developers (generally working in C/C++, with some scripting in Python and Lua) and more artistic types working on the models and textures that make up the game's landscape. The management tools must co-ordinate between them and bring their efforts together.

When it comes to software configuration management (SCM) in the gaming sector, the market leader is Perforce Software, which claims 18 of the top 20 games developers as its customers, including Electronic Arts and Ubisoft.  So, although Perforce is active across all industry sectors, it is not surprising that is biggest ever product announcement on March 6th had plenty of new stuff for its gaming customers, including extended support for multi-media.

The headline was effectively a re-branding of Perforce's SCM product set as what it now calls the Helix Platform. Perforce's aim is for the Helix Platform to sit at the heart of its customers development operations as a centre for co-ordination, like the role DNA plays in a cell, the structure of which the new platform's name alludes too.

New capabilities include:

·        Extended support for multiple content repositories: this goes beyond just those used for software such as Perforce's own P4D, now renamed the Helix Versioning Engine, and the popular Linux based GitHub. Helix also enables the sharing of assets from cloud storage systems such as Dropbox that are often used to share large multi-media files and include them in controlled software builds, something Perforce says has been tricky to do in the past.

·        Collaboration for multiple developers is enhanced through Helix Swarm and GitSwarm that connect widely dispersed contributors and improve project workflows.

·        Protection for software IP (intellectual property) is added through Helix Threat Detection, a kind of SIEM (security information and event management) capability that is specific to Helix and the digital assets it holds'. It looks for unusual behaviour, such as a user down loading an abnormal (for them) number of files. This might be a sign of a compromised account or an insider stealing IP as they head off to a new job (not uncommon as Quocirca reported in its 2014 report, What Keeps Your CEO Up At Night? The Insider Threat: Solved With DRM).

If the games and other software-based industries in the UK are to continue to be source of growth they need to continue to produce high quality products and that requires good management tools. To capture start-ups Perforce is planning a cloud-based free Community Edition due to be available later in 2015. This will be missing some key capabilities, for example it will not have threat detection, which is only available via a paid upgrade to Helix Cloud Premium Edition or Helix Enterprise which can be managed on-premise or hosted by Perforce.

As for my friend's son - he got a First in Gaming Technology and has gone travelling. As it happens he has never heard of Perforce, although they did cover project management on his course. He looks forward to a great career in the software industry, in gaming or elsewhere, he will then get fully involved in the rigours of software configuration management and may well discover Helix.

Boosting video conferencing confidence - simple and small steps

Rob Bamforth | No Comments
| More

The spread and adoption of video conferencing in the office might have been held back in the past when the technology was expensive and network capacity was limited. However, today high definition screens are even available on devices as small as smartphone, along with high quality cameras putting video into almost everyone's pocket. In addition, with the widespread availability of fibre and high-speed mobile networks, sufficient bandwidth to reach even remote locations is less and less of a barrier.

With many of the technical limitations largely removed, it is often the human issues that remain. Using and being filmed on video is something that many people have become much more familiar with as consumers, especially on smaller, mobile and non-traditional IT devices like tablets, but how does this familiarity turn into acceptance of video as a tool for the workplace?

New worldwide research surveying over 800 business video conferencing users, shows that in many companies video usage had spread well beyond the boardroom to general meeting rooms, desktops and even mobile devices. Several interesting questions remain, especially around getting employees to feel really comfortable with using video and I asked Roger Farnsworth a senior director of services from Polycom, the sponsors of the research, how he felt these issues were being addressed.

Rob: Why do users still feel they need handholding?

Roger: Often users find purpose-built video environments most intimidating.  Users are not reluctant to click links and mash buttons on their mobile devices or laptops; however, when entering a panelled boardroom with chic electronics they fear breaking something.  We've found that if a user is exposed to the group conference tools, either through training or simple how-to videos, they are far more likely to jump in and give it a go. In fact those users that have personal access to video as a regular part of their portfolio of business tools, personal Virtual Meeting Rooms (VMRs) for example, quickly catch on and begin using ad-hoc video more often.

So, some of the issue is a reluctance to let go from IT. Our research found that those whose video calls are more often assisted by IT are the ones most likely to also blame it for being complicated to use or unreliable. The survey found more than 50 percent of people who regularly use video rarely or never need IT to help them place a call. This is because video solutions have come a long way from their original iterations. Often, those with the most trepidation have previously been burned by a poor user experience which is why it is so important to get it right first time round.

Familiarity coupled with regular usage, normalises the use of video, and this seems to have the biggest impact on whether employees can manage without IT assistance or not. From the research it was also clear that user confidence rises with informal use of video, and it seemed better when video was put to use in a variety of communications applications - not simply replacing regular meetings such as team meetings, but also personal communications like interviews and one-to-ones.

In this way, video becomes just another day-to-day communication and not a 'special event' conference.

This normalisation is further improved when location restrictions are removed. At one time, video conferencing systems were only available in boardrooms or other special meeting rooms and customer briefing centres. Desktop conferencing systems widened access further, but often only to the desks of senior management. The move to general desktop PCs with low cost cameras, laptops and now smartphones and tablets completely removes location as a barrier to video usage - except perhaps for reasons of privacy - and video on the move can add a new dimension to its value.

Rob: What does small screen mobile usage contribute to video collaboration?

Roger:  An average individual today uses one or other kind of mobile device; in fact it is hard to find a person who doesn't. A majority of companies equip their workforce with mobile devices and support a BYOD culture. Companies are embracing mobility of their workforce because it means that business-as-usual can be conducted flexibly from any location at any time. Flexible working is a boon of the invention of small screen devices that allows workers to deliver their work from outside of the office premises and will soon become a norm. Its benefits extend into the event of emergencies, such as extreme weather conditions or train strikes, when video collaboration over mobile devices ensures business continuity. 

This is where proliferation of the small screen devices industry converges with the advances in video collaboration. Ultimately, this is how the investment in technology is paying off for businesses and chiselling the shape of the future of workplaces. It really is a case of #videoforall and the real question is why should your business be left behind?

Ultimately the success of mobile devices has been less to do with technology and more about people and in the work context, process. For individuals there is increased choice - from BYOD, but also from the variety of form factors and sizes of mobile devices available. Good design, styling and even just fashion branding have helped foster personal connections with mobile devices.

The business process impact is more on the organisation than the individual, but it is important to both. At one time the constraints of technology imposed themselves too much on the process and the person. You must return to your desk to communicate with someone (or visit them in person), you must also return to your desk to access the wealth of data stored by your organisation's IT systems. For those who are not office workers - on a factory floor, treating patients in hospitals, on site in the field, travelling with goods - this distraction of having to go somewhere special to use a communications tool affects productivity, and let's face it, even office workers are not glued to their desks.

Mobile devices and the shifting of IT access and communications tools directly to where the individual needs to be to work on their business tasks helps them be more productive, responsive and to collaborate better. Now that video can be an intrinsic part of that approach too, there is no reason why it should not be adding similar value to the business process. To read more about the impact of video adoption, download this free report.

Making more of print

Rob Bamforth | No Comments
| More

Mobile, cloud, the internet of things and other fast expanding technologies might be making dramatic changes in many organisations, but the much heralded 'paperless office' is still not only an out-of-reach concept, but for most it is way out-of-sight. The reality is that a great many companies - large and small - still operate a hybrid mix of paper and digital workflows. And both of these workflow models need effective management.

Many organisations and especially their employees have bought into the concept of bring your own device (BYOD), where some of the complexities and costs of mobile working are essentially borne by the individual. When it comes to getting any mobile thoughts and experiences onto paper however, even among small and medium enterprises (SMEs), the onus quickly moves back to the organisation. Quocirca research shows that the business activities of three quarters of SMEs depend on printing and over half are struggling to control costs.

It is clear that just trying to cut costs by simply and blindly limiting access or usage to printing is not the right approach, yet without some controls the entire process of managing print resources and using paper workflows becomes unnecessarily problematic.

This means all companies need a well thought out strategy for print, and a complete plan for how to accomplish what is required. Major elements to address include:

  • Print reliability - On the face of it, printing might seem like this is a simple, low cost investment - after all, who hasn't been amazed at the low cost of a personal printer for their home computing needs? However, paper workflows are integral to most business processes and the consequences of downtime can be serious; cutting corners with poor quality devices that too often break down or lax service with too many interruptions will disrupt business processes and frustrate employees.

  • Software and services - While this seems like a heavily hardware dependent part of the business support environment, there are plenty of key roles for software and services to play in the print environment, either installed and managed directly by a company or as part of a set of managed services for print (MPS). Not only is there a need to extend the availability of printer access to different types of devices and provide suitable queuing controls and management, many organisations will want to be able to limit who has access to which print resources and to have a way of accounting for usage.

  • Asset management - In addition to this operational management requirement, there is a need to manage the entire lifecycle of individual assets, including hardware systems - printers and multifunction printer (MFP) devices for document scanning, copying and print - as well as a wide range of consumables such as the ink, toner and paper which will require processes for ordering, stocking and effective deployment and installation.

  • Appropriate hardware - It is generally a false economy to keep any old IT equipment for too long. All technology improves, becomes cheaper to run and offers increased flexibility to users over time. Print is no different. New MFPs are far more efficient than older printers, both in power and more sophisticated paper, ink and toner handling to reduce the use of precious resources and keep running costs down.

  • Network connectivity - MFP devices also provide greater flexibility through their additional functionality, especially when connected to the internet. Increasingly they have features such as the ability to scan, perform OCR to determine content and then route automatically to other locations for example sending expenses or invoices to be processes. Connectivity means they can share 'upwards' to offer storage of scanned information in a cloud service and 'downwards' to provide wireless printing from mobile devices and on demand printing.

Printing has long been a key part of IT systems and despite of advances in displays or other technologies and the appeal of paperless (or at least 'less paper') offices, paper workflows still drive and support significant elements of almost all business processes.

Purchasing the right hardware to meet all user print needs will make employees more able to reach their productivity objectives and help ensure business continuity.

It should keep costs down with more efficient use of energy and other resources and have the management attributes that allow the organisation to not only securely control access, but also understand and identify what is being done at a sufficiently fine-grained level to properly account for usage.

A casual investigation of any print environment reveals a complex mix of requirements and the need for serious investment in many areas. However, the load can be spread to match usage though intelligent use of external financing and investigating managed services for print. The key is to ensure that available financing covers all aspects - increasingly complex hardware, software demands and managed services - to keep the ink and paper flowing.

For some thoughts on how to finance the changes required to address an enterprise print strategy as a whole, download this free report.

Stepping up to the cloud

Rob Bamforth | No Comments
| More

The concept of services provided from 'out there' over the network is often portrayed as what cloud is all about. Indeed, the commonly accepted definition of cloud is that it offers infrastructure, platform or software as a service. It might be a public cloud provider, that is offering services to anyone, an internal service from the organisation's own data centre or a hybrid of the both. There might often be reasons to start out with exclusively either public or private cloud, but the pragmatic approach will generally be balanced between the two, with an accompanying agility to allow for the movement of workloads.

A key reason behind cloud is flexibility. With a virtualised core, any operating platform or system can be provided on demand with a pay-as-you-use public commercial offering or as an adaptable base for a private cloud model.

It is this lowering of the total cost of ownership that is often an important driver for those considering adopting cloud. However cost saving alone is rarely the whole picture, as organisations are also looking to drive efficiency and new working practices and here cloud at the core is complementary to the adoption of mobile at the edge.

There are drivers offering opportunities to move into new services, such as extending the access to core infrastructure to external users, again often for business efficiency, or gaining access to new markets sectors as well as adding functionality or extending applications.

The cloud model also permits experimentation, not only for technical proof of concepts, but also business trials to for example extend into new territories or markets or try out alternative business models. This is a prime case for public cloud, to minimise impact on internal resources.

It might seem that a public cloud model has an impressively low upfront commitment or investment in that leads to several strong use cases in addition to the ability to trial or experiment with new ideas. For example, being able to deal with planned and unplanned peaks or dips in capacity as well as offering lower cost redundancy and software testing. Taking advantage of this will however require investment in existing internal systems to architect them to take full advantage of the external resources on offer.

Many organisations will have known peak times at year or quarter end or at particular stages. Owning the extra capacity required to handle this will rarely be cost effective and so the ability to call upon public cloud resources is very valuable. Unplanned changes in demand could occur on the back of an unexpected success which required a rapid ramp up of capability, perhaps only for a short duration.

Rare use of additional capability is a reason why on demand public cloud services are ideal for a failover platform. Again, owning and maintaining an entire system for business continuity reasons is an unnecessary expense if use can be made of a public cloud provider and only scaled up when actually required in the event of a failure of primary systems.

Private cloud on the other hand is sometimes viewed more sceptically from a cost benefit perspective, but it is here that investment in architecting a cloud-like core and transforming the data centre can also pay dividends over time. Not only is it a pre-requisite foundation for the fully flexible hybrid cloud model where workloads can be moved from private to public data centre capacity as demand arises, but it also encourages a more effective internal IT infrastructure capitalising on server virtualisation and therefore more efficient pooling of resources.

Putting aside sufficient resources and finding the right skills is difficult with any technology investment, and cloud is no different and it is these resourcing issues that often hold back cloud projects, rather than security per se.

Even those with an enthusiastic attitude to cloud adoption recognise this; despite believing they have more of the skills, they understand there is a need for significant investment in resources to make safe and secure use of the potential that cloud brings.

The cloud model of service delivery to an increasingly diverse and often mobile edge offers huge flexibility and agility advantages in both public and private capabilities to support workloads. As with mobile, this is a difficult task to perform incrementally since it requires investment and a re-architecting at the heart of IT, but with the right strategy it ultimately reduces both costs and risks which delivering greater capability for IT to provide increased value to the business.

There are often incremental improvements as technology advances - the oft-quoted marketing mantra of smaller, faster, more. However, every so often there are bigger changes that involve a different structure or way of thinking about what has been built before. There may be a need to rip up old systems and throw some things out - although with planned migrations this can be minimised - and there will certainly be a need to invest in something new.

Rather than always trying to take this route in a gradual fashion, some innovations like cloud imply a step change in thinking in order to take full advantage of the opportunities offered. The diversity of access to IT in an increasingly small, smart and mobile fashion, coupled with a cloud-based core using flexible service provision is one such instance where change is significant and needs significant attention and investment to maximise its value.

For some thoughts on how to finance the changes required to address an enterprise cloud strategy as a whole, download this free report.

Avoiding a piecemeal mobile strategy

Rob Bamforth | No Comments
| More

It is easy to see that world is 'going mobile', from smartphones and tablets to radical innovations such as wearable technologies and the highly connected internet of things. The impact on consumers is wide-ranging and fast-changing, but despite this, some businesses seem to think that this is a phenomena that they can take their time to evaluate further to see how it 'pans out'. Or their IT departments think that it will be ok to slowly edge towards decisions by perhaps dealing with those shouting loudest (often senior executives) first.

This would be a mistake.

While it is true to say that early mobile adoption was often the domain of the 'pink collar' executives (so termed because they might buy their shirts in Thomas Pinks in London or New York) with devices such as the business-like BlackBerry, mobile usage, acceptance and even eagerness has spread to all job roles through a multitude of desirable smartphones and tablets from Apple, Samsung and others.

This led to the trend of bring your own device (BYOD) among many employees, but rather than reducing company expenditure by eliminating the potential need for the organisation to purchase the hardware, it transfers investment demands into other areas, particularly IT management and security.

In the previous deployments of mobile technology, select individuals could be given, for example, a corporate laptop and costs could be foreseen and planned by making incremental increases in the number of employees to whom laptops were deployed.

In the modern mobile world the challenge rapidly surges to include all employees and all devices. Some organisations are trying to contain the growth of mobile access, but employees are aware of technology options through their consumer use and think that the technology they have at home is often better than in the workplace. Clearly this is not efficient or effective, so organisations need to embrace the mass adoption of mobile and apply the right resources to make it safe, secure and sustainable.

Acceptance of user choice, either by permitting a subset of BYO devices or by the organisation buying devices more in line with employee preferences essentially means that almost any type of device will need to be supported and managed. This means investment in a mobile device management (MDM) system, ideally closely related to desktop management tools, but with further capabilities to deal with mobile specific issues such as cellular and public Wi-Fi network access and airtime contracts.

The risk of loss or theft of mobile devices is not insignificant, although somewhat reduced when personal choices are exercised or the device belongs to the individual, and although insurance can cover hardware costs, loss of working time and data is more significant. A suitably sophisticated MDM will make it simple to not only apply some uniform protection and configuration controls to the complete fleet of devices, but also to quickly re-instate a new device with the setup of one that has been lost or stolen.

However just protecting the hardware is no longer sufficient, especially as employees expect to be able to use mobile devices for personal as well as business purposes. Organisations must also take a strong interest in mobile applications and apply suitable levels of security to both what apps are on the devices and what corporate applications and data can be created, used or accessed on the move.

It might be that the best way to serve a large fleet of mixed roles of mobile users is to put in place a corporate app store and have some way to allow employees to self-provision access to central services and applications, with appropriate controls to ensure that only those employees who are permitted certain applications are allowed them.

To get enough future flexibility to bring the best out of mobile working, this type of strategy and support model needs to be put in place today, rather than enduring the chipping away of IT support resources by the ad hoc creep of a diversity of mobile devices, users and applications.

Getting it right might require more investment upfront, but it will save time and money over time and ensure that employees are as productive as possible from the outset.

In addition to controlling who has access to what apps and on what devices, the vulnerability and integrity of data accessed and used on the move needs to be assessed and securely managed.

This is the issue most likely to be keeping IT managers awake at night, and no wonder. There have been plenty of high profile losses of data, and mobile makes the task of data security much harder. There are software tools that will protect against data loss and leakage, as well as applying digital rights management. In addition to the technology, this also requires one vital ingredient that is frequently overlooked - training. All too often only simple 'how to use' training is put in place, but mobile technologies encourage such a dynamic shift in work practices that employees would benefit greatly from coaching as to how to safely and securely get the best out the tools at their disposal.

Effective mobility benefits from a wide strategy that encompasses productivity and training. It is thus not a piecemeal approach, but all embracing. It doesn't necessarily mean providing everyone with the latest gadget from California, but it does mean having a way to cope with managing a portfolio of technologies, dealing with complexity, and requiring a step change from thinking small (focusing on the devices) to thinking big (how it changes the business).

Overall, mobile brings huge benefits but significant changes to organisations and the old model of small proof of concepts and slow rollout is no longer valid. Employees are well aware of what technology is available and want to participate in selecting what works best for them as individuals. However, the collective needs of the organisation mean that controls need to be put in place, and IT departments needs a strategy for the safe management of devices, apps and most critically data used by employees on the move.

For some thoughts on how to finance the changes required to address an enterprise mobile strategy as a whole, download this free report.


Samsung's bold ambitions to transform office print

Louella Fernandes | No Comments
| More

At its recent press event in Monaco, Samsung outlined its latest plans to expand its foothold in the enterprise printing market. As a company, Samsung is already in the midst of looking for new revenue streams for growth, as focus shifts from its consumer business and it looks to better serve larger businesses and deepen its relationship with the enterprise customer. 

In the printer market, Samsung is banking on its new "Office Transformer" A3 multi-function printer MX7 as key to establishing a stronger presence in the enterprise, offering a credible alternative to established competitors such as Canon, Xerox and Ricoh.

Samsung's target is to be a tier 1 manufacturers of A3 MFPs by 2017. This is a bold ambition given the competitive space - however, its OEM agreements with vendors such as Xerox mean that as a manufacturer it already has a strong presence in the market - its aim is to now further develop its presence in the enterprise market under the Samsung brand.

The MX7 is designed for heavy-duty work environments and is capable of printing up to 300,000 pages a month, and works with large toner cartridges that give users up to 30,000 pages of colour or 45,000 pages in mono. Other add-ons include an automated document stapler capable of stapling 60 sheets of paper or an 80-page booklet.

In particular, Samsung is targeting high-print volume markets such as professional service organisations, financial institutions and government organisations with its latest MX7 series products. Certainly, the Samsung MX7 series has strong credentials to help propel Samsung into the enterprise space.

 Highlights include:

  • Fast processing power. Today, the Samsung MX7 is the only A3 MFP on the market powered by a quad-core CPU. The 1.5 GHz quad-core CPU enables faster processing speeds than a dual-core CPU.  This is combined with 1,200 dpi high resolution quality output. Along with a dual-scan document feeder which gives up to 120 single-sided images per minute, and up to 240 double-sided images, these products are an ideal choice for high speed document capture.
  • Smart User Interface. Samsung X7600 MFPs also boast the industry's first Android-based printer user interface. The Samsung SMART UX Center 2.0 functions just like a tablet with a touch-to-print display screen that can be pivoted to get true document views.
  • Downloadable Apps. Samsung's new Printing App Center enables users to set up printers by downloading essential apps from the app centre's web portals. This includes the Workbook Composer which gives users the ability to crop desired content and scan and save without the need for a PC.
  • Secure mobile printing. Samsung Cloud Print uses Samsung's private cloud which can be enhanced with its mobile device management (MDM) solution for full integration with enterprise mobility. Also offered is a wireless option with active near field communication (NFC) which enables printing, scanning or faxing of documents from any NFC-supported mobile device.
  • Customised solutions. Samsung's eXtensible Open Architecture (XOA) provides customised enterprise solutions integrated with its MFPs, such as output management, document security and document management solutions.
  • Samsung Smart Service. A key differentiator is Samsung's Smart Printer Diagnostic System (SPDS) which aims to reduce time spent on maintenance. SPDS is a mobile application which provides a technical service system for service engineers. It can also guide others with little technical knowledge and experience to fix a printer issue without the need to call the technical support engineer and incur cost.

Quocirca believes that Samsung is well positioned to grow its presence in the enterprise space with its latest models. Its sweet spot is likely to be the entry level space rather than competing head to head with its more entrenched competitors.  Samsung is wisely focusing on expanding its solutions and services capabilities to gain further traction in the enterprise market.

 Although this is not the first time Samsung has talked of a move to grow its enterprise presence, it now seems more energised with some real focus and clear strategies to achieve its goals. Perhaps the biggest challenge is to be seen as a credible player in the managed print services (MPS) space where it is late to the market. Here, Samsung needs to move quickly to establish a presence, and will need to leverage partnerships rather than building an infrastructure from the ground up. Certainly MPS could be its strongest weapon to grab more share in the coveted enterprise market.

Getting value from video conferencing

Rob Bamforth | No Comments
| More

Despite advances in technology often bringing business costs down, IT investment always requires justification. With communications in particular, the challenge is tougher as there are knock on costs, such as further investment being required in infrastructure to support the changes or significant impact on user behaviour that requires training and perhaps updated HR policies.

Video conferencing is a case in point. Businesses may well believe that the value not only from reducing travel or benefitting the environment, but also from improved productivity and responsiveness to customers, is worth it. But they will still need to be sure that they are taking the right investment decisions, especially when they start out on a new installation.

Quocirca's 2014 worldwide research project surveying over 800 current business video conferencing users, makes it clear that while most companies believe that they have been getting good value from their investment in video, it still has to be regularly justified.  In an age where many believe consumer technology is 'good enough', making this justification at the start of the project is even harder. I asked Roger Farnsworth, a senior director of services from Polycom, the sponsors of the research, what he hears about the value of video conferencing on a daily basis from talking to those who are starting out on the journey.


Rob: Video collaboration solutions are expensive compared to some well-known free tools - where does the extra value come from?

Roger: Generally it boils down to three things - quality, security and choice. Most organisations wouldn't consider free security tools or phone systems, and it's for the same reason that they should invest in a video conferencing system. The quality of free, web-based software is often inferior to the full HD video you get from a specialist like Polycom.  Investment in a specialised, more comprehensive solution delivers better audio and video quality that enhances the host company's brand.

Compliance is also a major consideration for enterprises; many are legally obliged to conform to data protection and privacy regulations. Paid-for systems aid them in this.

A dedicated video collaboration solution also allows for better integration into your specific workflows.  This is partly because of its integration with standard enterprise tools such as Microsoft Lync and also because it can be customised to suit your specific needs.


According to the research, the quality of the overall experience is an important factor for boosting adoption of video, and thus gaining greater overall benefits. Some of this was expected to come from having a more reliable system and improving infrastructure such as network availability, but higher definition video was also seen as important.  Video experiences do not all have to be high end immersive telepresence, but decent quality does play a significant part in making employees more comfortable with using video.

Many employees will have experienced some challenges using early video systems or will have heard stories about problems in the past from colleagues. In an organisation that is either installing video for the first time, or extending existing systems to be used more widely, this 'video folklore' or perception of problems will not help adoption.

When Quocirca dug deeper into the research and talked directly to installers of video systems, it became clear that many are not doing enough after making the purchase decision to get the best out of their installation. This is not helpful and can result in reinforcing negative perceptions about using video in the workplace, or denting the confidence of employees so that they only use video conferencing if there is someone on hand to provide assistance or set up the communications for them.


Rob: What can be done to ensure new video collaboration customers get off to an effective start?

Roger: There are several simple steps that an organisation can follow to ensure the smoothest possible roll out of video collaboration. The most important is thinking how video is actually going to address the business challenges and needs and then anticipating how it will fit in to the end users' daily routine. Video that is integrated into workflows will be much more rapidly adopted than a system that doesn't seem contextually relevant.

The second step is to prepare end users for what's coming to make sure they are comfortable with process and ready to engage.  Think about the user profile and pick the methods best suited to them. For example, your digital natives and millennials will be happy to watch YouTube videos and tweet their questions to your support desk, but baby boomers might prefer a more personal and formal approach such as webinars, online tutorials and physical workshops.  It's key that the users know what to expect and do not become concerned or nervous about this being a tool for them to use in the future.

Lastly; remember you only get one chance to make a first impression. Users should find collaboration tools easy to use wherever they are working. People who have an experience that is simple, with clear menu options and error codes, quick and reliable connections, and who get a satisfactory audio and video experience the first time they try are much more likely to become return users. Ensure that your users have a positive and quality experience first time and every time.


It is quite easy to look at consumer usage of video conferencing and think it will translate directly into straightforward use in the workplace, but this is rarely the case. While regular consumer usage builds awareness and familiarity, it is not sufficient for the rigorous challenges of the workplace. Things do not only need to be easy to use, they have to be reliable and build confidence that they will portray a professional image.

Partly this is down to the conferencing and collaboration tools and how well the infrastructure supports them as well as how conducive the overall workplace is for video use. Some of these factors are environmental and need to be put in place to provide the right settings, easy mechanisms for establishing calls and so on. However, some factors are personal. Pro-active training and facilitation from the outset, will help establish confidence and this can be further developed with increasing awareness of the value and management commitment to video usage - fostering a positive culture of video adoption.

It is a significant investment, so it would seem foolish to do anything other than take it seriously and ensure that everybody in the organisation gets the best out of it. To read more about video adoption, download this free report.


Should everybody be on video?

Rob Bamforth | No Comments
| More

Video conferencing has many times been presented as THE solution for many business communications challenges, and yet beyond walnut veneer boardrooms and despite the wider usage of video in personal communications, business adoption often seems tantalisingly muted.

Having embarked on a worldwide research project surveying over 800 current business video conferencing users, several usage patterns emerged, but some burning questions remained, which I put to Roger Farnsworth a senior director of services from Polycom, the sponsors of the research.

The first question concerns the matter of how many working hours are taken up with what seems to be endless pointless meetings - a point which the vendor agrees on and suggests how video could change this working pattern


Rob: Many employees feel that they have too many meetings already - isn't video collaboration just a way to hold meetings remotely?

Roger: "It's not that employees have too many meetings; that's a function of business culture. Organisations still have to be smart about time management; however; video collaboration can make necessary meetings more productive. In the UK alone time wasted being unproductive in meetings is estimated to cost the economy £26 billion every year. That's because of the 4 hours the average worker spends in meetings a week, 2 hours 39 minutes of this time is wasted. This is down to travel time, waiting for rooms when the previous meeting runs over, waiting for latecomers etc.  Workers can be more productive when they don't have to physically go to a meeting room and wait for a meeting to start. When dialling into a meeting room from your desk you can continue to work right up until the moment the meeting starts.

Video as a medium also speeds up the meeting process. Essentially, meetings are a way to reach consensus on issues and make decisions. In our recent research, more than 80 percent of those using video collaboration said they experience faster decision making. The ability to launch a group video collaboration anytime, anywhere means no more long, convoluted email trails as a preamble to a lengthy meeting. And of course both remote and external participants can join easily, so that the group can be effective and efficient. Video collaboration promotes smarter and faster decision-making."


It is clear that for video to effectively change the way people work, share information and make decisions to be more efficient, more people, in fact pretty much all employees would need to be using it. The reality is that in many organisations, video conferencing usage exists only in pockets; either the walnut veneered boardroom, certain team meeting rooms or on privileged desktops. This seems oddly restrictive when so many have become so accustomed to advanced communications, including video, as consumers.

However, the research also indicated that some organisations had a much more progressive attitude than others. In these, video conferencing usage had become accepted, normalised like using the phone and very widely adopted. So what makes them different?


Rob: What do you think are the characteristics of an adoptive video culture?

Roger: "Organisations with a high percentage of digital natives and millennials will see a video culture develop rapidly. This is because these workers are more used to using video in their personal lives, with consumer solutions such as Skype and FaceTime.

However, there are other key factors. Organisations that are constantly revisiting process and policy in pursuit of improvement adapt and evolve more quickly. Those organisations where IT is a more active participant at C-level will see video collaboration integrated into business processes and therefore adopted quickly too. Having IT advise lines of business leaders on integration of video collaboration drives adoption from the top down.

In order to foster a bottom-up movement in terms of video adoption it's important to develop a more democratic work environment where employees feel empowered to run with the tools provided. This means making video for all, not just managers. Dissolving hierarchical access limitations is absolutely essential."


The research backs up these comments. The video conferencing industry has progressed through several stages of evolution, with the perceptions from some earlier hang-ups lingering a little longer than necessary. The technology needed to mature to become easier to use and more reliable, and networks needed to grow in capacity to support higher definition video. This has largely happened, with perhaps the odd rough edges in usability still needing some polish.

The next steps involve non-technical challenges such as social, psychological and political (the internal politics of management) acceptance. These influence the culture in the workplace and attitudes to how people communication. Getting them right will bring greater adoption and should lead to the intended goal - more effective communication. To read more about video adoption, download this free report.



DevOps and the IT platform

Clive Longbottom | No Comments
| More

Historically, the development team has had a bit of a two-edged sword when it comes to their development environment.  It has tended to be separate to the production environment, so they can do whatever they want without any risk to operational systems.  The network tends to have been pretty self-enclosed as well, so they get super-fast speeds while they are working.  However, those positives are also negatives as they then find what had worked so blazingly fast in the development environment fails in the user experience stakes in the production environment due to slower servers, storage and networks.

Database.jpgOn top of that, the development infrastructure has its own issues.  Provisioning development environments can take a long time, even where golden images are being used for the base versions.  Tearing down these environments after a development cycle is not as easy as it can be.  Declarative systems, such as the open source Puppet, allow for scripts to be written that can set up environments in a more automated manner, but it still leaves a lot to be desired.

Physically configuring hardware and software environments, even with the help of automation like Puppet, still leaves the problem of getting hold of the right data.  Making physical copies of a database at a single point in time or taking subsets as a form of pseudodata does not address the central issue.  In neither case is the data a true reflection of the current real world production data - and the results from the development and test environments cannot, therefore, be guaranteed to be the same when it is pushed into production. 

Trying to continue with inconsistent data between development, test and production environments will be slow and costly.  Alongside the lack of having a development and test environment that reflects the real world, attempting to work around this involves taking full database production copies and regularly refreshing them which is a lengthy process and will also affect the performance of the operational network itself. Organisations are particularly struggling with continuous integration, where an application requires data from multiple production databases (e.g. Oracle, Sybase and SQLServer).  As developers move towards looking at big data for their organisation, the problem gets worse - now, multiple different databases and data types (for example, Hadoop, and noSQL sources alongside existing SQL sources) may need to be used at the same time - bringing these together as distinct copies across three different environments is just not viable.

Continuous development, integration and delivery require systems that are adaptable and are fast to set up and tear down. Existing approaches make agile development difficult, requiring cascade processes that take too much time and involve too many iterations to fulfil the business' needs for continuous delivery.

What is needed is an infrastructure that bridges that gap between the different environments.  Server and storage virtualisation does this at the hardware level, and virtual machines and container mechanisms such as Docker allow for fast stand up of development and test environments within the greater IT platform.  However, there still remains the issue of the data.

To create an effective data environment needs the capability to use fresh data - without impacting adversely on overall storage needs or in the time required to set up and tear down environments.  The common approach of database snap shots, clones or subset copies involves too many compromises and costs - a new approach of using an abstraction of the database into a virtual environment is needed.

The data virtualisation technology I've been looking at from start-up Delphix does just this. It can create a byte-for-byte full size virtual "copy" of a database in minutes, using near live data and requiring barely any extra storage.  The data created for development and testing can be refreshed or reset at any point and then deleted once the stage is over. Suddenly each developer or tester can have their own environment without any impact on infrastructure.

By embracing DevOps and data virtualisation, everyone wins.  Developers and testers get to spin up environments that exactly represent the real world; DBAs can spend more time on complex tasks that add distinct business value rather than creating routine copies.  Sysadmins don't have to struggle with trying to deal with the infrastructure to support multiple IT dev/test environments; network admins can sleep easy knowing that huge copies of data are no longer being copied back and forth and storage admins don't have to have their capacity drained by pointless copies of the same data.

More to the point, the business gets what it wants - fast, continuous delivery of new functionality enabling it to compete far more strongly in its market.  All without having to invest in large amounts of extra hardware and time - without the end result being guaranteed.


Disclosure: Delphix is a Quocirca client

How managed print services accelerates business process digitisation

Louella Fernandes | No Comments
| More

Despite the transition to a digitisation of paper workflows, many organisations are struggling to integrate paper and digital information. A recent Quocirca study amongst 210 organisations revealed that 40% of organisations plan to increase their spending on workflow automation, but that there is also much progress to be made - and it is primarily those that use a managed print service (MPS) that are most confident of their digitisation initiatives.

Organisations remain reliant on printing. Quocirca's research reveals that overall, 30% of organisations view printing as critical to their business processes - this rises to 73% for financial services, followed by 41% for the public sector.  Given the financial and environmental implications of a continued dependence on printing - not to mention the inherent inefficiencies with paper-based processes - 72% of organisations indicated that they are planning to increase their digitisation efforts, and many are using their MPS providers to support this transition.

Today, MPS has moved beyond the realms of hardware consolidation to encompass a much broader strategy to drive business efficiency around paper-dependent processes. With cost remaining a top driver for most MPS engagements, many MPS customers are seeing significant cost reductions not only through a rationalised printer fleet, but also through the implementation of solutions that reduce or eliminate wasteful printing and better integrate document workflows.

Certainly, MPS is proving an effective approach to the digitisation of business processes.  At a foundation level, this could be through harnessing the sophistication of today's advanced multifunction peripherals (MFPs) which enable documents to be scanned and routed directly to applications (think expense reporting or HR applications), minimising the need for multiple hard copies.

Beyond this, many leading MPS providers offer a range of business process services (BPS) as an extension of their MPS offerings. These services are increasingly sophisticated, and look at analysing existing business process workflows and for optimisation opportunities to reduce the paper burden, for instance in areas such as mortgage or loan origination, or accounts payable or receivable applications.
Despite the clear need to better integrate paper and digital workflows, Quocirca's study revealed that overall, only 29% of organisations believe they are effective or very effective at integrating paper and digital workflows. However, there is a stark difference between organisations using and not using MPS. While only 9% of organisations not using MPS rated their ability to integrate paper and digital workflows as effective or very effective, this rose to 51% for those using MPS.  Quocirca expects this figure to climb over the next year as more organisations move further along their MPS journey and begin the implementation of document workflow tools and business process optimisation.


So MPS is certainly making an impact on digitisation efforts, and confidence is most prevalent in the financial services and professional services sectors.  Due to legal and security needs, these organisations have made the most headway in eliminating or minimising their paper dependencies and tend to be the most mature in their adoption of MPS.  At the other end of the scale is the public sector - despite a huge dependence on paper, they are still behind the curve on digitisation efforts - reflecting the disparate nature of these organisations and a lower use of centralised MPS.


Quocirca recommends that businesses looking to better integrate their paper and digital business processes should look closely at the broader services that many leading MPS providers are now offering. Vendors such as HP, Lexmark, Ricoh and Xerox are all developing a range of solutions - some focused on enterprise content management (ECM), others on business process automation. While some businesses may consider business process optimisation as something to be implemented later in the MPS journey, it is increasingly paper dependent processes that are being analysed at the outset as part of the initial assessment service. This is because automating such processes can have a real impact on improving productivity, efficiency and cost reduction.

Businesses looking to start or extend their MPS journey should look for an MPS provider that can have a truly transformative business impact. MPS is no longer just about devices; it has the potential to help organisations focus on their core business and innovate and not be hindered by slow, manual paper based processes. This demands a new kind of MPS provider that can not only tame the complexity of the print infrastructure, but also has the expertise, resources and tools to accelerate paper to digital initiatives.

The path to continuous delivery

Clive Longbottom | No Comments
| More

What is it that a company wants from its IT capability?  High availability?  Fast performance?  The latest technology?

Hardly.  Although these may be artefacts of the technical platform that is implemented, what the company actually wants is a platform that adequately supports its business objectives.  The purpose of the business is to be successful - this means that its processes need to be effective and efficient.  Technology is merely what makes this possible.

The problem has been that historically the process has been 'owned' by the application.  The business had to fit the process to the application: flexibility was not easy.  Under the good times (remember those?), poorly performing processes could be hidden - profit was still being made; however, more could have been made with optimised processes.  As the bad times hit and customer expectations changed to reflect what they saw on their consumer technology platforms, poorly running processes became more visible, and the business people started to realise that things had to change.

In came the agile business - change had to be embraced and flexibility became king.  Such agility was fed through into IT via Agile project management - applications became aggregations of chunks of function which could be developed and deployed in weeks rather than months. However, something was still not quite right.

Continuous delivery from the business angle needs small incremental changes to be delivered on a regular basis.  Agile IT aims to do the same, but there are often problems in the development-to-operational stage. Sure, everything has been tested in the testing environment; sure, operations understand how to implement those changes in the run-time environment.  According to a survey by VersionOne, 85% of respondents stated that they had encountered failures in Agile projects - a main reason stated was that the company culture was at odds with an Agile approach.  No matter how agile the project methodology itself became, the impact of a wrong culture was far reaching: without changes in other areas of the business and in tooling used, the Agile process hit too many obstacles and the whole system would short-circuit.

DevOps has been touted as the best way to try and remove these problems - yet in many organisations where Quocirca has seen DevOps being embraced, the problems remain, or have changed to being different, but equally difficult ones.

The problems lie in many areas - some vendors have been re-defining DevOps to fit their existing portfolios into the market. Some users have been looking to DevOps as a silver bullet to solve all their time-to-capability problems without having to change the thought processes at a technical or business level.  However, DevOps is becoming a core part of an organisation's IT: research by CA identified that 70% of organisations have identified a strong need for such an approach. Business needs for customer/end user experience and dealing with mobility are seen as major requirements. 

Even at the basic level of DevOps creating a slicker process of getting new code into the production environment, there is a need to review existing processes and put in place the right checks and balances so that downstream negative impacts are minimised.  At the higher end of DevOps, where leading-edge companies are seeing it as a means of accelerating innovation and transformation, a whole new mind-set that crosses over the chasm between the business and IT is required - the very chasm that was stated as the main reason for failure of Agile projects by VersionOne.

Traditional server virtualisation offers some help in here in speeding up things like image deployment.  However, it only solves one part of the issue - if development is still air-locked in its own environment, then the new image still requires testing in the production environment before being made live.  This not only takes up time; it will still be running in a different way due to running against different data.  The proving of the system in production is not the same as the testing of the system in development: problems will still occur and iterations will slow down any chance of continuous delivery.

The issue is in the provisioning of data to the test systems.  Short sprint cycles, fast provisioning and tear down of environments in the production environment and a successful Agile culture requires on-demand near-live data for images to run against.  This is the major bottleneck to successful Agile and DevOps activities.  Only through the use of full, live data sets can real feedback be gained and the loop between development and operations be fully closed.  DevOps then becomes a core part of continuous delivery: IT becomes a major business enablement function.

Today's solution, taking snapshots of production databases is problematic: each copy takes up a large amount of space which can mean subsets end up being used, hoping that the data chosen is representative of the larger overall whole. The provisioning takes a lot of time through an overly manual process and typically the end result is that dev/test data is days, weeks or even months old.

DevOps requires a new type of virtualisation, going beyond the server and physical storage down into the data. Delphix, a US start-up has created an interesting new technology that I believe could finally unlock the real potential of Agile and DevOps - data virtualisation.  More on this in a later post - but worth looking further into Delphix as a company.

Disclosure: Delphix is a Quocirca client.

Video conferencing - why use it?

Rob Bamforth | No Comments
| More

What is it with video conferencing?

The technology has been around for decades; it's been seen as an inherent part of sci-fi on film and TV over a similar period; networks from fibre to 3G have been touted as being great for it; and yet it still doesn't appear to have made the transition from unusual to everyday.

Some of the fault lies with technology. Video conferencing was once cumbersome and difficult to use, which has engendered a persistent perception of users needing handholding. Differences between vendors and systems have led to stubborn interoperability issues, which even standards have struggled to completely eradicate. Plus, there are lingering inconsistencies between any single vendor's own systems as they make rapid product improvements in what is still a relatively dynamic sector.

There have been many technical advances in business video systems, but according to a recent worldwide survey, commissioned by Polycom, of over 800 existing business video conferencing users, over a quarter find video conferencing to be too complicated, and making it easier to use is the number one thing most believe would increase usage.

The consumer experience of video conferencing has evolved significantly too. While the marketing of video calling over 3G phones turned out to be a complete flop and even mighty Apple has not been able to switch everyone into a mobile video call with FaceTime, there is no doubt that video usage has become more popular elsewhere. The usage might not be regular 'calling' or 'conferencing' but through a combination of easy (and free) tools like Skype, cheap video cameras and YouTube uploads, more have become acclimatised to the use of video.

The quality of the experience might often be poorer than that of business video conferencing, but the user is comfortable with it, and this is critical to generating more widespread use of video for business. User comfort, or the lack of it, is a major reason that holds back the adoption of video conferencing. It has not yet become as natural a thing to do as making a phone call in the workplace.

Does there need to be more widespread business use of video? Yes, but the reasons are more complex than portrayed by early video conferencing solution marketing messages. Saving money by reducing the amount of business travel is certainly a prime driver for increasing the use of video. These are tangible savings, which although rarely actually measured by most organisations are at least directly attributable.

While they are positive, travel savings are generally insufficient to stimulate sufficient investment in video and it is here that the less tangible, but potentially far more valuable benefits, become more important. Part of the benefit in travel reduction is in reality saving time; travel time, of course, but also setup time, 'waiting for someone to respond' time and time spent afterwards trying to sort out what it was all about.

This can be far more critical than simply saving a business traveller from a tedious journey.

The NHS in Lancashire and Cumbria has implemented tele-health services using high definition video to normalise behavior, meaning patients feel comfortable and are able to easily connect at the touch of a button.  This approach has worked specifically very well for renal patients, reducing the need for hospital visits and allowing a large network of doctors to collaborate without scheduling or travel restrictions.

Removing wasted time not only makes individuals more efficient, it will also be speeding up the overall decision-making process and therefore customer responsiveness. These benefits are all harder to define and measure for a straightforward ROI calculation, but most people know they are there from the first time they picked up a phone to avoid spending time making a journey.

The thing about phone calls is they can only be of real value if the caller knows they can call someone wherever they need or want to, and knows the recipient will have a means of answering. i.e. ubiquitous communications. Adding video to re-introduce the non-verbal aspects back into remote communication seems a natural progression, but only if it touches everybody, equally.

There are many organisations that have already have some video conferencing systems, but with different levels of adoption. In some there are pockets of frequent or proficient users; it might be the main board, a team of engineers or a distributed marketing group. In others there are handfuls of systems that sit idle; meeting rooms used for other purposes, executive desktops that no one else is allowed to touch, or systems no one remembers quite how they work.

To encourage individuals to feel more comfortable with video in a business setting requires a shift in the attitude and culture of the organisation. Video needs to become a normal, everyday activity, used by everyone, wherever they are (any room, any device) whenever it is required. It needs to be instilled in an organisation from top to bottom and in an individual's working practices from day one.



It might feel like a bigger leap, but just like many other forms of communication - public speaking, using the phone, writing letters, document sharing - not only do the right sort of facilities need to be in place, but people need to feel comfortable to use them and to get the most out of them.

It takes practice, but with regular use, anyone can become an effective communicator in any medium, including video, and better communication builds better collaboration and ultimately a more efficient and effective business. For a more detailed look at cultures of video adoption, click here for a free report based on the worldwide survey of over 800 video conferencing users

Many attacks may still be random, security should not be

Bob Tarzey | No Comments
| More

With all the talk of targeted attacks, it easy to lose sight of the fact that for the majority of us, especially in our lives as consumers, random malware is still the greatest danger. Random malware is distributed en masse, by whatever means, in the hope it will find its way onto the most vulnerable of devices. A targeted attack on the other hand, means it is you and/or your organisation, which an attacker specifically wants to penetrate, however that might be achieved.

The best protection against random attacks is still regular patching and host-based anti-malware packages. That was the message from Kaspersky Labs at recent press round table. Of course, as a vendor of such products, Kaspersky was keen to remind all present that is was not time to ditch more traditional security capabilities just because you have now invested in state-of-the-art protection against targeted attacks. Quocirca agrees, having issued similar advice in a free 2013 research report 'The trouble heading for your business'.

If anything, the issue of random attacks is set to get worse. More devices, with more diverse systems software, often attached to public network access points, increases the attack surface, especially as mobile devices are used more and more for online banking and payments. This will mean random attacks are not quite as random as before, malware variants will be needed for different operating systems, browsers and apps (whereas in the old days it was Windows, Windows, Windows).

However, it should still be worth the cyber-criminals' effort as at present many mobile devices do not have anti-malware installed. Kaspersky says the focus has been on Android, but iOS users are becoming more and more of a target. Overall Kaspersky saw 295,539 new mobile malware samples in the first half of 2014.

There is also the potential for collateral damage. Although a mobile device user's personal, banking and/or payment card details may be the primary target, where data protection controls are not in place, business data may make its way on to personal devices too. This may also be compromised with the potential to land data controllers in regulatory deep water if PII (personally identifiable information) is involved. 

Security distributor Wickhill was also at the round table and pointed out that one of the problems resellers find is that too many organisations are still rolling out applications without giving up-front consideration to appropriate security. This is especially true of SMB's who see security as a cost not a benefit. Wickhill also finds that security is being overlooked with mobile deployments.

There was general agreement that security needed to focus on data itself rather than the rapidly dissolving network edge. This requires a holistic approach to security that applies to data wherever it is being transmitted or stored. Measures are need to control what access internal and external users have to data and what they can do with it, which was the subject of two free 2014 Quocirca reports What keeps your CEO up at night? and Neither here nor there?

Technology helps drive all this, but as Wickhill pointed out, education is also needed, both of users and the IT teams which deploy and manage the devices and applications they use. For the more lackadaisical SMBs, help is at hand. Many resellers, that are already trusted advisors to their customers, are adding managed security services to their portfolio.

Quocirca expects this will increase the uptake amongst SMBs of cloud services. This is now seen as the best way for many to acquire both infrastructure and security, as another free Oct 2014 Quocirca research report Online domain maturity shows. Kaspersky found that many early adopters of cloud services found security lacking, however, the Quocirca report shows that more recent adopters now see security as one of the main benefits of online services.

Random attacks may still be a problem to worry about, but there is no excuse for random security. The products and services are out there to make organisations, if not 100% safe, at least safer than many others. If you are targeted, you will have better chance of withstanding the onslaught, and random attacks should pass you by to trouble a weaker organisation. 

Securing virtual infrastructure

Bob Tarzey | No Comments
| More

When considering the security of virtual environments, it helps to point out where in the virtual stack the discussion is alluding to. There are two basic levels, the virtual platform itself and the virtual machines (VM) and associated applications deployed on such platforms. This is the first of two Quocirca blog posts aimed to provide some high level clarity regarding security in a virtual world, starting with the platform itself.


Virtual platforms can be privately owned or procured from cloud service providers. Those organisations that rely 100% on the use of public platforms or who outsource 100% of the management of their virtual and/or private cloud infrastructure need read little further through this first post. They have outsourced the responsibility for platform security to their provider and should refer to their service level agreement (SLA).


As Amazon Web Services (AWS) puts it: "AWS takes responsibly for securing its facilities, server infrastructure, network infrastructure and virtualisation infrastructure, whilst customers choose their operating environment, how it should be configured and set up its own security groups and access control lists".


The AWS statement points out the areas those deploying their own virtual platforms and private clouds need to address, to ensure base security. The risk is in three areas:

  •  Security of the virtualisation infrastructure (the hypervisor)
  • Security of the resources that the hypervisor allocates to VMs
  • Virtualisation management tools and the access rights they provide to the virtual infrastructure


The third point includes the use of cloud orchestration tools such as OpenStack and VMware's vCloud Director, which can be used for managing private clouds or moving VMs between compatible private and public clouds (hybrid cloud).


Hypervisor security

All hypervisors can, and do, contain errors in their software which lead to vulnerabilities which can be exploited by hackers. So, as with any software, there needs to be a rigorous patching regime for a given organisation's chosen hypervisor and the management tools that support it. That said, hypervisor vulnerabilities are of little use unless they open access either to the hypervisor's management environment or resources it has access to. Most press reports reflect this, for example, picking on the most widely used hypervisor, VMware's ESX:


ThreatPost Dec 2013 "VMware has patched a vulnerability in its ESX and ESXi hypervisors that could allow unauthorised local access to files", the article goes on the report that that the vulnerability has the effect of extending privilege, something hackers are always seeking.


Network World, Oct 2013: report on an ESX vulnerability "To exploit the vulnerability an attacker would have to intercept and modify management traffic. If successful, the hacker would compromise the hosted-VMDBs, which would lead to a denial of service for parts of the program".


In both cases, VMware went on to issue a patch ensuring that fast acting customers were protected before hackers had much time to act.


Security of resources allocated by hypervisors

Both of the above examples underline the need to address the basic security of underlying resources; networking, storage, access controls and so on. For those that do everything in house, that includes physical access to the data centre. The considerations are pretty much the same for non-virtual deployments with one big caveat. In the virtual world many of these resources are themselves software files that are easy to create, change and move, so compromise of a file server may provide access to more than just confidential data, it may allow the virtual environment itself to be manipulated.


Securing use of virtual management tools

As with all IT management there are two dangers here; the outsider finding their way in with privilege or the privileged insider who behaves carelessly or maliciously. A virtual administrator, however their privileges are obtained, can change the virtual environment as they see fit without needing physical access. That may include changing the configuration and/or security settings of virtual components and/or deploying unauthorised VMs for nefarious use.


When it comes to access control, the management of privilege, who has it, when they have it and auditing what they do with it is similar to that for physical environments. However, there are other considerations that apply in a virtual world over and above those in a physical one. Principally this is about being able to monitor hypervisor-level events; control and audit access to key files, the copying and movement of VMs, capturing hypervisor event streams and feeding all this to security information and event management (SIEM) tools. There is also the need to define hypervisor-level security and take actions when it is breached for example closing VMs or blocking traffic to and from VMs.


Specialist vendors

There are certain specialist vendors that are focussed purely on the security of virtual infrastructure layer. For example Catbird specialises in reporting on and controlling security of VMware-related deployments and GroundWork which focuses on monitoring data flows in open source-based virtual environments. The suppliers of virtual platforms and tools provide support too, not least access to urgent patching advice.


When many mainstream IT security vendors talk about virtual security they refer to the security of deploying VMs and associated applications. Security at this level is of course important to address and has its own special considerations which will be covered in the second blog post. For those that have outsourced the virtual platform and/or the management of it, and are confident in their supplier, the focus will already be at this higher level.

Have you entered our awards yet?

Find recent content on the main index or look in the archives to find all content.


Recent Comments

  • Adam: Cloud computing and BYOD go hand-in-hand. Cloud computing can make read more
  • David Chassels: Hi Clive Is the business emphasis not wrong in looking read more
  • Clive Longbottom: After a discussion with CA Technologies, I would just like read more




-- Advertisement --