Less than 20% of IT departments in North America and Europe have incorporated data virtualisation into their integration tool kits. Even fewer are realising its true potential, according to Forrester analyst Brian Hopkins.
Forrester defines data virtualisation as a technology that abstracts, transforms, federates and delivers data taken from a variety of heterogeneous information sources. This allows consuming applications or users to access data from these various sources via a request to a single access point.
In a Forrester report, "Data virtualization reaches critical mass", Hopkins notes two factors limiting the adoption of data virtualisation. First, he says many early product releases have not lived up to expectations. Second, some of the bigger suppliers are underplaying the benefits, while IT is more interested in individual projects rather than enterprise wide architectures.
In the report he writes: "Over the next 18 to 36 months, we expect this market attitude to change as technology advancement, more third-party integration, and new usage patterns lead to increasing awareness of data virtualisation's potential. Already, many early adopters are having significant success with recent releases of the market-leading products."
As an example, he says that an interviewee from Qualcomm stated: "When we realised that we didn't have to physically move data around for integration, the technology started to really make sense. Now we have gone from point solutions to an enterprise deployment of data virtualisation."