While moving databases into a private cloud environment is a relatively straightforward proposition, the same isn’t necessarily true of migration to the public cloud.
But the key challenges have less to do with the technology itself – although without doubt there are some technical hurdles to using cloud databases – and more to do with contractual and process matters. For example, at the contract level, organisations need to consider both support issues and exit strategies.
Few companies today are opting to outsource large chunks of their IT infrastructure to any single cloud computing vendor. Most are instead going for a mix-and-match approach based on their individual requirements. This situation can generate database and application interoperability problems both internally and between different cloud providers.
As a result, it’s necessary to evaluate from the outset how easy it will be to integrate such systems and how such work should be done. A successful strategy will also entail defining data standards and devising suitable means of managing master data in order to prevent vendor lock-in.
But it is likewise crucial to understand which vendors will be responsible for and support which elements of the overall system, and to be aware of how such activities will correspond. A final consideration is to ensure that the service-level agreements of all involved parties are coordinated.
Exit strategies for cloud databases
Another thing to be mindful of is making upfront provisions to exit a cloud computing deal as inexpensively and efficiently as possible, should the need arise. For instance, Alastair McAulay, managing consultant at PA Consulting said one of his clients found that it needed a significant amount of expensive technical support to migrate when the Software as a Service offering it planned to adopt for its UK operations didn’t scale enough to roll out across Europe.
“You’ve got to get the data and reconstruct the database schemas in the old system in order to create the new one, so you need assistance,” he said. “The application was fairly vanilla, but over two years, it had been extended enormously – so they had some shared functionality as well as their own and they needed technical help to get it off.”
Although the customer already had an exit plan in place, it ended up “paying more than it would have liked,” McAulay added. Contractual issues aren’t the only things that need to be thought about before moving your data to cloud databases. A key issue for many organisations is that their information security processes often aren’t as watertight as they believe, primarily because those processes tend to evolve over time as workarounds are introduced.
While the cracks can be papered over internally or even dealt with adequately by traditional outsourcing suppliers that tend to have close relationships with their customers, this might not be the case with the more arms-length arrangements of many cloud services providers.
“Processes won’t necessarily be good enough to slot into a generic cloud offering as it’s one-size-fits-all and what is missing is a sense of personal relationship management. So if you’re phoning up about something, they’ll just say, ‘You’re not authorized.’ They won’t look at ways around a situation,” McAulay said.
But increased rigour in terms of access rights will nonetheless be required to prevent unauthorized online access to cloud databases. This means ensuring that default accounts in databases have been removed or locked down and that security processes are smart enough to ensure that privileges are changed as staff take on new roles or leave the company.
Cloud databases and security sensitivities
Yet another important consideration is classifying data assets correctly by assigning them a value in sensitivity terms and knowing where they’re located. If such classifications aren’t applied, warned John Walker, a member of the London chapter of the Information Systems Audit and Control Association’s security advisory group, the danger is that “you could end up migrating highly sensitive, confidential data out into the cloud.”
Related to this is the need to have effective information lifecycle management processes in place to ensure that data is created, stored and destroyed in line with internal policies and external regulations, such as the Data Protection Act, to minimise the costs of processing large amounts of unnecessary data.
Other issues, meanwhile, relate to disaster recovery. While it’s important to keep a true copy of the data held in a cloud database, it’s just as crucial to know where the copy is being kept – and to ensure that it has been suitably cleansed.
A final thing to bear in mind, particularly in an “Infrastructure as a Service” – cloud infrastructure services –scenario, is performance. “If you’ve not got the database on your network, it’s going to be slower as your application has to go and talk to it over the Internet,” explained a development and database manager at a UK charity. “You also have to think about firewall calibrations and the like.”
While adding support for Web services or using HTTP traffic are two ways of getting around the performance issue, such activity is likely to require work for which it may be tricky to obtain buy-in.
“It’s not a simple switch, and you can’t just say, ‘the database is over there at the moment’ because of security concerns and fears that sensitive data will be hacked and revealed. So a lot of organisations are having to do quite a lot of work to weigh up the convenience of cloud versus the risk,” the database and development manager concluded.
This article is part of our UK network of sites' package on cloud computing trends.