At the Rehabilitation Research Institute of Singapore, Colin Quek is laying the foundation for a new data platform that will serve as the backbone for future research projects.
“This is the proverbial ‘not if, but when’ scenario which no one hopes to encounter,” says Quek, its deputy director for informatics, referring to cyber threats and data breaches. “The next challenge lies with local regulatory restrictions on data.”
While security is also of utmost concern at Australia’s MinterEllison, a professional services firm, the exponential growth in data has been posing challenges.
“Managing on-premise data growth comes with a heavy upfront investment,” says Nathan MacGregor, head of infrastructure at MinterEllison. “To manage data growth, we use tiering policies to archive old data to low-cost cloud storage.”
MinterEllison also needs to ensure it has the best performance for business-critical applications that it is running on tier 1, all-flash arrays. For file storage data, it uses tier 2 storage comprising a mixture of SAS (serial-attached SCSI) and solid-state drives.
Using multiple storage tiers will undoubtedly result in storage complexity – one of the top storage challenges faced by enterprises today.
“Storage has so many ‘built for purpose’ protocols and approaches that result in customers having to manage multiple platforms from multiple vendors,” says Paul Haverfield, chief technologist and presales manager for hybrid IT at Hewlett Packard Enterprise (HPE) Asia-Pacific.
“This, in turn, creates cost inefficiency as well as infrastructure and interoperability complexity. Quite simply put, storage technology is becoming too complex for humans alone to support and manage from end to end.”
Inability to access and analyse data
Another major storage challenge facing enterprises is the inability to access and analyse the data they have stored.
Haverfield says many companies still retain and hoard their data because it is easier to do so than to delete or cull it. But that retained data does not provide value because it is not connected to analytics tools.
“Data is transformative only when it can be refined and accessed at the right place and at the right time, driving actionable insights into new revenue streams,” he says. “Storage needs to be intelligent so that you can unlock data’s potential.”
The growing cost of storage is another challenge because of the amount of data being generated each day, and companies like MinterEllison are dealing with the problem by migrating some workloads to the cloud.
“The benefits of using cloud means we have more choices and flexibility regarding where the data can be stored,” says MacGregor. “Therefore, inactive data can be tiered to the appropriate level of cloud storage, helping to reduce the costs of holding old data.”
Although MinterEllison has risk and governance policies that must be adhered to, sometimes limiting its use of cloud, it uses cloud storage whenever it makes business sense.
“The primary business requirement is that the data must remain within Australia,” says MacGregor. “Even with this limitation, we can still utilise major cloud players like Amazon Web Services and Microsoft Azure, given that they now have instances in Australia.”
Extreme elasticity and on-demand storage
MinterEllison has moved entire workloads to the cloud based on specific requirements, such as those that need extreme elasticity and on-demand storage that can vary from day to day.
For example, in e-discovery, where electronic information is sought for legal proceedings, cloud storage elasticity has enabled the firm to scale up and down as required, based on demand.
“Depending on the requirements of an e-discovery project, the workload may require storage of 100GB one day and 10TB the next,” says MacGregor. “Cloud storage provides a platform with which to provide this agile data expansion and contraction.”
In recent years, the firm has also bought on-premise storage that can archive inactive data to the cloud. It also ensures that any new on-premise storage purchased has a cloud equivalent, allowing for data migration, redundancy and seamless replication.
But Ted Aravinthan, Dell’s general manager and sales director for cloud and converged systems in Asia-Pacific and Japan, says enterprises will continue to balance storage between on-premise and cloud.
“There are those that are migrating data from cloud back onto on-premise platforms due to the total cost of ownership and reducing their reliance on cloud service providers,” says Aravinthan.
“On the other hand, we see many customers moving some non-critical data into the cloud where some are on a trial basis. It’s an ongoing balancing act to arrive at what works best for their organisations.”
Read more about storage in APAC
- NVMe storage is becoming popular among Asia-Pacific enterprises that want to reduce latency and speed up application performance.
- New Zealand-based email hosting provider SMX needed to scale up its infrastructure to be able to handle 400,000 mailboxes, and found object storage from Scality fitted the bill.
- The Australian arm of global engineering firm Laing O’Rourke signs up for Nutanix to run its core applications on a private cloud.
- With storage systems becoming more heterogeneous and siloed, some organisations in ASEAN are turning to storage analytics to reduce costs and address storage issues before they occur.
In any case, the challenge of operating in a hybrid cloud environment needs to be addressed from the onset, through an integrated platform that will make it easier and seamless to manage and migrate workloads, says Aravinthan.
“Often, organisations are struggling to manage their applications portfolio, comprising existing and new built-in-the-cloud applications,” he adds. “With VMware Cloud Foundation on Dell EMC VxRail, organisations can implement a multi-cloud environment and be able to manage and move workloads seamlessly and integrated across an organisation’s operational systems.”
Enterprises will also need better tools to manage storage pools strewn across a hybrid cloud environment. One such tool, HPE Infosight, leverages artificial intelligence and telemetry data from systems beyond storage to provide actionable recommendations.
HPE’s Haverfield says: “This is important as we have shown that more than half the IT problems ‘believed to be’ in storage are actually root-caused outside the storage. Focusing on storage only is myopic and not resolving the customer’s challenge efficiently.”
HCI may help
In some cases, deploying hyper-converged infrastructure (HCI) may help to alleviate the complexities of storage and infrastructure management by providing a platform that is easier to manage, scalable and comes with out-of-the-box capabilities.
While HCI is “solving a lot of operational challenges”, Aravinthan says care is needed to ensure it fits the right workloads, or enterprises could risk not achieving the desired business outcomes.
“Choices of virtualisation, independent scaling of storage or compute requirements, criticality of workloads, and input/output operations per second requirements all play key roles in identifying if HCI would be the best option,” he says. “It is more than likely that HCI would be the option. However, there are situations where it might not be.”
According to Haverfield, the consolidation of tier 2 applications such as virtual desktop infrastructure, HR and finance, as well as databases, is driving most HCI deployments.
For enterprises – such as those in healthcare – that face difficulties in running unstructured and unpredictable workloads on HCI, HPE provides two architectural approaches: aggregated and hyper-converged, as well as disaggregated and converged management.
Haverfield says these two approaches address both predictable and unpredictable workloads, while still delivering the underlying benefits of hyper-convergence.