vectorfusionart - stock.adobe.co
Inside the Singapore government’s cloud journey
The Smart Nation Group’s chief digital technology officer outlines the government’s cloud journey, including its approach to cloud migration and how it came to host mission-critical workloads on AWS
The Singapore government has been a big proponent of cloud computing, starting with its Government Commercial Cloud (GCC) service that makes it easier for public sector agencies to manage and secure their use of public cloud services from the likes of Amazon Web Services (AWS), Google Cloud and Microsoft Azure.
By the end of the year, it expects to have at least 70% of eligible government systems on commercial cloud services, including mission-critical systems that involve sensitive or confidential data that will be run on AWS’s new Dedicated Local Zones service.
Speaking to Computer Weekly on the sidelines of the recent AWS re:Invent 2023, Chan Cheow Hoe, government chief digital technology officer at the Smart Nation Group, said the government’s cloud journey started about seven years ago, when it started hosting unclassified workloads such as school websites on public cloud services through the GCC.
At the time, Chan said the work on GCC had enabled it to clarify the policies and regulations that would determine if the government could host its workloads on public cloud services.
“There were three things that were non-negotiable – one was for critical and private data to be ‘geofenced’ to Singapore, a requirement that came not from me or security, but from a legal point of view so that those workloads are subject to the laws of the country, like the Official Secrets Act and the Computer Misuse Act,” he said.
Other requirements included the need for secure connectivity between the government’s on-premise backend systems and front-end systems on the public cloud, along with baseline security measures, such as using infrastructure-as-code to automate deployments and prevent cloud misconfigurations that could be exploited by threat actors.
But for mission-critical workloads with more sensitive and confidential data to be moved to public cloud, Chan said there had to be more transparency on how public cloud infrastructure is run, with even higher levels of security. “We don’t know what’s happening inside there,” he said. “We don’t even know how they run it. Everything is kind of like a black box, so transparency is important.”
Ensuring resiliency
Chan said having transparency would enable the government to ensure the resiliency of mission-critical systems across availability zones – as well as their survivability. “What happens if there’s a shutdown and the zones are disconnected from the internet? How long will they survive and what are the consequences?”
But such criteria for hosting mission-critical systems would drive up costs significantly and deny the government of the benefits of a rich ecosystem of third-party services available through the public cloud.
With a wish to have its cake and eat it, the government approached the major hyperscalers for a service that would provide it with the transparency and higher levels of security it needed to host mission-critical systems without compromising the benefits of cloud.
Chan said that while there were sovereign cloud offerings in the market, “everything would be locked up” so there was no offering that met the government’s requirements. “We did workshops and deep-dived into things like security caging and dedicated personnel with the hyperscalers – AWS was most ready for the job and that was how Dedicated Local Zones was conceived,” he said.
With Dedicated Local Zones, Chan said the government now offers public sector agencies three cloud deployment options – commercial cloud services for quick deployment of workloads; GCC for those concerned about localisation; and Dedicated Local Zones for highly sensitive and mission-critical workloads.
Read more about cloud in APAC
- Malaysia-based startup Aerodyne is tapping AWS to power its drone platform that enables drone operators to onboard, analyse and make sense of drone data in a variety of use cases.
- Spending on public cloud services in Australia will reach 50% of the total addressable market for the first time in 2024, according to Gartner.
- Microsoft is making its largest investment in Australia to expand its infrastructure footprint, alongside plans to bolster skills training and cyber security in the country.
- HPE has evolved its GreenLake strategy over the past five years, expanding beyond its as-a-service offerings to include a capital purchase model and a strong focus on hybrid cloud management and AI.
Besides AWS, the Singapore government also uses public cloud services from Microsoft Azure and Google Cloud. Chan said the government will evaluate other hyperscaler offerings similar to Dedicated Local Zones if they become more mature.
“From a safety and resiliency point of view, we don’t want to put all our eggs in one basket,” he said. “The design philosophy of each cloud provider can be quite different, and it’s not about whether one is better than the other. We’re working with all of them, and when a product comes along that makes sense, we’ll look at it.”
But running workloads on multiple public cloud providers can be daunting as it requires technology teams to be skilled in the inner workings of each cloud. While the government doesn’t dictate the choice of cloud provider for agencies, it has called for cloud providers to invest in training, said Chan.
AWS, for example, has trained over 200,000 individuals in Singapore on cloud skills since 2017 through various programmes such as Skill Builder, as well as re/Start, which prepares individuals for careers in the cloud. Earlier this year, it partnered with Ngee Ann Polytechnic to provide over 500 ICT students with access to its training and certification programmes over three years.
Cloud migration approaches
The Singapore government has taken pragmatic approaches to its cloud migration efforts. These include lifting and shifting workloads from on-premise environments; re-platforming, such as compiling code from legacy platforms like Solaris into Java; replacing legacy systems with software-as-a-service alternatives; rearchitecting into cloud-native applications; and eliminating workloads that are hardly used.
Chan said of all the approaches, the lift-and-shift approach was most criticised as it was perceived to be of little value. But he defended the approach, noting that lifting and shifting legacy workloads gives agencies and their suppliers time to learn how to work with cloud technologies.
“Also, you can’t rearchitect all the applications to the cloud because it’s a lot of work and you don’t have the time and money,” he added. “Rearchitecting can happen when an application reaches its end of life, and the faster you move the better because it allows you to rationalise your infrastructure.”
The archetype of each application is also a factor in determining which migration approach to take. Chan said customer-facing applications tend to be rearchitected because of the need to make them more scalable, elastic and customer-friendly using cloud-native services and application programming interfaces that provide access to third-party services such as payment systems.
Just as important is the need for applications to failover to a different availability zone during a datacentre outage. He said that while it’s much easier to do this with the cloud today with multiple availability zones, IT teams will need to know how to configure applications properly for the failover to occur.
“The technology might be there, but the way it is used is important,” said Chan. “That’s why training is so important, not just for students but also our vendors. If we don’t upgrade the industry fast enough, we will have problems.”