pigprox - Fotolia
With a strategic five-year plan, Bureau Veritas (BV) set out on its digital transformation project in 2015. Core to that transformation was migration of the vast bulk of its applications and infrastructure to the AWS cloud. Along the way, it faced challenges in migration, day-to-day management, costs and security. We talk to BV’s IT director, Jean-Marc Devos Plancq, about the transition.
Bureau Veritas – or BV to those who know it well – is one of the oldest companies still active anywhere. Formed in 1828 in Belgium, but now headquartered in France, the testing, inspection and certification firm employs 75,000 people in 1,500 offices in 150 countries.
Its original raison d’être was maritime risk assessment, but since then, Bureau Veritas has diversified into sectors that include automotive, railways, public sector infrastructure, transport, supply chain, energy, agro-food and health. With a presence on five continents, the group turned over €4.6m in 2020.
“Acceleration of digitisation was a key pillar of the plan,” says Devos Plancq. “And that pillar comprised two aspects: delivery of digital services to our clients, built on the improvement and digitisation of our internal processes.”
So, the IT director began to look at what was on offer from public cloud providers. “Historically, we had used private datacentres to host our applications,” he says. “Capex had been a major budget component, but we wanted to gain some financial agility and not be tied to amortisation cycles over three, four, five years for different solutions.”
Devos Plancq also points out the time-consuming nature of such procurement cycles. “You have calls for tenders to replace a SAN that will cost hundreds of thousands or even millions of euros,” he says. “Then it takes months to select the vendor and the product, and to deliver and install it. It takes up a lot of time for a large part of the IT department for 18 months just to add storage capacity. We wanted to avoid this inertia.”
Besides the investment and time needed for new deployments on-site, the IT chief wanted to reduce the burden of managing all these elements, as well as networks, patching, and so on. And so the agility promised by IaaS, PaaS, SaaS (infrastructure, platform and software as a service) and the cloud appeared very seductive.
“We wanted to orient ourselves towards what would bring value for customers, whether internal or external, that use the applications we had traditionally hosted,” says Devos Plancq.
At the time, six years ago, Bureau Veritas took the view that AWS was “the most mature provider, with a platform that had the largest number of directly usable services”, he adds.
And so, by February 2021, BV hosted 85% of its applications on AWS – but that transition didn’t happen overnight.
Three phases to cloud transition
“We started with a discovery phase, so we could understand how the cloud worked and how we could integrate it into our application environment,” says the IT director. “Also, we had to prepare support for our teams because they were about to completely change the tools they worked with.”
This exploratory period – lasting 18 to 24 months – saw applications move to the cloud as the opportunity arose. “We moved applications to AWS that were simple to deploy, notably those already delivered by DevSecOps and automated and secured by technologies like Java,” says Devos Plancq.
With time, confidence and knowledge acquired, BV adopted its “cloud-first” approach. “According to this principle, any new applications had to be developed in the cloud, unless it was technically impossible,” he says. That period lasted two more years before the third stage was reached.
“When we believed we had good knowledge of the AWS platform, we decided to migrate all servers for our corporate solutions to the cloud, so we could shut down our on-site infrastructure,” says Devos Plancq.
This phase of migration to the AWS cloud meant moving Oracle and SQL Server databases to Amazon’s RDS database management system (DBMS). But the Bureau Veritas IT teams didn’t settle for a simple “lift and shift”. “We integrated a version upgrade to our databases into their move to the cloud,” says Devos Plancq.
“It was easier to migrate our databases to the cloud than to carry out an update on-site, because that would have required infrastructure changes too. We limited ourselves to creating backup and recovery partitions on the instances that we mounted on AWS, and that was it finished.”
Regression testing was carried out “to make sure everything was working, that there were no connectivity problems”, he adds.
Successful cloud migration needs a few tricks
Migrating a DBMS to Amazon RDS can sometimes bring surprises, but the BV IT teams didn’t have any problems. “The bulk of the functionality in SQL databases is taken care of by RDS,” says Devos Plancq. “But there is some functionality that can’t be taken on by AWS. If you use it, you have to find another solution.”
This was one of those times when outside help was needed, so BV subscribed to the AWS Migration Acceleration Programme (MAP).
Databases managed by RDS communicate via Elastic Beanstalk, which is one of the longest-standing services provided by AWS, and is used heavily by Bureau Veritas. “This PaaS allows deployment of applications and the benefits of automated platform scaling,” says the IT chief. “You manage the environment rather than the servers because the platform manages itself according to the number of users at any one time.”
The IT team administers about 50 applications in this way, out of a total of 115.
Read more on cloud computing
- Commerzbank picks Google for cloud migration. German bank expands its work with Google Cloud and plans to move 85% of applications to its platform.
- Hidden cost of cloud puts brakes on migration projects. Lack of skills and complexities in rebuilding software for the cloud are among the main factors inhibiting cloud success.
“For custom developments, the PaaS allows us to guarantee a level of performance according to the time of day, the number of users connected, but also to optimise our costs when activity is low or non-existent,” says Devos Plancq.
Most of the applications developed by Bureau Veritas are written in Java. Elastic BeanStalk was built with Java in mind and supports frameworks and languages that include .NET, Node.js, PHP, Python, Go and Ruby.
On the other hand, Elastic BeanStalk also requires you to take into account several peculiarities, says Devos Plancq. “It is important that your applications aren’t dependent on user sessions,” he adds. “Because the platform decides which server is active or not, users can lose their progress within a task. So, you have to manage sessions in a shared cache.”
For that, Bureau Veritas uses Amazon’s ElastiCache, which is a service based on in-memory databases Redis and Memcached.
“This requires a little tweaking in the application to externalise user sessions in the cache, but equally it’s important the sessions don’t have a big footprint when serialisation takes place,” says Devos Plancq. “Ideally, you should use stateless applications.”
Cloud-first bears fruit
Devos Plancq is full of praise for how quickly it is possible to develop and deploy solutions via AWS services. He points to the example of Bureau Veritas’s Restart your Business, which provides services to help customers reopen workplaces and spaces to the public after Covid restrictions. The application was developed in 14 days and deployed in 85 countries in “three or four days”, he says.
He points to the number of new services regularly announced by AWS, many of them pushing towards a serverless approach. “We’re taking things to the next level with PaaS with services like Lambda that allow use of milliseconds-worth of compute to execute applications,” he says.
Automation of processing is also on the right track. BV’s IT teams are committed to an infrastructure-as-code approach to deploy and upgrade the technical infrastructure, to upgrade operating systems, applications and other services.
On this subject, Devos Plancq says his teams make use of services that help automate operations and use alerts to warn of the need to deploy another bucket when S3 storage has reached capacity, for example.
“We’re working to connect our systems via APIs [application programming interfaces],” he says. “We use API Gateway to allow local applications to talk to applications across the group, but also to allow access to customers and partners.”
API Gateway is used, for example, in Code’n’go, the highway code learners’ application developed by BV and delivered via driving schools.
Obviously, to adapt all an organisation’s services to such a new way of working demands financial vigilance. Bureau Veritas has gradually shifted to a FinOps approach that makes use of EC2 Savings Plans as well as automation of startup and shutdown of test-and-dev environments used by developers.
After six years, adapting to these services and their limits is part of the daily life of BV’s IT teams, but they also face other difficulties.
“We are often constrained by the technical prerequisites of software packages,” says Devos Plancq. Bureau Veritas uses Documentum for EDM, Sybele for disaster recovery, Tableau for BI and reporting, and SAP for financials.
Cyber security: Encryption obligatory for Bureau Veritas
To monitor its cloud infrastructure and various services, BV uses the AWS tool Amazon CloudWatch. “It allows us to get a granular view of what’s happening in our environments,” says Jean-Marc Devos Plancq. “We also use AWS Config SIEM to track user events and actions at any moment in time.”
As regards its core activities around certification and testing, Bureau Veritas has a heavy burden to bear in data and customer security, he says.
“As an organisation, you have to understand how responsibilities are shared between AWS and the customer,” says Devos Plancq. “From the moment you know which elements are in your charge and those that AWS administers, it is easy to define the security perimeter, and to do that, there are lots of AWS services available.”
From strong partitioning of applications and environments at network level, via isolation of applications from each other, up to data encryption, the BV security team uses the AWS tools at its disposal. “We have very strict security rules at Bureau Veritas, with network security, data encryption in-transit and at-rest, and deployment of layers of application protection like AWS WAF,” says Devos Plancq.
He also cites BV’s use of Amazon Virtual Private Cloud (VPC) and Amazon Key Management Service (KMS). “It’s the responsibility of our information security function to clearly define the rules for these building blocks and to put in place all the auditing measures and monitoring required to ensure their application,” he says.
Like all European enterprises, Bureau Veritas is bound by the General Data Protection Regulation (GDPR). “Besides the fact that our data is hosted in Europe by AWS, we have to implement all the specific requirements of GDPR: processing documentation, classification of data, and so on,” says Devos Plancq. “Customers are made aware that their data is hosted on the Amazon infrastructure.”
But with a US-based cloud provider, the company is exposed to US extra-territorial laws. The most well-known of these is the Cloud Act, but the Patriot Act and Fisa also represent a risk for some Bureau Veritas customers, and so BV has taken up the recommendation repeated many times by AWS.
“We’ve made things very simple,” says Devos Plancq. “Absolutely everything is encrypted, with no questions about whether it is possible, with no debate. It’s the rule we apply without exception.”
However, certification files are not always kept in the cloud, such as in the case of BV clients that are subject to particularly strict rules or regulations that mean they simply can’t pass data to AWS or its competitors. “If we work with a ministry of defence in a particular country or a customer with specific needs, their needs are addressed locally,” says Devos Plancq.