The Winter Olympic Games in Sochi has certainly sparked its share of controversy, from protests over Russia’s anti-gay laws to anger at the venue’s lack of snow. But although speed-skaters and snowboarders may snipe about a slushy course, one thing has so far run remarkably smoothly: the technology.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
With a slew of sensors monitoring athletes’ every twist and turn, and hundreds of physical servers to process the results data in real time – pumping it out to venues, broadcasters, subscribers and a cloud-based website serving information to some eight billion devices – these are the most data-intensive games ever staged.
Results systems are generating more than 15TB of data every day, a figure that Atos, the games’ official worldwide IT partner, engagingly estimates is “the equivalent of every spectator in the speed-skating arena tweeting once a minute for 27 years”.
Putting in place the infrastructure to support this has been no mean feat, but Marta Sanfeliu, Atos’s chief integration officer for the games, has plenty of experience. “This is my eighth Olympics,” she says. “I’ve been involved since Sydney in 2000, although this is the first time I’ve handled full delivery and led the project on site.”
With the dexterity of a skier negotiating the slalom, Atos's Marta Sanfeliu has had to ensure all the teams have been able to work successfully around one another
The scope of Sanfeliu’s role is breathtaking. As well as being responsible for the results systems across 14 sporting venues, she has also had to deliver all the games management systems, including those for security officials, volunteers and logistical operations such as human resources.
“I’m also in charge of integrating systems from the various technology partners – notably Omega, which is delivering all the sensor and timing technology; networking partner Avaya; and several other companies providing things like security and load balancing systems,” she says.
“We also help the different technology partners and Organising Committee with their policies and procedures for service management, incident management, change management and the rapid deployment necessary at the various competition venues, as well as non-competition venues such as the International Broadcasting Centre and the three Olympic villages at these games.”
She leads a team of business technologists that has grown from around 100 in the early stages to 3,000 on-site during the competition. For the past four years Sanfeliu has also been navigating through a political and contractual minefield involving the International Olympic Committee (IoC), the Organising Committee for the Sochi games, Russian government officials and a raft of technology partners across the globe.
“Much of the equipment we needed to procure was not available in Russia, so we’ve had to go through long purchasing procedures via the Organising Committee to get things out here,” she says.
And with the dexterity of a skier negotiating the slalom, she’s had to ensure that all the teams, partners and stakeholders have been able to work successfully around one another, as well as around the numerous builders and contractors on-site.
It's pretty complicated to put together a technology solution when you don't have a stable infrastructure to build on
Marta Sanfeliu, Atos
Building the infrastructure
For the first time, these games boast two separate main datacentres. “When I first came out to Russia in 2010 to define the requirements for physical equipment with a core team of people and the Organising Committee, we had to base the games management systems in Moscow, since the Sochi site was not ready,” she says. “The results systems required a separate datacentre at the Technology Operations Centre in Sochi, which we didn’t start building until December 2011.”
The Sochi datacentre was fully operational by May 2012. In addition, all the individual venues have their own standalone mini-datacentre to process results on-site. “It’s a strict IoC requirement that the results for each venue have to be delivered locally,” says Sanfeliu.
Altogether the datacentre systems comprise 400 HP ProLiant servers running Microsoft Hyper-V and System Center software for virtualisation, disaster recovery and remote access management. “Despite the increased workloads and performance, that’s less than half the number we had in Vancouver four years ago,” she says.
The major difference between Sochi and previous Olympic cities Sanfeliu has worked in, she says, is the fact that all the infrastructure has had to be built from scratch.
“Since the games were appointed here seven years ago, what we’ve been able to achieve is quite impressive. It’s pretty complicated to put together a technology solution when you don’t have a stable infrastructure to build on,” she says. “Add to that the fact the infrastructure is not fixed, and you have an even bigger challenge.”
Unsurprisingly, the project has not been without its stresses. Even at the end of 2013, when the datacentre had been live for 18 months, site construction was still disrupting the team’s work.
“We had a lot of service downtime due to construction. For example, on one occasion the contractor building the train link accidentally cut through all the communications lines,” says Sanfeliu.
Power to the people
But the biggest challenge was power. “When we started working in Sochi there simply wasn’t enough power available – not only for the datacentre, but for the whole infrastructure of the games. So at that early stage, we had to ensure we got hold of appropriate generators and other power systems. It was vital that power was managed in the right way because we couldn’t risk damaging the equipment, which is meant to be kept as part of the post-games legacy,” she says. “Fortunately, one of the local partners, Rostelecom, was able to help us get the generators and power equipment we needed.”
More on Olympic IT
In all, getting the systems ready for the games has involved around 100,000 hours of testing, and so far they’ve been performing very smoothly.
Sanfeliu's biggest worry before the start of the games was that either power or communications would go down at a critical moment, particularly since power became unstable in one of the venues just a day before the opening ceremony.
“Luckily, it came back in time for the start of the games and so far that’s the only major glitch we’ve seen,” she says. “Apart from that, we’ve just faced small operational issues, such as a few days ago in Laura [venue for the biathlon and cross-country skiing] where lighting for broadcasters caused some interference with the transponders. But that’s the sort of minor issue you get at any games.”
It’s taken a lot of hard work, and a lot of hours, to reach this point, but now the games are underway, Sanfeliu says she’s managed to take a little time to enjoy them, so far attending the opening ceremony and figure skating. And she says even after 14 years she still loves what she does.
This is the first time she’s worked in Russia and she says it’s been a very rewarding experience. “When I was first offered the opportunity, I didn’t know much about the working culture of the Russians, but they’ve been incredibly warm and welcoming. And their spirit of collaboration runs deep, especially when you engage them on a personal level. This has been a very rewarding project from all sorts of perspectives,” she says.
“When I was originally hired for the Sydney games, I was so happy to be involved in something that’s so special for everybody. Now I look for that same sparkle in the eyes of the people I hire. Because it’s incredibly hard work and you can only do it if you love what you’re doing.”