NTT taps data analytics and cloud for Tour de France
NTT’s analytics and cloud capabilities stood up to the test amid crashes and poor weather conditions during the first stage of this year’s Tour de France
On the opening day of the 2020 Tour de France, wet weather, twisting roads and race tensions created problems not only for the peloton, but for NTT as well.
The technology supplier for one of the world’s most watched sporting events and sponsor of the NTT Pro Cycling team found itself dealing with distorted GPS data transmitted from race bikes due to poor atmospheric conditions.
Making things worse were multiple crashes along the 156km route, with as many as 20% of the riders in the peloton having to use replacement bikes that were not fitted with sensors, said Peter Gray, senior vice-president at NTT’s advanced technology group for sport.
“It was an incredibly complex stage to manage with all the bike changes and complexity around the challenging weather conditions that affected radio transmission and GPS accuracy,” Gray told Computer Weekly.
“Our analytics platform had to do a lot of data cleansing and interpolation to position riders, and in some instances almost having to make an educated guess on their locations because of those external factors.”
Gray said NTT’s algorithm has been refined over time, so even with the challenging scenarios in the first stage of this year’s race, they were able to snap riders back to their probable locations in the course. “We employ data quality confidence levels for different riders, so there is a level of confidence that the position we’re calculating for a rider is correct.”
Like most modern sports in the age of cloud and big data analytics, professional cycling generates heaps of data that can be used to manage races, as well as to enrich the fan experience at a time when fewer spectators are allowed.
During each leg of the 21-stage race that spans about 3,500km of flat, hilly and mountain roads, 2.5 million records of raw tracking data is collected, with the volume of raw real-time data reaching 800MB. Each record is further enriched with other real-time data on more than 50 attributes, including weather conditions and road gradient.
All teams in the race get access to the same data, which is shown on live TV and available on the event’s website and official mobile app. Gray said the teams use the data to understand how each race progresses, including the number of riders they have in the tête de la course – or lead group – as well as at the back of the race.
“Those types of information are really useful for the teams, and sports directors also have a radio connection with each of the riders,” said Gray. “They can then decide if they want to chase it down or save their energy for the next day.”
This year, NTT has integrated a machine learning model into its fantasy league game, which can forecast which riders will do well and in which stage of the race, among other predictions.
“We’re using that model to give players insights into which of the riders they should be watching out for today and give them a bit of advice on who they should be adding to their fantasy teams,” said Gray.
Gray claimed that the model had been accurate in predicting the top three riders who clocked the fastest cumulative times across all stages in the general classification category. For each stage, it can also successfully predict those who are likely to be in the top five.
NTT’s relationship with Tour de France started in 2015, when it deployed a portable datacentre housing clusters of servers in a truck to process all the data on-site.
In 2016, it decided to use virtualised servers on the cloud. That proved to be a godsend when bad weather prevented NTT from deploying its on-site infrastructure at the finish line on top of a mountain for a stage race.
“Because we were replicating our environments in the cloud, we completely virtualised our physical environments and rerouted all the data to our cloud infrastructure,” Gray said. “We demonstrated that we were able to make a fully cloud model work successfully.”
This year, NTT took things further by deploying Docker containers for its real-time analytics capabilities, along with the use of code-based automation. “Our DevOps teams can literally type a single command and deploy our new environments, including infrastructure and applications,” he said.
Moving forward, Gray said NTT is looking at technologies to enhance the fan experience and support the event’s massive logistics operations.
“It’s almost like a village – you’re moving hundreds of kilometres every day and so using services around the internet of things, geolocation, wayfinding and augmented reality to enhance the experience of the fans and people running the race is very much on the agenda.”
Read more about cloud and analytics in APAC
- Asia-Pacific organisations are relying on cloud scalability to extend digital services to more users and customers.
- Australian retailer Kmart is lifting and shifting some old Cobol code to AWS while rebuilding others into microservices in its mainframe migration move.
- Tableau bolsters its presence in Asia-Pacific following its acquisition by Salesforce, and is combining its analytics capabilities with Einstein Analytics to deliver AI-powered insights.
- Malaysia’s Affin Hwang taps data analytics to better profile its customers, increase sales leads and identify new customer segments.