Case study: How the BBC uses cloud to process media for iPlayer

News

Case study: How the BBC uses cloud to process media for iPlayer

Archana Venkatraman
Ezine

This article can also be found in the Premium Editorial Download "Computer Weekly: The future of networks."

Download it now to read this article plus other related content.

The BBC’s IT team has transferred its video-on-demand service iPlayer to the cloud using a system called Video Factory. The broadcaster has achieved scalability, reliability and elasticity with the new system and has cut down the processing time to make a video available on iPlayer from nine to 10 hours down to between 20 and 30 minutes.

BBC’s iPlayer is UK’s biggest audio and video on demand service, free in the UK receiving about 7m requests per day.

The team adds approximately 500 hours of unique content to iPlayer every week, making it available immediately after the broadcast for at least seven days, and in some cases, even up to 30 or 45 days. 

iPlayer 2014.png

The content is made available to users for streaming as well as in download capacity in more than 1,000 devices including connected TVs, mobile devices, PCs, Android and iOS systems, cable boxes, smart TVs etc. It also has an install base of 20 million apps.

When BBC iPlayer was launched in 2007, it was “ground-breaking”, but seven years later, as demand grew and as users started having higher expectations, the IT team found that the iPlayer infrastructure was aging and needed to be replaced, according to Marina Kalkanis, head of core services at the BBC.

“Our old system was very monolithic, pretty slow, couldn’t cope with spikes, and had mixed ownership with third-party suppliers,” said Phil Cluff, the team lead at BBC Media Services in a video recorded at AWS re:Invent conference 2013.

“We wanted to bring it all together to allow us to make changes very quickly, have complete ownership and react to the changing industry by having a new, responsive delivery techniques,” Cluff added.

“We also wanted a highly elastic transcode system and wanted it to be completely reliable and scalable and full ownership.”

And cloud provided the answer to the team’s requirements as the cloud empowers businesses to process media at scale in ways that were previously not possible, or constrained by lack of infrastructure availability.

To improve the BBC iPlayer service and make it ready for the next decade, the IT team designed and built Video Factory – a complete in-house re-build of the ingest, transcode and delivery workflows for the iPlayer – on a public cloud platform.

“It is a scalable, elastic message-driven cloud architecture that will last us for the next ten years,” said Phil Cluff, the team lead at BBC Media Services in a video. “It is a result of 1-year of development by 18 engineers,” Cluff said in the video.”

Soon after the first AWS re:Invent developer conference in 2012, the team rolled out the first set of components out in actual production for Video Factory.

But why use public cloud computing services?

Many people rely on iPlayer to catchup on their favourite programmes. “We get complaints when programmes take too long to become available. Live progammes are particularly challenging for us because we have to do all the online processing after the broadcast completes. This is even more challenging when many live programmes all broadcast at the same time,” Kalkanis said in her blog.

When the IT analysed users’ media requests for a typical week, they found that there are peaks and lulls making the usage pattern ideal for using elastic cloud computing services. So the team decided to move live processing on to the cloud.

The Video Factory architecture was built using AWS cloud services – Amazon Simple Queuing Service (SQS) and Amazon Simple Notification Service (SNS). “We were one of the very early adopter of Electric Transcoder and we are very happy with the service,” Cluff said.

We never actually touch the video itself, we use only Amazon S3 API calls

AWS’s Elastic Transcoder is a cost-effective and scalable cloud-based media transcoding service.

Cluff demonstrated the simple live workflow of Video Factory. The team captures live broadcasts and sends these to cloud-based storage (Amazon S3) for a transcode job. The broadcast content is then sent to the transcoder.

Once transcoded, the media either goes back to the cloud storage to be packaged on demand when a request from the internet comes in or used in BBC’s other distribution services, Kalkanis explained.

But the whole system is automated on the cloud. “We never actually touch the video itself, we use only Amazon S3 API calls,” Cluff said. The team finds the system quick and has not reported any performance issues until now.

“The only failure we get in the new system is when someone puts in a dodgy video clip – that’s how reliable it is,” he added.

The other advantage of using cloud services is that the BBC does not have a fixed amount of storage and so it does not have to limit the hours of content it can process or worry about its HD content.

“And because it is easier to add in new services our system is much more flexible in creating content for new device,” Kalkanis added.

But there was a word of caution to users of AWS in media services. “One thing you want to be careful is that Amazon S3 isn’t immediately consistent but it will eventually become consistent once you build your individual components,” warned Cluff.

iSpy saves us a lot of time diagnosing failures and very rarely do we have to look at logs

The team then added monitoring tools to the cloud-based media processing system to manage and evaluate how the individual components of Video Factory behave and perform. “The monitoring framework that we built is called iSpy – it saves us a lot of time diagnosing failures and very rarely do we have to look at logs,” he added.

Other pieces of technology within the Video Factory architecture include Apache Camel, Splunk’s web-interface software and Java applications running on Tomcat.

Because of cloud’s agility and responsiveness, the BBC’s IT team is easily able to improve quality, increase the amount of HD content, and make live programmes available much more quickly.

For instance, the broadcaster has about 20 regional news programme that goes on air every night across the country and it has to be mounted on the iPlayer platform as soon as possible. “In the previous infrastructure it could take nine or 10 hours before all those transcodes were available for public use. We wanted a system where all those transcodes could happen in parallel and delivered at the same time and now with the new cloud-based, system we take 20 to 30 minutes to deliver those news programmes,” Cluff said.

The team is now looking to develop Audio Factory based on the same cloud principles.


Email Alerts

Register now to receive ComputerWeekly.com IT-related news, guides and more, delivered to your inbox.
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
 

COMMENTS powered by Disqus  //  Commenting policy