Shifting the goalposts - the Euro 2000 website

A huge amount of traffic is expected to hit the official website of the European football championships. To handle this a new...

A huge amount of traffic is expected to hit the official website of the European football championships. To handle this a new network design has been produced

The second largest tournament of the world's most popular sport has just kicked off, and while England may be struggling to achieve the success of 1966, or even 1996, the creators of the official website are hoping that fans from across the globe won't be let down. is trying to break the traditional mould of the official site. Instead of playing the role of traditional marketing tool with basic team information and match reports - perhaps lessons learned from the last World Cup - UEFA has brought in partners to aim the content of the site at the fans. In doing so, it is expected to be the most popular website ever during the three weeks that the tournament runs.

While changing the content to suit what the fans want may not seem like a revolutionary idea, there are still many business sites out there that use their sites purely as marketing tools in order to promote brand name. A little lateral thinking on behalf of UEFA may pay huge dividends, and similar thinking from other companies may have the same effect.

In order to produce the content UEFA was after it turned to Sportal, a new sports Internet company that now runs over 40 sporting websites across the world. The company has developed a reputation for unbiased and interactive content, which UEFA believed would be of benefit to its own site.

Mike Sturrock, head of engineering at Sportal, said: "The key to the site is that it is a website for the fans. UEFA didn't want it to be some crusty official site for the press to pick up the official UEFA line. So we put a lot of work into making it attractive to the fans. The supporters section, games championship, forums and chat rooms are all designed to really engage the fans."

As such, the site not only contains pictures and text, but will also have streaming audio for commentary throughout matches, and streaming video for highlights of the games to be posted just after it has finished. The commentary, statistics and results will all be available in real-time through a special Java applet available for download on the site. In addition to this, the site will be available in five languages: English, French, German, Spanish and Italian.

Of course, this generous use of streaming multimedia, combined with the extremely large anticipated number of hits on the site, would be enough to put most normal sites out of action completely. To this end, more partners were brought in to build what is seen by some as the first of the next generation of high-demand websites.

PSINet have been working with UEFA since March 1999, and also were the web hosts for the 1998 World Cup. This put the company in the ideal position to take on the challenge of hosting PSINet's task was to provide enough bandwidth to satisfy the huge demand that would be placed on the site, estimated at over one gigabit per second, with reliability so that content was always fresh and accessible to the fans.

As part of the design process, PSINet and Sportal got together back in January to decide on the best way forward. Considering the vast amount of traffic that was expected - around 150 million hits per day, with the majority falling in a four hour period around the matches - both companies decided that to guarantee every user had a good response, the information would have to be held as closely as possible to the user. The service also needed the capability to route the request to different servers if one was becoming congested.

These requirements led to a network design in which the content was hosted at two main data centres in Holland and Los Angeles. This content was then replicated onto caching servers throughout Europe and the US. The content would be created by Sportal on Vignette servers, which would then be pushed onto Apache web servers when it was ready for release. From here, software would detect new content on the web servers and notify the caches to delete the old content and go and pick up the new. This guaranteed that despite the fact that users would only see cached content, it would always be up to date. Now all that was needed was to find the right caching and routing partners.

Vicky Reeves, implementation manager for PSINet, said: "Throughout the whole of January and the beginning of February we stress tested the concept of the design with Sportal to make sure that it really could do what we wanted it to do. Throughout our stress testing we tested about five different suppliers from each category. It turned out that the best products for what we wanted to do were from ArrowPoint and Cacheflow."

These two partners were then brought on board the project and helped to finalise the design plans for the network. The final network now contains 28 Cacheflow 500 servers dotted around the globe in nine cities, including London, Frankfurt, Paris and Valencia.

Nigel Hawthorn, marketing director for Cacheflow, said: "The network infrastructure has the built-in ability to push new information to all of these caches held in Europe and the US, so that they are providing information that's bang up to date. That was the challenge: to be able to make sure the content is always fresh. The whole network is centrally driven and the content is pushed out via the switches to the distributed caching servers."

Directing and monitoring traffic to these 28 caches are 10 ArrowPoint switches. ArrowPoint, which recently announced it was to be bought by Cisco, will offer a mixture of CS-100 and CS-800 switch models depending on the expected loads at different sites. These switches are used to route a user request to the nearest cache so they will always receive the fastest response time possible. This may include redirecting away from congested servers to those that are less busy.

Mike Holden, project manager for ArrowPoint, said: "Let's say lots of users were coming from Germany while the Germany vs. England match was on, and we hit a capacity issue. If we experience a lot of demand for content going into Germany we can point users off to say, Paris, which might be lightly loaded. Also, if one of the websites goes down for any reason, our switches can detect that and automatically route users to an alternative server where the content is held."

Each site has a link to both hosting centres in Amsterdam and LA, and has an assigned primary and secondary link to either centre. Should one fail, the switches will automatically switch to the other centre.

Although yet to be fully tested by users demands, it its hoped that the topology of this network will not only provide resilience but security. At such a high profile event some sort of attack on the official site is almost expected from its designers.

"At an event like this people will inevitably try and do something to the site," said Holden. " With a global load sharing, attackers will find it hard to disrupt."

The main concern had been a denial of service attack on the site following the spate of high profile sites that were taken down earlier this year, including and Yahoo. However, the distributed nature of the site is expected to make it almost impossible to severely affect its speed. Should one cache come under attack, traffic would simply be routed through to another. The ArrowPoint switch also has the capability to detect the junk data that is often used in such attacks, and simply drop that connection.

Hackers will have an even more difficult time if they actually want to get into the site to mess around with it. First, they have to get past the firewall capabilities of the switches, but even if they manage that they will not be able to get at the web servers. Since only the caches communicate with the web servers and all users have access to the various caches, the hacker will find it almost impossible to launch a direct attack on the server itself.

Of course, this is still at a stage when everything is predicted. The tournament itself is just in its infancy, but if figures leading up to the tournament are used as an indicator, then the site could well pass the predicted 150 million hits per day and 40 million per hour at peak times. The full results will not be known until the competition comes to an end on 2 July, and can really only be judged on user satisfaction. Nevertheless, those involved in the creation of the site and the network believe they have started something here.

"This is a benchmark we have created," said PSINet's Vicky Reeves. "This is the first in the next generation of websites, in terms of the number of hits it can handle, its scalability and resilience."

All the companies involved have said that this type of network will work for almost any type of website that needs to handle high volumes of traffic. If everything goes to plan during the tournament we could well see many high profile companies adopting this type of topology as the popularity of the Internet continues to grow rapidly.

Paul Grant

Read more on IT risk management