moisseyev - Fotolia

100Gbps network powers petabyte research cloud at Australia’s Monash University

Australian university implements powerful network to support data-hungry research cloud

Australia's largest university is to implement a 100Gbps network backbone that will drive a new generation of academic understanding.

Pressure to not only manage an exploding volume of research-related data, and to also make it available to researchers for real-time collaboration, has driven the development.

Monash University – a multidisciplinary institution with 55,000 students and 8,100 academic staff located in Melbourne, Australia's second most populous city – has long worked to keep up with the deluge of research data being generated by more than 100 research centres, 17 co-operative research centres, and major facilities such as the Australian Stem Cell Centre and Australian Synchrotron.

The combination of demand for big data and high-performance computing (HPC) systems – which Steve Quenette, deputy director of the Monash eResearch Centre, calls “big compute” – had pushed the technology team to “give researchers an environment to find things they haven't seen before, and to do it much faster”.

This approach reflects a growing shift in computing capacity – from being something that is primarily required to process data that has been collected, to being something that is an intrinsic part of the research process beginning as early as the data-collection phase.

“Where all the leading and interesting research is occurring is where we have both big data and big compute,” he said. He added that new instruments can generate 500TB of data in a single run. “That makes for a really interesting ICT environment, where we need to converge HPC and data infrastructure, and do it in such a way that it is really permeable.”

That permeability meant the Monash IT team designed far more than just a large data repository – which it already had, with some five petabytes of raw data under management – but an “immensely low latency” environment that would enable researchers and academics to “create big-data clouds of information”.

Read more about modern IT in Australia

With a range of 25Gbps and 50Gbps connections already linking some of the research facilities, it was clear that further development of the environment would require even bigger backbone capacity. Evaluation of available options eventually led the university to settle on Mellanox's CloudX platform, which combines that company's Spectrum SN2700 Open Ethernet switches, ConnectX-4 NICs and LinkX cables.

The CloudX platform, which delivers 100Gbps of switched bandwidth, has significantly increased available bandwidth to compute nodes – as well as to the group's SAN – but has also reduced the potential impact of network congestion from competing high-bandwidth applications.

Introducing 100Gbps capacity across the network – bandwidth normally associated with telcos such as Australia's Optus, which recently built out transnational 100Gbps backhaul links – drew on many of the things that the team had learned from its work in HPC environments.

“The interconnect inside the HPC systems has long been at the forefront of the network,” Quenette said. “When you're dealing with hundreds and thousands of [CPU] cores, it needs to be immensely low latency. You can't tolerate message failures. And when you start to bring that across the entire network fabric, you need to do the same for data as well.”

Based on the open-source OpenStack cloud infrastructure, the new environment is almost all software defined and, unlike previous dedicated HPC networks, is broadly available across a range of network topographies.

“The fabric is heterogeneous on purpose,” Quenette said. “We can have some of it, which is really high performance, and some which isn't. We shape it as needed, and we can pragmatically push [bandwidth] into our environment.”

This approach not only accommodates expanded data-processing needs, but suits evolving collaborative research models in which researchers are increasingly treating big-data services as a utility that empowers them to do new kinds of collaborative research.

“Every discipline is pooling data together, and people are trading information and creating big-data clouds,” said Quenette. “These data sets are too big to fit on a laptop, and even if you could do it, nobody can share it with you on a laptop. But if we bring it to our internal cloud, all of it is intrinsically accessible.”

Researchers “need to be able to extract the bits they want out of that, and be able to run filters, do analysis and play with the knobs,” he continues. “Researchers can work with other researchers with very little pain.”

Read more on Telecoms networks and broadband communications

Start the conversation

Send me notifications when other members comment.

Please create a username to comment.

-ADS BY GOOGLE

SearchCIO

SearchSecurity

SearchNetworking

SearchDataCenter

SearchDataManagement

Close