psdesign1 - Fotolia
Singapore researchers will get priority access to the supercomputing resources at the National Supercomputing Centre (NSCC) to aid their work on any effort related to the Covid-19 pandemic.
NSCC said this will allow researchers who are involved in Covid-19-related research work to have “fast-tracked” access to supercomputing resources, without having to wait for the biannual call for projects.
It added that this special call is open to all NSCC stakeholders, as well as non-stakeholders, as part of a national effort to deploy advanced high-performance computing (HPC) resources to accelerate Covid-19 research in mitigating the effects of the coronavirus pandemic.
These resources include the Aspire 1 petascale supercomputer with 1,288 CPU nodes and 128 accelerator nodes with Nvidia K40 graphics processing units (GPUs), an artificial intelligence (AI) system powered by Nvidia’s DGX-1 GPU architecture, and 13PB of high-performance storage.
NSCC chief executive Tan Tin Wee said speed is one of the key elements that has been cited by global experts as a leading factor in helping to resolve the pandemic.
“Supercomputers can play a crucial role in speeding up research whether it is in detecting the virus, tracing the infected, studying mutations in the coronavirus genome, building faster test kits or developing vaccines for the virus,” he added.
Peter Ho, chairman of the NSCC steering committee, said the Singapore government is treating the Covid-19 situation not as a medical crisis but as a national crisis.
“This means that all resources of government and of the nation can and should be deployed in support of managing the outbreak,” he added. “Hence, it is appropriate for NSCC to deploy its HPC resources in support, and also taking priority over other commitments where reasonable and necessary.”
Besides Singapore’s NSCC, Europe’s Partnership for Advanced Computing is also speeding up approval of proposals for Covid-19-related research projects. These could be biomolecular research to understand infection mechanisms, bioinformatics research to understand mutations and epidemiologic analyses to forecast the spread of the disease.
In the US, a new consortium was formed by the White House Office of Science and Technology Policy, the US Department of Energy, and IBM to bring together federal government, industry and academic leaders who are volunteering compute time and resources on their HPC systems.
One of these machines is the Frontera supercomputer at the Texas Advanced Computing Center at University of Texas in Austin. It is already being tapped by researchers to develop a full computer model of the virus that can be used to develop new drugs and vaccines.
Read more about HPC in APAC
- Singapore’s national AI programme is getting its own high-performance computing infrastructure to support more compute intensive workloads.
- The Gadi supercomputer at the Australian National University will run 10 times faster than its predecessor, giving researchers access to high performance computing resources to solve the toughest research problems.
- The University of Sydney has upgraded its supercomputing infrastructure to answer big questions on cosmology and keep pace with growing research needs.
- A Huawei executive makes the case for HPC infrastructure, despite the growth of public cloud services that have democratised access to HPC and AI capabilities.