Reports suggest that Graphic Processing Units (GPUs) made by Nvidia are helping scientists advance their HIV/AIDS research.
Researchers in Barcelona have simulated the behaviour of the initial crucial step in the HIV maturation process, which starts the infectious phase of HIV.
By providing this new visibility into this process, biotech researchers can potentially design new antiretroviral drugs to halt the HIV maturation process to stop it from becoming infectious.
The Barcelona researchers achieved this breakthrough by harnessing the power of thousands of GPU accelerators on a distributed network of individual computers –not unlike high-end PC gaming rigs — which allowed them to utilise supercomputing processing power typically available on dedicated multi-million dollar set ups.
Using GPUGrid.net – a volunteer distributed-computing effort that uses spare time on the GPUs of thousands of volunteers – Spanish researchers made an important breakthrough in the quest to better understand the HIV virus.
The story is detailed on a blog from GPU specialist Nvidia below:
It turns out, the HIV protease acts like a pair of scissors. These “scissors proteins,” cuts the long chain of connected proteins that form HIV into individual proteins. These individual proteins, or virons, then take viral genomes from one cell to another. Using GPU-accelerated software called ACEMD, researchers showed how the first HIV “scissors proteins” can cut themselves out from within the middle of these poly-protein chains, beginning the infectious phase of HIV.
By providing this new visibility into how the HIV protease behaves, biotech researchers can potentially design new antiretroviral drugs to halt the HIV maturation process to stop it from becoming infectious.
With this tremendous computing power at their disposal, the researchers were able to run thousands of complex computer simulations of HIV protease, each for hundreds of nanoseconds for a total of almost a millisecond. That gives them a very high-probability that their simulation represented real-world behaviors.
Simulations of this length and complexity would have been unfeasible to achieve using a computing system based on CPUs alone.
These findings have been published in the latest edition of Proceedings of the National Academy of Sciences of the United States of America (PNAS).