Gaming supercomputers tackle HIV/AIDS research

bridgwatera | 1 Comment
| More

Reports suggest that Graphic Processing Units (GPUs) made by Nvidia are helping scientists advance their HIV/AIDS research.

Researchers in Barcelona have simulated the behaviour of the initial crucial step in the HIV maturation process, which starts the infectious phase of HIV.

By providing this new visibility into this process, biotech researchers can potentially design new antiretroviral drugs to halt the HIV maturation process to stop it from becoming infectious.

The Barcelona researchers achieved this breakthrough by harnessing the power of thousands of GPU accelerators on a distributed network of individual computers --not unlike high-end PC gaming rigs -- which allowed them to utilise supercomputing processing power typically available on dedicated multi-million dollar set ups.

Using GPUGrid.net - a volunteer distributed-computing effort that uses spare time on the GPUs of thousands of volunteers - Spanish researchers made an important breakthrough in the quest to better understand the HIV virus.

The story is detailed on a blog from GPU specialist Nvidia below:

It turns out, the HIV protease acts like a pair of scissors. These "scissors proteins," cuts the long chain of connected proteins that form HIV into individual proteins. These individual proteins, or virons, then take viral genomes from one cell to another. Using GPU-accelerated software called ACEMD, researchers showed how the first HIV "scissors proteins" can cut themselves out from within the middle of these poly-protein chains, beginning the infectious phase of HIV.
aids.png
By providing this new visibility into how the HIV protease behaves, biotech researchers can potentially design new antiretroviral drugs to halt the HIV maturation process to stop it from becoming infectious.

With this tremendous computing power at their disposal, the researchers were able to run thousands of complex computer simulations of HIV protease, each for hundreds of nanoseconds for a total of almost a millisecond. That gives them a very high-probability that their simulation represented real-world behaviors.

Simulations of this length and complexity would have been unfeasible to achieve using a computing system based on CPUs alone.

These findings have been published in the latest edition of Proceedings of the National Academy of Sciences of the United States of America (PNAS).

1 Comment

Good article something similar to BOINC ?

Leave a comment

Subscribe to blog feed

About this Entry

This page contains a single entry by Adrian Bridgwater published on December 11, 2012 4:00 AM.

Top predictions, about IT predictions, for 2013 was the previous entry in this blog.

ERP is dead, long live two-tier ERP is the next entry in this blog.

Find recent content on the main index or look in the archives to find all content.