Fifty years ago, Intel co-founder Gordon Moore outlined his vision of how microelectronics would power the modern world - introducing Moore's Law, which predicted the power of microchips would double every 18 months. Today, Intel's business model is under threat.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
The demise of the PC as the computing device of choice for consumers and businesses has meant that the biggest share of Intel's revenue comes from its server division.
But with datacentre consolidation, virtualisation and a shift to cloud-based computing there is likely to be less demand for mass-market x86-powered commodity PC servers.
Moore’s Law drives industry trends
In a video message on the Intel website commemorating the publication of his prediction in the 19 April 1965 edition of Electronics magazine, Moore (pictured below) said: "The message I was trying to get across was that integrated circuits were the road to less expensive electronics. It really evolved from being a measure of what goes on in the industry to something that more or less drives the industry."
This drive has taken the industry from the era of pocket calculators to modern smartphones that have the equivalent computing power of Deep Blue, the $100m machine that beat chess champion Garry Kasparov in 1997.
Intel innovation strategist Steve Brown, said: "Transistors dramatically decrease in cost at an exponential pace. This gives us amazing developments in technology and the economy."
Today's transistors run 90,000 times more efficiently and are 60,000 times cheaper to manufacture than Intel's first processor, the 4004. But the laws of physics have worked against hardware companies. In the past chipmakers used to talk about processor clock rates and frequency; today's measure of performance is based on parallelism and vectorisation, said Rajeeb Hazra, vice president, technical computing group at Intel.
Speaking at the launch of Lenovo's new Intel-powered high-performance computing (HPC) site in Stuttgart, Hazra discussed how software could not take full advantage of the theoretical performance of the hardware. In many ways HPC pushes the limits of the hardware and the software.
This is a market Intel will try to grow, by delivering supercomputing to a wider market, in its relentless pursuit of fulfilling Moore's Law.
Simplifying parallel programming
Intel dominates the Top500 Supercomputing sites list. The Tianhe-2 (Milkyway-2) is a 3,120,000-core machine at the National Supercomputer Centre in Guangzhou, China. It is the world’s fastest supercomputer, capable of delivering 33.86 petaflops.
Hazra said: "Hardware outpaces the ability of software to keep up – so we need to optimise software. This is critical to the success of HPC and to make Intel's products relevant."
Traditionally HPC has been associated with research institutes with the funding to buy the multimillion-pound supercomputers and have the expertise to utilise them to solve computationally challenging problems. Hazra believes that going forward, HPC will be much more relevant than it is today.
"Modelling simulation and data analytics are fundamental to the digital economy," he said. Intel's strategy is about making such applications available to SMEs and other people who have not used HPC before. But he admits this will not be easy: "The methods we use for HPC have to be adapted."
For instance, developing software for highly parallel HPC environments is not easy. Hazra said: "We are not turning everyone into a PhD programmer." His ambition is to continue driving hardware innovation. But Intel has recognised this needs to be supplemented by software. Hazra predicted that programmers would be able to use off-the-shelf software libraries to enable them to build HPC applications more easily.
Open source and HPC
In fact, analyst Gartner noted in its High-Performance Computing Delivering Value to All Participants report: "As the x86 architecture available from Intel and AMD has penetrated deeper within the large-scale HPC environment, the entire HPC market has been driven to much higher levels of parallelism. As the number of servers, processors and software tasks grows to the tens or hundreds of thousands, the management aspects of these systems have become more complex and the burden has fallen on to the software developers."
Hazra said: "It is critical to get the open source community up and running." An open source marketplace of HPC software libraries could potentially lower the barrier to adopting HPC technologies. In Hazra’s opinion, this could lead to the democratisation of HPC, where organisations would be able to harness the power of such technology by buying converged systems, HPC appliances or turnkey systems with a set of APIs, that allow programmers who are not necessarily savvy with the nuances of parallel programming, to create HPC-based business applications.
Clearly, Moore’s Law will drive down the cost of computing to the point where the processing power of a machine like the Tianhe-2 supercomputer will be available to any business.
The field of bioinformatics is an example of where HPC is combined with big data analysis. Using the vast amount of processing power that is now on tap, bioinformatics gives researchers the ability to analyse the human genome, which could eventually lead to medicines that uniquely target a condition based on the genetic makeup of the patient.
Intel's Steve Brown said: "Pharmaceutical companies are using powerful supercomputers for the research which will one day provide a cure for cancer."
But just as Intel needs to make programming supercomputers easier, the exponential rise in the power of microelectronics as predicted by Moore's Law, may be held back by forces beyond the control of the IT industry. Jurgi Camblong, CEO of bioinfomatics firm Sophia Genetics, said: "In the field of bioinformatics, Moore’s Law proves to be one-dimensional. Of course the increase in the rate of processing speed is essential, and has had an impact on the speed and cost of Next Generation Sequencing. However, the most important factor to consider in clinical genomics is accuracy, and that has not had the same exponential rise as speed."