The United States has wrested back its number one position in the race to own the world’s most powerful supercomputer; a race that China had been leading ever since it overtook the US in 2013.
On Friday (June 8), engineers at the U.S. Department of Energy’s Oak Ridge National Laboratory (ORNL), in Tennessee, took the lid off SUMMIT – the United States’ latest pride and joy, which has, now, surpassed China’s Sunway TaihuLight, in terms of processing power.
“Today’s launch of the Summit supercomputer demonstrates the strength of American leadership in scientific innovation and technology development,” said Secretary of Energy Rick Perry.
“It’s going to have a profound impact in energy research, scientific discovery, economic competitiveness and national security,” he said.
At peak performance, the IBM-built Summit can do 200 petaflops, or 200 quadrillion calculations per second, making it a million times faster than your average home computer and about 60% faster than Sunway TaihuLight; so, yes, China has quite some catching up to do if it wants to get back in the lead.
Titan – a supercomputer, which is also housed at the Oak Ridge facility – had, until now, held the record for being the fastest of its kind in the US, but is now relegated to a distant second position, as Summit processes at a computational scale eight times faster than it.
However, Summit will be formally declared the Numero Uno supercomputer in the world when the concerned organization, Top500, updates the official ranking of supercomputers later this month.
Here are a few mind-boggling basics about Summit.
- Summit’s 4,608 servers are housed in a space equal to the size of two tennis courts
- It comprises more than nine thousand 22-Core IBM Power9 CPUs and over 27,000 NVIDIA Tesla V100 GPUs
- Summit’s cooling system requires 4,000 gallons of water every minute to dissipate about 13 megawatts of heat that the monster generates
- The energy required to run Summit could easily power more than eight thousand homes.
While there is no denying the fact that the US has grabbed back the supercomputing initiative from China, there is much more to Summit than just being a symbol of national pride.
Summit was conceptualized with artificial intelligence as one of the primary focuses of the team behind its development.
“Summit takes accelerated computing to the next level, with more computing power, more memory, an enormous high-performance file system and fast data paths to tie it all together,” said Jeff Nichols, ORNL associate laboratory director for computing and computational sciences.
According to Nichols, this will enable researchers to get more accurate results much faster.
“Summit’s AI-optimized hardware also gives researchers an incredible platform for analyzing massive datasets and creating intelligent software to accelerate the pace of discovery,” Nichols added.
The supercomputer’s machine learning and deep learning capabilities will play a significant role in powering health research, physics, chemistry, astronomy, biology, global weather forecasts and climate modeling, to name a few domains where it is expected to excel.
Running a climate model, for example – which is, basically, a system of differential equations based on the laws of physics, chemistry and fluid motion – require scientists to divide the planet into a 3D grid; process huge amounts of relevant data like precipitation, wind patterns, etc.; apply the necessary equations; and, finally, evaluate the results.
This is where Summit can come in real handy, as these are not the type of tasks that can be performed on the run-of-the-mill cloud computing services on offer by internet service providers, says Ian Buck – VP / General Manager, at NVIDIA’s Tesla Data Center Business.
“Industry is great, and we work with them all the time,” said Rick Stevens, an associate director of the Argonne National Laboratory in Illinois. “But Google is never going to design new materials or design a safe nuclear reactor.”
The U.S. Department of Energy researchers at the ORNL have already demonstrated that Summit can live up to its super-hype by using the mega-computer to run exascale scientific calculations to analyze genomic information.
In fact, Summit was not only able to break the exascale barrier but almost double it, as well, running at 1.88 Exaops (1 Exaop is equal to one billion, billion calculations per second).
Summit, which is capable of achieving even more – up to 3.3 Exaops, using mixed precision calculations – is a big step towards achieving full-scale exascale computing capabilities for research purposes in the near future.
“I am truly excited by the potential of Summit, as it moves the nation one step closer to the goal of delivering an exascale supercomputing system by 2021,” Perry said.
“Summit will empower scientists to address a wide range of new challenges, accelerate discovery, spur innovation and above all, benefit the American people,” Perry said.
Aurora, the supercomputer that is expected to be operational by 2021, will have the ability to deliver one quintillion calculations per second, which is five times more than what Summit can do.
Summit will be involved in a number of upcoming projects, including the study and analysis of supernovas, in order to determine how different elements such as gold, for example, traveled through the universe.
Some of the other functions that Summit is expected to perform in the near future include projects like running simulations on new material types, such as superconductors, for example; and analyzing huge amounts of health-related data to find possible cures for diseases like Alzheimer’s, cancer, coronary diseases, and addiction, to name a few.
“The complexity of humans as a biological system is incredible,” said ORNL computational biologist Dan Jacobson. “Summit is enabling a whole new range of science that was simply not possible before it arrived.”