PDA

View Full Version : The World’s Most Powerful Supercomputer Is an Absolute Beast



dynamo
16th June 2018, 18:45
https://www.bibliotecapleyades.net/imagenes_sociopol3/internet275_01_small.jpg (https://www.bibliotecapleyades.net/imagenes_sociopol3/internet275_01.jpg)
A row of Summit's server racks.
Photo: Oak Ridge National Laboratory




Behold 'Summit,' a new supercomputer capable of making 200 million billion calculations per second.



It marks the first time in five years that a machine from the United States has been ranked as the world's most powerful. The specs for this $200 million machine defy comprehension.



Built by IBM and Nvidia for the US Department of Energy's Oak Ridge National Laboratory, Summit is a 200 petaflop (https://en.wikipedia.org/wiki/FLOPS) machine, meaning it can perform 20 quadrillion calculations per second.



That's about a million times faster than a typical laptop computer.



As the the New York Times (https://www.nytimes.com/2018/06/08/technology/supercomputer-china-us.html) put it, a human would require 63 billion years to do what Summit can do in a single second.



Or as stated by MIT Technology Review (https://www.technologyreview.com/s/611077/the-worlds-most-powerful-supercomputer-is-tailor-made-for-the-ai-era/),


"everyone on Earth would have to do a calculation every second of every day for 305 days to crunch what the new machine can do in the blink of an eye."

The machine, with its 4,608 servers, 9,216 central processing chips, and 27,648 graphics processors, weighs 340 tons.



The system is housed in a 9,250 square-foot room at Oak Ridge National Laboratory's (ORNL (https://en.wikipedia.org/wiki/Oak_Ridge_National_Laboratory)) facility in Tennessee.



To keep this machine cool, 4,000 gallons of water are pumped through the system. The 13 megawatts of energy required to power this behemoth could light up over 8,000 US homes.



Summit is now the world's most powerful supercomputer, and it is 60 percent faster than the previous title holder, China's Sunway TaihuLight. It's the first time since 2013 that a US-built computer has held the title, showing the US is keeping up with its main rival in this area, China.



Summit (https://en.wikipedia.org/wiki/Summit_(supercomputer)) is eight times more powerful that Titan (https://en.wikipedia.org/wiki/Titan_(supercomputer)), America's other top-ranked system.





https://www.bibliotecapleyades.net/imagenes_sociopol3/internet275_02_small.jpg (https://www.bibliotecapleyades.net/imagenes_sociopol3/internet275_02.jpg)

Photo: Oak Ridge National Laboratory





As MIT Technology Review explains, 'Summit' is the first supercomputer specifically designed to handle AI-specific applications, such as machine learning and neural networks.



Its thousands of AI-optimized (https://dzone.com/articles/why-hardware-matters-in-the-cognitive-enterprise) chips, produced by Nvidia and IBM, allow the machine to crunch through hideous amounts of data in search of patterns imperceptible to humans.



As noted in an Energy.gov release (https://www.energy.gov/articles/oak-ridge-national-laboratory-launches-america-s-new-top-supercomputer-science),


"Summit will enable scientific discoveries that were previously impractical or impossible."

Summit and machines like it can be used for all sorts of processor-heavy applications, such as,




designing new aircraft

climate modeling

simulating nuclear explosions

creating new materials,

finding causes of disease


Indeed, its potential to help with drug discovery is huge:




Summit, for example, could be used to hunt for relationships between millions of genes and cancer.


It could also help with precision medicine, in which drugs and treatments are tailored to individual patients.


From here, we can look forward to the next generation of computers, so-called "exascale" computers capable of executing a billion billion (or one quintillion) calculations per second.



And we may not have to wait long:


The first exascale computers may arrive by the early 2020s...

chancy
16th June 2018, 22:35
hello everyone:
https://www.theblaze.com/news/2013/07/01/seven-stats-to-know-about-nsas-utah-data-center-as-it-nears-completion
It came online in the fall of 2013
here's another fast computer:
7 Stats to Know About NSA’s Massive Utah Data Center as It Nears Completion
Jul 1, 2013 5:54 pm
111
Liz Klimas

Although the National Security Agency (NSA)’s Utah Data Center had a ribbon-cutting in late May, it wasn’t the official opening of the massive data facility. The center located in Bluffdale, just over 20 miles from Salt Lake City, is expected to open its doors — at least to a very select group with the appropriate security clearances — this fall.
NSA Data Center
This June 7, 2013 file photo, shows a military no trespassing sign shown in front of Utah’s NSA Data Center in Bluffdale, Utah. The nation’s new billion-dollar epicenter for fighting global cyberthreats sits just south of Salt Lake City, tucked away on a National Guard base at the foot of snow-capped mountains. The long, squat buildings span 1.5 million square feet, and are filled with super-powered computers designed to store massive amounts of information gathered secretly from phone calls and emails. (Photo: AP/Rick Bowmer, File)

But the Salt Lake City Tribune has been keeping tabs on the latest developments of the data center being constructed in its back yard. Here are a few stats and facts about the upcoming facility:

Project cost: around $1.5 billion

Size: 1 million square feet. 100,000 square feet will compose four data halls with data-storage servers. 900,000 square feet will serve for technical and administrative support staff,which is expected to be less than 200 employees.

NSA Data Center
This June 6, 2013, photo, shows an aerial view of the NSA’s Utah Data Center in Bluffdale, Utah. The nation’s new billion-dollar epicenter for fighting global cyberthreats sits just south of Salt Lake City, tucked away on a National Guard base at the foot of snow-capped mountains. The long, squat buildings span 1.5 million square feet, and are filled with super-powered computers designed to store massive amounts of information gathered secretly from phone calls and emails. (Photo: AP/Rick Bowmer)

Why Utah?: Lots of water for cooling massive servers, low utility rates, workforce and low potential for extreme weather-related disasters. There’s also room for expansion.

Energy use: 65 megawatts of electricity continuously

Water use: up to 1.7 million gallons a day

NSA Phone Records
An detail aerial view of the NSA’s Utah Data Center in Bluffdale, Utah, Thursday, June 6, 2013. The government is secretly collecting the telephone records of millions of U.S. customers of Verizon under a top-secret court order, according to the chairwoman of the Senate Intelligence Committee. The Obama administration is defending the National Security Agency’s need to collect such records, but critics are calling it a huge over-reach. (AP Photo/Rick Bowmer)

Computer capabilities: Cray XC30 Supercomputers will serve the facility, running up to 1 million Intel Xenon core processors at the same time as fast as 100 petaflops. SLT.com described one petaflop as about one thousand trillion calculations per second. This would put the system as three times faster than the world’s fastest supercomputer.

Storage: Estimates come in around the thousands of zettabytes or yottabytes. The military has estimated, as a comparative example, that its entire network storage exceeds exabytes and possibly yottabytes.

A new National Security Agency (NSA) data center is seen June 10, 2013 in Bluffdale, Utah. The center, a large data farm that is set to open in the fall of 2013, will be the largest of several interconnected NSA data centers spread throughout the country. The NSA has come under scutiny after two large scale data survalliance programs were leaked to the press. (Photo: George Frey/Getty Images)

The facility has a projected completion date of September 2013.

Read more details about the project and how recent leaks about the NSA’s classified information collection program could someday relate to the new facility in the Salt Lake Tribune’s full article.

This story has been updated to correct the number of gallons speculated to be used by the Utah Data Center from 17 million gallons to 1.7 million gallons.

Look at the progress in just 5 years. Unbelievable the doubling of speed in that short time

chancy