Technology :: SDSC Triples Speed of BlueGene Data Supercomputer

Like movie makers who dazzle audiences with virtual imagery created with supercomputers, scientists rely on massive computational power to build digital worlds that simulate major earthquakes to help design better buildings, predict the path of plumes from smokestacks, and even recreate the birth of stars.

And just like in the movies, bigger supercomputers help scientists move virtual worlds closer to reality.

Toward that end, the San Diego Supercomputer Center (SDSC) at UC San Diego announced today that its IBM eServer BlueGene supercomputer has been tripled in size, giving a peak performance of 17.2 Teraflops (trillion calculations per second). It would take a person operating a hand-held calculator more than 500,000 years to do the calculations this supercomputer completes every second.

SDSC was the first academic institution to deploy a BlueGene supercomputer. Since then, the BlueGene architecture has been adopted more widely, and is currently the fastest supercomputer in the world, holding the first and third spots on the Top500 list of the world’s fastest supercomputers, compiled by researchers at the University of Mannheim, the University of Tennessee, and the Department of Energy.

“As more scientists have used SDSC’s BlueGene Data system, we’ve had increasing demand for time on it, which led us to expand it,” said Richard Moore, Director of Production Systems at SDSC. “With the expansion, we will have tripled the capacity, and scientists will routinely be able to get access to up to 6,144 processors with excellent overall performance, a rare opportunity that will enable new scientific breatkthroughs.”

SDSC’s BlueGene Data system has proven well-suited to applications in a range of disciplines, from physics and molecular dynamics to fluid dynamics and other fields. For example, scientists in a major particle physics collaboration are probing the ultimate building blocks of matter using the newly expanded machine. Scientist David Baker of the University of Washington can run his Rosetta protein structure prediction code on the BlueGene system to design proteins more complex than previously possible, opening the way for new life-giving drugs. Researcher P. K. Yeung of Georgia Tech runs simulations of how substances mix in the chaos of turbulent flows, yielding important insights for such engineering applications as improving combustion efficiency. And Southern California Earthquake Center (SCEC) scientists create virtual earthquakes to guide preparations for the “big one” that threatens California.

“The most powerful supercomputers in the world, Blue Gene systems like the one at SDSC are at the forefront of enabling the next generation of computational science and engineering,” said Dave Jursik, vice president of Deep Computing sales at IBM. “The architecture, optimized for bandwidth, scalability, and handling large amounts of data, supports the most advanced applications while consuming only a fraction of the power and floor space required by other systems.”

SDSC’s powerful IBM eServer BlueGene system is housed in only three computer racks. Each rack holds 1,024 compute nodes and 128 I/O nodes, which is the maximum ratio of I/O to compute nodes, needed to support data-intensive computing. Each node consists of two PowerPC processors that run at 700 megahertz and share 512 megabytes of memory, giving an aggregate peak speed of 17.2 Teraflops.

BlueGene’s efficiencies in power consumption, cooling, and space requirements are vital for institutions hosting large computing power. As the person responsible for managing SDSC’s supercomputers, Moore’s enthusiasm for this efficiency is clear.

“We’re very pleased at how cost-effective this upgrade is,” said Moore. “We’re adding more than 11 Teraflops of computing power for scientists and engineers, with very little incremental system administration time or operations costs.

“It’s also impressive that we were able to get the entire system into full production less than a week after the new racks were delivered. IBM has produced an easily extensible machine, and both IBM’s personnel and our staff deserve a lot of credit for making this happen.”

SDSC’s BlueGene Data machine is also playing an important role in the march toward “petascale” supercomputers — systems that can run at the blinding speed of one thousand trillion calculations per second — hundreds of thousands of times faster than a typical PC. SDSC staff has worked with users to scale three important science codes to run on up to 40,960 processors of the largest open system in the world, IBM’s 114 peak Teraflops BlueGene Watson system. The Rosetta protein structure prediction code, P.K. Yeung’s turbulence simulations, and the SCEC earthquake simulations have all achieved unprecedented scaling, setting the stage for more accurate “virtual realities” that can illuminate the next generation of scientific progress.

About SDSC

For more than two decades, the San Diego Supercomputer Center (SDSC) has enabled breakthrough data-driven and computational science and engineering discoveries through the innovation and provision of information infrastructure, technologies and interdisciplinary expertise. A key resource to academia and industry, SDSC is an international leader in Data Cyberinfrastructure and computational science, and serves as a national data repository to nearly 100 public and private data collections. SDSC is an Organized Research Unit and integral part of the University of California, San Diego and one of the founding sites of NSF’s TeraGrid.

Leave a Comment