Subscribe for 17¢ / day
Supercomputers getting more super
Computer scientists Henry Tufo, left, and Richard Loft show off their world record-holding supercomputer, an IBM BlueGene/L, at the University Corp. for Atmospheric Research in Boulder, Colo.

WASHINGTON — The federal government is pushing computer scientists and engineers to greatly step up the speed and capacity of America's supercomputers.

Officials say much faster performance is needed to handle a looming tidal wave of scientific, technical and military data.

Powerful new telescopes, atom-smashers, climate satellites, gene analyzers and a host of other advanced instruments are churning out enormous volumes of computer bytes that will overwhelm even the swiftest existing machines.

In the next five years, the government's goal is a computer system that can process at least a quadrillion (a million times a billion) arithmetic operations per second. The best current machines operate in the trillions (a thousand times a billion) of calculations per second.

"Within the next five to 10 years, computers 1,000 times faster than today's computers will become available. These advances herald a new era in scientific computing," according to Raymond Orbach, undersecretary for science at the Department of Energy.

A quadrillion-rated computer, known technically as a "petascale" system, will be at least four times faster than today's top supercomputer — IBM's BlueGene/L — which holds the world's record at 280 trillion operations per second.

"Peta" is the prefix for a quadrillion in the metric system. "Tera" stands for a trillion, so Blue Gene is a terascale system.

On a more familiar level, a petascale computer will be at least 75 times faster than the most powerful game machine, such as IBM'S XBox-360, and 100 times faster than a top-of-the-line desktop personal computer, such as the Apple Power Mac.

Last Tuesday, RIKEN, a Japanese research agency, announced that it had built a computer system that theoretically can perform 1 quadrillion operations per second. If so, this would be the world's first true petascale computer.

Henry Tufo, a computer scientist at the University of Colorado, Boulder, who operates a Blue Gene/L system, said it would take petascale computer power to solve problems that stump present-day systems.

"One of the most compelling and challenging intellectual frontiers facing humankind is the comprehensive and predictive understanding of Earth and its biological components," Tufo said in an e-mail message. "Petascale systems will open up new vistas (for) scientists."

To meet this goal, the National Science Foundation asked researchers on June 6 to submit proposals to develop the infrastructure for a petascale computing system to be ready by 2010.

As examples of difficult questions that only a petascale system could handle, the NSF listed:

  • The three-dimensional structure of the trillions of proteins that make up a living organism. Proteins are the basic building blocks of all living things.
  • The ever-changing interactions among the land, ocean and atmosphere that control the Earth's maddeningly complex weather and climate systems.
  • The formation and evolution of stars, galaxies and the universe itself.

The Department of Energy also is offering $70 million in grants for teams of computer scientists and engineers to develop petascale software and data-management tools.

"The scientific problems are there to be solved, and petascale computers are on the horizon," said Walter Polansky, senior technical adviser in the DOE'S Office of Advanced Scientific Computing.

For example, the Energy Department wants ultra-fast computers in order to determine the 3-D structure of molecules that let drugs pass through cell walls, knowledge that can be vital against cancer.

"This is completely new," Orbach wrote in the current issue of Scientific Discovery through Advanced Computing, a DOE publication. "No one has ever probed that region of science before."

The Energy Department also needs petascale computing to help solve problems that are blocking the development of nuclear fusion, an unlimited, nonpolluting energy source that's baffled designers for decades.

The DOE and NASA, the space agency, are collaborating in an effort to determine the nature of the dark energy and dark matter that are thought to make up 95 percent of the universe. Petascale computer power will be needed here, too.