From forecasting the weather to improving the earthquake resistance of architecture, supercomputers have become a vital tool in just the span of a few decades.
But while the machines have blinding calculation speeds that are seen as posing the potential for creating comprehensive artificial intelligence, the industry also has down-to-Earth realities, particularly the economic downturn.
Following are questions and answers regarding supercomputer technology:
What is a supercomputer?
A supercomputer is usually defined as a computing unit with a calculating capability ranging from 100 to 1,000 times that of the average personal computer.
Supercomputers are used to simulate a variety of phenomena and provide mathematical analysis that would take decades for regular computers to complete.
According to Genki Yagawa, a professor at Toyo University and former director of the Japan Society for Simulation Technology, supercomputers are capable of solving linear equations similar to those studied in junior high school. But they can handle calculations with 1 million variables rather than only the commonly used X and Y.
What can such fast calculations provide?
The technology can be applied to nuclear power plant safety, for example, by calculating fluid dynamics of a steam turbine, analyzing fatigue fractures of boilers and evaluating maximum earthquake shock waves from structural geology in the vicinity of a plant.
Supercomputers also play a key role in breaking encrypted messages in time of war. IBM’s Deep Blue supercomputer predicted 200 million possible movements per second to defeat chess champion Garry Kasparov in 1997.
Supercomputers have also been used to study the flu virus and simulate the most effective medicine structure that would thwart its harm to the human body.
“Tamiflu, which is under the spotlight now, was developed using supercomputer technology and its simulation powers,” Yagawa said of the widely used flu drug.
How fast is the latest supercomputer?
Supercomputer performance is measured by floating point operations per second, or “flops.” According to the TOP500 project Web site, which compiles statistics on the fastest computers on the globe, IBM’s Roadrunner is second to none with its 1 petaflop performance (1 quadrillion calculations per second).
Roadrunner, the first computer to surpass the petaflop mark, is at the Los Alamos National Laboratory in New Mexico. It entails 278 refrigerator-size racks that occupy some 468 sq. meters and is interlinked with about 100 km of fiber-optic cable, according to a press release by IBM. Roadrunner is used for research in various fields, including nuclear safety and studies of genomes, astronomy and climate change.
Japan’s most famous supercomputer, Earth Simulator, achieved 35 teraflops (35 trillion calculations per second) in 2002, becoming the world’s fastest unit at the time. The simulator has since been upgraded to handle 131 teraflops. The Japan Agency for Marine Earth Science and Technology uses it to make global warming projections and other meteorological calculations.
Supercomputers were first built around the late 1970s and boasted speeds of about 100 megaflops. Nowadays, a high-spec personal computer can easily reach 10 gigaflops.
Experts working with the education ministry said in 2005 that due to the extreme speed of technological advancement, it takes only about two years for a state-of-the-art supercomputer to become outdated.
Is Japan a leading power in the supercomputer industry?
Japan was a major contributor in the field beginning in the 1980s, with NEC Corp., Fujitsu Ltd. and Hitachi Ltd. outdoing their American counterparts. Earth Simulator’s staggering speed shattered its U.S. opponents and ignited a Sputnik-like shock in the U.S. that was dubbed the “computenik” incident.
But those days are long gone.
Earth Simulator ranks 73rd on the TOP500 Web site list as of November, and only 17 of the 500 fastest computers are Japanese. Japan now treads well behind the U.S., which has 290 computers on the 500 list. The United Kingdom has 46, France 26 and Germany 25.
Why did Japan fall behind?
The reasons include the falling price of computing technology and easier access for smaller companies to join the supercomputing race.
Back in the 1980s only a handful of companies could afford the extensive investment, but that is no longer the case.
The decline has also been attributed to Japan’s economic slide.
NEC revealed last month that it will withdraw from a government program to build the next supercomputer, citing lack of funding.
Another possible factor is that Japan’s supercomputing industry does not engage in defense contracting, unlike in the West, where firms involved in military procurements are key investors in the development of simulation technology.
What is the latest development in Japan?
The Next-Generation Supercomputer Project, run by the Ministry of Education, Culture, Sports, Science and Technology, is scheduled to start running a unit with a 10-petaflop capability within fiscal 2010. The government has so far invested ¥115 billion in the project.
But Tomoyoshi Ito, a Chiba University professor of computer engineering who specializes in computer-generated holography, questions the project and its mounting costs.
“The objective shouldn’t be to build the world’s fastest computer,” he said, comparing the supercomputer with a huge mansion that fails to make its occupants comfortable.
Ito, who was involved in creating a supercomputer for gravitational computations in 1989 for a mere ¥200,000, said supercomputers should be more geared toward aiding researchers than just making fast calculations.
Will supercomputers be capable of emulating the human brain?
With Roadrunner breaking the petaflop mark, the next big achievement in sight is an exaflop computer capable of 1 quintillion calculations per second. That goal could be reached in a decade if the speed of development continues its pace.
Although experts are divided on how fast the human brain functions, it seems just a matter of time before computers can emulate the brain in terms of speed.
Toyo University’s Yagawa predicted it would take decades for computers to simulate human intelligence. But there could also be ethical issues even if technology reaches that level, he said, and supercomputer development will likely face restrictions similar to that of DNA cloning.
Chiba University’s Ito said it won’t be just speed that paves the way for a complete simulation of the brain, because humans have a different algorithm level when making decisions.
“Computers are accurate in doing what they are programmed to do. But a human brain has the element of inspiration and flashes,” he said.
While improved calculation speed and faster supply of information is helpful for researchers, Ito acknowledged there is a frightening dimension to computer advancement — even if it doesn’t involve artificial intelligence becoming independent and, for example, a supercomputer taking it upon itself to launch a nuclear war.
“Information travels so fast today and influences the entire world in an instant, as seen in the spread of the recent economic downturn. And I also feel that quick abundance of the same information everywhere has created a homogenous world,” he said.
In a time of both misinformation and too much information, quality journalism is more crucial than ever.
By subscribing, you can help us get the story right.