Super Computer

Supercomputer

Supercomputer

Whether it’s supercomputers? Quoting from Wikipedia, the definition of a supercomputer is a computer engine technology specifically designed for super fast processing capacity, particularly speed of calculation and data processing. Then, whoever the manufacturer of supercomputers? Used for what? What’s the difference with regular server? All will be reviewed here.

XT5 Jaguar, one of the products made by Cray Inc. supercomputer. based in Oak Ridge National Laboratory, Tennessee, top the list of world’s Top 500 fastest supercomputers. Velocity reaches 1.7 petaflops, meaning capable of doing 1.7 quadrillion computing operations per second. Meanwhile, China-made supercomputer Dawning nebulae, which is in the National Supercomputing Center in Shenzhen, China, was close behind with 1.27 Jaguar petaflops per second. In fact, no longer a newcomer from IBM, Sequoia is ready to compete with them.

Hopefully, in 2011 IBM Sequoia has completed the performance 20 times more super than the Roadrunner or greater than 400 000 processor with 3 GHz quadcore CPU. How big is 20 petaflops? IBM said “Sequoia calculations are calculated for one day, requiring a 1000-year and 6 billion people to count with a calculator”.

After knowing such an amazing performance, the question arises, what is the use of supercomputers? Whatever you can do? One answer is to research and knowledge. Examples can be seen is to simulate natural processes to control the computer. With supercomputers can know how to spread poison in the soil and it takes time to unravel. We can also calculate the magnetic currents in the earth’s core, to simulate the formation of galaxies, showing what happens when a supernova, or how the effects of laser shots on a variety of materials (important for research and development of particle accelerators such as those used to treat tumor diseases).

With a supercomputer, the effects of therapeutic innovations can also be studied without torturing animals as experimental media. In fact, supercomputers are able to predict when the next earthquake will happen in the world and which areas are affected. And also now widely used in the supercomputer complex problem solving such as quantum physics, weather forecasting, climate research, molecular modeling, and simulation of physics experiments. All of the above problems require fast and accurate calculations that use of the supercomputer is appropriate when compared to using a regular computer (PC).

Supercomputers are very important for the study, explained Klaus Wolkersdorfer, head of high-performance computer systems at the research center in Jülich, Germany. In addition to theory and experiment, supercomputers have been developed into the third pillar in the world of science.

IBM received a request to build from the Sequoia National Nuclear Security Administration, a division of U.S. Department of Energy wants to know whether the storage of nuclear weapons decades-old can still be used or be a danger that can not be controlled.

Sequoia is also used for civilian purposes. Meteorologists and seismologists can make more precise prognosis, according to IBM in a press conference. In addition, the local weather forecast accuracy can be predicted up to 100 square meters. In fact, the prognosis of the earthquake and the safe evacuation routes can be calculated up to 50 times better than now. The researchers were able to estimate the effects of the earthquake in a city, even its effect on every building in the city.

Supercomputer in Jülich fully used for civilian purposes. Here, to be able to use the facility, researchers must sign a statement that their findings will be published. Conversely, what would be calculated Sequoia in 2011 remain a secret.

Another opinion expressed by Herbert Cornelius, Director Advanced Computing Center EMEA at Intel. The focus of the effort actually covers the future of supercomputing technology, processors, interfaces, and new software.

At this time, the technology enables supercomputer simulations that was unimaginable a few years ago, said Cornelius. Various processes that apply fields of science are very different, now can play a role. For example, the simulation of blood flow through the heart. In addition to chemical and biological reactions, such as the physical characteristics of blood flow behavior is also noteworthy. This of course requires computational operations, Cornelius confirmed.

Supracomputer

10 years to achieve exaflops
Actually there are a number of instances in which supercomputers can not calculated it. One of them, have not been able to simulate the airflow around the aircraft as a whole. In order to achieve definitive results, the model aircraft must be very fine grid. We need to make curved area of the polygons are straight. The more detailed the model, the more complicated calculations. Therefore, the aircraft constructor always just calculate the flow characteristics of the components. Flow simulations carried out two to three times per day. Thus, the engineers will quickly get a result and continue their work, said Cornelius.

Supercomputer technology and innovation will continue. I predict in 10 years will present a supercomputer that can penetrate Exaflops limit. However, I do not know exactly whether a nuclear plant capable of supplying electricity to a computing center such as the size of a city will supercomputer and mega project will be a giant, Cornelius said.

Leave a Reply

Your email address will not be published. Required fields are marked *