Titan grabs the supercomputer crown!


In the race to have the fastest supercomputer on the planet, Oak Ridge National Laboratory’s Titan has taken the crown. During the beginning of November, technicians at the Oak Ridge Lab have installed 18,688 Tesla K20x GPU modules, created by NVidia to speed up the supercomputer to an average of 17.59 petaflops/sec, which is the equivalent of 17,590 trillion calculations per second! To understand what this means, let’s break down a few terms that might be foreign to a typical user.

GPU stands for Graphic Processing Unit. CPUs stand for Central Processing Unit. CPUs are beginning to be considered the old way of running a computer. When computers were first introduced, CPUs ran the machines, which was like the brain of the computer. This device could handle a few instructions that you give it to do, however were not very well at carrying out many instructions at once. CPUs can build complex tasks by using the answer to its first calculation to work on the next, and up and up, HOWEVER in order to get the speed of processing, technicians began using GPUs, which are slower at calculating to get to the next answer, however they can begin MANY calculations at one time, instead of just a few at a time. CPUs combined with GPUs provides a way to divvy up the work depending on what the core, or calculation needs to obtain the quickest result, and with the most information. Still with me?

At the moment, Titan has about 90% of its workload running on GPUs, and around 10% running on CPUs. With the introduction of NVidia’s new Tesla GPUs, Titan took the crown from IBM’s Sequoia, also a supercomputer from the United States. The supercomputer operates at 17.59 petaflop/sec., and has a peak performance of more than 20 petaflop/sec, so it can and will be the leader for a while, as it will continue to go faster and faster as the GPUs are constantly pushed. We now also have another term to understand… what is a petaflop? To understand petaflop, you need to break the word down into two sections, peta, and flop. FLOPS stand for floating-point operations per second, which sounds even more confusing, however it basically is a measure of a computer’s performance. A floating-point operation is basically a calculation, or instruction. Peta is a measurement that pertains to a scale that starts with kilo, and goes as high as yotta. Petaflops are 10 to the 15th power, so you can calculate if kilo is 10 to the 3rd power, or 1000, peta is quite a bit.

Titan is going to be used for researching biofuels, modeling climate change, and developing energy-efficient engines for vehicles, however it can also be rented out to third parties. If you would like to learn more about supercomputing, and who is on the list and their specifications, you can go to www.top500.org to learn more.


Related posts: