FOLLOW US

Three Ways In Which Supercomputers Paved The Future

Thomas Wellburn
August 16, 2017

Although the term ‘supercomputer’ is used rather casually in today’s day and age, back in the 50s and 60s, it was a very different scene.

Without the aid of supercomputers, it would not be far fetched to state that computer technology would not have advanced as far as it eventually did. After all, the advent of the very first supercomputer was way back in 1953.

Before that, in the 1940s, supercomputers were mainly used for the purpose of making large calculations and the aspect of cracking codes in the Second World War. The famous Enigma Machine that turned the tide of the war in favour of the Allies is one of the main examples.

You might just want to keep that bizarre fact in mind, the next time you need to give away your laptop for computer repairs.

Back then, who would have thought that in five or six decades, supercomputers would be so central to our lives? That they would end up playing such a pivotal role in ensuring the large-scale success and progress of the human race in the field of computing? Back then, it was mainly the military wing of the government that was using them. However today, supercomputers have spread everywhere, from the aspect of weather forecast, to even gaming.

On that very note, let’s take a good look at the four main ways in which supercomputers have ultimately paved the way to the future :-

  • The aspect of CPU Speed: The faster a CPU is, the more computations it can perform. It is as simple as that. With the modern CPU hitting bottlenecks in regards to maximum clock speed, the answer was simple. Add more CPUs. A supercomputer certainly needs to have more than one CPU just so that it can accomplish circuit switching on a faster level as well as perform more tasks in general. Hence it should come as no surprise that in comparison with a standard PC, the level of their Operations Per Second would be higher on an exponential level. The current fastest supercomputers operate at petascale speeds, meaning one quadrillion calculations per second (1,000,000,000,000,000). The U.S.A is currently funding the development of exascale computers, which would process at over a quintillion operations per second (1,000,000,000,000,000,000)!

  • Giant strides in Memory & Storage: In the good old days, computers had a mere 64KB of RAM, which almost sounds unreal now, considering how much things have changed since then. Most volatile memory was also stored on tapes, whereas now we use solid-state drives. Nowadays, most PCs will end up having at least 8-12 GB of RAM and 1TB internal storage. In comparison, the Titan supercomputer (first operational in 2012), contains 693.5 TB of RAM and 40 PB (40000 terabytes) of internal storage.

  • Solved the problem of cooling: Needless to say that most of the earlier supercomputers did suffer from the problem of cooling. In fact, they were so inefficient that overheating was a regular occuernce. This was because of the large silion dies used for the processors. As technology has advanced and processors have shrunk, power requirements and heat production have decreased. It’s also spurred on new cooling methods, such as liquid cooling. Initially a method reserved for supercomputers, mainframes and servers, it has now trickled down into the consumer space.

There is absolutely no doubt that supercomputers have come a long way since their humble origins. The sheer speed at which the evolution of the supercomputer has taken place is amazing indeed. Moore’s law of exponential computings shows no signs of slowing down.

Starting out with a supercomputer that could calculate up to about 5,000 characters per second, today the NASA Columbia supercomputer manages to do a mind-numbing 42.5 trillion calculations per second. This quantum jump was achieved in the span of a few decades.

About the Author

Share this article

We use cookies to study how our website is being used. By continuing to browse the site you are agreeing to our use of cookies.