1971 saw the release of the world’s first commercially available microprocessor – Intel’s 4004. In the years that have followed we have seen unparalleled advancement, if the car had improved at the same rate as computer chips since 1971, by 2015 cars would been able to hit speeds of about 420 million miles per hour – with that number doubled by the end of this year!}
Intel’s co-founder, Gordon Moore, keenly noted in 1965 that the number of transistors per square inch on integrated circuits had doubled every year since their invention; he predicted that processing power would double every two years for the foreseeable future, and despite constant dismissal that the theory is dead, or will soon be dead the inevitable march of progress continues to defy speculation. But given that each year more and more money is required to make processors smaller and faster, it is inevitable that new ways of designing computer chips (or new ways of designing computers themselves) will be the driver for the next generation of innovation.
Although it becoming almost impossible to fit more transistors onto new microprocessors, the idea of stacked micro-chips is providing new life in the pursuit of speed (processing speed that is). 3D chips might seem like a logical next step once you think about it, but until recently it has been all but impossible to stack microchips without them melting. However, researchers from Stanford and 3 other universities have developed a brand new technique for stacking chips called Nano-Engineered Computing Systems Technology, or N3XT. These new chip designs run roughly 1,000 time more efficiently that traditional 2d configurations – so it looks like Moore’s law may just be a warm up for this level of innovation, although type of sustained improvement is unlikely to continue.
They solved the main problem with stacking chips (the level of heat required to fabricate each level) by creating nano-materials that can be fabricated at lower temperatures – thus avoiding the risk of frying the lower layers of circuitry as the processors are built up. In addition to this they developed electric “ladders” or “elevators” that can move more data over a short distance all while using less power than traditional wiring.
Quantum computing may seem like an exotic and far-fetched idea, one only possible in the far reaches of science fiction. However, several tech giants (Microsoft, IBM, and Google) have been pouring money into developing quantum chips – in fact you can actually try your hand at programming with IBM’s quantum chip virtually on their website here. Having been confined to theoretical research for the last decade, there has finally been a breakthrough and researchers at the University of Sussex have just unveiled plans to build what could be the world’s first quantum computer!
Quantum computing takes advantage of the quantum physics phenomenon of super-position; the idea that on a quantum level, particles can be in two places at once. By taking advantage of this quirk of the quantum world, physicists and computer technicians can create computers with much more processing power than they could using regular silicon microprocessors.
Unfortunately, these advancements are unlikely to ever be seen in personal computers as quantum computers are generally intended to use for processing mathematical problems, data analytics, and cryptography – as they can solve these problems much quicker than any traditional super-computer!
Computing in the Cloud
There is a third avenue down which we could go to achieve great processing power, cloud computing. This is hardly a new idea, but the concept for improving computing power on our everyday devices may be lost on some. While it may soon be impossible to fit smaller or more powerful processors into IoT (Internet of Things) devices, any complex computing could soon be done in the cloud. The size of data centres is not restricted by the limits of a mobile phone or laptop, thus by reallocating data processing from our mobile devices – just like Apple do with Siri – we could massively expand the possibilities we have to process data from our IoT devices!
Moore’s law has provided us with a relative certainty in the speed of progress for the last 50 years, but things are about to change. Advancements will now come it fits and starts, and technology scrambles to keep pace with consumer demand for faster and faster devices. We are on the verge of some incredible advancements in terms of processing power, but we could be waiting there a while.
By Josh Hamilton
Josh Hamilton is an aspiring journalist and writer who has written for a number of publications involving Cloud computing, Fintech and Legaltech. Josh has a Bachelor’s Degree in Political Law from Queen’s University in Belfast. Studies included, Politics of Sustainable Development, European Law, Modern Political Theory and Law of Ethics.