What’s Next for the Computer after the Demise of Moore’s Law?

January 3, 2018 12:00 pm

It seems that computers and devices are becoming ever more powerful and speedy. In fact they are or, rather, have been. Moore’s Law is the principal of adding more silicon transistors to computer chips to increase speed, power, and overall capacity. This began in 1965 as Gordon Moore, cofounder of Intel and for whom Moore’s Law would become the namesake, noticed that the size of transistors was shrinking so much that every year they could double the amount of transistors that could fit on a chip. In 1975, however, Moore changed this to occur every two years instead.

Fast forward to today and Moore’s Law is on its way out. This is because transistors can only get so small – down to the size of an atom. There’s nowhere to go past that size, so there’s got to be an alternative, something better, more sustainable, something that will surpass the time frame Moore’s Law set forth, right? There are indeed ideas and concepts scientists and engineers are working on, but the growth of these other routes has long been stunted due to the unabated doubling process; experimentation for what’s to come has been rather lackluster in place of simply multiplying exponentially the number of transistors biennially. Nevertheless, let’s take a look at some alternatives and why:

 

  • It’s not just physical limitations – One reason why we should really look to alternative methods is that it’s becoming much too expensive to find ways to cram these transistors on ever-shrinking silicon chips. In previous years, expensive design changes allowed us to keep up with Moore’s Law but, as we near the physical limitations in regard to size, it becomes increasingly more expensive with each generation.
  • Quantum computers – You’ve probably heard rumors about quantum computers as an emerging technology over the past few years. These differ from our current systems in that they will be possibly millions of times more powerful, able to perform complex computations in a fraction of the time it takes a normal computer to do so. One facet of quantum computing is encryption. Finding the prime factors for extremely large numbers is what goes into encryption coding, but our current computers take far too long to factor these numbers, as long as hundreds of years according to best estimates. A quantum computer, however, is able to process exponentially more data in much less time, so it could theoretically perform such computations in as little as days or even hours. One problem, though, is that they have to be cooled to close to 0 Kelvin (-459.67 ℉, or -273.15 ℃) or absolute zero, the lowest temperature we can physically achieve. Scientists have been routinely able to achieve this feat, though it takes an enormous amount of energy to maintain such low temperatures Google is already working on developing its own quantum computer, but a date for public release is still very far away.
  • An alternative to Silicon – We’ve all heard of Silicon Valley, the birthplace of modern computers, as pioneered by big companies such as Apple. Silicon is the main element found in most computer hardware because of its properties as a functioning semiconductor. A semiconductor is a material that acts as both a conductor (so electrical current can flow) and an insulator (to stop the flow of electrical current). This is the “on-off” switch in computers, which correlates with binary code, a system of 1’s and 0’s, yes and no, on and off. One alternative scientists have been experimenting with is graphene, a single layer, two-dimensional hexagonal lattice of carbon atoms. The problem, though, is that graphene, while it conducts electrons very well, is very hard to stop the flow. Graphene, instead of being the replacement for Silicon, has opened up possibilities of other similar materials and structures.

 

We’re still a few years away from the end of Moore’s Law, but now more than ever we should be looking toward alternatives. This may ultimately mean changing how we build computers altogether, though this would spell disaster because it would mean we have to change how we code computers, and software would change as well. The future of transistors and chips is unclear in our current model. Only with innovative processes and an eventual breakthrough will computers as we know them survive.

Categorised in: , ,