World News

How the end of Moore’s Law will usher in a new era in computing

By the mid-2020s it is believed that the Law will have plateaued completely as production costs increase and transistors reach their physical limits.It is predicted that the machines needed to produce such bewilderingly tiny components will cost $10bn.

“Below 20nm transistors cease to get more efficient or more cost-effective as you continue to shrink them,” says Stephen Furber, Professor of Computer Engineering at the University of Manchester. “Sure, they get smaller, and you can fit more on a chip, but the other historic benefits of shrinkage no longer apply.”

With Moore’s Law effectively becoming economically unsound, the technology industry will need to become more creative without an established blueprint to follow. As a result, a new era in computing could follow. 

“This may not be noticeable for some time because hardware design has not been efficient because Moore’s law has given manufacturers a free ride,” says Noel Sharkey, Professor of AI at Sheffield University. 

“The task now will be to take hardware up to as close to 100pc efficiency over the coming decade and that will keep processing power moving along. After that, it is anyone’s guess.”

A computer brain

Some experts believe the next step is to use chips inspired by the human brain, building ‘artificial neural networks’ to accelerate artificial intelligence. Such systems can learn without being programmed with set tasks, using connecting artificial neurons based on their biological equivalent and fired by connections similar to synapses in the brain.

“The explosive developments over the last 15 years in machine learning and AI have been paralleled by developments in more brain-inspired approaches; neuromorphics,” says Furber. 

“Although developed primarily for brain science, there is growing interest in commercial applications of neuromorphics, though nothing compelling yet. If and when there is a breakthrough in our understanding of how the brain works, this should unleash another huge leap forward in machine learning and AI.”

With big tech companies like Google, Microsoft and Facebook increasingly using AI solutions and algorithms, neural network processors (NNPs) are a big new market for chip producers. 

American company Nvidia, best known for its development of video game graphic cards, announced its $2bn Tesla 100P chip in 2016 that puts more power into deep learning.

And Intel demonstrated its own NNP last month, with Naveen Rao, corporate vice president and general manager of the Artificial Intelligence Products Group at Intel, saying that he expects the company’s AI solutions to have generated $3.5bn in revenue this year. I

Perhaps more presciently, Rao also claimed that neural network models are becoming ten times more complex each year, growing at an exponential rate greater than any technology transition he is aware of.

Quantum supremacy?

Further away, but potentially more broadly disruptive for the computing industry is the advent of quantum computing. Whereas traditional computing operates in binary, storing data as electrical signals in two states (1s and 0s), quantum computing is able to access data in a variety of states simultaneously in memory units called qubits. Companies around the world, including in the UK, are feverishly working on methods to successfully build the new frontier of tech.

“There are numerous ways to build quantum computers,” says Winfried Hensinger, the Professor of Quantum Technologies at the University of Sussex who is working on a quantum computer with company Universal Quantum.  

“We are using silicon microchips that host arrays of electrodes. These electrodes emit electric fields that can trap individual charged atoms just above the surface of the microchip. Each atom forms a quantum bit.”

.

Tags
Show More

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
Close
Close