A fading Moore’s Law to spark fundamental rethink of microchips: Future of Computers P4

IMAGE CREDIT: Quantumrun

A fading Moore’s Law to spark fundamental rethink of microchips: Future of Computers P4

    Computers—they're kind of a big deal. But to really appreciate the emerging trends we've hinted at so far in our Future of Computers series, we also need to understand the revolutions sprinting down the computational pipeline, or simply: the future of microchips.

    To get the basics out of the way, we have to understand Moore's Law, the now famous law Dr. Gordon E. Moore founded in 1965. Essentially, what Moore realized all those decades ago is that the number of transistors in an integrated circuit doubles every 18 to 24 months. This is why the same computer you buy today for $1,000 will cost you $500 two years from now.

    For over fifty years, the semiconductor industry has lived up to this law’s compounding trendline, paving the way for the new operating systems, video games, streaming video, mobile apps, and every other digital technology that has defined our modern culture. But while the demand for this growth appears like it will remain steady for yet another half century, silicon—the bedrock material all modern microchips are built with—doesn’t appear like it will meet that demand for much longer passed 2021—according to the last report from the International Technology Roadmap for Semiconductors (ITRS)

    It’s physics really: the semiconductor industry is shrinking transistors to the atomic scale, a scale silicon will soon be unfit for. And the more this industry tries to shrink silicon past its optimal limits, the more expensive each microchip evolution will become.

    This is where we're at today. In a few years, silicon will no longer be a cost-effective material to build the next generation of cutting-edge microchips. This limit will force a revolution in electronics by forcing the semiconductor industry (and society) to choose between a few options:

    • The first option is to slow, or end, costly development to further miniaturize silicon, in favour of finding novel ways to design microchips that generate more processing power without additional miniaturization.

    • Second, find new materials that can be manipulated at far smaller scales than silicon to stuff ever greater numbers of transistors into even denser microchips.

    • Third, instead of focusing on miniaturization or power usage improvements, refocus on the speed of processing through creating processors that are specialized for specific use cases. This could mean instead of having one generalist chip, future computers may have a cluster of specialist chips. Examples include graphics chips used to improve video games to Google’s introduction of the Tensor Processing Unit (TPU) chip that specializes in machine learning applications.

    • Finally, design new software and cloud infrastructure that can operate faster and more efficiently without needing denser/smaller microchips.

    Which option will our tech industry choose? Realistically: all of them.

    The lifeline for Moore’s Law

    The following list is a brief glimpse into the near- and long-term innovations competitors within semiconductor industry will use to keep Moore’s Law alive. This part is a bit dense, but we’ll try to keep it readable.

    Nanomaterials. Leading semiconductor companies, like Intel, have already announced that they will drop silicon once they reach miniaturization scales of seven nanometers (7nm). Candidates to replace silicon include indium antimonide (InSb), indium gallium arsenide (InGaAs), and silicon-germanium (SiGe) but the material that is getting the most excitement appears to be carbon nanotubes. Made from graphite—itself a composite stack of the wonder material, graphene—carbon nanotubes can be made atoms thick, are extremely conductive, and are estimated to make future microchips up to five times faster by 2020.

    Optical computing. One of the biggest challenges around designing chips is ensuring that electrons don’t skip from one transistor to another—a consideration that gets infinitely harder once you enter the atomic level. The emerging tech of optical computing looks to replace electrons with photons, whereby light (not electricity) gets passed from transistor to transistor. In 2017, researchers took a giant step toward this goal by demonstrating the ability to store light-based information (photons) as sound waves on a computer chip. Using this approach, microchips could operate near the speed of light by 2025.

    Spintronics. Over two decades in development, spintronic transistors attempt to use the ‘spin' of an electron instead of its charge to represent information. While still a long way from commercialization, if solved, this form of transistor will only need 10-20 millivolts to operate, hundreds of times smaller than conventional transistors; this would also remove the overheating issues semiconductor companies face when producing ever smaller chips.

    Neuromorphic computing and memristors. Another novel approach to solving this looming processing crisis lies in the human brain. Researchers at IBM and DARPA, in particular, are leading the development of a new kind of microchip—a chip whose integrated circuits are designed to mimic the brain’s more decentralized and non-linear approach to computing. (Check out this ScienceBlogs article to better understand the differences between the human brain and computers.) Early results indicate that chips that mimic the brain are not only significantly more efficient, but they operate using unbelievably less wattage than current day microchips.

    Using this same brain modelling approach, the transistor itself, the proverbial building block of your computer's microchip, may soon be replaced by the memristor. Ushering in the “ionics” era, a memristor offers a number of interesting advantages over the traditional transistor:

    • First, memristors can remember the electron flow passing through them—even if power is cut. Translated, this means one day you could turn on your computer at the same speed as your light bulb.

    • Transistors are binary, either 1s or 0s. Memristors, meanwhile, can have a variety of states between those extremes, like 0.25, 0.5, 0.747, etc. This makes memristors operate similar to the synapses in our brains, and that’s a big deal since it could open up a range of future computing possibilities.

    • Next, memristors don’t need silicon to function, opening the path for the semiconductor industry to experiment with using new materials to further miniaturize microchips (as outlined earlier).

    • Finally, similar to the findings made by IBM and DARPA into neuromorphic computing, microchips based on memristors are faster, use less energy, and could hold a higher information density than chips currently on the market.

    3D chips. Traditional microchips and the transistors that power them operate on a flat, two-dimensional plane, but in the early 2010s, semiconductor companies began experimenting with adding a third dimension to their chips. Called ‘finFET', these new transistors have a channel that sticks up from the chip's surface, giving them better control over what takes place in their channels, allowing them to run nearly 40 percent faster, and operate using half the energy. The downside, however, is that these chips are significantly more difficult (costly) to produce at the moment.

    But beyond redesigning the individual transistors, future 3D chips also aim to combine computing and data storage in vertically stacked layers. Right now, traditional computers house their memory sticks centimetres from its processor. But by integrating the memory and processing components, this distance drops from centimetres to micrometres, enabling a giant improvement in processing speeds and energy consumption.

    Quantum computing. Looking further into the future, a large chunk of enterprise level computing could operate under the freaky laws of quantum physics. However, due to the importance of this kind of computing, we gave it its own chapter at the very end of this series.

    Super microchips aren’t good business

    Okay, so what you read above is all well and good—we’re talking ultra energy-efficient microchips modeled after the human brain that can run at the speed of light—but the thing is, the semiconductor chip-making industry isn’t overly eager to turn these concepts into a mass-produced reality.

    Tech giants, like Intel, Samsung, and AMD, have already invested billions of dollars over decades to produce traditional, silicon-based microchips. Shifting to any of the novel concepts noted above would mean scrapping those investments and spending billions more on constructing new factories to mass-produce new microchip models that have a sales track record of zero.

    It’s not just the time and money investment that’s holding these semiconductor companies back. The consumer demand for ever more powerful microchips is also on the wane. Think about it: During the 90s and most of the 00s, it was almost a given that you’d trade in your computer or phone, if not every year, then every other year. This would let you keep up with all the new software and applications that were coming out to make your home and work life easier and better. These days, how often do you upgrade to the latest desktop or laptop model on the market?

    When you think of your smartphone, you have in your pocket what would've been considered a supercomputer just 20 years ago. Aside from complaints about battery life and memory, most phones bought since 2016 are perfectly capable of running any app or mobile game, of streaming any music video or naughty facetiming session with your SO, or most anything else you'd want to do on your phone. Do you really need to spend $1,000 or more every year to do these things 10-15 percent better? Would you even notice the difference?

    For most people, the answer is no.

    The future of Moore's Law

    In the past, most investment funding into semiconductor tech came from military defense spending. It was then replaced by consumer electronics manufacturers, and by 2020-2023, leading investment into further microchip development will shift again, this time from industries specializing in the following:

    • Next-Gen content. The coming introduction of holographic, virtual and augmented reality devices to the general public will spur a greater demand for data streaming, especially as these technologies mature and grow in popularity during the late 2020s.

    • Cloud computing. Explained in the next part of this series.

    • Autonomous vehicles. Explained thoroughly in our Future of Transportation series.

    • Internet of things. Explained in our Internet of Things chapter in our Future of the Internet series.

    • Big data and analytics. Organizations that require regular data crunching—think the military, space exploration, weather forecasters, pharmaceuticals, logistics, etc.—will continue to demand increasingly powerful computers to analyze their ever-expanding sets of collected data.

    Funding for R&D into next-generation microchips will always exist, but the question is whether the funding level needed for more complex forms of microprocessors can keep up with the growth demands of Moore's Law. Given the cost of switching to and commercializing new forms of microchips, coupled with slowing consumer demand, future government budget crunches and economic recessions, chances are that Moore's Law will slow or halt briefly in the early-2020s, before picking back up by the late 2020s, early 2030s.

    As for why Moore’s Law will pick up speed again, well, let’s just say that turbo-powered microchips aren’t the only revolution coming down the computing pipeline. Next up in our Future of Computers series, we’ll explore the trends fueling the growth of cloud computing.

    Future of Computers series

    Emerging user interfaces to redefine humanity: Future of computers P1

    Future of software development: Future of computers P2

    The digital storage revolution: Future of Computers P3

    Cloud computing becomes decentralized: Future of Computers P5

    Why are countries competing to build the biggest supercomputers? Future of Computers P6

    How Quantum computers will change the world: Future of Computers P7     

    Next scheduled update for this forecast

    2023-02-09

    Forecast references

    The following popular and institutional links were referenced for this forecast:

    European Commission
    Evolution of Web
    YouTube - RichReport

    The following Quantumrun links were referenced for this forecast: