Eight trends accelerating the age of commercial-ready quantum computing

Image for post
Image for post

Every major technology breakthrough of our era has gone through a similar cycle in pursuit of turning fiction to reality.

It starts in the stages of scientific discovery, a pursuit of principle against a theory, a recursive process of hypothesis-experiment. Success of the proof of principle stage graduates to becoming a tractable engineering problem, where the path to getting to a systemized, reproducible, predictable system is generally known and de-risked. Lastly, once successfully engineered to the performance requirements, focus shifts to repeatable manufacturing and scale, simplifying designs for production.

Since theorized by Richard Feynman and Yuri Manin, quantum computing has been thought to be in a perpetual state of scientific discovery. Occasionally reaching proof of principle on a particular architecture or approach, but never able to overcome the engineering challenges to move forward.

That’s until now. In the last 12 months, we have seen several meaningful breakthroughs from academia, venture-backed companies, and industry that looks to have broken through the remaining challenges along the scientific discovery curve. Moving quantum computing from science fiction that has always been “five to seven years away”, to a tractable engineering problem, ready to solve meaningful problems in the real world.

Companies such as Atom Computing* leveraging neutral atoms for wireless qubit control, Honeywell’s trapped Ions approach, and Google’s superconducting metals, have demonstrated first-ever results, setting the stage for the first commercial generation of working quantum computers.

While early and noisy, these systems, even at just 40–80 error-corrected qubit range, may be able to deliver capabilities that surpass those of classical computers. Accelerating our ability to perform better in areas such as thermodynamic predictions, chemical reactions, resource optimizations, and financial predictions.

As a number of key technology and ecosystem breakthroughs begin to converge, the next 12–18 months will be nothing short of a watershed moment for quantum computing.

Here are eight emerging trends and predictions that will accelerate quantum computing readiness for the commercial market in 2021 and beyond:

1. Dark horses of QC emerge: 2020 will be the year of dark horses in the QC race. These new entrants will demonstrate dominant architectures with 100–200 individually controlled and maintained qubits, at 99.9% fidelities, with millisecond to seconds coherence times that represent 2x — 3x improved qubit power, fidelity, and coherence times. These dark horses, many venture-backed, will finally prove that resources and capital are not sole catalysts for a technological breakthrough in quantum computing.

2. Hybrid classical-quantum applications will power the first wave of commercial applications. Using quantum systems, we can natively simulate quantum theory or elements of nature, such as the characteristics of electrons, and thus molecules and their behaviors. Hybrid systems can rely on early quantum systems surpassing what is possible on a classical computer: taking advantage of their limited and specialized capabilities while passing the computed variables back to the classic system for completed computation. We’ve already seen this emerge for chemistry-related research across materials engineering and pharma.

3. Early consolidation: We will start to see early consolidation in quantum hardware companies as conglomerates realize they need to abandon, bolster, and/or diversify their current architectural approaches. Companies that don’t have existing investments in quantum will need to acquire their way in to gain access. A number of architectural methods won’t work as well as anticipated (see Microsoft’s elusive particle). As we saw with the hard drive disk and semiconductor industries consolidation, those that have proven early technology successes, indicating an approach may become dominant, will be the first to be subsumed.

4. The ‘quantum software developer’ generation emerges thanks to various layers of the quantum stack beginning to become accessible to developers:

  • Access to quantum hardware thanks to cloud providers such as Google, Microsoft, and Amazon deploying new managed services.
  • Access to software frameworks thanks to various quantum developer kits released and open-sourced (Microsoft, Google, Baidu, IBM).
  • Access to applications thanks to companies such as Zapata, QCWare, and Cambridge Quantum, building quantum-ready applications and simulations across chemistry, finance, logistics, pharma, and more that position companies to be ready to leverage new quantum hardware technologies as they become available.

5. Venture Capital investments into QC hardware companies will stage inverse, focusing on late-stage, proven technologies, slowing down venture investments into new Seed and Series A QC hardware companies. Most of the venture capital firms who go deep into new forms of computing have made their early-stage QC hardware bets, leaving few firms to target. At the same time, there will likely be an increase in venture investment into later stage (Series B and on) QC hardware companies as a result of material technical de-risking, end to end algorithmic computation, and a path to error correction and scale. As we saw with the semiconductor industry, we will see mainstream venture funds double down on dominant technical approaches.

6. A surge in commercial and government funding for QC companies thanks to a number of tailwinds:

  • More companies are starting to invest in being ‘quantum ready.’ This ranges from internal training to build more profound awareness of the power of QC, to building quantum-ready applications and simulations for high-value problems, spending upward of $500K — $1M per application use case or algorithm.
  • An increasing number of companies are actively paying for access to early quantum hardware in order to build ahead of the curve, even if those systems aren’t capable of accurate or complete computations yet.
  • The National Quantum Initiative Act has earmarked $1.2B for quantum research. While these funds over ten years will trickle mostly through university research programs, it will lead to a number of new spin-outs and shared research across the quantum computing ecosystem.
  • Various legislative draft proposals have been in works for a “National Science and Technology Foundation”, replacing the NSF, which would spend $100 billion over five years on research related to technologies such as artificial intelligence, quantum computing, and 5G telecommunications.
  • Our national security and defense priorities are beginning to crystalize around quantum computing use cases, mirroring many of the ‘quantum ready’ intentions and use cases of enterprises as mentioned above, leading to new SBIR and OTA contract awards.

7. Geopolitics is going to push quantum computing into the mainstream. The intensifying competition from China, Canada, Australia, Finland, and others, will introduce new existential risks around encryption and technological dominance. What if an adversary suddenly gained access to a computing power advantage? Similar to the politicizing of 5G and AI, pushing quantum computing into the national spotlight will increase pressure on our federal government to accelerate US-based quantum leadership.

8. Post-quantum Encryption will become a top priority for every CISO. Even if we are ten to fifteen years away from enough error-corrected qubits to break public-key encryption, this isn’t a zero-sum problem. Encryption lives in shades of grey, impacted by regional policies, encryption standard, depreciated or legacy installations, and more. For example, there are still over 25 million publicly visible sites relying on SHA-1, a cryptographic standard deprecated in 2011. While the most advanced encryption protocols are likely safe from the next few generations of quantum computers, the volume of deprecated yet active encryption protocols across the web is rampant, and can go from a small nuisance to major problem overnight. NIST is leading the charge on a post-quantum cryptographic standard to be approved by 2024, in hopes of being fully deployed before 2030. In the meantime, best to upgrade to the latest protocols.

The beginning of every renaissance or golden age in history have started as a result of the intersections of capability, community, access, and motivation. Quantum computing is entering the beginning stages of that age.

Technology breakthroughs are demonstrating stable, durable qubits that can be controlled and scaled. Underlying technologies such as arbitrary waveform generators, software-defined radios, and rapid FPGA development have accelerated the speed of development. New entrants are proving new methods and architectures as far superior to those of the past. An ecosystem is developing supporting the applications, distribution, and funding to enable access to these systems. The industry is seeing first hand the power and capability of a quantum system, racing to be the first in line to get their hands on it.

Quantum computing will represent the most fundamental acceleration in computing power that we have ever encountered, leaving Moore’s law in the dust. Welcome to the quantum age.

Venrock is an investor in Atom Computing.

This post was originally authored and published as a guest columnist for Techcrunch, see it here

Written by

Venture Capitalist, Partner @Venrock, writing about software & hard things for developers, space, and modern computing.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store