### Your choice regarding cookies on this site

Our website uses cookies for analytical purposes and to give you the best possible experience.

Click on Accept to agree or Preferences to view and choose your cookie settings.

Global Intelligence for the CIO

shaping tomorrow with you

Fujitsu is the leading Japanese IT services provider and works with customers worldwide to shape the future of business and society.

www.fujitsu.com

www.fujitsu.com

Is quantum computing ready to leap out of the labs and into the commercial world? If so, what are the implications for IT leaders?

In the 1980s, leading physicists Richard Feynman and David Deutsch both proposed the idea of computers built around quantum processes that would be able to perform feats beyond anything possible using classical, binary machines. Indeed, they believed that these computers would be the only ones capable of accurately modeling the complexity of a universe itself built on quantum processes — from the formation of molecules and stars to the activity of the human brain.

Now, following decades of painstaking work, this vision is becoming reality. Research labs across the globe (some funded by the world’s biggest technology companies) are today developing and building quantum computers. At some point soon — possibly within the next 10 years — we could see the commercial launch of a general-purpose quantum computer capable of solving in seconds calculations that would take a classical computer millions or even billions of years. But well before that, perhaps in a matter of years, current developments in quantum computing could have a measurable impact on your organization. Read on or skip to a specific section:**What is a quantum computer?****How close are we to quantum computing power?****What's happening now?****When will quantum start benefiting businesses?****What should we do to prepare? **

What is a quantum computer?

In classical computing, a bit can either represent zero or one. But a quantum computer takes advantage of the strange properties of quantum mechanics where, at a subatomic level, the classical laws of physics break down. A logical quantum bit (qubit) can be in two states simultaneously — in other words it can be both zero and one at the same time. This is known as superposition and is most famously illustrated by the thought-experiment, Schrödinger’s cat.

Quantum computing also takes advantage of another principle of quantum physics — entanglement — which allows pairs of particles to be inextricably bound together in such a way that any action on one of the pair has a predictable and instant effect on the other, no matter how far apart they are. This principle, which Einstein referred to as “spooky action at a distance,” allows the qubits to be controlled for the purposes of computation and communication.

Because a qubit can represent zero and one simultaneously, every logical qubit added to the system increases its computational power exponentially. So, two qubits can process (in parallel) the same amount of information as four classical bits, three qubits the same as eight bits and so on. Once a machine has 56 qubits, it could outperform a quantum computer simulation running on a classical supercomputer (a point known, somewhat misleadingly, as quantum supremacy).

The more ambitious projects today are aiming to build machines scalable to thousands (and ultimately millions) of logical qubits. Clearly, if the technology can be made to work — and scale — the implications are revolutionary. Quantum computing could herald the ability to manipulate matter at a molecular level, model hitherto unfathomable natural processes and massively advance the field of artificial intelligence, among other exciting possibilities.

A recent paper (“Technical Roadmap for Fault Tolerant Computing”) by Oxford University researchers Amir Fruchtman and Iris Choi notes: “The applications for such a machine are vast… from linear algebra, simulating physics or chemistry faster than classical computers, to machine learning applications, fast database searches, financial analysis, and more.”

How close are we?

In reality, this new computing paradigm is in its very early stages. If you compare the evolution of quantum computing to that of classical computing, we’re still in the pre-transistor age. And not all qubits are created equally. Different groups are using a variety of techniques to produce them; and different types exhibit different levels of stability — there are superconducting qubits, ion trap qubits, photonic qubits, and more. While some are more stable than others, the vast majority (of whatever type) are unable to hold their state of superposition long enough to perform as a logical qubit. It is therefore important to make the distinction between physical qubits and the logical qubits required for general, circuit-based computation.

Error correction and fault tolerance are major challenges, and creating a single, logical qubit can take between tens and thousands of physical qubits, depending on the approach used. There are other thorny problems too, such as engineering and powering all the other elements of the computer, like gates and buses.

Nonetheless, Michele Mosca, research chair at Canada’s University of Waterloo and a leading light in the quantum world, believes the field has reached a fundamental turning point. The challenge is now one of engineering rather than theory. “Scientists at Yale said the major harbinger of the development of a large-scale quantum computer would be when 100 or fewer physical qubits could reliably behave as one logical qubit, and several projects are on the edge of achieving that,” he noted in 2017.

In 2015, most researchers were predicting that it would be between 20 and 30 years before the arrival of a large-scale quantum computer, but advancements have led an increasing number of them to revise that estimate down to within the next decade.

What’s happening now?

There are around 30 credible, well-funded projects across the globe developing quantum computers today in countries such as the US, China, the UK, Canada, Australia, France and the Netherlands. The approach that has borne the most fruit to date is the superconducting qubit (supported by the likes of Google and Intel, among others), with ion trap qubits also showing considerable promise. Superconducting qubits need to be stored at temperatures close to absolute zero, but are proving popular with some traditional tech firms because they can be built into semiconductors and have a fast gate time. Ion traps are a few years behind, but have the advantage of working at room temperature and can hold their state for longer. However, they have a slower gate time and use massive amounts of power.

Microsoft, meanwhile, is basing its quantum computer on a new and potentially far more stable and scalable ‘topological’ qubit. This uses quantum particles called Majorana fermions to string qubits together in a lattice, dramatically reducing the need for error correction. Professor Leo Kouwenhoven, whose lab at Delft University of Technology in the Netherlands was the first to observe the Majorana fermion, is now leading Microsoft’s research in this area and is confident the approach will pay off. “Putting normal qubits together is like building a house of cards. Topological quantum computers using Majorana fermions is more like Lego — and that means we can build larger structures to create quantum circuits,” he said.

One quantum computing luminary who spoke with I-CIO, but requested anonymity, said: “Microsoft scares all the others. If it can make a working processor, it will beat everybody else hands-down. Its competitors say the company hasn’t proved the Majorana fermion can be generated or manipulated, but others believe it wouldn’t be making these claims unless it had already successfully generated a qubit in this way.”

There were a flurry of other significant developments in 2017 – including Intel’s delivery of a 17-qubit chip and Google’s boast that it would soon be able to demonstrate a machine with around 50 qubits that (while not true logical qubits) could hold their state long enough to perform useful calculations. Microsoft underlined its commitment to the field by announcing a full-stack quantum computing solution, including the launch of a high-level Quantum Development Kit that integrates with Visual Studio. Developers will also be able to test their quantum algorithms on an Azure-based classical-computer simulation of a quantum machine.

When will quantum computing start benefiting our business?

Don’t expect to install a general-purpose quantum computer in your server room any time soon. Today’s lab-based machines look more like something out of an HG Wells novel — a tangle of wires, rods and chambers suspended in midair. The volatile superconducting qubits currently leading the race have to be housed in cryogenic chambers to function optimally with as little interference as possible (as do Microsoft’s topological qubits). They also consume phenomenal amounts of power. For these reasons, the first commercial, general-purpose quantum computers will almost certainly only be available as cloud services.

While the scalable, general-purpose quantum computer may be some years off, quantum developments are already having a commercial impact. Canadian company D-Wave Systems, for example, has been selling what are known as adiabatic quantum computers for several years, the latest model boasts 2,000 (physical) qubits. They work in a very different way to circuit-based quantum computers and do not have any logical qubits. And while the maths and science behind them is a little fuzzy, they have been shown to solve one very specific group of problems much faster than classical computers — optimization issues (for example, finding the fastest route for a delivery driver given the location of all drop-off points).

It is also possible to simulate the operation of quantum machines in software on a classical computer using a technique known as quantum annealing. Fujitsu, working with the University of Toronto, has developed the Digital Annealing Unit, which can rapidly solve combinatorial optimization problems using existing semiconductor technology — and operates at standard room temperature. According to Dr Joseph Reger, Fujitsu CTO for EMEIA: “This quantum-inspired technology will have a significant impact, since it allows businesses to solve common combinatorial optimization problems up to 17,000 times faster than they can today.”

Fujitsu has also partnered with and invested in Vancouver-based 1QBit — currently the only vendor of commercial quantum computer software — to create software and quantum algorithm for the Digital Annealer. Andrew Fursman, CEO of 1QBit, says: “The Fujitsu Digital Annealer is one of the first pieces of really usable hardware that can leverage all of the research that 1QBit has engineered over the past four years.”

There are also advances underway at various companies in circuit-based quantum computing that are likely to have commercial impacts. Although the first machine to do this might not be a true scalable quantum computer, the milestone will no doubt generate a lot of headlines and further raise commercial interest in the field.

Recent experiments have shown even small-scale circuit-based quantum computers can do significant calculations faster than classical machines. We could see computers like this being made available within 3-5 years, with the first applications likely to be in areas such as quantum chemistry, smart materials and machine learning.

What should we be doing to prepare?

Apart from keeping an eye on developments in the field, the biggest issue IT leaders should be thinking about now is their current use of encryption. In 1994, mathematician Peter Shor developed a quantum algorithm capable of cracking all encryption based on factoring prime numbers, including RSA. Since this currently protects most online data and communications, as soon as a quantum computer emerges that can run Shor’s algorithm, businesses will need to have quantum-resilient encryption in place. If Mosca, Microsoft and others are right that such a machine could emerge within 10 years — and given it could take several years for a large business to ensure all their sensitive data is effectively protected — the time to start understanding the implications and preparing is now.

First published March 2018

Load more articles

Our website uses cookies for analytical purposes and to give you the best possible experience.

Click on Accept to agree or Preferences to view and choose your cookie settings.