Quantum Noise has long been acknowledged as a formidable barrier in the realm of quantum computing. This disturbance, originating from the principles of quantum mechanics, can significantly degrade the performance of a system. Addressing quantum decoherence is a prerequisite for elevating quantum computing to broader levels of implementation and application

Quantum Computing is perceived as a revolutionary technology, poised to make significant strides in sectors such as pharmaceuticals, chemistry, automotive, and finance.

Nevertheless, it confronts various developmental challenges. The emphasis is not solely on predicting the widespread availability of quantum computers but also on overcoming the technical impediments currently constraining this technology. Of these challenges, Quantum Noise is paramount, critically affecting the precision of computations.


Quantum Noise presents a major obstacle in the progression of Quantum Computing. It is a disturbance capable of altering the results of a quantum system, caused by factors including magnetic fields, electronic device interference, and qubit interactions. Surmounting this challenge, known as quantum decoherence, is vital for advancements in quantum computing, with a goal to attain greater reliability and accuracy.
Strategies like error suppression, error mitigation, and Quantum Error Correction (QEC) have been devised to reduce the noise’s impact on qubits, thereby improving calculation precision.
These advancements are essential for achieving quantum supremacy – the ability of quantum computers to solve problems more efficiently than classical computers.

Understanding Quantum Noise

In Quantum Computing, noise is seen as a disruption stemming from various sources such as magnetic fields, Wi-Fi and mobile phone interference, and the mutual influence of qubits (quantum bits, the basic units of quantum information) due to their proximity.

This disturbance is significant; it can lead to information loss in inactive qubits, erroneous rotations in active qubits, and a general deviation of the system from its intended state. This issue, known as decoherence or quantum decoherence, has been a focus of research and experimentation for an extended period.

In 2017, American physicist John Preskill introduced the term Noisy Intermediate Scale Quantum (NISQ) to describe the error-prone nature of quantum computers of that time, a challenge that persists today, due to environmental disturbances [here his keynote: https://www.youtube.com/watch?v=h4nUyF9cSaw&t=13s].

The Causes of Quantum Noise and Decoherence

Quantum decoherence is commonly attributed to four distinct causes:

  • 1 – Primarily, the environment: Fluctuations in temperature, electric or magnetic fields can lead to the degradation of quantum information within the computer. Even weak galactic space radiation can impact qubits, thereby degrading them.
  • 2 – Secondly, crosstalk and interference: It’s essential to note that quantum computers are powered by qubits that operate in unison, manipulated by lasers or microwaves. Sometimes, the laser or microwave signal can affect neighboring qubits as well as the targeted qubit. This phenomenon is referred to as crosstalk or interference.
  • 3 – Degradation of state: In this case, it’s crucial to remember that the quantum state of a qubit deteriorates rapidly, often within mere fractions of a second. Therefore, it is necessary for algorithms to complete before the quantum states collapse.
  • 4 – Implementation errors: Algorithms apply various rotations to the qubit, implemented via laser or microwave pulses. Any imprecision in implementation can lead to subsequent errors in calculations.

Decoherence Impairs Precision and Reliability

Decoherence, therefore, poses a significant problem, rendering quantum computers less suitable for problem-solving compared to classical computational systems. According to some studies, despite ongoing efforts to enhance qubit reliability, currently, a precision rate exceeding 99.9% is reported: certainly a high value, but not sufficient to enable quantum computers to execute the complex algorithms necessary to surpass existing classical computers.

In January 2020, an international team of researchers, led by Professor Andrea Morello from UNSW Sydney, announced they had achieved a 99% accuracy rate with nuclear spin qubits embedded in silicon. Their work, published in Nature in the same month, demonstrated a 99.95% accuracy for single-qubit calculations, and a 99.37% accuracy for two-qubit calculations.
Identifying and correcting quantum errors is a critical challenge on the path towards quantum advantage, a term that denotes the point at which quantum computers will solve real-world problems faster than classical computers.

Andrea Morello explains that

“Typically, error rates need to be below 1% to effectively apply quantum error correction protocols. Having now reached this threshold, we can begin to design scalable and reliably functioning silicon-based quantum processors for practical computations.”

It’s worth noting that classical or non-quantum algorithms consist of a finite sequence of instructions, executable on a “traditional” computer. Similarly, quantum algorithms comprise a sequence of instructions and utilize some essential features of quantum computation, such as quantum superposition or quantum entanglement.

Quantum algorithms are faster and more efficient than their counterparts in classical computing. For instance, Shor’s algorithm for factorization is exponentially faster than the classical algorithm, while Grover’s algorithm, used for searching in an unstructured database or unordered list, is quadratically faster than a classical algorithm performing the same task.

Therefore, to overcome the problem of quantum decoherence, it is necessary to be capable of performing error correction. This means analyzing the system to determine which disturbances have occurred and then reversing them.

Why Correcting Quantum Decoherence is Important

Without effective error correction, as the size and complexity of a system increase, quantum computations become unreliable.

As early as 2019, Joschka Roffe, a researcher from the Department of Physics and Astronomy at the University of Sheffield, wrote in a paper titled “Quantum Error Correction: An Introductory Guide“:

Quantum error correction protocols will play a central role in the realization of quantum computing; the choice of error correction code will influence the entire quantum computing stack, from the arrangement of qubits at the physical level to gate compilation strategies at the software level. Thus, familiarity with quantum coding is an essential prerequisite for understanding both current and future quantum computing architectures.”

Currently, some working groups in academia and industry have developed algorithms that appear to yield positive results on computers limited in size and coherence. These algorithms are evaluated in noisy environments to determine how well they might function in the short term under realistic scenarios. Noise simulation provides a systematic approach to the evaluation of noisy algorithms, allowing for the selection between different types of noise with adjustable intensity.

The underlying idea of these efforts is that as the development of quantum hardware progresses, it will be possible to understand the influence of noise better, model the errors, and hence advance in their research on quantum error correction.

What Lies Ahead and the Cloud’s Role

The situation appears to be far from stagnant. Indeed, in the past two years, theoretical and experimental progress seems to be heading in the right direction: a combination of hardware and software strategies is proving promising for suppressing, attenuating, and cleaning up quantum errors, and this is happening sooner than expected. The cloud also deserves some credit for this advancement.

On May 4, 2016, a date not chosen by chance, IBM managed to provide physicists and researchers worldwide with the opportunity to use their qubits by launching the first cloud-accessible quantum computer, bringing the problem into sharp focus. With 7,000 users registered in the first week and 22,000 in the first month, there was sufficient engagement to highlight not only the interest in Quantum Computing but also its limitations.

The starting point was a 5-qubit system, which later expanded to 12, with the ideal goal of eventually reaching hundreds of thousands of qubits working together. It quickly became clear that noise was “THE” problem. Of course, some noise was expected: thermal radiation, temperature fluctuations, and the application of energy pulses to set the qubits into correct states were all sources of noise.

However, the sheer number of trials made possible by the cloud has allowed researchers to learn much more quickly than was previously possible.

Managing Quantum Noise

At present, physicists and researchers are working on a series of solutions to help control noise, thereby overcoming the limits of quantum decoherence.

– The first step is error suppression. This is likely the most basic approach and is based on analyzing the behavior of qubits and circuits. Here, the research involves redesigning circuits and reconfiguring how instructions are imparted, in order to better protect the information contained in the qubits. This increases the likelihood that quantum algorithms can produce a correct response.

– The second level of intervention focuses on error mitigation. The underlying assumption is that noise does not always lead to complete computational failure: often, it results in “simple” alterations that can be corrected. The goal here is to introduce measures that can reduce noise and, consequently, computational errors. The analogy with noise suppression tools in the audio world is not coincidental.

– The third level of intervention is what is known as Quantum Error Correction (QEC). This is an intriguing approach: instead of storing the information of a qubit in a single qubit, QEC encodes it in the quantum states of a set of qubits. This approach could significantly reduce the effect of noise-induced errors: by monitoring each of the additional qubits, any changes can be detected and corrected before the information becomes unusable.

Techniques of Quantum Error Correction

In fact, with Quantum Error Correction (QEC), a set of specific techniques comes into play, where, unlike classical error correction based on 0 and 1 bits, the management of qubits existing in superpositions of states is required. The most common error correction method involves encoding a logical qubit using several physical qubits. Thanks to the entanglement of qubits (i.e., quantum correlation) and careful choice of encoding, errors in individual qubits can be detected and corrected, thus preserving their quantum information. In QEC, indeed, information is carefully and redundantly encoded across multiple qubits, so that the effects of noise on the system can be, as the name suggests, corrected.

This is not a straightforward approach. Although the implementation of QEC is considered essential for the journey towards large-scale quantum processing, it cannot be overlooked that it requires a significant overhead. The standard error correction architecture, known as the surface code, requires at least 13 physical qubits to protect a single useful “logical” qubit. When connecting logical qubits together, this number increases: a practical processor might require 1,000 physical qubits for each logical qubit. [Detailed information on this aspect can be found in this paper]

It should be noted that many entities, from giants like AWS, Google, or IBM, to smaller-scale operations, are working on these aspects. The development of so many noise management techniques is significant. We are talking about innovations that go hand in hand with general hardware performance improvements and with increasing the number of qubits per processor, indispensable for bringing quantum computing to large-scale uses, making it comparable – if not even preferable – to traditional computational systems. In particular, the aim is to have systems that can compete with high-performance computing centers, with significantly reduced energy consumption.

Comparable performance, reduced consumption, lower management costs: these are the goals. Achievable? Probably over the course of a five-year period.

Glimpses of Futures

Addressing Quantum Noise is essential if Quantum Computing is to become the ‘Next Big Thing’, supporting key activities in medical research, chemistry, and finance.

Let’s examine the prospects according to the STEPS matrix (Social, Technological, Economic, Political, Sustainability):

S – SOCIAL: Quantum computing has the potential to bring various benefits to society, including enhancing decision-making processes, accelerating the development of drugs and vaccines, and improving complex services. However, its development is not without ethical considerations, which concern responsible use, equitable access, and the potential misuse of technology. In particular, there are legitimate fears of increasing the existing digital divide between those with access to advanced technologies and those without. Consequently, there is a growing focus on defining guidelines to accompany the development of the technology.

T – TECHNOLOGY: Quantum computing is an innovative technology capable of reshaping the world. Powered by the principles of quantum mechanics, quantum computers promise to revolutionize industries, solve complex problems, and open new frontiers of knowledge.

E – ECONOMY: From a strictly economic perspective, quantum computing will have an economic impact where it is used to accelerate the resolution of complex problems. Not surprisingly, the finance sector is considered one of the potential beneficiaries of this technology. However, there are also concerns about its possible impacts on security, particularly where quantum computing is used in decryption activities. Additionally, there are significant concerns from an employment perspective, both in terms of potential job losses and, more importantly, due to the shortage of specialized personnel capable of supporting the large-scale development of this technology.

P – POLITICAL: The pursuit of quantum supremacy is pushing governments and major technology companies to invest in quantum computing research and development. The United States, China, and the European Union are leading this race, each striving to achieve the Quantum Advantage already mentioned. This commitment is driven by the desire to secure strategic advantages in fields such as cryptography, material science, and drug discovery. It is here that the main concerns arise about the potential geopolitical impact of quantum computing. In particular, in the field of cryptography, the capabilities of quantum computers could compromise the maintenance of secure communication channels used in financial transactions and the exchange of sensitive governmental and corporate information.

S – SUSTAINABILITY: On the one hand, quantum computers can significantly accelerate complex tasks, potentially reducing energy and time requirements. On the other hand, as Deloitte research explains, quantum computing will likely transform the fight against climate change. By accelerating the development of innovative solutions to tackle technical challenges, quantum and digital solutions promise to hasten the advancement towards sustainable solutions and informed decision-making.

Written by:

Maria Teresa Della Mura

Giornalista Read articles Look at the Linkedin profile