You are reading a sample from the TechMIX newsletter, in which Pavel Kasík and Matouš Lázňovský bring several comments and observations from the world of science and new technologies every Wednesday. If TechMIX interests you, subscribe!
Of course, the company used the crossing of this symbolic boundary for advertising, but the truly interesting milestones in the development of quantum computers fall elsewhere. After all, even IBM itself shows in its program of further development of quantum devices that it will place greater emphasis on other aspects of the technology.
Let us recall that quantum computers are supposed to be significantly more efficient than classical ones in solving a certain range of problems, for example in cryptography. In a quantum processor, bad solutions are “cut out” faster, so in some cases (far from all) it is possible to achieve the goal with significantly less computing power.
“Some people in the industry say that quantum computers can solve practically anything. I have also seen materials that without exaggeration promised a breakthrough in cancer treatment, but in reality the range of problems in which they promise significant improvement is quite narrow,” says Jakub Mareček from the Czech Technical University, who also worked for IBM in the past and cooperates with its research team on some projects to this day.
Although quantum computers have been talked about for decades, and quantum processors and computers are even for sale (in very limited numbers and at extremely high prices), a truly practical computer of this type does not yet exist. But progress in the field has been unexpectedly rapid recently, and after years of promises, it is looming that quantum computers will be available sooner than expected.
However, it is only a possibility, not a certainty. “There are still too many unknowns and a lot of uncertainty in the game,” adds Mareček.
Three different paths
The development of quantum computers is taking place in several directions. The first is the aforementioned increase in the number of qubits in processors. This is undoubtedly an important step. In principle, small processors can only handle small problems, and today’s advanced silicon digital processors are sufficient for them, even if they do not solve them optimally. Working with units of qubits is relatively well mastered and such machines are not difficult to build, they are just not of any practical use.
In recent years, however, engineers and designers have succeeded in increasing the number of quantum bits in chips relatively quickly, roughly doubling each year. Last year, IBM’s record processor had 433 qubits, this year’s Condor model already has 1121 superconducting qubits in a honeycomb structure.
However, increasing the performance itself is only one of the necessary conditions. The other, and so far more problematic, is picking up errors, which quantum computers are full of.
Part of the error is due to the very principle of the device, because not every qubit can be written well in such a way that it is possible to work with it simply mathematically. Certain “roundings” are necessary and inevitably involve errors. At the same time, however, the quantum devices themselves are in principle very sensitive and various errors and noises arise in them due to their design solution.
As a result, the error rate is so large that in today’s quantum computers, even after a relatively low number of operations, you are left with nothing but noise from the original values. In practice, these devices are therefore unusable, despite the very bold claims of some developers or companies.
How to get rid of errors? A proven solution is to create more robust computations by combining several individual qubits—each encoded in, say, a superconducting circuit or a single ion—into larger “logic qubits.” These are intended to check errors and enable their correction.
Such inefficient handling of qubits is a good reason why it is important to increase the performance of quantum computers. If we had a machine with billions of qubits at our disposal, sacrificing a large portion of them for “quality control” would not be a problem.
Unfortunately, we are not in that position. If performance increases continue at the current rate, quantum computers could reach the order of billions of qubits in about 20 years. So if the current pace of development could be maintained, which is not at all certain.
Fewer inaccurate parts and less demanding repairs
Another way is to reduce the amount of errors in the qubits themselves. “At the same time, the problem is not in one single qubit, but in the error rate of a gate composed of two qubits, which is actually such a basic part that computers are made of,” explains Jakub Mareček.
Progress has also been made in this area in recent years, which was, however, significantly slower than in the case of increasing the number of qubits in the processor. This year, however, the field received a pleasant surprise: in December, IBM presented, in addition to its record-breaking Condor chip, a significantly smaller Heron chip. Although it has “only” 133 qubits, it has a record low error rate, three times lower than that of the previous quantum processor. Although the error rate is still not as low as it should be, it is definitely an unexpected and welcome step in the right direction that gives hope for the future.
The last piece of the puzzle that gives hope for an unexpectedly fast transition to quantum computing is the “soft” part of computers, i.e. software. In addition to sufficient performance and robust qubits, the reliability and practical usability of quantum computers must also be ensured by what “will run” on them, i.e. the algorithms that such machines are supposed to use.
Since the main problem in this area is errors, the development of “self-correcting quantum codes” plays a key role, which will ensure the reliability of the results (that is, on the condition that there will not be too many errors in the work of the processor, which should be ensured by the aforementioned more robust logical qubits, for example). However, development in this area has not moved much. Until recently, estimates claimed that to guarantee a sufficiently low error rate, there would need to be 1000 times more physical qubits than “logical” ones.
In such a case, quantum computing would need a thousandfold security. Which means, for example, that the new Condor chip would not be 1000-qubit in practice, but one-qubit. From millions of qubits there would be thousands, from billions to millions – and we would therefore have a hard time waiting for truly powerful quantum computers before decades.
Recently, however, the industry has been enthusiastic about an alternative correction scheme with the help of a low-density parity-check code (abbreviated as qLDPC, i.e. “quantum low-density parity-check code”). It is a technique of creating self-correcting code that is commonly used in classical communications, but no one has been able to use it in quantum devices before.
In the last two years, there has been an increasing number of results that indicate that this complex and rather exotic question can hopefully be solved. It seems that this energy-saving technique could also be deployed on larger quantum processors. This can mean a major reduction in the need for a “check backup” for individual qubits. A team of IBM experts recently concluded that the ratio could be around 25:1 instead of 1000:1, which could ensure that powerful quantum computers should “mature” faster than recently expected.
It is still too early for celebrations and definitive confirmation. There are too many uncertainties and unknown variables at play. The deployment of the qLDCP technique is still rather theoretical, or realistic only on the simplest models. It can take years to master the code enough to consider implementing it on more powerful machines.
It won’t always work either. To use this type of code, each physical qubit must be connected to at least six others. In the case of current IBM chips, it is connected to only two, so it will be necessary to change a number of things even at the hardware level. However, the field is still in its infancy and developers are not limited by production requirements, the willingness to change is high.
You can find much more in the full version of the TechMIX newsletter. Subscribe to receive it directly to your email every Wednesday.