Characteristics of quantum computers

The power of a quantum computer is measured in qubits, the basic unit of measurement in a quantum computer. Source.

I facepalm every time I read a sentence like this. It did not lead to good, vision began to sit down; will have to contact Meklon soon.

I think it's time to somewhat systematize the main parameters of a quantum computer. There are several of them:

  1. Number of qubits
  2. Coherence retention time (decoherence time)
  3. Error rate
  4. Processor architecture
  5. Price, availability, maintenance, depreciation time, programming tools, etc.

Number of qubits

Everything is obvious here, the more the better. In reality, you have to pay for qubits, and ideally, you need to buy exactly as many qubits as required to complete the task. For a developer of exclusive slot machines, one qubit per machine is enough (to generate a random house). For "brute force" RSA-2048 - at least 2048 qubits.

The most publicized quantum algorithms are named after Grover and Shor. Grover allows you to "hack" hashes. To crash bitcoin, you need computers with at least 256 qubits on board (you can fool around with the complexity of bitcoin, but let's stop at this round number). Shor allows you to factorize numbers. Factoring a number of length n bits requires at least n qubits.

Current maximum: 50 qubits (already 72?). And in fact, 50 qubits is the limit. The limit of quantum computer simulation. In theory, we can simulate any number of qubits on classical computers. In practice, adding one qubit to a simulation requires doubling the classical calculators. Throw in rumors about doubling qubits every year, and ask yourself: how do you debug algorithms for 25651210242048 qubits? There is no simulator, you can’t set a break point on a quantum processor.

Coherence retention time (decoherence time)

Coherence and coherence are not the same thing. I prefer to compare coherence with RAM regeneration. There are billions of cells on the RAM bar, each one has a charge, zero or one. This charge has a very interesting property - it flows down. Initially, the "single" cell becomes the cell at 0.99, then 0.98, and so on. Accordingly, 0.01, 0.02, 0.03 are accumulated at zero ... We have to renew this charge, β€œregenerate”. Anything less than half is reset to zero, everything else is pushed to one.

Quantum processors cannot be regenerated. Accordingly, there is one cycle for all calculations, until the first "leaked" qubit. The time until the first "leak" is called the decoherence time. Coherence is the state when the qubits have not "leaked" yet. Here You can see a little more adult explanations.

Decoherence is related to the number of qubits: the more qubits, the more difficult it is to maintain coherence. On the other hand, if there are a large number of qubits, some of them can be used to correct errors associated with decoherence. From here followsthat the number of qubits by itself does not solve anything. You can double the number of qubits, and spend 90% of them fixing decoherence.

This is where the concept of a logical qubit comes in. Roughly speaking, if you have a processor with 100 qubits, but 40 of them are aimed at fixing decoherence, you have 60 logical qubits left. Those on which you execute your algorithm. The concept of logical qubits is now rather theoretical, I personally have not heard about practical implementations.

Errors and their correction

Another scourge of quantum processors. If you invert a qubit, there is a 2% chance that the operation will fail. If you entangle 2 qubits, the error rate goes up to 8%. Take a number of 256 bits, hash it with SHA-256, count the number of operations, calculate the probability of performing ALL of these operations without error.

Mathematicians provide a solution: error correction. There are algorithms. The implementation of one entanglement of 2 logical qubits requires 100.000 physical qubits. Bitko-kapets will not come soon.

Processor architecture

Strictly speaking, there are no quantum computers. There are only quantum processors. Why do we need RAM when the time to work is limited to milliseconds? I'm programming in Q#, but it's a high level language. Allocate yourself 15 qubits, and do what you want with them. He wanted to, confused the first qubit with the tenth. Desired - confused the first six.

On a real processor, there is no such freedom. He asked to confuse the first qubit with 15 - the compiler will generate 26 additional operations. If you're lucky. If you are not lucky, it will generate a hundred. The fact is that a qubit can only get entangled with its neighbors. I have not seen more than 6 neighbors per qubit. In principle, there are compilers that optimize quantum programs, but they are still rather theoretical.

Each processor has its own set of instructions, and the connections between qubits are different. In an ideal world, we have arbitrary Rx, Ry, Rz, and combinations of them, plus free obfuscation over ten features, plus Swap: look at the operators in Quirk. In real life, we have several pairs of qubits, and the entanglement of CNOT (q[0], q[1]) costs one operation, and CNOT(q[1], q[0]) already takes 7. And the coherence melts …

Price, availability, maintenance, depreciation time, programming tools…

Prices are not advertised, accessibility to an ordinary citizen is near zero, depreciation time is not calculated in practice, programming tools are just emerging. Documentation at arxiv.org.

So what kind of information to require from experts when releasing a new quantum computer?

Apart from the list above, I like options from Perl Power ΠΈ age2:

If only every article about a new quantum computer began with two characteristics - the number simultaneous entangled qubits, and qubit retention time.

Or even better - from the time of execution of the simplest benchmark, for example, finding prime factors of the number 91.

Source: habr.com

Add a comment