πŸ“‘ The Many-Worlds Theory, Explained

Bookmarked The Many-Worlds Theory, Explained (The MIT Press Reader)

A mind-bending, jargon-free account of the popular interpretation of quantum mechanics.

With the release of Doctor Strange in the Multiverse of Madness, John Gribbin takes a dive into the history of the Many Worlds Interpretation (MWI). He talks about the way in which, ironically, Erwin SchrΓΆdinger and Hugh Everett both came up with the idea, separately. However, it is was actually David Deutsch who later came up with the precise version associated with quantum computing.

What makes a quantum computer qualitatively different from a conventional computer is that the β€œswitches” inside it exist in a superposition of states. A conventional computer is built up from a collection of switches (units in electrical circuits) that can be either on or off, corresponding to the digits 1 or 0. This makes it possible to carry out calculations by manipulating strings of numbers in binary code. Each switch is known as a bit, and the more bits there are, the more powerful the computer is. Eight bits make a byte, and computer memory today is measured in terms of billions of bytes β€” gigabytes, or Gb. Strictly speaking, since we are dealing in binary, a gigabyte is 230 bytes, but that is usually taken as read. Each switch in a quantum computer, however, is an entity that can be in a superposition of states. These are usually atoms, but you can think of them as being electrons that are either spin up or spin down. The difference is that in the superposition, they are both spin up and spin down at the same time β€” 0 and 1. Each switch is called a qbit, pronounced β€œcubit.”

Because of this quantum property, each qbit is equivalent to two bits. This doesn’t look impressive at first sight, but it is. If you have three qbits, for example, they can be arranged in eight ways: 000, 001, 010, 011, 100, 101, 110, 111. The superposition embraces all these possibilities. So three qbits are not equivalent to six bits (2 x 3), but to eight bits (2 raised to the power of 3). The equivalent number of bits is always 2 raised to the power of the number of qbits. Just 10 qbits would be equivalent to 210 bits, actually 1,024, but usually referred to as a kilobit. Exponentials like this rapidly run away with themselves. A computer with just 300 qbits would be equivalent to a conventional computer with more bits than there are atoms in the observable Universe. How could such a computer carry out calculations? The question is more pressing since simple quantum computers, incorporating a few qbits, have already been constructed and shown to work as expected. They really are more powerful than conventional computers with the same number of bits.

This is an excerpt from Gribbin’s book Six Impossible Things.

Leave a Reply

Your email address will not be published. Required fields are marked *