In February, two physicists made a bet on Twitter. Jonathan Dowling, a professor at Louisiana State University, and John Preskill of Caltech wagered a pizza and a beer over whether 10 years from now, someone will have finally invented a machine of longtime physics fantasy: the so-called topological quantum computer.

Preskill bet yes; Dowling bet no. “Preskill immediately liked the idea of the bet,” says Dowling, who initiated it. “He and I have been going at it, back and forth, on this topic for some time.”

To document the agreement, Dowling typed up the terms in PowerPoint on a clip art parchment backdrop. The specific terms of the bet are laid out in a blurry image inside a tweet on the two physicists’ feeds. (“I blame the pixelation on Preskill,” says Dowling. “He resaved it as a PDF.”) They will settle the bet on March 1, 2030, at midnight, Coordinated Universal Time.

To most, the subject of their bet is fairly esoteric. But among experts, the building of a topological quantum computer has been a decades-long moon shot, first championed by academics and later taken up by Microsoft, where researchers continue to pursue its development today. “It’s so beautiful and so elegant,” says Preskill, of topological quantum computing.

Indeed, “beautiful” and “elegant” are perhaps the words most commonly used by physicists to describe topological quantum computing. First proposed in 1997 by Russian-American physicist Alexei Kitaev, a topological quantum computer represents information in clusters of electrons, known as non-Abelian anyons, inside a material. Theory predicts that these clusters retain a sort of memory of their movement in the material, and the computer could encode information in how they are swapped around. For example, in a pair of anyons, a 0 might be represented as an anyon swapping positions with the one to its right, and a 1 would be swapping the right-side anyon over the left.

Physicists liken swapping two anyons to braiding two strands of hair. The bit of information is represented in which strand is on top in the braid, not in the physical properties of the hair itself. Information encoded this way is also much harder to alter, compared to conventional quantum computing. The quantum bits, or qubits, should commit far fewer errors relative to qubits based on the properties of physical objects, such as the superconducting circuits that make up Google’s and IBM’s noisy quantum machines. When these quantum computers execute complicated algorithms, for example, a circuit can accidentally change the property of its neighbor, producing an error, which researchers don’t quite know how to correct. Topological quantum computers would be resistant to this type of error.

Topological quantum computing exploits the field of geometry known as topology, hence its name. Topologists study properties of objects that stay the same despite deformation. For example, imagine shaping a piece of clay into the shape of a doughnut. You should be able to then smoothly morph that doughnut into the shape of a coffee cup without tearing or re-attaching any clay. Thus, a coffee cup and a doughnut are what’s known as topologically equivalent.

In the same way, a topological qubit will preserve its contained information as long as it remains in a topologically equivalent state, which means you can deform that qubit “as much as a doughnut is different from a coffee cup, and it still works,” says Dowling. Proponents say that such a machine would not suffer the computation errors that plague existing quantum machines–if only physicists could figure out how to build it.

Preskill learned of topological quantum computing in 1997, during Kitaev’s first visit to the US from Moscow, and he immediately fell in love with the idea. Previously, researchers thought the only way to avoid quantum computing errors was to implement an additional software algorithm that corrected the errors–algorithms that researchers are still working to develop. Kitaev, now Preskill’s colleague at Caltech, presented a design that protects the computer from errors by virtue of the hardware itself, without the need for extra error-correcting code. His machine would use qubits that could be stretched and deformed, so to speak, while retaining their information.