Why comparing qubits to bits gets quantum computing completely wrongA qubit is not a classical bit that can be both 0 and 1! A qubit doesn't store values at all; it defines probabilities.October 9, 2025

Dear Quantum Machine Learner,

Most explanations of quantum computing start with the same line: a qubit is like a bit that can be both and at once. It sounds catchy. But, it's wrong.

That idea sends people down the wrong path before they even begin. I am certainly not the only one pointing out that a qubit is not and at the same time.

But I might be the only one telling you that a qubit is not or , either. This is because a qubit doesn't store binary values. It only generates binary outcomes when you measure it.

What may seem like like splitting hairs is actually crucial. There is a clear distinction between what a qubit is and what it produces as an outomce. The mere remark that it produces something should raise doubts that it cannot be like a classic bit.

If you’ve ever struggled with the both and analogy, today's post will replace it with a mental model that actually makes sense. One that is consistent with how quantum computers really work. Of course, I also prepared a PDF companion for this post.

Please Login to Download the PDF (your PyQML subscription works here!)

—Frank ZickertAuthor of PyQML