1
New Ways to Think about Bits
Classical computing is everywhere. The phone in your pocket, the laptop on your desk, and the world's fastest supercomputers use classical computing hardware. A classical computer codes everything in bits, and bits are quite simple. The rules for dealing with bits have been known since the mid-1800s. (George Boole wrote his ground-breaking work on logic in 1854). The computer hardware that manipulates bits has developed steadily since the mid-1940s.
Before you read about quantum computing, you need to have explored certain mathematical concepts. With that in mind, this chapter shows you how math applies to ordinary, classical bits.
If you’ve done any coding, you’re already familiar with the and, or, and not operators, ...
Get Quantum Computing Algorithms now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.