Quantum Futures

Rob Bell

Issue 15, September 2018

Imagine a world where Quantum computing is the norm...

It’s not that long ago that we couldn’t possibly conceive of smartphones in our pocket, with more power than was used in an entire Apollo mission to send someone to the moon and back.

While it’s difficult for us to imagine (if you’re younger) or remember (if you’re older), the Apollo guidance computer operated at just over 1MHz with 4kB RAM and 32kB storage. That’s less than we pack into a single DIP8 ATtiny, certainly less than an Arduino UNO, and barely a dot on the horizon compared to a modern iPhone or laptop computer. But all of that pales in comparison to where we’re headed.

Many of us have laughed at the joke: “There are 10 types of people in the world. Those that understand binary, and those that don’t”. Well, qubits (pronounced cue-bits) throw all that out the window. A qubit doesn’t have a binary state. It can be a one, a zero, or a mixture of both at once!

That throws the whole binary approach to programming and execution out the window. But it also means you can do a lot more with a single qubit than you can do with a single binary bit.

The inner workings of quantum computing are outside the scope of this column, but it’s coming, and it’s going to change everything!


One of the ways computational power advances our capabilities so drastically is with complicated mathematical problems. No, this isn’t trying to cheat on a high school maths exam, this is the basis for much of the cryptography on the internet.

What provides us security, is essentially, that there are no shortcuts to solving complex maths questions. The only way to find the correct answer is to work through all possibilities until you find the answer.

Let’s take something human-computable.

If you take the number 6, you can pretty easily work out that it’s made by factoring prime numbers 2 and 3.

But let’s now take the number 256,027. This is made by factoring prime numbers of 503 and 509. That just got a lot harder.

Could you work that out in your head? Eventually, perhaps faster with a calculator and notepad. But a computer could do this in a fraction of a second. However, things become exponentially more difficult the larger the factors become.

What if we take this 30-digit prime number: 671998030559713968361666935769

You can barely fathom how to say it because our English language doesn’t really deal with these numbers.

How about a 100-digit prime number: 2074722246773485207821695222107608587480996474721117292752992589912196684750549658310084416732550077

Yes - that is a prime number. Divisible by itself, one, and nothing else.

It’s actually very easy to calculate prime numbers. Simply 2^x-1 - that is 2, to the power of a large number, minus 1. That will give you a very large prime number depending on the value of x.

Alone, these prime numbers aren’t particularly special, but interesting things happen when you multiply them together as we’ve already discovered, especially when the prime numbers are rather large.


When you multiply large prime numbers together, you get a very large non-prime number. This exceptionally large non-prime number means determining which prime numbers were used to create the new large non-prime number is exceptionally difficult, and there’s no shortcut. If you have the answer, you can “validate” accuracy almost instantly. However, if you have to brute-force your way, you have no other method available than to run the maths!

While the implementation of this theory varies between different methods of cryptography, the fundamental basis of how secure they are comes down to the level of computational power required to break them. It’s not that they cannot be “guessed” - it’s a known fact that they can, it’s that the computational power required to do it is so significantly large that it’s virtually impossible.

This is the basis for much of the internet’s encryption, namely RSA (Rivest-Shamir-Adleman - the three inventors of the methodology).

It is only a matter of time though when quantum computing technology will be able to defeat the RSA methodology. Why? Because we’ll have the computing power to guess the private key in a fraction of a second due to the exponentially greater computational power available. Once we can guess the private key in a short space of time by pure trial and error, the entire basis for these systems is no longer sound.

Even without Quantum Computing, the size of the numbers we’re dealing with in encryption is increasing steadily. As the computational power of standard transistor-based chips keeps increasing, so must the size of the algorithms required for security.


While we think of supercomputers sitting there trying to calculate Pi to the nth decimal, there are thousands of real-world applications.

But what about extensive computer modelling, predicting weather patterns, or working out where an asteroid will be in 10,000 years? All of these things take exceptionally high amounts of processing ability, dealing with massive formulae to calculate.

Graphics rendering and modelling is mostly based around mathematics too. Think of where we’ve come in the last 20 to 30 years with video game graphics. We’ve gone from black and white 2D games to fully immersive 3D VR (or even just multi-screen UHD HDR) in the blink of an eye.

All that is made possible by ever-increasing computational power. While you might not think of it as maths, the processors are constantly calculating polygon shapes, lighting effects, and everything else that makes your graphics look better, CGI looks better than reality, and that’s only going to continue.


Just think - if you could mine the entire array of tokens for a cryptocurrency with a single computer, in the flash of a second, what knock-on effect does that have for the blockchain, and indeed all cryptography? Sure, quantum computers could create infinitely more complex cryptography in the same way, but they could instantly render current technologies obsolete.

There are currently numerous efforts to create quantum-stable cryptography methods since inevitably, we’re going to need them!

Stacking qubits together creates challenges, but the companies working on quantum computing technology are already making huge leaps and bounds.