Quantum Computing: Past the Buzzwords
Quantum Computing: Past the Buzzwords
So many buzzwords surround the world of quantum computing, and yet so few people know what it really is. Learn a bit about two types of quantum computers, and walk through a simple quantum algorithm. It might be simpler than you think.
Join the DZone community and get the full member experience.Join For Free
Maintain Application Performance with real-time monitoring and instrumentation for any application. Learn More!
What are quantum computers? We've all heard the buzzwords and we can't avoid all the hype. Sure they are based on "quantum superposition" in the use magical "qubits" that are neither a zero or a one, but somehow they are a blend of both. All these buzzwords and technical terms are just names for things. Richard Feynman believes strongly (and pontificated regularly) that "names don't constitute knowledge". The short video below is from 1973 and has Feynman explaining this in his own inimitable style:
Feynman was one of the fine men in the rarefied world of theoretical particle physics, and in my opinion he will remain so for the ages. His books are a delight to read and he did a series of lectures that were captured on audiotape. Amazon has a variety to select from and Bill Gates has reportedly put them on the web (but sadly, in a proprietary IE/Silverlight format). But you can still Google a large number of free, very interesting, standard YouTube videos. But I digress.
Feynman first proposed the idea of a "quantum computer" in 1982 in his keynote paper Simulating Physics with Computers published in the International Journal of Theoretical Physics. (Due credit: Feynman based his ideas on some earlier work done by Yuri Manin in 1980.) It is typical Feynman in that it is fun, irreverent, clear, insightful, educational, and especially thought-provoking.
Adiabatic Quantum Computers
Back to talking about what quantum computers are and are not. There is a computing system manufactured by D-Wave Systems that has been on the market since 2011. While the system is based on quantum principles it is not generally considered by the quantum computing community to function in a general enough way to be considered an actual computer. It uses the adiabatic principle which allows the result bits to "anneal" toward a pattern matching a solution.
The concept of annealing is taken from physical processes such as metalworking. When metal is formed with a hammer and anvil the metal shape (pattern) changes, but as a side effect is that many microscopic dislocations in the crystalline structure build up and cause stress and brittleness. When the metal is heated it permits some of the metal atoms to move enough to shift their positions and find lower energy positions in the metallic crystal. In a similar way the adiabatic quantum computer bits are "stressed" with a pattern and then allowed to relax (anneal) into a solution close to the pattern. Note: It does not have a sense of computational completeness (e.g. exhaustive search) but rather a sense of approximation (close fit). You might even think of it as a conceptual bridge between digital and analog computation.
One of the earlier tests of the D-Wave technology was to solve a sudoku puzzle: imagine the input being the numbers you can see initially and the empty boxes being jostled with the other possible numbers and settling (crystallizing?) on a solution. It is yet to (and probably won't) be demonstrated that any of the hoped-for massive speedups are possible. The company is focused on simply demonstrating a modest constant-factor performance over classical computers.
Quantum Gate Array Computers
On the surface this type of quantum computer would see much more familiar to us who are familiar with classical computing. There is the concept of registers holding numbers. And there are concepts of adding, multiplying, raising a number to a power, etc. The only difference (and it's a really, really big one) is that the registers represent every possible number that a register could represent. In a classical computer a three bit register can represent any number between zero and seven (e.g. 101) , but in a quantum gate array computer a three bit register can represent every number between zero and seven. If you follow this idea further and you multiply two quantum bit registers together, then you get a resulting register which contains every possible product of all of those potential inputs. I can already sense that you are asking (maybe even out loud): How on earth can that possibly help? You input a gobbledygook of numbers and you get gobbledygook times gobbledygook ... all of them! What can you do with that?
Okay, no one said that programming a quantum computer would be just like programming a classical computer. For those of you classical computer programmers that have moved into massively parallel GPU programming, you understand that there are different programming methods as well as different underlying assumptions. So, as you might expect, a quantum computing algorithm is fundamentally different. In a sense, in classical programming we compute results and may refine them in a "hill climbing" approach, whereas in quantum computing we compute all the possible answers at once and use some characteristic of the result set to "identify" the correct answer(s). Okay, this still seems very "hand wavy".
I think Shor's algorithm is a great place to start understanding the mindset of quantum based problem-solving. This algorithm is designed to factor numbers.
And in case you didn't already know the RSA encryption standard used for all of our protected information (banking, medical records, etc.) is based solely on the difficulty that classical computers have finding the prime factors of one big number that was generated by multiplying two prime numbers.
Shor's algorithm is insidiously simple:
N = the number we want to factor
A = a quantum register
B = another quantum register
pick an random number X where 1<X<N-1
raise X to the power of A
do a MOD N operation on the result
put the resulting register B
I know what you're thinking, I raise some number to the power of gobbledygook, and then I divide it by the number I wanted to factor, which leads to a lot of gobbledygook that I stored in register B!? But wait, it turns out that if you read out all of the numbers that are in register B there is a pattern. It isn't just gobbledygook. In fact you can test this with a classical program (it'll take longer) by looping through all of the possible values of A, doing the exponential and MOD functions and putting the result in an array B. When you look at the array B you will notice a cyclical repeat of results. I don't want to get into the math weeds here, but the number of elements that comprise the "cycle" of repeats is associated with a factor of N. We can compute that factor directly.
To recap, we did three functions: exponentiation, MOD, assignment. Then we looked for the repeat cycle frequency. Plugged it into a simple equation. Got a factor. Done.
There is an excellent walk-through of Shor's algorithm on the web. It's about two pages long and has diagrams that are very easy to understand. I highly recommend it.
Diagram from "A Brief History of Quantum Computing" By Simon Bone and Matias Castro
Of course you can find many explanations of Shor's algorithm on the web but be warned they are usually very dense with mathematics and a bit pretentious with the narrative.
Quantum computing uses slightly different techniques and operates under different assumptions. These differences seem a little strange at first, but on closer examination the methods themselves aren't that mysterious. In fact I think that the high-level concepts are actually simpler than what we use in classical computing. For those of you that are "seasoned" enough to have made the transition from procedural programming to object-oriented programming you may remember how it felt when you suddenly "got it".
Of course, the hardware technology has a long way to go. Quantum gate computers only exist in the lab right now and the largest quantum entangled registers are on the order of 10 bits. But, those 10 bits do work and the rest is just engineering, right?
Opinions expressed by DZone contributors are their own.