19

We are at the advent of quantum computing, with quantum languages anticipating hardware quantum computers now available at high and low levels for simulated quantum computers. Quantum computing brings new elementary functions like entanglement and teleportation of qubits, measurement of qubits, and imposition of superposition on qubits.

What kinds of statistical problems are likely to benefit from quantum computation?

For example, will quantum computers provide more ubiquitous true random number generation? What about computationally cheap pseudorandom number generation? Will quantum computing help accelerate MCMC convergence, or ensure upper bounds on convergence time? Will there be quantum algorithms for other sampling-based estimators?

This is a broad question, and acceptable answers will also be broad, but kudos if they differentiate quantum and classical computation. (If this is a too broad question, please help me make it a better question.)

Bayequentist
  • 330
  • 1
  • 2
  • 15
Alexis
  • 26,219
  • 5
  • 78
  • 131
  • 7
    +1 I think it's a good and interesting question. Since it invites many (and potentially speculative) answers it's on the borderline of what kind of question works here. It shares that borderline with some of our most popular and enduring threads and, like those, deserves CW status. – whuber Dec 31 '17 at 16:06
  • 7
    Since machine learning is sort of a subdiscipline of statistics, you might find [Quantum algorithms for supervised and unsupervised machine learning](https://arxiv.org/pdf/1307.0411.pdf) interesting. – Jakub Bartczuk Dec 31 '17 at 19:04
  • 2
    Faster computing is always valuable but currently quantum computing is in an infant stage and they haven't got it to beat classical computing yet. I appreciate this question because it got me to go to learn something about it. So far I find it difficult to understand. – Michael R. Chernick Jan 01 '18 at 01:50
  • 1
    Does it matter that quantum computing is still in it's infancy? It works and it does beat the classical computing when it was a baby. Also not so unimportant, the speed up can be *exponential* for such problems as solving matrix equations or finding the inverse of functions and black boxes. Now we only need to get it to grow up. The algorithms that can run on such future computers have already been made up since decades.. It is only straightforward (although *very* broad, just think of the matrix equations) to come up with applications for statistics. – Sextus Empiricus Jan 01 '18 at 21:43
  • 1
    I think the first and most important point is that quantum computing can theoretically speed up arithmetic by a significant degree. Is that correct? If so, then all the linear algebra routines already see a benefit. – AdamO Jan 31 '18 at 17:52
  • @AdamO What do you mean precisely by "speed up the arithmetic?" Which arithmetic? Computers are pretty damn fast at adding integers, and even pretty damn fast at adding floating point numbers. Source? (Not trying to pick on you, but the point of my question is to move us past the 'Gee, whiz!' regarding quantum computing and start thinking concretely in terms of what we can actually do with it.) – Alexis Jan 31 '18 at 18:13
  • @Alexis By "speed up" I mean fewer flops. By arithmetic, I mean `+`, `-`, `*`, `/`, `|`, `&`, and so on. I agree they're fast already, but with billions of replications, the difference between a nanosecond and a picosecond become hours vs. weeks. I'm reminded of Grace Hopper's lecture on "cut me off a nanostring". Is your requested source more clarification on these definitions? Or is it whether quantum computing can actually speed up arithmetic? – AdamO Jan 31 '18 at 18:18
  • @AdamO I don't think that the promise of quantum computing is that it does *computing* (i.e. those operators you indicated) faster, I think that the promise of quantum computing is that it has elementary operations that are not part of classical computing (e.g., teleportation, entanglement, superposition, super-compact data encoding, etc.), and that these new operations give rise to new quantum algorithms that can indeed work faster than classical computing. My question asks about specific differentiation of quantum from classical computing approaches in statistics. – Alexis Jan 31 '18 at 18:24
  • 1
    Entanglement and superposition are together sort of 'speed up' (it would be more like 'scale up'). They together make it possible that you can perform brute-force search computations *simultaneously* on $2^n$ states with $n$ quantum bits (classically you'd brute force the $2^n$ different states *repeating* $2^n$ times). So the computational power has a potential to grow exponentially with the increase in number of bits. – Sextus Empiricus Mar 29 '18 at 07:42
  • 1
    Because you can not perform any possible computation (only the one that acts on the $2^n$ states at the same time) and because in the end you can only read out (collapse) a single of the $2^n$ states (the rest is lost), the possible useful algorithms are limited (only those where the most likely state to be read is the right answer). although I would not say it is as limited as @eSurfsnake says. I would say that the current limitations are more like technical/physical: to get the chips larger while at the same time getting the entanglement and superposition and the operations on them reliable. – Sextus Empiricus Mar 29 '18 at 07:43
  • 1
    I'm not a big fan of Medium, but I'm just going to leave this here: https://medium.com/quantum-bits/top-3-quantum-myths-and-misconceptions-2ae797550746 – The Laconic Jul 07 '18 at 22:23

4 Answers4

5

About quantum computing

Quantum computing makes use of superposition and entanglement.

  • Superposition means that the state of a system is a collection of multiple states that are superposed. Intuitively you can see this as quantum computing being based on wave mechanics and the state of the system is a wave. For waves we can imagine multiple waves being on top of each other.

    The superposition allows manipulations of multiple (superposed) waves at once in a single operation.

  • Entanglement means that the states of multiple components in the system are linked by a single wave equation/function.

    For instance when two elections are created in a single event then the sum of their spins (one must be up; the other must be down) might be linked due to conservation rules (similar like conservation of energy or momentum).

    The big philosophical deal with entanglement is in wave function collapse. It occurs when a measurement of the state of the wave function is made (which breaks down the superposed state into a single state).

    • The philosophical "problem" is that the event of wave function collapse instantaneously collapses the state of all entangled components over a large distance
    • In addition there is a problem about the interpretation of this wave function collapse (and this is more related to the superposition). The wave function collapse is not something that follows from other basic principles/mechanisms, based on which we can get an intuitive grasp. It is more like some axiom or some base principle. For quantum computing it does not matter. It just works. But the human mind feels like there should be something more behind this collapse. It feels a bit, flimsy. The wave function collapse is our current day 'atom' that we try to split into parts.

    The entanglement allows the exponential increase in the number of states that are modeled by the superposed wave functions.

The power of quantum computing

The power of quantum computing lies in the combination of superposition and entanglement. The $n$ entangled quantum bits create $2^n$ different states for the system that are superposed on each other. The superposition allows us to manipulate (make computations with) all of those $2^n$ states simultaneously. The contrast with non-quantum computing is that you would have to repeat the computations $2^n$ times for each state separately.

No free lunch

Note that this is not a free lunch. A quantum computer is not deterministic but instead will give a different answer each time according to some distribution.

We can manipulate multiple superposed waves made out of the entangled bits at the same time. But we cannot know the state/amplitude of all those superposed waves. Once we read out the state of the system we only get to read one single state out of all superposed states in the system. This is the wave function collapse when the multiple superposed waves turn into the wave of a single state.

The behavior of this collapse is probabilistic. The probability to observe/collapse a particular state will depend on the wave function. The algorithms for quantum computing are made in such a way to start with a wave that encodes the $2^n$ states with more or less equal probability. Due to the manipulations/computations, the algorithm ends up with the states that are close to the solution as the ones that are most likely to be observed.

What sort of benefits?

Basic computations

There are several basic algebraic manipulations that quantum computing can speed up. All of the statistics that are using such manipulations will benefit from quantum computing.

Basically, quantum computing is able to reduce the number of computations by applying simultaneous computations. For instance, in computing a discrete Fourier transform with $2^n$ components, the quantum algorithm can use $n$ bits that encode these $2^n$ components, and by the manipulations, on those $n$ bits you effectively are doing manipulations on the $2^n$ components.

Which problems?

The problems that will most benefit from this simultaneous computation are large-scale problems that are currently limited by computational power (and that can be parallelized). For instance, machine learning problems, Monte Carlo simulations, or problems with a large dimensionality. The first two of these three are also nondeterministic (reducing the weak point of a quantum computer) and are like the quantum computer making approximations by using an estimate based on luck.

Maybe you could better ask which kinds of (computation-heavy) statistical problems are not gonna benefit from quantum computing? There's probably not gonna be much.

Sextus Empiricus
  • 43,080
  • 1
  • 72
  • 161
3

Brute force methods are most likely to benefit because of what quantum computing is. Why? One possible physical explanation of the path of a pitched baseball is that all possible quantum paths are automatically explored and the least energy expenditure path, i.e., the path of least resistance available, is chosen, and all that is done without having to build a calculator; the calculations are ineffable. Generalizing; nature can be viewed as a quantum calculator. Thus those problems that are similar, the ones that do optimization, like regression minimization of some criterion be that goodness of fit or other (goodness of fit is, in some cases, ill-posed) are the ones that will benefit.

BTW, the intermediate steps; the iterations, in optimization would not be calculated, only the final result, just like when a baseball pitch occurs. That is, only the actual path of the baseball occurs, the alternative paths are automatically excluded. One difference between a statistical implementation and a physical event is, however, that the error of the statistical calculation can be made as small as desired by arbitrarily increasing the precision, (e.g., to 65 decimal places), and this is not typically achievable physically. For example, even a pitching machine will not throw a baseball in an exactly duplicated pathway.

Carl
  • 11,532
  • 7
  • 45
  • 102
  • +1 Thank you. Would you say that Monte Carlo methods, bootstrapping methods and other quantitative approaches to solutions fit the label "brute force?" – Alexis Feb 15 '18 at 18:34
  • 1
    Potentially, they may, but not in the same fashion as linear programming. For example, Metropolis and Ulam's method (Monte Carlo simulation) was originally applied by Ulam to calculate the critical mass of the atomic bomb. With true quantum computing, a simulated bomb will either undergo a simulated explosion, or not, at about the same speed as the real explosion. BTW, I met Ulam in 1964, I was a young guy back then. – Carl Feb 15 '18 at 19:44
  • 1
    Thank you, that point about "simulated explosion" really helps, and I think is building my intuition about this topic. Also: :D ***Wow!*** – Alexis Feb 15 '18 at 19:54
1

I liked the answer above on baseball. But I would be cautious about what quantum computing might do well.

It seems like it might do very well at things like cracking cryptographic schemes and the like: being able to superimpose all solutions and then collapse onto the actual one might go quite fast.

But in the 1980s - which was a very long time ago - there was a very high-profile company named Thinking Machines. See this article: https://en.wikipedia.org/wiki/Thinking_Machines_Corporation

The whole idea had a whiff of quantum computing. It utilized a n-dimensional hypercube arrangement. Imagine, if you will, four (very simple) microprocessors connected in a square. Each could do a computation, then share the result with the processor before it (counterclockwise), after it (clockwise), or opposite it (across). Next imagine 8 processors in a cube that can expand that concept to three dimensions (each processor can now share its output with one or more of 7 others: 3 along a vertex of the cube; three across the face of a square the processor was part of, and one diagonal in 3-space).

Now take this up, to maybe 64 processors in a 6-dimensional hypercube.

This was one of the hottest ideas of the time (along with the dedicated, 34 bit Lisp machine that Symbolics put out, and the slightly bizarre cache-only memory system put out by Kendall Square Research - both have wikipedia pages worth reading).

The problem was that there was precisely one, and only one algorithm that actually worked well on the TM architecture: a Fast Fourier Transform using what was called the "Perfect Shuffle Algorithm". It was a genius insight into how to use a binary mask technique, the bespoke algorithm, and the architecture to parallel process an FFT in a brilliantly clever and fast way. But I don't think they ever found another single use for it. (see this related question: https://cs.stackexchange.com/questions/10572/perfect-shuffle-in-parallel-processing)

I have been around long enough to realize that technologies that seem brilliant and powerful often end up to not solve a problem (or enough problems) to make them useful.

There were lots of brilliant ideas at the time: TM, Symbolics, KSR, as well as Tandem (gone) and Stratus (amazingly, still alive). Everyone thought these companies - at least some of them - would take over the world and revolutionize computing.

But, instead, we got FaceBook.

eSurfsnake
  • 1,004
  • 5
  • 13
  • You are right to call out the hype, and I like your historical perspective, eSurfsnake. I grew up in Santa Clara County as it became Silicon Valley... I have long had a deep appreciation of universal computation. One of the reasons statistics moves me, is because probability—true randomness—is outside the domain of computation. We can simulate it... very well for many purposes, but there's aspects of nature, seemingly, that *aren't* computation. Quantum computing seems to offer elementary operations that are also not Turing computation... so I want to understand what such tools might signify. – Alexis Mar 29 '18 at 06:02
  • @Alexis Actually, quantum computers don't have any super-Turing abilities. Any problem that can be computed using a quantum computer can also be computed using a classical computer, which follows from the fact that classical computers can simulate quantum computers. But, there are a few known problems that can provably be solved more efficiently using quantum computers. – user20160 Jul 07 '18 at 21:47
  • @user20160 True randomness is a super-Turing ability. Superposition is a super-Turing ability. *Simulation* is not the thing itself. – Alexis Feb 13 '19 at 17:07
  • @Alexis Not sure if we're talking about the same thing, but what I mean by super-Turing is the ability to compute a function that a Turing machine cannot. Interestingly, true randomness doesn't give the ability to compute any function that could could not be computed deterministically. I completely agree that simulation is not the thing itself, but it's at the very heart of computational equivalence (where we abstract away the thing itself). If machine A can simulate machine B, then A can compute any function that B can. More in Nielsen & Chuang. *Quantum Computation and Quantum Information* – user20160 Feb 13 '19 at 19:05
0

What kinds of statistical problems are likely to benefit from quantum computing?

On page 645 of "Physical Chemistry: Concepts and Theory" Kenneth S. Schmitz explains:

Quantum effects become important when the de Broglie wavelength becomes comparable to, or is greater than, the dimensions of the particle. When this occurs the wave functions can overlap, giving different properties of the system.

Macroscopic systems can be analyzed by classical methods, as that Wikipedia page explains:

More refined consideration distinguishes classical and quantum mechanics on the basis that classical mechanics fails to recognize that matter and energy cannot be divided into infinitesimally small parcels, so that ultimately fine division reveals irreducibly granular features. The criterion of fineness is whether or not the interactions are described in terms of Planck's constant. Roughly speaking, classical mechanics considers particles in mathematically idealized terms even as fine as geometrical points with no magnitude, still having their finite masses. Classical mechanics also considers mathematically idealized extended materials as geometrically continuously substantial. Such idealizations are useful for most everyday calculations, but may fail entirely for molecules, atoms, photons, and other elementary particles. In many ways, classical mechanics can be considered a mainly macroscopic theory. On the much smaller scale of atoms and molecules, classical mechanics may fail, and the interactions of particles are then described by quantum mechanics.

   

For example, will quantum computers provide more ubiquitous true random number generation?

No. You don't need a computer to generate a true random number, and using a quantum computer to do so would be a huge waste of resources with no improvement in randomness.

ID Quantique has available SoCs, stand-alone, and PCIe cards for sale for from U\$1200 to U\$3500. It's a little more than photons travelling through a semi-transparent mirror, but has sufficient quantum random properties to pass AIS 31 ("Functionality classes and evaluation methodology for true (physical) random number generator - Version 3.1 Sept 29 2001" .PDF). This is how they describe their method:

Quantis is a physical random number generator exploiting an elementary quantum optics process. Photons – light particles – are sent one by one onto a semi-transparent mirror and detected. These exclusive events (reflection – transmission) are associated to “0” – “1” bit values. This enables us to guarantee a truly unbiased and unpredictable system.

A faster (1 Gbit/s) system is offered by QuintessenceLabs. Their quantum random number generator “qStream” is compliant with NIST SP 800-90A and meets the requirements of the draft NIST SP 800 90B and C. It uses Esaki tunnel diodes. Their products are new and pricing is not yet publicly available.

Also available are systems from Comscire for several hundred to a couple of thousand dollars. Their PCQNG and post-quantum RNG methods and patents are explained on their website.

Quantum Numbers Corp. has developed a chip sized device to quickly (1 Gbit/s) produce quantum random numbers which they claim will be available soon.

What about computationally cheap pseudorandom number generation?

If you mean "computationally cheap" as in few instructions and rapid execution = yes.

If you mean that any computer is an inexpensive means to generate true random numbers = no.

Any property implemented QRNG won't produce pseudo random numbers.

Will quantum computing help accelerate Markov Chain Monte Carlo (MCMC) convergence, or ensure upper bounds on convergence time?

I'll let someone else take a crack at that for now.

Will there be quantum algorithms for other sampling-based estimators?

Probably.

Please edit and improve this Wiki answer.

Alexis
  • 26,219
  • 5
  • 78
  • 131
Rob
  • 2,050
  • 1
  • 6
  • 23
  • I am not sure I agree about the "true waste of resources" for reliable true RNG. For one thing pseudo-RNG takes *time* which adds up quickly in large-scale simulation work. For another, RNG takes *memory*, and likewise for large-scale simulation work. Having a rapid guaranteed source of true randomness from a known distribution does not seem so wasteful. Moreover *other* solutions to true RNG does not preclude quantum computers from also providing such a solution. – Alexis Feb 13 '19 at 17:11