First posted 26 July 2004. For a long time, I was vexed by the prevailing non-realist interpretation of quantum mechanics. I refused to accept that reality's evolution from moment to moment could have an unavoidable element of chance in it, causal effects that 'travelled' faster than light or both. You can pick up the strength of my reaction against it by my tone. I even set my own humility sufficiently to the side to accuse one of the greatest scientists of all time of a humility shortfall!
I'm more comfortable with these possibilities now, and I've actually embraced an even more 'extreme' interpretation, which I'll share in a separate post.
Call me a realist. I just can't help believing that there is a world out there whose existence does not rely on whether I or anyone else perceives it, measures it, imagines it or defines it. I believe this not just about the myriad objects and beings I encounter everyday in my comings and goings but also about the microscopic, quantum world of subatomic particles. This puts me in a small minority among people who bother to think about such things, squarely against the orthodox interpretation of quantum mechanics (QM).
That orthodox view, espoused by influentially Bohr and Heisenberg and dubbed the Copenhagen interpretation, holds that sub-atomic particles cannot be said to have definite properties such as position or momentum until such time as those attributes are measured. In fact, in its stronger statements, this interpretation says we cannot really speak of a quantum world at all. We can only speak of the observed measurements we obtain when we interact with that world in certain ways. We should, on this account, simply celebrate the predictive success of QM's formal mathematical model, the most successful predictive model in the history of science.
The slightly discomfiting corollary - that the world at its most basic levels is plagued by inescapable uncertainty - is simply something we need to accept, by throwing away our long-standing classical misconceptions about reality. Moreover, we need to accept that the theory as it stands is fundamentally impossible to improve upon. It constitutes the final word on the subject.
But there is an alternative interpretation 'on the table', one first presented in 1952 by David Bohm. This interpretation, an example of a 'hidden variables' (causal-realist) theory, is wholly consistent both with the formal mathematics of Schrodinger's equations and with experimental observation. It holds, though, that the same causal relationships we observe at the macro level continue to hold at the quantum level. Given that this just sits much better with me, I plan to read much more about it and report back.
In the meantime, let me address a few problems with the orthodox, or Copenhagen, view. The overall point of this post is to say not that we should reject QM but rather that we should use it for the predictive model it is while continuing to search for ways to make it more complete. We certainly shouldn't elevate the orthodox interpretation of it to the status of unassailable doctrine. Here are some reasons why:
First, it falls short of what we have traditionally held to be the criteria for an adequate (let alone a complete) scientific theory. Although it has impressive predictive power, it does not subsume the classical theory that it replaces. That is to say, it does not show how, under certain 'limit' conditions, it reduces to the previous best model.
This is in stark contrast to the advances that Einstein's theories of special and general relativity made on Newton's mechanics and theory of gravity. Einstein shows how, for objects that are not too massive and for velocities that are not too great, his formulae simplify to those discovered and used by Newton.
QM throws out the baby with the bathwater. It discards the classical order completely, erecting a wall around the quantum world and declaring it once and for all a special kingdom where old languages shall not be spoken.
A second, closely related point is that QM doesn't say where that 'wall' is. As Schrodinger showed with his reductio ad absurdum example of 'Schrodinger's cat', QM cannot delineate where the classical (macro) world stops and where the quantum (micro) world begins. Schrodinger's thought experiment uses a mechanism that connects the life-or-death fate of a feline to whether or not a particle is emitted by a nearby radioactive substance.
Importantly, this set-up is contained within a closed, opaque box. Under the orthodox interpretation of QM, unless and until an experiment is carried out by a knowledgeable observer to measure the emission, the particle (or potential particle) exists in a superposed state between being emitted and not being emitted from the radioactive source. Because (in Schrodinger's thought experiment) any particle emission would trigger the release of a deadly gas within, the cat must therefore also be in a suspended state, neither alive nor dead, but rather a superposition of the two! Not until the box is opened so that the observer can inspect is the poor cat's fate real.
It is difficult enough to accept that quantum particles take on concrete properties only when an observer 'collapses the wave function' - the mathematical, probabilistic function that describes a system dynamics. That such indeterminacy should encroach so glaringly into the macro-world is, to me, evidence of a shortfall in the theory (or our interpretation of it).
Third, Bohr and others slip without due care between statements about what we can KNOW and statements about what actually IS THE CASE. If we take QM to be saying that, given our current best understanding, we can do no better than deal with the quantum world in a probabilistic manner because we do not understand the finer detail, then that is fine. But Bohr and his followers go on to make two much stronger statements.
Clearly suffering from an appalling lack of humility, they suggest that our current best understanding cannot be improved upon. Now this would be a great break indeed from the entire history of science, which is filled with (and one might say defined by) a succession of advances whereby one 'best working model' is superseded by another. Why should it suddenly be that science (or this particular branch of it) has suddenly reached its ultimate answer?
Bell's inequality and the Aspect experiments suggest that we probably have to sacrifice the principle of locality (which creates problems with Einstein's theory of general relativity). But they do NOT show that no hidden variables theory is consistent with observed evidence and theoretical formalism. Indeed, Bohm's theory is already before us as one example, and Tim Palmer has even shown how a chaotic (but wholly determinate) system with intertwined, riddles basins can satisfy Bell's inequality fits perfectly well with QM's formal mathematics.
An at their most bold (and bizarre), they say that our ultimate knowledge limit - as defined by our human powers of perception and computation - actually impose limits on existence. In philosophical terms, they jump from epistemological claims to ontological ones. If, they say, we cannot verify that something exists, then it does not exist. To which I challenge: if existence is not bound by the perceptual and reasoning powers of spiders, rabbits or dogs, then why should it be constrained by our abilities?
Isn't it much more reasonable to believe that a mind-independent, verification-transcendent world exists, and that the scientific advances we have seen to date constitute an ever-better understanding of that world? This seems preferable (and certainly more humble) than asserting that each scientific advance is not a discovery but an act of creation!
Looking at all we are capable of verifying and then inferring to the best explanation leads me not to reject realism but to embrace it confidently. Watch this space for a bit more on both Bohm and Bell (who was himself a causal realist).
First posted 21 Oct 2003. Not much to add or change now. My 'favourite' interpretation is that of Julian Barbour, which I'll share as a separate post.
Is light a wave or a particle? If your answer is, 'Who cares?' you may have a point, but you probably want to read a different post or visit a different blog. It seems that our answer is that we cannot be sure, but that light behaves in some ways as if it were the first and in others as if it were the second, and this duality is just one example of the counter-intuitive, but unprecedentedly accurate, conclusions quantum mechanics helps us reach.
The title of this post recalls famous physicist Richard Feynman, who said that anyone who thinks he understands quantum mechanics doesn't. Feynman admitted that he didn't really understand it.
The famous (at least among physicists) double slit experiment at its most basic shows that light behaves as a wave. Point a lamp towards a photographic plate, but on its way, have it pass through a single vertical slit in a screen. Consider the lamp plus the slitted screen to be the 'light source' - in modern times we can replace this set with a laser. Now, erect another screen between the light source and the photographic plate, and cut two parallel vertical slits in it.
Some of the light shone at the screen passes through the left slit and some through the right. Like the ripples created by two pebbles dropped a few feet apart in a pond, the light coming through one slit interferes with the light coming through the other. As the light reaches the plate, at some points two 'peaks' of the ripples overlap and reinforce each other. At other points, two 'troughs' reinforce each other. In between different levels of reinforcement or canceling out occur. The visual result is a series of vertical white and black 'stripes' with varying grey bits in between. This is called an interference pattern, and it serves as great evidence that light does in fact behave as a wave. Seems simple enough. This experiment discredited Newton's 'corpuscular' theory of light in a single sweep.
Photo-electric effect - Einstein lays a foundation for the quantum revolution
But in the early 20th century, Einstein showed that when light was shone on a certain metallic surface, it knocked electrons from the surface like bullets chipping a stone wall. The brighter the light the more electrons are knocked free, but the brightness has no effect on the speed with which the electrons travel when knocked loose. Rather, the higher the frequency of the light, (moving up from red to violet) the greater the velocity (and therefore energy) of the electrons knocked loose. This all showed light to behave as a particle. Hmm, getting confusing...
Moreover, when performed across a range of light frequencies, it also verified Planck's quantum hypothesis - showing that the possible sizes of the energy 'transfers' to the dislodged electrons were not continuous, but rather discrete (quantum) levels. Photons (as light particles came to be known) below a minimum frequency (and hence energy level) for a given metal will not knock any electrons free, no matter how high the intensity of the light shone.
Photons and the double-slit experiment
Eventually, scientists were able to ‘shine’ light in smaller and smaller amounts, culminating in the ability to fire individual photons (they can do the same with electrons and get the same results). Here is where things take a very strange turn.
Turning back to the double-slit experiment, when photons are fired one at a time at the photographic plate, via the slits in the screen, the result is not a distribution like one would see when bullets are fired through gaps in one wall at a target on another wall, with light 'dots' concentrated in the areas aligned with the bullets’ possible trajectories. Instead, you get just the same interference pattern as in the original double slit experiment, only now built up slowly as more and more photons are fired one after another. Whoaaaa. Wait a minute!
So…the individual particles of light behave in a probabilistic, wave-like manner. Each particle seems to know what the others have done and to ‘interact’ with them, even though each is not fired until the previous one has already hit the plate. Viewed another way, each photon has a probability function, with a certain likelihood for landing in different places on the plate. But the photons do not 'pick' any one place. Each 'spreads itself out' on the plate according to that function. Another question is, how do the photons 'know' that both slits are there? Yes. This is headache material.
'Collapse' of the probability function
Now, even stranger, when you set up a contraption to measure and register when a photon passes through either the left or the right slit, the pattern on the plate is not the interference, wave-like one, but rather the bullet-type one that you might have expected based on common sense. What has happened?
It seems that as long as we don’t know which slit a particle passes through, each photon fails to ‘commit’ to one or the other, instead behaving probabilistically, as described above. But by measuring which slit a photon passes through, we force it to commit to one slit or the other and thereby to a single, bullet-like point of impact with the plate.
But we all know that photons don't think and that they don't 'commit'. We can't rely on such metaphors. How do we really describe what is happening? You may find the answer a bit disappointing. Although, the theory of quantum mechanics can model this behaviour mathematically and predict the behaviour of microscopic particles with unparalleled accuracy, scientists cannot agree how to explain this aspect of our world. Many refuse to even take up the challenge, defining it as within the domain of philosophy rather than science.
Various interpretations of this quantum mechanical outcome have been put forward, ranging from ones requiring an infinite number of dimensions in the universe to ones that question the existence of concrete properties independent of their observation by intelligent beings. The truth is that, theoretically, any number of interpretations are consistent with the mathematics and the observations with which they correspond so well.
The problem, for those of us who trust common sense, is that none of them correspond with it. We shouldn't be shocked by this. Our common sense was formed by our experiences in life, none of which deal with phenomena at the microscopic level. Like Einstein's theory of relativity, which only leads to different predictions than Newton's classical physics at velocities approaching the speed of light, quantum theory deals with the world at scales completely outside our direct experience.
No hidden variables
Still, many are uncomfortable with the lack of a satisfactory explanation of the unquestionable maths behind quantum theory. Einstein hoped that the theory of quantum mechanics was simply incomplete and that once we discovered other variables at play, the ‘uncertainty’ (I'll write some other time on Heisenberg's Uncertainty Principle) would disappear. Detailed experimental tests of John Bell's theorem suggest that no such variables could supplant quantum uncertainty. We may just have to celebrate quantum mechanics' great predictive powers and live, perhaps uneasily, with the uncertainty that it ascribes to the microscopic world.
I'm curious. I like looking beneath and behind the obvious, also looking for what is between me and the obvious, obscuring or distorting my view.