首页    期刊浏览 2025年12月04日 星期四
登录注册

文章基本信息

  • 标题:CHANCE: INDIVIDUAL INDETERMINACY OR COLLECTIVE RANDOMNESS?
  • 作者:Bunge, Mario
  • 期刊名称:The Review of Metaphysics
  • 印刷版ISSN:0034-6632
  • 出版年度:2018
  • 期号:December
  • 出版社:Philosophy Education Society, Inc.
  • 摘要:The first drops of a spring shower splash on a tiled floor in a random fashion, even though they all come from the same rain cloud. In contrast, when a source of light emits a pair of photons traveling in different directions, they remain entangled from the moment they emerge from a random atomic decay. The drops of rain, like the shrapnel of a grenade, may be said to be "classons" or referents of classical mechanics, whereas the photon twins may be called "quantons." Mesophysical entities, such as pollen grains, dust specks, and bacteria, will be ignored in this paper.

    While the drops of rain start out as classons but end up distributed at random, the twin photons are quantons throughout their existence, which ceases only when both are absorbed, perhaps by atoms scattered at random. This intertwining of chance with causation has bedeviled for centuries the thinking about both modes of becoming. The genuine issues and the confusions that accompanied the birth of quantum physics did not spare Albert Einstein and Niels Bohr, two of the giants of the last century.

    I

    The Einstein-Bohr Debate. Over many years, Albert Einstein and Niels Bohr held a friendly but intense debate on the foundations of quantum theory (QT), one of the most important and philosophically unsettling scientific theories in history. Their debate culminated in 1935 with Einstein's paper authored together with Boris Podolsky and Nathan Rosen. (1) This paper is still being vigorously debated as if it had been published yesterday.

CHANCE: INDIVIDUAL INDETERMINACY OR COLLECTIVE RANDOMNESS?


Bunge, Mario


CHANCE: INDIVIDUAL INDETERMINACY OR COLLECTIVE RANDOMNESS?

The first drops of a spring shower splash on a tiled floor in a random fashion, even though they all come from the same rain cloud. In contrast, when a source of light emits a pair of photons traveling in different directions, they remain entangled from the moment they emerge from a random atomic decay. The drops of rain, like the shrapnel of a grenade, may be said to be "classons" or referents of classical mechanics, whereas the photon twins may be called "quantons." Mesophysical entities, such as pollen grains, dust specks, and bacteria, will be ignored in this paper.

While the drops of rain start out as classons but end up distributed at random, the twin photons are quantons throughout their existence, which ceases only when both are absorbed, perhaps by atoms scattered at random. This intertwining of chance with causation has bedeviled for centuries the thinking about both modes of becoming. The genuine issues and the confusions that accompanied the birth of quantum physics did not spare Albert Einstein and Niels Bohr, two of the giants of the last century.

I

The Einstein-Bohr Debate. Over many years, Albert Einstein and Niels Bohr held a friendly but intense debate on the foundations of quantum theory (QT), one of the most important and philosophically unsettling scientific theories in history. Their debate culminated in 1935 with Einstein's paper authored together with Boris Podolsky and Nathan Rosen. (1) This paper is still being vigorously debated as if it had been published yesterday.

It is generally agreed that the paper in question, known as EPR, is so long lived because it discusses a raft of deep and hard metaphysical and epistemological matters, such as whether or not physical objects come with all their properties or acquire them only upon being observed, and whether chance is in nature or only in our knowledge of it. Presumably, EPR would have interested thinkers as different as Berkeley and Laplace, Boltzmann and Planck, as much as Democritus and Epicurus.

Although the center of the debate in question was the very existence of what Kant called "things in themselves," no living philosophers were invited as discussants. And yet some of them could have been helpful, for example, by pointing out that some of the physicists, starting with Einstein (or perhaps Podolsky), confused realism, the principle that the universe preexists the emergence of human observers, with classicism, the attempt to replace QT with a theory containing only scatterfree or nonfluctuating variables, (2) hence not including Heisenberg's indeterminacy inequalities.

Judging from his lifelong fight against subjectivism, (3) Einstein was far more upset by the claim that nature depends on the observer, than by the thesis that QT enshrines randomness. As Wolfgang Pauli explained to Max Born, (4) Einstein's point of departure is realistic rather than deterministic--but people, Born included, would not listen.

The present paper will be confined to the determinacy/randomness dilemma. As is well known, Bohr and Einstein held opposite views on this basic philosophico-scientific problem. Indeed, when Einstein held that "God [sive natura] does not play dice," Bohr retorted: "Einstein, don't tell God what to do." Still, we heard through the grapevine that Bohr kept thinking about this matter until the eve of his death in 1962, when he filled his blackboard with diagrams and formulas concerning one of the thought experiments he devised to rebut his late friend, who had died seven years earlier.

Deep problems are not wiped out by Kuhn's irreverent eraser, but instead reappear under different guises. This is certainly the case with the bundle of philosophico-scientific conundrums called EPR, as evidenced by the interviews of seventeen foundational workers in QT collected by Maximilian Schlosshauer. (5) Indeed, no two of the said experts coincide in everything concerning those problems. The great Leibniz would have been surprised at such cacophony, since he thought that logic would eventually be mathematized to the point that, when confronted with an intellectual debate, the discussants would accept his invitation: Calculemus.

II

Indivisibles or Ensambles? Einstein did not dispute the sensational success of QT in accounting for any number of experimental results in fields apparently as distant as spectroscopy and astrophysics, atomic and solid state physics, high-energy physics and chemistry, nuclear engineering and machine computation. But he claimed that all those cases concern ensembles, not individual quantum-mechanical entities or quantons, as I call them. This view gave rise to the so-called stochastic quantum theory. (6) But a number of experiments performed on single quantons, such as photons or electrons, cast doubt on the thesis that QT, like classical statistical mechanics, is a theory of collective events.

Consider the following crucial experiments designed in part to resolve the dilemma: the double-slit, the partial reflection mirror, and the Stem-Gerlach ones.

Double-slit. What happens to a light beam when it strikes a screen with two very narrow and parallel vertical slits? If light were composed of corpuscles, as Newton thought, some of them would go through the left slit whereas the others would go through the right one. What if the beam intensity is decreased so that a single photon strikes the screen at any given time? Since the photon is indivisible, it "comes through both slits, spreads over the entire screen, and collapses to an atom-size disturbance on interacting with the [second] screen's atoms." (7)

Semitransparent mirror. If a light beam strikes a semitransparent mirror inclined at 45 degrees with respect to the light beam, it splits into two roughly equal branches: one goes through the mirror while the other is reflected. Next, decrease the intensity of the beam so that a single photon strikes the mirror at a time. What will the indivisible photon do when about to hit the splitter? Will it go straight through the mirror, or will it reflect? The answer is that both courses are equally possible, so that if the light intensity is increased, we will observe two beams with roughly the same intensity.

Stem-Gerlach splitter. If a beam of silver atoms is subjected to a vertical inhomogeneous magnetic field, it splits into two branches: one that bends upward and the other downward. The standard account is that the original beam is a superposition of electrons with their magnetic moments aligned with the field, and of electrons with magnetic moments antiparallel to the field. When traveling through the field, the spin-up electrons get the additional energy [micro]H, where p stands for the Bohr magneton, whereas the spin-down ones release the same energy. The standard account is that, before entering the magnetic field, the ingoing electrons are in a so-called superposition of two spin states, and the magnetic field would separate the upper from the lower branch of the superposition. The alternative field-theoretic account is this: the incoming matter wave splits into two branches with opposite polarizations.

III

Two Kinds of Chance. Let us now go back to the Einstein-Bohr dialogue. Would an omniscient being play dice? Of course not, for he would be able to predict the outcome of every throw. Indeed, he could find out the initial positions and velocities of all the dice and, using the laws of classical mechanics for solids, he could compute the final configuration. In other words, such a being would beat chance of the second kind, or derivative chance, as I will call it. A much harder question, however, is whether such a being could beat primary chance, as exemplified by typically quantum processes such as the radioactive decay of an atomic nucleus or the radiative decay of an excited atom.

Derivative or secondary chance is inherent in the disorder of an ensemble of mutually independent items. Entropy is the degree of such disorder. The description and prediction of a large ensemble of similar entities or events can be made knowing the initial values of a handful of collective properties like entropy, temperature, and pressure, jointly with the laws that bind them. (These are the laws of statistical mechanics and thermodynamics. Information theory, sometimes invoked in our controversy, lacks such laws and is therefore incapable of computing such predictions.) In sum, derivative chance can be produced and forecast exactly. By contrast, primary or irreducible chance is inimitable and unpredictable in detail.

A note of caution: Although the primary/secondary split is well grounded, it may not be definitive. Indeed, in principle it is possible that future experiment may tag the components of atomic nuclei, and that future theory may predict which of them will acquire, through collisions, the energy necessary to escape the attractive force.

IV

QT: Probabilistic or Statistical? In the preceding section we distinguished primary or irreducible chance from secondary or reducible chance. They are different, but fortunately they are related by the famous formula incorrectly credited to Ludwig Boltzmann, namely, S = k log W, where W is the number of microphysical configurations or orders compatible with the macrostate characterized by the entropy S and other macroproperties.

QT is sometimes said to be probabilistic and at other times statistical. Which of these characterizations is correct? This question is pertinent because probabilistic theories are centered in the probability concept, while statistical ones are centered in statistics such as frequency, average, and mean standard deviation or fluctuation.

Furthermore, the theories in question are rooted in different but interrelated problems:

Direct: Given a probability distribution, calculate the corresponding statistics.

Inverse: Given a statistics, guess the underlying probability distribution.

In other words, my answer to the original question is this:
   When looking forward, at the occurrence of future random events, QT
   is probabilistic.

   When looking backward, at a list of past data on ensembles, QT is
   statistical.


In sum, the single quanton behaves randomly, whereas the ensemble of quantons behaves statistically. In other words, we find irreducible randomness at the microphysical level and emergent determinacy at the macrophysical one. The randomness in question is irreducible or primary in the sense that it cannot be eliminated by altering either the incoming quantons or the splitter; the value 1/2 for either probability does not depend on any hidden variables or properties of the things in question. But, by increasing the beam intensity, we transform a probability into a frequency. In other words, for large numbers, frequencies emerge from probabilities, not the other way around, as the frequentists like John Venn and Richard von Mises claimed, largely because of their empiricist bias.

Further, I claim that the quantum probabilities are objective properties of individual microphysical items, not degrees of belief, as the Bayesians like Bruno de Finetti and Leonard Savage claimed. So much so that objective probabilities can be evaluated via measurements of the corresponding frequencies, which are objective properties of ensembles. A Geiger counter will do this in the case of radioactive decay.

However, many philosophers and a few scientists use the Bayesian (subjectivist or personalist) view of probability as a measure of an individual's degree of belief. Since beliefs are personal and such probability assignments are arbitrary, Bayesianism is hardly scientific: it is only a variety of gambling. (8) In the sciences, probability is introduced only when the referent is quantum-mechanical or, in the case of secondary chance, when a randomization mechanism, such as shaking, heating, or spinning a wheel, can be identified. In the case of secondary chance, no randomization process, no randomness, hence no legitimate use of probability.

Many philosophers, by contrast, assign probabilities to propositions, in particular those included in theories, which are ordered by the implication relation. They do not justify the assignment of probabilities to propositions, and they might not be able to raise reasonable objections to speaking of the areas or volumes of propositions, since the probability theory is but an application of measure theory, a chapter of abstract mathematics.

In short, the splitting experiments have shown that primary chance is objective. In either case, probability is a measure of possibility, the precursor of actuality.

V

Heisenberg's Indeterminacies. In 1927 Werner Heisenberg wrote one of the most famous and most misunderstood formulas in the history of science: his misnamed uncertainty principle. The formula in question, which is actually a theorem implied by certain physical and mathematical assumptions, states that the indeterminacies or standard deviations of the position and the momentum of a quanton are the duals of one another, in that the increase of one of them is compensated for by the decrease of the other.

The popular version of this formula is that our uncertainty about the exact value of the position is inversely proportional to our uncertainty about the exact value of the momentum. But, of course, uncertainties are states of human knowledge, whereas the said assumptions speak only of quantons and their dynamical properties. What are at stake in QT are objective indeterminacies, not subjective uncertainties, (9) as suggested by the fact that Heisenbeg's theorem is implied by premises that refer only to physical items.

The Heisenberg theorem should have had a strong impact on ontology. Indeed, it should have told us that (a) some of the variables we use to describe physical reality are indeterminate, in the sense that most of the time they have no precise values but have nonzero fluctuations; (b) their potential values are correlated with probabilities, in the sense that, at any given place and time, what matters is the density [psi]* [X.sub.[psi]] of a variable ("observable") X representing a physical property of a quanton in state [psi]; (c) some of the properties of a quanton constitute a system, in the sense that they are interdependent.

VI

Old Roots. The notion of chance was alien to the ancient and medieval worldviews, nearly all of which imagined the universe as an orderly cosmos with no place for unrealized possibility, disorder, spontaneity, or emergence. Yet contemporary ideas about chance have three ancient roots: the games of chance played by soldiers and other commoners, Epicurus's clinamen or spontaneus deviation from the straight path, and Spinoza's distinction between natura naturans and natura naturata.

The professional soldiers of old were fond of dicing with sheep knuckles to while away the time. But the idea that there could be laws of chance emerged only in the early modern period, when Galileo, Descartes, Pascal, and a few other scholars were curious and ambitious enough to try and find such laws. This is how probability theory was born, namely, as the calculus of chance.

Epicurus, one of the great ancient atomists, speculated that atoms moved in straight lines as Democritus had imagined, except for the clinamen or small irregular and spontaneous (uncaused) departures. This may have been the earliest concept of spontaneity--something utterly alien to the artificial intelligence community. It came back unexpectedly about two millennia later under the guise of the Zitterbewegung or trembling motion of the relativistic electron that Erwin Schrodinger discovered in Dirac's theory of electrons and positrons. In the 1940s, the attention of physicists shifted to quantum electrodynamics, partly because it showed that, far from being utterly void, the vacuum is filled with a subtle substance subject to uncaused fluctuations that affect electrons (Lamb) and even metallic plates (Casimir) immersed in it. Some years later, the present author introduced a position coordinate, usually called the Feynman-Corben-Bunge operator, subject to somewhat subdued spontaneous oscillations and that, combined with momentum, yields six new constants of the motion, as well as new indeterminacy formulas involving energy and time. (10)

Spinoza drew a sharp distinction between natura naturans (nature in the making or creating) and natura naturata (nature made or created). The pairs possibility/actuality, past/future, input/output, and probability/frequency belong in the study of the naturans/naturata duality, whereas the notion of present or now is not only an egocentric particular, as Bertrand Russell pointed out, but also a hinge between naturans and naturata.

The same distinction may also be exemplified by the difference between the decay process and the decaying thing. When combined with Epicurus's clinamen, we get the gist of radiative and radioactive decays, for both are regarded as spontaneous (uncaused) as well as random and therefore subject to probabilistic patterns derivable from the principles of QT. This theory tells us that a nucleus or an atom in an excited state is bound to decay to a lower energy state while at the same time emitting a "particle" or a photon, as in the case of the reaction "neutron [right arrow] proton + antineutrino." The theory does not predict when such decay will happen, but it may predict the probability that it will happen during a given time interval.

VII

Concluding Remarks. In conclusion, there are two very different kinds of chance: primary and secondary. Both occur in reality, though the former operates mainly at the microlevel, whereas the latter occurs only at the macrolevel. In between, both kinds of chance occur. For example, flu vaccines, used to avoid flu contagion, are designed to prevent certain gene mutations, which happen in between levels. But some years forecasts fail to identify the possible flu strains, and huge quantities of ineffective vaccines are manufactured in vain. A Bayesian or subjectivist approach to this problem, which does not take objective chance seriously, would only worsen the issue for immunologists and the pharmaceutical industry. In sum, let us take chance just as seriously as causation, for both modes of becoming occur in the real world, and not even God could beat primary chance any more than he could lift himself by pulling his shoelaces.

McGill University

Correspondence to: Department of Philosophy, McGill University, Leacock Building, 855 Sherbrook Street West, Montreal, Canada, H3A 2T7.

(1) Albert Einstein, Boris Podolsky, and Nathan Rosen, "Can Quantum-mechanical Description of Physical Reality Be Considered Complete?" Physical Review 47 (1935): 777-80.

(2) See Mario Bunge, "The Einstein-Bohr Debate over Quantum Mechanics: Who Was Right about What?" Lecture Notes in Physics 100 (1979): 204-19.

(3) See, for example, Albert Einstein, "Autobiographical Notes," in Albert Einstein: Philosopher-Scientist, ed. P. A. Schilpp (Evanston, 111.: Library of Living Philosophers, 1949), 2-94.

(4) Wolfgang Pauli, "Letter to Max Born, 31 Mach 1954," in The Born-Einstein Letters (New York: Walker and Company, 1971), 221.

(5) Maximilian Schlosshauer, Elegance and Enigma: The Quantum Interviews (Heidelberg: Springer, 2011).

(6) See Luis de la Pena, Ana-Maria Cetto, and Andrea Valdes-Hemandez, The Emerging Quantum: The Physics Behind Quantum Mechanics (Heidelberg: Springer, 2015).

(7) Art Hobson, Tales of the Quantum (Oxford: Oxford University Press, 2017).

(8) See Mario Bunge, Evaluating Philosophies (Dordrecht: Springer, 2012).

(9) See Mario Bunge, Philosophy of Physics (Dordrecht: Reidel, 1973).

(10) See Mario Bunge, "A Picture of the Electron," Nuovo Cimento series 10.1 (1955): 977-85; and Mario Bunge, "Velocity Operators and Time-energy Relations in Relativistic Quantum Mechanics," International Journal of Theoretical Physics 42 (2003): 135-42.
COPYRIGHT 2018 Philosophy Education Society, Inc.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2018 Gale, Cengage Learning. All rights reserved.

联系我们|关于我们|网站声明
国家哲学社会科学文献中心版权所有