Wolfgang Smith
A golden age of physics, one can see in retrospect, commenced around the year 1900: the dazzling era of quantum mechanics had begun. That golden age, however, has proved to be short-lived.
It began, if you will, in the year 1897 with J. J. Thomson’s discovery of the electron, the first so-called quantum particle to see the light of day. This epochal breakthrough was followed in quick succession by the detection of protons and neutrons: the structure of the chemical elements was about to present itself to scientific view in the form of the now familiar atom, with its heavy and positively charged proton-neutron nucleus and light negatively charged electrons revolving around it in shells. At one stroke the mystery of the chemical elements, as enshrined in the periodic table, had now been resolved, reducing chemistry in principle to physics. Meanwhile, in the fateful year 1925, the dynamics of these so-called quantum particles was successfully encoded in the famous Schrödinger wave equation, which extends the classical laws of motion into the newly-discovered quantum realm. The venerable mechanics of Newton—which for three centuries had been perceived as the master-key to the secrets of Nature—was now demoted to the status of an approximation, valid in the limit as the newly-discovered Planck’s constant h tends to zero.
In the wake of this mathematization of quantum theory, discovery succeeded discovery in its triumphant march. The first enigma to be resolved was that of atomic spectra: those astounding sequences of spectral lines in the radiation emitted by atoms, which could be measured with uncanny precision and used to identify not only the chemical elements, but even their isotopes. It did not take long, following the discovery of the Schrödinger wave equation, for the likes of Wolfgang Pauli to calculate these spectral frequencies to degrees of accuracy never before attained. There was reason to believe that, in certain domains at least, physics had at last reached its ground level and was now conversant, not merely with epiphenomena, but with Nature’s fundamental laws.
It was a period of unprecedented discovery, in which entirely new domains of inquiry and spheres of application came rapidly into view: think of the cyber revolution, to mention but one example, the foundations of which rest upon quantum theory. And let us not forget the atom bomb, which in its own horrific way proclaims the stupendous magnitude of the quantum-mechanical breakthrough. What presently concerns us, however, are not the triumphs of quantum physics in the course of what we have termed its golden age, but the emergence rather of its limitations: for scarcely had the new theory come to birth, than phenomena came to light demonstrating its incompleteness.
Continue Reading...
The atomic nucleus, for example, posed problems the new physics alone could not resolve: what is the force, for example, which prevents the protons within an atomic nucleus from flying apart? Given that neither electromagnetic nor gravitational forces are strong enough, it became apparent that there must exist an as yet unknown force. So too, soon enough, a host of other so-called quantum particles made their appearance: the positron, for example—a positively charged electron—was predicted in 1928 and observed three years later. And as a matter of fact, two other so-called “antiparticles”—the antiproton and antineutron—were likewise predicted in 1928, and observed in 1955 and 1956, respectively. Moreover new parameters descriptive of quantum particles—such as the measure of spin—came to be defined, and new particles, characterized in terms of these various indices, made their appearance in droves. What ensued was an embarrassment of riches, soon dubbed the particle zoo: a curious ensemble of weird semi-entities, with half-lives as small as 10-20 seconds, seemingly void of rhyme or reason. The need to bring order into this seeming chaos became paramount, and clearly the newly discovered quantum mechanics was not, by itself, up to the task.
By mid-century more or less, the beautiful new theory had apparently delivered its quota of epochal discoveries, and came to be viewed as something quite familiar and well explored. Yet when it comes to the foundations, Richard Feynman’s “No one understands quantum theory”—so far from having been rendered obsolete—had rather been confirmed by the fact that all attempts to “understand quantum theory”—as opposed to merely applying it—seemed to the neutral observer to have failed. But be that as it may, the theory functioned magnificently wherever it was applied. By about mid-century, however, questions took center stage which seemed to call for other means, and thus the focus of fundamental physics began to shift: from quantum mechanics to the emerging vistas of particle physics. On its foundational level, physics was fast entering a subsequent epoch: an era defined by newly-formulated conceptions and staffed by a distinctly new breed of physicists.
By now—the second decade of the twenty-first century—the golden age of quantum mechanics has long ceased, and both the style and the content of fundamental physics have changed substantially. Generally speaking, we appear to know far more than in the preceding “golden” age, but with incomparably less assurance. Meanwhile it has become harder than ever for the non-specialist to catch so much as a glimpse of what is actually taking place on a foundational plane—to which one might add that the task of distinguishing scientific fact from scientistic speculation has never been as daunting. What is urgently called for is a definitive treatise on the state of particle physics today, authored by an insider of the physics elite astute enough, honest enough—and above all, courageous enough—to tell the story as it is.
* * *
Enter Sabine Hossenfelder, a particle physicist of stature amply endowed with the aforesaid qualifications, who in a recent book, entitled Lost in Math,1 has lifted the veil. She begins with a comment on the interplay between theory and experiment, noting that “in the last century, the division of labor between the theorists and the experimentalists worked very well. But my generation has been stunningly unsuccessful” (1).2 And with astounding candor she admits: “I witnessed my profession slip into crisis… I’m no longer sure any more that what we do here, in the foundations of physics, is science” (2).
Between the glory days of quantum theory, moreover, and the present impasse, there is a transitional phase defined by a signal accomplishment: the theoretical formulation and subsequent empirical verification, namely, of the so-called standard model, which reduces the list of quantum particles—inclusive of the so-called particle zoo—to precisely twenty-five “elementary” particles, of which all the rest are said to be composites. Out of the original electron, proton, neutron triad basic to chemistry, only the electron has survived as an indecomposable. I might mention that more than half a century had elapsed before the last of the standard model particles, the so-called Higgs boson predicted in the 1960’s, was detected: it took the Large Hadron Collider at CERN—boasting a 16-mile tunnel and a price tag of 6 billion dollars—to accomplish this feat.
Curiously enough, however, particle physicists, as a rule, are not pleased with their magnum opus; no one, it seems, cares for the standard model! For one thing, the newly found “elementary particles” have failed so far to contribute significantly to the understanding of what might be termed “our” world. Whereas the discovery of electrons, protons and neutrons has revolutionized science, beginning with chemistry, and has given rise to hitherto undreamed-of applications, the identification of semi-entities such as quarks and gluons has had no even remotely comparable effect beyond the confines of particle theory as such. It could in fact be argued that nothing of empirically appreciable consequence would be lost if we were to regard the erstwhile protons, electrons and neutrons as the indecomposable constituents of physical entities, so long of course as we bear in mind that they obey the laws of quantum theory as distinguished from Newtonian mechanics. As Sabine Hossenfelder explains: “This way we can speak of atoms and their shell structure, of molecules and their vibrational modes, and of metals and their superconductivity—even though in the underlying theory there aren’t any such things as atoms, metals, or their superconductivity, only elementary particles” (44). And as to the magnum opus of twentieth century particle physics, she concludes: “The most astounding fact about high-energy particle physics is that you can get away with knowing nothing about it.”
It is unclear to what extent this “uselessness” has been an issue for present-day particle physicists. I will mention in passing that from the vantage point of the metaphysical traditions, their quest points “downwards” in reference to the scala naturae: from the pole of morphe to that of hyle, a descent which cannot but lead eventually to the “nothingness” of prima materia. And let me note that this view of the matter accords actually with Werner Heisenberg’s stipulation regarding the ontological status of quantum particles as something “midway between being and nonbeing,” which as such are reminiscent of Aristotelian potentiae. Ontologically speaking, it thus appears that contemporary particle physicists are actually moving in the wrong direction: “away from reality,” towards the nether pole of hyle, where nothing at all exists. Their passion, according to this optic, is to hypostatize the ontological underside of whatever entities or semi-entities they encounter, as if “further down” signified “closer to reality.” Is it any wonder, then, that physicists seem to be losing what Nick Herbert has so aptly called their “grip on reality”!
But hylomorphic considerations, evidently, do not carry too much weight these days, and one might recall that even Heisenberg—the son of a classicist—stood alone among quantum theorists of the time in his ontological reflections. Meanwhile it should be noted that what to some extent at least compensates for the contemporary lack of ontological discernment is a passion for mathematics, coupled moreover with a deep and authentic apprehension of its beauty: the majesty, one could say, of its sovereign symmetries. And this is something Sabine Hossenfelder touches upon again and again, in full recognition of the well nigh central role this factor currently plays in the dynamics of fundamental research. It appears moreover to be the underlying reason why no one among leading physicists is satisfied with the standard model, which seems nowadays to be perceived as a kind of makeshift, something that for aesthetic reasons alone could not possibly be the last word. As Sabine Hossenfelder explains:
The standard model, despite its success, doesn’t get much love from physicists. Michio Kaku calls it “ugly and contrived.” Stephen Hawking says it’s “ugly and ad hoc,” Matt Strassler disparages it as “ugly and baroque,” Brian Greene complains that the standard model is “too flexible,” and Paul Davies thinks it “has the air of unfinished business” because “the tentative way it bundles together the electroweak and strong forces” is an “ugly feature.” I have yet to find someone who actually likes the standard model. (70)
* * *
Yet its most serious flaw in the eyes of the physics community is doubtless the fact that it fails to account for one of the basic forces: gravity, that is. And perhaps the single most intensely pursued and erstwhile promising attempt to remedy this perceived deficiency is the discipline known as “string theory,” which since about the mid-1980’s has been the field of choice for many of the most highly gifted theorists. The sheer amplitude of its underlying mathematical structure—including spatial manifolds of up to fifteen dimensions—seemed well-nigh to guarantee access to the ultimate secrets of the quantum realm.
A key condition imposed itself from the start, namely supersymmetry, which entailed the fascinating prospect that every particle of the standard model had a “supersymmetry” partner, the observable parameters of which string theorists could in principle calculate. This enlargement, moreover—if it were to prove real—would “round out” the picture as described by the standard model, and hopefully cure its proverbial “ugliness”: the very “beauty” of supersymmetry seemed almost to guarantee this consummation. The expectation was rife, moreover, that at sufficiently high energies, nuclear collisions would produce these highly wished-for particles; and it is by no means implausible to suppose that the lure of supersymmetry may in fact have been the decisive motivation for the construction of the LHC. But as Sabine informs us: “The LHC began its first run in 2008. Now it’s December 2015 and no signs of supersymmetry have been found…” To which I would add that the situation remains unchanged to the present day. Yet whereas many if not most theorists have by now abandoned the postulate of supersymmetry, others continue to believe the predicted “partners” will someday be found.
Meanwhile something far more portentous even than the apparent demise of supersymmetry has come to pass; as Sabine explains:
In reaction, most string theorists discarded the idea that their theory would uniquely determine the laws of nature and instead embraced the multiverse, in which all the possible laws of nature are realized somewhere. They are now trying to construct a probability distribution for the multiverse according to which our universe would at least be likely. (173)
* * *
“I begin to worry,” she commented upon returning from a conference, “that physicists are about to discard the scientific method”: no wonder she was concerned! For by the time one postulates a multiverse, the line has already been crossed. Let us be clear on this pivotal issue: it is by no means possible to accommodate multiverse theory to the scientific method due to the fact that no empirical communication between one universe and another can conceivably take place: no signal of any kind, after all, can pass from one universe to another. Inasmuch, therefore, as one can thus neither confirm nor falsify the existence of “another universe” by empirical means, the hypothesis of a multiverse is in principle bereft of scientific sanction. The so-called multiverse reduces thus to a kind of deus ex machina missioned to rescue a scientific theory which fails to square with the facts. How marvelous: an erroneous prediction comes thus to be verified, howbeit in another world!
The multiverse, however, constitutes by no means the only dubious conception currently in vogue: it appears that in this regard, contemporary astrophysics leads the way. And I would add that since the confirmation of what has been dubbed “the axis of evil” (based upon data from the Planck satellite), the fundamental premise underlying relativistic astrophysics—the so-called Copernican principle—has been demonstrably disqualified.3
Yet to this day particle physicists avail themselves freely of tenets pertaining to this invalidated domain as if they were well-established scientific facts. A case in point pertains to so-called “inflation”: “Inflation has some evidence speaking for it, though not overwhelmingly so” states Sabine, noting moreover that “physicists have further extrapolated this extrapolation to what is known as ‘eternal inflation’” (104). Yet despite these implicit uncertainties, “the string theory landscape conveniently merged with eternal inflation,” she goes on to say (105). Nor does Sabine Hossenfelder stand alone in her doubts; as one of her colleagues remarked: “One can always find inflationary models to explain whatever phenomenon is represented by the flavor of the month… Inflationary cosmology, as we currently understand it, cannot be evaluated using the scientific method” (212). No wonder Sabine “is not sure anymore that what we do here, in the foundations of physics, is science”!
On a lighter note, she relates how a bump in the LHC data, observed on June 22, 2016, sparked the idea of a so-called “diphoton anomaly,” which was subsequently put to rest by data gathered on August 4th: “In the eight months since its ‘discovery’ more than five hundred papers were written about a statistical fluctuation. Many of them were published in the field’s top journals. The most popular ones have already been cited more than three hundred times. If we learn anything from this, it’s that current practice allows theoretical physicists to quickly invent hundreds of explanations for whatever data happens to be thrown at them” (235). From which she concludes:
How long is too long to wait for a theory to be backed up by evidence?… Five hundred theories to explain a signal that wasn’t and 193 models for the early universe are more than enough evidence that current quality standards are no longer useful to assess our theories. (236)
* * *
The need of a theory “to be backed up by evidence”: this takes us to the heart of perhaps the major problem which has increasingly afflicted contemporary physics on its foundational level. As physicists began to probe beyond the realm of such truly basic quantum particles as the proton—and the gap between theory and application began to widen by leaps and bounds—“evidence” became ever more scarce. And by the time one arrives at the string theories—with their “invisible” dimensions—the temptation to dispense with that increasingly inconvenient requirement became ever more imperious. Emboldened moreover by the fact that the pursuit of “mathematical elegance” had often in the past led to notable success, an increasing cadre among the most gifted specialists have succumbed apparently to the lure of “unconfirmed” and “unconfirmable” theory—right up to the reductio ad absurdum of the multiverse.
The most noteworthy commentary, in that regard, comes from the well-known physicist George Ellis, whom Sabine interviewed: “There are physicists now saying we don’t have to test their ideas because they are such good ideas,” he begins. “They’re saying—explicitly or implicitly—that they want to weaken the requirement that theories have to be tested. To my mind, that’s a step backwards by a thousand years” (213). Well said, obviously. The big question however, it seems to me, is whether “beyond a certain depth” empirical verification, in the true sense, is still feasible. Compare a proton, say, to a quark: one can still manipulate the former, place it in an electric field, for instance, and send it off in some more or less specified direction. In spite of “quantum indeterminacy,” a proton is yet close enough to a “thing” to be somehow observed and tested. So much so, in fact, that scientists have been tempted to “reify” such pseudo-entities in their imagination, “corporealize” them as it were. My point is that the picture changes drastically as one “descends” to a still deeper level of pure materiality: for as we have noted earlier, the ontological shift is directed towards prima materia, which technically speaking, does not actually exist. In place of a “theory of everything,” therefore, the further evolution of fundamental physics appears thus to be moving inexorably towards a theory of “nothing at all”: the very nothingness, if you will, of the multiverse itself.
Clearly Sabine Hossenfelder is onto something when she writes: “I can’t believe what this once venerable profession has become. Theoretical physicists used to explain what was observed. Now they try to explain why they can’t explain what was not observed” (108).
* * *
But regardless of whether contemporary particle physics may have “crossed the line” beyond which its modus operandi begins to fail, I would like now to touch upon another fundamental issue—utterly different—which strikes me as a likely source of invalidity. I am referring specifically to the wholesale acceptance of Einsteinian relativity (in both of its forms), along with the Einsteinian astrophysics, which has left its mark upon particle physics by way of pivotal conceptions such as “inflation” and “dark energy.” This wholesale incorporation of Einsteinian theory into particle physics, I must admit, strikes me as uncritical, and I suspect that it may be the prime source of error affecting particle physics at this time. I have contended elsewhere4 in reference to the special theory that Einstein’s original argument was based ultimately upon ideological, as opposed to scientific considerations, and that crucial experiments have in fact proved him wrong. And as regards the Einsteinian astrophysics, I have come to surmise that the theory has been kept alive from the outset by way of ad hoc hypotheses, missioned to neutralize corresponding domains of hostile data.
One of the earliest conceptions thus to impose itself upon fundamental physics is that of “inflation,” which as we have come to see, has risen to the status of a foundational premise in particle physics. Inasmuch, however, as inflation alone could not fully account for the Einsteinian expansion it was supposed to explain, a second ad hoc conception was brought into play: i.e., “dark energy.” I will leave it to the experts to inform us whether this does or does not “explain” the seemingly accelerated expansion of the cosmos at large; what I wish to point out, first of all, is that the notion has sparked an immense amount of research. As Sabine Hossenfelder informs us: “There is an extensive literature on conjectured dark energy fields, like chameleon fields, dilation fields, moduli, phantom fields, and quintessence.” The prime question, of course, is whether such a thing as “dark energy” actually exists. And it might not be irrelevant to point out that the concept bears the distinction of having led to what Michio Kaku calls “the biggest mismatch between theory and experiment in the history of science”: to a vacuum energy, namely, which is “120 orders of magnitude too big”!5
Leaving aside notions imported into particle physics from “big bang” cosmology, the fact remains that particle physicists have, from the start, built the basic conceptions of Einsteinian relativity into the foundations of their discipline. The big question, of course, is whether this “importation” can be justified on rigorous grounds. To begin with, one might ask whether the conspicuous success of Richard Feynman’s QED—his quantum electrodynamics—may have tilted the scales: for this would have been illegitimate, given that electrodynamics—in contrast to mechanics—has been “Einsteinian” from its inception by virtue of its Lorentz invariance. The question remains whether the Einsteinian symmetries at large upon which present-day particle physics is based are in fact justified; and for my part—speaking, to be sure, as an outsider—this appears open to doubt.
It should be recalled, in this connection, that “foundational physics,” as Sabine Hossenfelder points out, “is as far from experimental test as science can be while still being science” (27). And surprisingly enough, the natural presumption that Einsteinian physics has been carefully and objectively “vetted” before being de facto elevated to the status of a foundational axiom—appears questionable. From the start the new physics seems to have imposed itself upon the scientific community at large—not merely as a fascinating hypothesis—but rather as a long-awaited breakthrough, the “coming of age,” almost, of physics per se. As in the case of Darwinian evolution, evidence was hardly the prime concern: Einsteinian relativity was simply “too good” not to be true. Such, in brief, is my reading of the phenomenon, which might in any case explain how sober and often brilliant physicists could conceivably have succumbed to a fantasy, seduced say by the “beauty” and “elegance” of space-time geometry.
The fact, in any case, is that particle physics has been Einsteinian from its inception. As regards the standard model, Sabine Hossenfelder informs us that “the gauge symmetries and the symmetries of special relativity dictate most of the standard model’s structure” (55). What remains unaccounted for, at this point, is the force of gravity, which needless to say, particle physicists have conceived from the outset in terms of general relativity. And that is where string theory comes into play. Although the theory originated as an attempt to understand strong nuclear interactions, its rise to fame began when, as Sabine points out, it was noticed that “the strings were exchanging a force that looked just like gravity” (172). In short, string theory “is a theory of quantum gravity” (188). And this is the theory, let us recall, which has given us the multiverse!
Keith Olive, an authority on supersymmetry, may yet be right when he confides to Sabine: “At the end of the day, the only thing we know to be true is the standard model” (65). It may be time to take a second look at the foundations.
* * *
It remains to ask why “in the field of quantum foundations, our colleagues want to improve a theory which has no shortcomings whatsoever” (5). Sabine Hossenfelder is referring to that “golden age” of quantum theory with which we began these reflections: the good old physics of “protons, electrons and neutrons” which worked—and continues to work—so well. And she answers the question, at least in part: “It irks them”—the “foundational” physicists, namely—“that, as Richard Feynman, Niels Bohr, and other heroes of last century’s physics complained, ‘nobody understands quantum mechanics’” (6). Whatever may have sparked the movement to “improve” upon that physics, it turns out, had little to do with rectifying flaws or correcting deficiencies:
Alas, all experiments have upheld the predictions of the not-understandable theory from the last century. And the new theories? They are still untested speculations. (6)
Admittedly the physics “from the last century” had its limitations; but is this not perhaps a sine qua non of bona fide science per se? One is reminded of Goethe’s memorable dictum “In der Beschränkung zeigt sich der Meister”: could it be that the prowess of science derives ultimately from its Beschränkung, its limitation? And does this not, finally, entail that the quest for a “theory of everything” is bound to fail? For my part, I am persuaded that such is indeed the case: in the final count, physics “works” precisely because it is not a “theory of everything.”
What is it, however, which renders that (admittedly “limited”) “theory from the last century”—the one that “has no shortcomings whatsoever”—to be “not-understandable,” to use Sabine’s own term? It appears that Steven Weinberg—one of the greatest physicists of our time—has provided the crucial clue. In response to a question put to him by Sabine Hossenfelder, he states:
You can very well understand quantum mechanics in terms of an interaction of the system you’re studying with an external environment which includes an observer. But this involves a quantum mechanical system interacting with a macroscopic system that produces the decoherence between different branches of the initial wave function. And where does that come from? That should be described also quantum mechanically. And, strictly speaking, within quantum mechanics itself there is no decoherence. (126)
The impasse of long standing could not have been stated more clearly! The reason why “no one understands quantum theory” resides thus in the measuring problem. And what renders this conundrum insoluble to the physicist is the fact that “strictly speaking, within quantum theory itself there is no decoherence.” Here we have it: the very Beschränkung, it turns out, which bestows upon the physicist his sovereign power to comprehend the physical universe, renders the measuring problem insoluble—i.e., to the physicist!—by restricting his vision to the realm of the physical as such.
What is it, then, that this vision excludes? I answered this question in the first paragraph of my first book: “It excludes the blueness of the sky and the roar of breaking waves” I wrote, “the fragrance of flowers and all the innumerable qualities that lend color, charm and meaning to our terrestrial and cosmic environment.” To which of course the “scientific” response will be: “But these are all subjective attributes: that color and that sound—that’s all in your head!” Here we have it: the Beschränkung is yet in force! It has not been transcended: the aficionados of physical science have apparently become de facto incapable of transcending it.
Let me say apodictically that there is in truth no scientific basis whatsoever for the aforesaid response, the aforesaid reductionism; and in truth there cannot be. What confronts us here is not a scientific, but indeed a philosophical issue. The subjectification of the qualities is actually none other than the Cartesian postulate, what Alfred North Whitehead terms bifurcation, which affirms that reality divides ontologically into two disjoint realms: an external world consisting of res extensae or “extended things,” all else being relegated to a subjective domain of so-called res cogitantes or “thinking entities.” Introduced in the seventeenth century by René Descartes,6 it has ushered in the period known as the Enlightenment, and has bedeviled our intellectuals ever since. And this premise, I say, proves to be none other than the Beschränkung upon which physics per se has been based since the days of Newton.
No wonder “nobody understands quantum mechanics”! The very limitation upon which physics as such is based renders this impossible. But whereas our Weltanschauung has been off since the Enlightenment, what has changed with the advent of quantum theory is that henceforth physics itself can no longer be understood on a bifurcationist basis. The true significance of the measuring problem resides in the seemingly paradoxical fact that decoherence does take place, even though “within quantum mechanics itself there is no decoherence.” But if not “within,” there must be a “without”: and this recognition opens the door to a rediscovery of the authentic “unfiltered” world, along with the fullness of our humanity, beginning with our very sanity.7
The remedy, therefore, is simple in principle: rectify this misconception—expunge the fallacy of bifurcation, root it out—and the resolution of the “insoluble” measuring problem stares you in the face: at one stroke quantum theory ceases to be “not-understandable,” and turns comprehensible in a trice.8
Which brings us finally to the question: how does this resolution of the long-standing quantum enigma impact the present-day particle physics quandary? For my part, I entertain the hope that this rediscovery of the integral cosmos may disclose the way out of that formidable-seeming impasse.
Show Less...
Dr. Smith’s latest book, Physics: A Science in Quest of an Ontology, is now available, as is our feature documentary chronicling his life and work, The End of Quantum Reality.
- New York: Basic Books, 2018. [↩]
- Page numbers will be given in parentheses. [↩]
- I have dealt with this issue in some detail in Physics and Vertical Causation: The End of Quantum Reality (Philos-Sophia Initiative, 2023), ch. 5. [↩]
- Op. cit., ch. 5. [↩]
- Chad Orzel, Lost in Math, op. cit., p. 136. [↩]
- Strictly speaking, the idea traces back to the atomism of Democritus, his celebrated fragment: “By convention there exist color, the sweet and the bitter, but in reality only atoms and the void.” And let me note that both Plato and Aristotle were wise enough to recognize this doctrine as inherently heterodox. [↩]
- As I have pointed out repeatedly, the Cartesian Weltanschauung has plunged us into a state of collective schizophrenia: one moment “the grass is green,” and the next—when again we turn Cartesian—it is not. [↩]
- A summary presentation of that resolution may be found in my lecture “From Schrödinger’s Cat to Thomistic Ontology,” reprinted in Ancient Wisdom and Modern Misconceptions (Philos-Sophia Initiative, 2023), ch. 1. A complete and rigorously “non-bifurcationist” account of physics per se is given in The Quantum Enigma: Finding the Hidden Key (Philos-Sophia Initiative, 2023). [↩]