More on this book
Community
Kindle Notes & Highlights
Read between
April 8 - April 13, 2021
Such experiments have provided precise quantitative confirmation of the predictions of the theory, even out to the fifth decimal place.20 Thus, general relativity now stands as one of the best confirmed theories of modern physics.
Questions about the applicability of the theory of general relativity arise, however, for extremely small subatomic and quantum-level phenomena. In the subatomic realm (10−12 cm or smaller), strange phenomena can occur, such as light or electrons acting like both waves and particles at the same ...
This highlight has been truncated due to consecutive passage length restrictions.
some leading physicists, including Paul Steinhardt, one of the originators of inflationary cosmology, now think there are significant reasons for doubting all inflationary cosmological models.)
Within a decade, Borde, Vilenkin, and a third physicist, Alan Guth, one of the original proponents of inflation, had come to a startling conclusion: the universe must have had a beginning, even if inflationary cosmology is correct.
We’ve seen that previous attempts to prove a cosmological singularity at the beginning of the universe were based upon Einstein’s theory of general relativity. This made sense given the strongly intuitive basis of Hawking’s initial insight: if the universe is expanding, the density of mass and thus the curvature of the universe will eventually reach a limit in the reverse direction of time. Arguably, that insight and the mathematical arguments based on it remain strong indicators of a beginning, even if those arguments cannot conclusively prove the validity of extrapolating all the way back to
...more
Nevertheless, in 2003, Borde, Guth, and Vilenkin developed a proof for a beginning of the universe that did not depend on using Einstein’s field equations of general relativity or on any energy condition.36 Instead, the Borde-Guth-Vilenkin (BGV) theorem is based sol...
This highlight has been truncated due to consecutive passage length restrictions.
Recall from Chapter 5 that special relativity addresses the relationship between the speed of light and time. The BGV theorem applies to any universe that meets very general conditions, including those implied by inflationary cosmological models. As Alexander Vilenkin explained, “A remarkable thing about this theorem is its sweeping generality. We made no assumptions about the material content of the universe. We did not even assume that gravity is described by Einstein’s equations. So, if Einstein’s gravity requires some modification, our conclusion will sti...
This highlight has been truncated due to consecutive passage length restrictions.
Consequently, the theorem applies to nearly all plausible and realistic cosmological models. It states that any universe that is on average expanding is “past incomplete.” In other words, if one follows any spacetime trajectory back in time, any expanding universe, including one expanding as a consequence of an “infla...
This highlight has been truncated due to consecutive passage length restrictions.
Borde, Guth, and Vilenkin have shown that all cosmological models in which expansion occurs—including inflationary cosmology,39 multiverses,40 and the oscillating and cosmic egg models—are subject to the BGV theorem.41 Consequently, Vilenkin argues that evidence for a beginning is now almost unavoidable. As he explains, “With the proof now in place, cosmologists can no longer hide behind the possibility of a past-eternal universe. There is no escape; they have to face the problem of a cosmic beginning.”42 Since our universe is expanding and the Borde-Guth-Vilenkin theorem does not depend upon
...more
For now, though, it’s worth noting that a proof (in the case of the BGV theorem) and a strong indicator (in the case of the Hawking-Penrose-Ellis singularity theorems) have reinforced the testimony of observational astronomy: as best we can tell, the universe did have a beginning.
Astrophysicist Sir Fred Hoyle (Fig. 7.1) pioneered research on how the nuclear reactions in stars transform hydrogen into the many chemical elements, including carbon and oxygen, necessary for life.1 He started his scientific career as a staunch atheist who saw no evidence of design in the universe. As he said in his early years as a scientist, “Religion is but a desperate attempt to find an escape from the truly dreadful situation in which we find ourselves. . . . No wonder then that many people feel the need for some belief that gives them a sense of security, and no wonder that they become
...more
His atheism played a major role in his approach to science, priming him to reject the idea that the universe had a beginning. In fact, as we saw, he coined the term “big bang” to ridicule the idea of a cosmic beginning and later developed the steady-state model as an alternative. Unfortunately for Hoyle, after the discovery of the cosmic microwave background radiation (CMBR), support for his steady-state model dwindled as more and more astronomers came to accept the big bang theory. Nevertheless, it was not the discovery of the CMBR, but a different discovery that eventually shook Hoyle’s
...more
Indeed, since the 1950s, physicists have discovered that life in the universe depends upon a highly improbable set of forces and features as well as an extremely improbable balance among many of them. The precise strengths of the fundamental forces of physics, the arrangement of matter and energy at the beginning of the universe, and many other specific features of the cosmos appear delicately balanced to allow for the possibility of life. If any one of these properties were altered ever so slightly, complex chemistry and life simply would not exist.
The Mysterious Prevalence of Carbon in the Universe Hoyle’s contribution to the discovery of fine tuning began in the 1950s. What he discovered shocked him and eventually shook his atheism. Hoyle knew that the universe contained a surprising abundance of carbon. He also knew the production of the element carbon was crucial to all known forms of life. Carbon forms long chain-like molecules that can carry information and store the energy that living cells need to survive.6 People have speculated about life based on other elements, such as silicon, existing somewhere in the cosmos. But physicists
...more
Indeed, carbon-based life is the only known form of life, and carbon has features that make it uniquely suitable as the basis for complex chemistry and life. For instance, carbon is essential for forming sufficiently stable, long, chain-like molecules capable of storing and processing genetic information. Carbon also combines with oxygen to form carbon dioxide in essential chemical reactions. Carbon dioxide is a gas, so it can easily escape cells as waste and readily mix throughout the biosphere. In contrast, silicon dioxide is a solid (familiar to us in the form of sand), and it cannot
...more
Hoyle was stunned by these and other “cosmic coincidences” that physicists began to discover after the 1950s.28 Whereas before he affirmed atheism and denied any evidence of design, he began to see fine tuning as obvious evidence of intelligent design. As he put it in 1981, “A common-sense interpretation of the facts suggests that a super-intellect has monkeyed with physics, as well as with chemistry and biology, and that there are no blind forces worth speaking about in nature. The numbers one calculates from the facts seem to me so overwhelming as to put this conclusion almost beyond
...more
As the British physicist Paul Davies put it in 1988, “The impression of design is overwhelming.”31 Similarly, astrophysicist Luke Barnes notes: “Fine tuning suggests that, at the deepest level that physics has reached, the Universe is well put-together. . . . The whole system seems well thought out, something that someone planned and created.”
The most fundamental type of fine tuning pertains to the laws of physics and chemistry. Typically, when physicists say that the laws of physics exhibit fine tuning, they are referring to the constants within those laws.33 But what exactly are the “constants” of the laws of physics?
The laws of physics usually relate one type of variable quantity to another. A physical law could tell us that as one variable (say, force) increases, another (say, acceleration) also increases proportionally by some factor. Physicists describe this type of relationship by saying that one variable quantity is proportional to another. Conversely, a physical law may stipulate that as one factor increases, another decreases by the same factor. Physicists describe this type of relationship by saying that the first variable quantity is inversely proportional to the other.
As Paul Davies (Fig. 7.6) has marveled, “The really amazing thing is not that life on earth is balanced on a knife-edge, but that the entire universe is balanced on a knife-edge, and would be total chaos if any of the natural ‘constants’ were off even slightly.”34 Or as Stephen Hawking noted, “The remarkable fact is that the values of these numbers seem to have been very finely adjusted to make possible the development of life.”35
We now know that many other constants of physics also exhibit fine tuning as a condition of a life-permitting universe.36 The electromagnetic force constant exhibits moderate fine tuning of 1 part in 25.37 The strong nuclear force constant is fine-tuned to 1 part in 200.38 Moreover, the ratios of the values of the different force constants also require significant fine tuning. For example, the ratio of the weak nuclear force constant to the strong nuclear force constant had to have been set with a precision of 1 part in 10,000.39 If the weak force had been weaker or stronger by that small
...more
More impressively, the ratio of the electromagnetic force to gravity must be accurate to 1 part in 1040. Were this ratio a bit higher, the gravitational attraction would be too strong in comparison to the contravening force of electromagnetism pushing nuclei apart. In that case, stars would, again, burn too quickly and unevenly to allow for the formation of long-lived stars and stable solar systems. Were this ratio a bit lower, gravitational attraction would be too weak in comparison to electromagnetism. That would have prevented stars from burning hot enough to produce the heavier elements
...more
Cambridge theoretical physicist Sir John Polkinghorne, who has argued that cosmic fine tuning now provides the basis for a revived program of natural theology.
There I had the opportunity to interview another Cambridge physicist, Brian Josephson, a Nobel laureate. In our conversation, Josephson explained why he thought the choice of a prior intelligent mind provided a natural explanation of the fine-tuning evidence. As he explained, “It could have been [that there was] some mind around before the kind of universe we know came into being.
And if that were right, that mind could, as it were, have intentions for the universe and been able to set it up so that the end result came out right.”43 Interestingly, in an interview for the PBS program Closer to Truth, Josephson later estimated his own confidence in intelligent design as the best explanation for the conditions that would make evolution possible at “about 80 percent.”44
Another physicist I interviewed, the late Henry Margenau, a distinguished Yale professor of quantum physics, put the point even more emphatically. When I asked him how he explained the fine tuning of the laws and constants of physics, he simply said, “There is a mind which is responsible for the laws of nature and the existenc...
This highlight has been truncated due to consecutive passage length restrictions.
Physicist and self-described atheist George Greenstein has confessed that, in the face of his materialistic predilections, “the thought insistently arises that some supernatural agency, or rather Agency, must be involved. Is it possible that, suddenly, without intending to, we have stumbled upon scientific proof for the existence of a supreme being? Was it a God who providentially stepped in and crafted the cosmos for our benefit?”46
Attempts to explain the evidence by invoking chance alone or multiple other universes (more on that in Chapter 16) seemed to him to betray a kind of metaphysical special pleading, even desperation. As Longley explained, the anthropic design argument “is of such an order of certainty that in any other sphere of science, it would be regarded as settled.” He continued: “To insist otherwise is like insisting that Shakespeare was not written by Shakespeare because it might have been written by a billion monkeys sitting at a billion keyboards typing for a billion years. So it might. But the sight of
...more
Even so, there are at least two other general types of fine tuning: the fine tuning of the initial conditions of the matter and energy at the beginning of the universe, and the fine tuning of other contingent features of the universe. These don’t get mentioned quite as often as the fine tuning of the laws and constants of physics, but they are every bit as important for the existence and maintenance of life in the universe.
the configuration of matter and energy at the beginning of the universe determined the distribution of matter and energy later in the history of the cosmos. Only the extreme fine tuning of that initial configuration enabled galaxies, stars, and planetary systems to form.
When creating a tunnel, the precise angle and force of dynamite charges will determine the outcome. In the same way, the initial configuration of matter and energy at the beginning of the universe will determine whether or not a life-permitting universe will result.
The Oxford physicist Sir Roger Penrose, who collaborated with Stephen Hawking in proving cosmological singularity theorems and later calculated the exquisite and hyper-exponential fine tuning of the initial entropy of the universe.
That’s putting it mildly. The mathematical expression 1010123 represents what mathematicians call a hyper-exponential number—10 raised to the 10th power (or 10 billion) raised again to the 123rd power. To put that number in perspective, it might help to note that physicists have estimated that the whole universe contains “only” 1080 elementary particles (a huge number—1 followed by 80 zeroes).
But that number nevertheless represents a minuscule fraction of 1010123.13 In fact, if we tried to write out this number with a 1 followed by all the zeros that would be needed to represent it accurately without the use of exponents, there would be more zeros in the resulting number than there are elementary particles in the entire universe. Penrose’s calculation thus suggests an incredibly improbable arrangement of mass-energy—a degree of initial fine tuning that really is not adequately reflected by the word “exquisite.” I’m not aware of a word in English that does justice to the kind of
...more
Nevertheless, these physical factors are themselves independent of each other and probably finely tuned.17 For example, the expansion rate in the earliest stages of the history of the universe would have depended upon the density of mass and energy at those early times. And the density of the universe one nanosecond (a billionth of a second) after the beginning had to have the precise value of 1024 kilogram per cubic meter. If the density were larger or smaller by only 1 kilogram per cubic meter, galaxies would never have developed.18 This corresponds to a fine tuning of 1 part in 1024
The most conservative estimate for that fine tuning is 1 part in 1053, but the number 1 part in 10120 is more frequently cited.19 Physicists now commonly agree that the degree of fine tuning for the cosmological constant is no less than 1 part in 1090.20 To get a sense of what this number means, imagine searching the vastness of the visible universe for one specially marked subatomic particle.
Then consider that the visible universe contains about 200 billion galaxies each with about 100 billion stars21 along with a panoply of asteroids, planets, moons, comets, and interstellar dust associated with each of those stars. Now assume that you have the special power to move instantaneously anywhere in the universe to select—blindfolded and at random—any subatomic particle you wish. The probability of your finding a specially marked subatomic particle—1 chance in 1080—is still 10 billion times better than the probability—1 part in 1090—that the universe would have happened upon a
...more
Equally problematic, increasing the mass of electrons by a factor of 2.5 would result in all the protons in all the atoms capturing all the orbiting electrons and turning them into neutrons. In that case, neither atoms, nor chemistry, nor life could exist.22 What’s more, the mass of the electron has to be less than the difference between the masses of the neutron and the proton and that difference represents fine tuning of roughly 1 part in a 1000.23 In addition, if the mass of a special particle known as a neutrino were increased by a factor of 10, stars and galaxies would never have formed.
...more
Notice too that the WAP advocates focus on the wrong phenomenon of interest. They think that what needs to be explained (or explained away) is why we observe a universe consistent with our existence. It’s true that such an observation is not surprising. What needs explanation, though, is what caused the fine tuning of the universe in the first place—not our later observation of it. Thus, WAP advocates offer as a cause of the event that does need explanation a statement of a necessary condition of another event that does not need explanation.
The mathematician and philosopher William Dembski (Fig. 8.3), has developed a theory about how we detect the activity of intelligent agents in the effects they leave behind. His theory helps explain why the fine tuning evidence suggests design to so many physicists. It also reinforces the conclusion that the fine tuning of the laws, constants, and initial conditions of the universe does indeed point to a designing mind. In his groundbreaking book The Design Inference, Dembski explicated the criteria by which rational agents recognize the effects of other rational agents and distinguish them
...more
According to Dembski, extremely improbable events that also exhibit “an independently recognizable pattern” or set of functional requirements, what he calls a “specification,” invariably result from intelligent causes, not chance or physical-chemical laws.32
Mathematician and philosopher William Dembski. In his groundbreaking book The Design Inference, Dembski established a rigorous method of detecting the activity of intelligent agents and distinguishing such activity from purely natural causes.
I’ve often explained Dembski’s theory by asking students to think about the faces on Mt. Rushmore in South Dakota. If you look at that famous mountain you will quickly recognize the faces of the American presidents inscribed there as the product of intelligent activity. Why? What about those faces indicates that an artisan or sculptor acted to produce them? You might want to say it’s the improbability of the shapes. By contrast, we would not be inclined to infer that an intelligent agent had played a role in forming, for example, the common V-shaped erosional pattern between two mountains
...more
Nevertheless, as Dembski points out, the precise arrangement of the rocks at the bottom of the mountain also represents an extremely improbable configuration, especially when one considers all the other possible ways those rocks might have settled. So in addition to the improbability of the shapes, what help...
This highlight has been truncated due to consecutive passage length restrictions.
The answer is the presence of a special kind of pattern. In addition to an improbable structure we see a shape or pattern that matches one we know from independent experience, namely, from seeing the human face and even the specific faces of the presidents on money or in history books. Thus, Dembski suggests that the improbability of the structure by itself does not trigger our awareness of prior intelligent design. Instead, intelligent agents recognize intelligent activity whe...
This highlight has been truncated due to consecutive passage length restrictions.
Consider another example. The cartoon in Figure 8.4 depicts a pattern of flowers on the hillside of the harbor in the city of Victoria on Canada’s Vancouver Island. I’ve occasionally ridden a high-speed ferry from Seattle that docks in the Victoria harbor. Once, while standing on the bow of the ferry boat as it came into the harbor, I noticed this pattern of red and yellow flowers on the hillside. While I was still at some distance, the pattern caught my attention and made me curious, so I put on my glasses. When I did, I immediately made a design inference. Why? I realized the red and yellow
...more
Specified complexity or functional information as an indicator or “signature” of intelligence. The inner harbor of Victoria, Canada houses flower beds that spell out the phrase “Welcome to Victoria.” The arrangement of flowers conveys “specified” or functional information, an unmistakable sign of intelligence. No one, for example, would attribute this pattern of flowers to an undirected process such as birds flying over the harbor randomly dropping seeds.
Dembski’s theory and his two criteria explain why I was right to make this inference. Given the many other ways the flowers might have been arranged and given how wind and rain and other natural forces would be expected to scatter the seeds for growing them, the specific arrangement of flowers qualified as an extremely improbable pattern. In addition, however, the arrangement exemplified several patterns that I recognized independently, namely, the shapes of several English letters. The arrangement of flowers also exhibited a functionally significant pattern in the sense that it met a set of
...more
For now, it’s worth noting that scientists have discovered another class of unexpected evidence that suggests, to many leading physicists at least, the need to revive a God or design hypothesis. As John Polkinghorne notes, “We are living in an age where there is a great revival of natural theology taking place. That revival of natural theology is taking place not on the whole among theologians, who have lost their nerve in that area, but among the scientists.”35 Polkinghorne further observes that although this new natural theology generally has more modest ambitions than the natural theology
...more
The DNA Enigma When Watson and Crick discovered the structure of DNA, they made a shocking discovery. DNA could store information in the form of a four-character digital code. Their structural model of DNA showed that strings of precisely sequenced chemical subunits called “nucleotide bases”—affixed along the interior of the DNA double-helix backbone—could store and transmit information (Fig. 9.1). Crick developed this idea further in 1958 with his now famous “sequence hypothesis,” according to which the chemical subunits of DNA (the nucleotide bases) function just as letters in a written text
...more