It would be reassuring to most people to discover that the universe is constructed to favor life. If the human race isn’t a freakish outcome of chance events, we have every right to see the universe as our home. But this psychological reassurance strikes physicists as wishful thinking – the bulwark of modern science, from the most minuscule events at the quantum scale to the Big Bang itself, is the assumption that creation is random, without guidance, plan, mind, or purpose.
Only very slowly has such a blanket view been challenged, but these new challenges are among the most exciting possibilities in science. We’d like to outline the argument for a “human universe” with an eye to showing why it is important for understanding why the human race exists. This question is too central to be left to a small cadre of professional cosmologists – everyone has a personal stake in it.
The most accepted theory of the large-scale structure of the universe is big bang cosmology, which has achieved impressive results. Yet when you try to model the universe, you can’t escape the problems surrounding what seems like a simple act: observing it. Measuring the cosmos is intricately interwoven with limits imposed by the process of observation itself. As you go back in time or ahead into the future, as you reach so far into space that light takes billions of years to reach Earth, any possible model encounters horizons of knowledge at some ultimate, faint observational limit. Beyond such a horizon, not just observation is blocked, but so is physics, mathematics, and the human mind.
For example, with the big bang theory, light cannot be used to observe further back in time or across immense distances to arrive close to the very beginning. The first instant of the big bang remains forever hidden from the present. Knowledge about the early universe has to be inferred, as indeed. We can examine the parts that scattered after the big bang, but we cannot grasp the whole. As such, our observational limitations prohibit verifying cosmological theories to any degree of accuracy for any observational test. So the Hubble telescope, marvelous as it is for sending back photos of distant galaxies, can’t reveal reality independent of cosmological theory. Theory cannot be verified with complete certainty, which means that important topics like the expansion of the universe and the evolution of galaxies are our own mental constructs – they reflect who we are as observers, not independent reality.
Fine Tuning in Cosmology
What we can see and infer is certainly fascinating. We want to touch upon the inexplicable fact that the cosmos fits together with the smallest and largest aspects fine tuned beyond anything that pure chance can explain. Talking about this fine tuning is done mathematically, in a language beyond the reach of non-scientists. Yet as soon as anyone ventures to suggest a creation that departs from randomness, two bad things happen. Religionists leap into the breach with God, and in reaction scientists become hotly defensive. We aren’t out to add to either of these bad things. But we can’t ignore the human implications of what we’re about to discuss, because God and science will both be forced to take new shapes.
The most basic aspect of fine tuning is the consistency of the cosmos, which is the smoothest of cream soups compared with lumpy oatmeal. The universe we observe is essentially flat, which has given rise to the Flatness Problem. Being nearly flat today, the universe was exactly flat close to the time of the big bang itself, to one part in 1050 (10 followed by 50 zeros, an unimaginable vast number). Why? The usual interpretation proposed in the 80’s is that early on the universe was in an inflationary state, washing out any departures from flatness on extremely short time scales of 10-35 sec. (Imagine one of those whirling paintings sold at carnivals, with the colors swirling outward with incredible force – not a single drop would leap up off the paper.) In more general terms, it would appear that the universe followed the simplest possible theoretical construct (flatness) in its large-scale geometry.
The inflationary model was developed to account for the flatness of the universe and also supposedly solves the horizon problem. That problem arose because , looking in all directions, the universe is remarkably homogeneous, as related to the microwave background radiation that fills it — the temperature of this radiation is constant if one looks at different parts of the sky, to 1 part in 106 . Such consistency isn’t easy to explain. Observations indicate that the background radiation was emitted around 100,000 years after the beginning, meaning that opposite sides of the sky at that time were, separated by approximately 10,000,000 light years. How could two opposite parts of the sky be so similar to each other if information had no chance to get from one to the other? Imagine a hot pancake fresh off the griddle that you tear into pieces and fling into the air. A hundred thousand years later, all the pieces have the same temperature as one another, even though they never came into contact again – this is like the horizon problem.
Yet the biggest fine tuning is the value of the so-called cosmological constant, introduced by Einstein as part of general relativity. The cosmological constant is a value to describe the density of energy everywhere in empty space. Such constants, like gravity and the speed of light, are necessary for the mathematical computations of physics to work. In this case, we are talking about the dark energy in empty space that cannot be seen. The cosmological constant was originally needed to counter gravity and produce a closed, static universe that was stable —it essentially acts as negative gravity.
It was later abandoned when observations by Hubble and others favored an expanding universe unlike the stable one that Einstein’s generation assumed. Now the cosmological constant has been reintroduced because current observations seem to indicate that the universe not only is expanding but is also accelerating in its expansion. The standard model of particle interactions predicts a value that is 10122 larger than the actual observed value. Had the value been what standard particle theory predicts, the universe could not exist in its present form. This is known as the Cosmological Constant Problem
This last has shaken our confidence that we can rely upon observation in the normal sense if we want to grasp what the universe is. Regular matter (i.e., atoms and molecules) contributes 4% or less of the enclosed density of the universe right now. As such, if one insists on exact flatness, one needs to introduce unknown forms of “dark matter” (around 25%) and “dark energy (around 70%) to make a flat universe. Worse still for cosmologists, unknown physics is required by a non-zero cosmological constant. The mathematical model for a flat universe is simple in its initial assumptions but the underlying physics required to maintain it is complex and even unknown.
Let’s translate the dilemma into everyday terms. Only 4% of the universe – meaning all the stars, galaxies, planets, light, heat, and interstellar dust – fits into science. We are perched as if on the cherry that tops an ice cream sundae, trying to make the whole dessert conform to being like a cherry, since that’s the only world we know. But the universe refuses to be a cherry, and what it insists on being may be inconceivable. For unlike a bacterium that may have floated on to the top of an ice cream sundae through the air, we were born on the cherry, are made of its substance, and can think only in terms of our small specific surroundings.
Yet fine tuning has always lurked on the edges of standard physics, being ignored only because for a century, observation was triumphant, carrying theory along with it. You can do wonders with subatomic particles, relativity, and quantum calculations before you have to worry about events that occurred over 13 billion years ago. The universe “as it presents itself” was good enough, as it has been for a long, long time. But the numbers are inescapable, and on every side they point to a universe that is fine tuned at the smallest and largest levels. An ancient Indian proverb can be put in modern terms: “As is the large, so is the small. As is the microscopic, so is the macroscopic.”
This similarity defies randomness. Pure chance is the clumsiest, most inelegant, and least probable way to explain a fine tuned cosmos, which means that it isn’t good science. In an ironic twist, the numbers game of modern physics has revealed that the numbers match too well. It’s like a bingo game where the machine spits out the same ball millions of times in a row. How can that be? More importantly, why is creation fit together seamlessly? We’ll look for plausible answers in the next post.
(To be cont.)
Why the Universe Is Our Home – It’s Not a Coincidence (Part 2)
At the human level, everyone would like to feel that life has meaning, which implies that the setting for life – the universe at large – isn’t a cold void ruled by random chance. There is a huge gap here, and for the past century, science hasn’t budged from its grandest assumption, that creation is ruled by random events. There was good reason for this adamant position. The mathematics of modern physics is a marvel of precision and accuracy. No guiding hand, creator, higher intelligence, or deity was needed as long as the equations worked.
Now there is a crack in the theory, tiny at first but opening into a fissure, that casts doubt on how science observes the universe. The fault isn’t that the mathematics was wobbly ad loose. Quite the opposite. The universe is too finely tuned to fit the random model. God isn’t going to leap into the breach, although religion has reason to feel better about not accepting the so-called “accidental universe.” The real fascination lies in how to match reality out there” with the potentiality of the human mind. Both are up for grabs.
In the modern era, Sir Arthur Eddington and Paul A.M. Dirac first noticed that certain “coincidences” in dimensionless ratios can be found. These ratios link microscopic with macroscopic quantities. For example, the ratio of the electric force to gravitational force (presumably a constant), is a large number (Electric Force/Gravitational Force = E/G ~ 1040) while the ratio of the observable size of the universe (which is presumably changing) to the size of an elementary particle is also a large number, surprisingly close to the first number: Size Universe/Elementary Particle = U/EP ~ 1040. It is hard to imagine that two very large and unrelated numbers would turn out to be so close to each other. Why are they? (For earlier examples of fine tuning, please see our first post, which gives some general background as well.)
Dirac argued, as fundamental physics, that they must be related. The essential problem is that the size of the universe is changing as the cosmos expands while the first relationship is presumably constant, since it involves only two supposed “constants”. Why should two very large numbers, one variable and the other not variable, be so close to each other? (It’s like seeing a person’s vocal chords vibrating in all kinds of ways and yet discovering that each word he speaks is exactly half a second apart –even this image is a simplification compared to the actual problem, which spans similar ratios in terms of light years and time in the trillionth of a second.)
Dirac’s Large Number Hypothesis attempts to link ratios in such a way that they aren’t coincidental. But fine tuning was a pervasive finding in other places, too, where unexpected ratios match in terms like the number of particles in the universe to the entropy in the whole system. Such “coincidences” extend beyond cosmology to all-encompassing relationships at many levels. Let’s look at some cases at our level of reality, where matter is comfortably composed of atoms and molecules. . The “fine structure constant” determines the properties of these atoms and molecules. It is a pure number, ~ 1/137. If the fine structure constant were different by as little as approx. 1 %, no atoms or molecules would exist as we know them .
For example, the fine structure constant determines how solar radiation is transmitted and also how it is absorbed in the Earth’s atmosphere; it also applies to how photosynthesis works. Now the Sun “happens” to emit the majority of its radiation in a part of the spectrum where the atmosphere of the Earth “happens” to be transparent to it. However, the radiation from the Sun is determined by the value of the gravitational constant. Why would a “macroscopic” quantity, namely the force of gravity, be such that the spectrum of radiation would just happen to be the right one to be transmitted through the Earth’s atmosphere and absorbed by plants (the atmospheric transmission being determined by the “microscopic” fine structure constant)? If these two effects did not work together exactly right, there would be no life as we know it. The initial question we asked, “How do humans fit into the universe?” isn’t satisfactorily answered when one coincidence must be piled on another.
The coincidences don’t end there. For instance, massive supernova explosions are responsible for forming the heavy elements like iron that are in your body today, billions of years hence. The specifics of the supernova explosion are determined by the weak force, which exists at the infinitesimally small scale of the atomic nucleus. If this weak force were different by as little as 1 % or so, there would be no supernova explosions, no formation of heavy elements, and therefore no life as we know it.
The problem of fine tuning is one of the biggest embarrassments facing modern physical and biological science. These “coincidences” may be indicating the existence of some deep, underlying unity involving the fundamental constants, linking the microcosm to the macrocosm just as the ancients saw without mathematics. The so-called Anthropic Principle has been proposed to account for fine tuning, as follows, if the constants were not exactly right, there would be no life on Earth, no humans, etc. Being here, we look around and find that the cosmos led to our existence. This is an attempt to preserve reality “out there” by limiting it to the aspects a human mind and the five senses can understand. However, if you think about it, the Anthropic Principle just states the obvious, “we are here because we are here”. It has little explanatory power.
As far as we are concerned, the dilemma boils down to two clear choices: On the one hand, fine tuning is indeed just coincidence piled higher and her, and humans “happen to be in the right universe”. This is the point of view favored by M-Theory proponents, including Stephen Hawking, using superstring theory. M-theory posits trillions of possible universes (the multiverse) that bubble away, churning out every possible combination of constants, zillions of which do not match to form life. But one did, and we live in it. This is the cosmic equivalent of putting a hundred monkey to work tapping randomly on typewriters, eventually producing the complete works of Shakespeare after also producing an almost infinite mountain of gibberish. (At the mathematical level, superstring theory fits any number of models, extending beyond counting. Unfortunately, no observations support which model is correct, and worse still, it may be that no possible observation can, since superstrings exist – if they do at all – beyond time and space.)
Although superstring theory may one day be proven, there is no reason to try to invoke it for the extreme fine tuning we observe, simply as a means to avoid a huge embarrassment for today’s science. M-theory waves its hands around, invoking a randomly picked universe that “happens” to be right. As such, nothing in the end needs to be explained: Pure randomness rules if we live in nothing more than an incredibly unlikely universe of our own – lucky us.
How unlikely is it? Estimates from superstring theory yield one out of 10500, or 1/10 followed by 500 zeros, a number far greater than the number of particles in the universe. But it gets even more cumbersome: From chaotic inflation theory, the chances of being in the right universe are much smaller, 1/(1010)10)7 ! The proponents insist that we just live in one of the many, many universes and there is nothing to explain. It is hard to swallow this view and see how scientists can be proposing anything so empty. It’s one thing to claim that a hundred monkeys can write Shakespeare, but it’s quite another to declare that there is no other way. The fact is that all these fine-tuned constants fit together far more precisely than any work of literature, even at Shakespeare’s level of genius.
We said that two clear choices exist. The other, which we favor, is that the universe is self-driven. The fine tuning is self-designed (not the red herring of “intelligent design” by a supernatural God in the sky). The self-design of the universe is driven by quantum processes, rapidly picking choices that lead to optimal final results, as required by acts of observation. Certain assumptions are needed for this alternative, as they are for any theory:
• We are able to observe the universe because we are woven into its unfolding existence.
• The universe reinforces the patterns and forms that are successful in evolving from random ingredients.
• The same organizing principles that exist in us were inherited from the universe. These include creativity, intelligence, and evolution.
• In some way the universe monitors and governs itself. Call it a self-aware of conscious universe, the terminology is secondary to a primary fact: Something knows how to self-design on the grandest scale as well as the most minute.
• This something permeates creation, including ourselves. The linkage between self-aware humans and a self-aware cosmos is necessary – it must exist or we wouldn’t be able to observe the world that surrounds us.
• Randomness isn’t the opposite of meaning and purpose but serves them, just as the randomly smeared colors on a painter’s palette serve the highly organized picture that is being made.
The two choices are infused into everyone’s life: Either incredibly small odds of finding the right universe from a random set of universes “out there”. Or, a universe imbued with life and consciousness that drives itself. We hold the latter to be simpler and more logical but also more scientific. It fits the facts without abolishing any cherished, precise observations. Indeed, we are going to all this trouble in order to preserve observation. The precise matchup of so many constants, ruling the biggest and smallest domains in creation, can be taken as natural rather than accidental. Einstein hit upon a deep truth when he said, “I want to know the mind of God; everything else is just details.” Substitute “the mind of the universe,” and you have a goal worth pursuing for the coming century.