Reason and Belief
Human beings have all sorts of beliefs. The way in which they arrive at them varies from reasoned argument to blind faith. Some beliefs are based on personal experience, others on education, and others on indoctrination. Many beliefs are no doubt innate: we are born with them as a result of evolutionary factors. Some beliefs we feel we can justify, others we hold because of "gut feelings."
Obviously many of our beliefs are wrong, either because they are incoherent, or because they conflict with other beliefs, or with the facts. Two and a half thousand years ago, in ancient Greece, the first systematic attempt was made to establish some sort of common grounds for belief. The Greek philosophers sought a means to formalize human reasoning by providing unassailable rules of logical deduction. By adhering to agreed procedures of rational argument, these philosophers hoped to remove the muddle, misunderstanding, and dispute that so characterize human affairs. The ultimate goal of this scheme was to arrive at a set of assumptions, or axioms, which all reasonable men and women would accept, and from which the resolution of all conflicts would flow.
It has to be said that this goal has never been attained, even if it were possible. The modern world is plagued by a greater diversity of beliefs than ever, many of them eccentric or even dangerous, and rational argument is regarded by a lot of ordinary people as pointless sophistry. Only in science, and especially mathematics, have the ideals of the Greek philosophers been upheld (and in philosophy itself, of course). When it comes to addressing the really deep issues of existence, such as the origin and meaning of the universe, the place of human beings in the world, and the structure and organization of nature, there is a strong temptation to retreat into unreasoned belief. Even scientists are not immune from this. Yet there is a long and respectable history of attempts to confront such issues by rational and dispassionate analysis. Just how far can reasoned argument take us? Can we really hope to answer the ultimate questions of existence through science and rational inquiry, or will we always encounter impenetrable mystery at some stage? And just what is human rationality anyway?
The Scientific Miracle
Throughout the ages all cultures have extolled the beauty, majesty, and ingenuity of the physical universe. It is only the modern scientific culture, however, that has made any systematic attempt to study the nature of the universe and our place within it. The success of the scientific method at unlocking the secrets of nature is so dazzling it can blind us to the greatest scientific miracle of all: science works. Scientists themselves normally take it for granted that we live in a rational, ordered cosmos subject to precise laws that can be uncovered by human reasoning. Yet why this should be so remains a tantalizing mystery. Why should human beings have the ability to discover and understand the principles on which the universe runs?
In recent years more and more scientists and philosophers have begun to study this puzzle. Is our success in explaining the world using science and mathematics just a lucky fluke, or is it inevitable that biological organisms that have emerged from the cosmic order should reflect that order in their cognitive capabilities? Is the spectacular progress of our science just an incidental quirk of history, or does it point to a deep and meaningful resonance between the human mind and the underlying organization of the natural world?
pardFour hundred years ago science came into conflict with religion because it seemed to threaten Mankind's cozy place within a purpose-built cosmos designed by God. The revolution begun by Copernicus and finished by Darwin had the effect of marginalizing, even trivializing, human beings. People were no longer cast at the center of the great scheme, but were relegated to an incidental and seemingly pointless role in an indifferent cosmic drama, like unscripted extras that have accidentally stumbled onto a vast movie set. This existentialist ethos -- that there is no significance in human life beyond what humans themselves invest in it -- has become the leitmotif of science. It is for this reason that ordinary people see science as threatening and debasing: it has alienated them from the universe in which they live.
In the chapters that follow I shall present a completely different view of science. Far from exposing human beings as incidental products of blind physical forces, science suggests that the existence of conscious organisms is a fundamental feature of the universe. We have been written into the laws of nature in a deep and, I believe, meaningful way. Nor do I regard science as an alienating activity. Far from it. Science is a noble and enriching quest that helps us to make sense of the world in an objective and methodical manner. It does not deny a meaning behind existence. On the contrary. As I have stressed, the fact that science Works, and works so well, points to something profoundly significant about the organization of the cosmos. Any attempt to understand the nature of reality and the place of human beings in the universe must proceed from a sound scientific base. Science is not, of course, the only scheme of thought to command our attention. Religion flourishes even in our so-called scientific age. But as Einstein once remarked, religion without science is lame.
The scientific quest is a journey into the unknown. Each advance brings new and unexpected discoveries, and challenges our minds with unusual and sometimes difficult concepts. But through it all runs the familiar thread of rationality and order. We shall see that this cosmic order is underpinned by definite mathematical laws that interweave each other to form a subtle and harmonious unity. The laws are possessed of an elegant simplicity, and have often commended themselves to scientists on grounds of beauty alone. Yet these same simple laws permit matter and energy to self-organize into an enormous variety of complex states, including those that have the quality of consciousness, and can in turn reflect upon the very cosmic order that has produced them.
Among the more ambitious goals of such reflection is the possibility that we might be able to formulate a "Theory of Everything" -- a complete description of the world in terms of a closed system of logical truths. The search for such a TOE has become something of a holy grail for physicists. And the idea is undoubtedly beguiling. After all, if the universe is a manifestation of rational order, then we might be able to deduce the nature of the world from "pure thought" alone, without the need for observation or experiment. Most scientists reject this philosophy utterly, of course, hailing the empirical route to knowledge as the only dependable path. But as we shall see, the demands of rationality and logic certainly do impose at least some restrictions on the sort of world that we can know. On the other hand, that same logical structure contains within itself its own paradoxical limitations that ensure we can never grasp the totality of existence from deduction alone.
History has thrown up many physical images for the underlying rational order of the world: the universe as a manifestation of perfect geometrical forms, as a living organism, as a vast clockwork mechanism, and, most recently, as a gigantic computer. All of these images capture some key aspect of reality, though each is incomplete on its own. We shall examine some of the latest thinking about these metaphors, and the nature of the mathematics that describes them. This will lead us to confront the questions: What is mathematics? And why does it work so well in describing the laws of nature? And where do these laws come from anyway? In many cases the ideas are easy to describe; sometimes they are rather technical and abstract. The reader is invited to share this scientific excursion into the unknown, in search of the ultimate basis of reality. Though the going gets rough here and there, and the destination remains shrouded in mystery, I hope that the journey itself will prove exhilarating.
Human Reason and Common Sense
It is often said that the factor which most distinguishes human beings from other animals is our power to reason. Many other animals seem to be aware of the physical world to a greater or lesser extent, and to respond to it, but humans claim more than mere awareness. We also possess some sort of understanding of the world, and of our place within it. We are capable of predicting events and of manipulating natural processes to our own ends, and although we are part of the natural world, we somehow distinguish between ourselves and the rest of the physical universe.
In primitive cultures, understanding of the world was limited to everyday affairs, such as the passage of the seasons, or the motion of a slingshot or an arrow. It was entirely pragmatic, and had no theoretical basis, except in magical terms. Today, in the age of science, our understanding has,vastly expanded, so that we need to divide knowledge up into distinct subjects -- astronomy, physics, chemistry, geology, psychology, and so on. This dramatic progress has come about almost entirely as a result of "the scientific method": experiment, observation, deduction, hypothesis, falsification. The details need not concern us here. What is important is that science demands rigorous standards of procedure and discussion that set reason over irrational belief.
The concept of human reasoning is itself a curious one. We are persuaded by "reasonable" arguments, and feel happiest with those that appeal to "common sense." Yet the processes of human thought are not God-given. They have their origin in the structure of the human brain, and the tasks it has evolved to perform. The operation of the brain, in turn, depends on the laws of physics and the nature of the physical world we inhabit. What we call common sense is the product of thought patterns deeply embedded in the human psyche, presumably because they confer certain advantages in dealing with everyday situations, like avoiding falling objects and hiding from predators. Some aspects of human thought will be fixed by the wiring of our brains, others inherited as "genetic software" from our ancestors of long ago.
The philosopher Immanuel Kant argued that not all our categories of thought derive from sensory experience of the world. He believed that some concepts are a priori, by which he meant that, although these concepts are not necessary truths in the strictly logical sense, nevertheless all thought would be impossible without them: they would be "necessary for thought." One example Kant gave was our intuitive understanding of three-dimensional space through the rules of Euclidean geometry. He supposed that we are born with this knowledge. Unfortunately, scientists have now discovered that Euclidean geometry is actually wrong! Today, scientists and philosophers generally suppose that even the most basic aspects of human thought must ultimately refer back to observations of the physical world. Probably the concepts that are most deeply etched in our psyche, the things that we find it hard to imagine could be otherwise -- such as "common sense" and human rationality -- are those that are genetically programmed at a very deep level in our brains.
It is interesting to speculate whether alien beings who evolved under very different circumstances would share our concept of common sense, or indeed any of our thought patterns. If, as some science-fiction writers have mused, life existed on the surface of a neutron star, one could not begin to guess how such beings would perceive and think about the world. It is possible that an alien's concept of rationality would differ from ours so greatly that this being would not be at all persuaded by what we would regard as a rational argument.
Does this mean that human reasoning is suspect? Are we being excessively chauvinistic or parochial in supposing that we can successfully apply the thought patterns of Homo sapiens to the great issues of existence? Not necessarily. Our mental processes have evolved as they have precisely because they reflect something of the nature of the physical world we inhabit. What is a surprise is that human reasoning is so successful in framing an understanding of those parts of the world our perceptions can't directly reach. It may be no surprise that human minds can deduce the laws of falling objects, because the brain has evolved to devise strategies for dodging them. But do we have any right to expect extensions of such reasoning to work when it comes to nuclear physics, or astrophysics, for example? The fact that it does work, and works "unreasonably" well, is one of the great mysteries of the universe that I shall be investigating in this book.
But now another issue presents itself. If human reasoning reflects something of the structure of the physical world, would it be true to say that the world is a manifestation of reason? We use the word "rational" to mean "in conformity with reason," so my question is whether, or to what extent, the world is rational. Science is founded on the hope that the world is rational in all its observable aspects. It is possible that there may be some facets of reality which lie beyond the power of human reasoning. This doesn't mean that these facets are necessarily irrational in the absolute sense. Denizens of neutron stars (or supercomputers) might understand things that we, by the very nature of our brains, cannot. So we have to be aware of the possibility that there may be some things with explanations that we could never grasp, and maybe others with no explanation at all.
In this book I shall take the optimistic view that human reasoning is generally reliable. It is a fact of life that people hold beliefs, especially in the field of religion, which might be regarded as irrational. That they are held irrationally doesn't mean they are wrong. Perhaps there is a route to knowledge (such as through mysticism or revelation) that bypasses or transcends human reason? As a scientist I would rather try to take human reasoning as far as it will go. In exploring the frontiers of reason and rationality we will certainly encounter mystery and uncertainty, and in all probability at some stage reasoning will fail us and have to be replaced either by irrational belief or frank agnosticism.
If the world is rational, at least in large measure, what is the origin of that rationality? It cannot arise solely in our own minds, because our minds merely reflect what is already there. Should we seek explanation in a rational Designer? Or can rationality "create itself" by the sheer force of its own "reasonableness"? Alternatively, could it be that on some "larger scale" the world is irrational, but that we find ourselves inhabiting an oasis of apparent rationality, because that is the only "place" where conscious, reasoning beings could find themselves? To explore these sorts of questions further, let us take a more careful look at the different types of reasoning.
Thoughts About Thought
Two sorts of reasoning serve us well, and it is important to keep a clear distinction between them. The first is called "deduction." This is based on the strict rules of logic. According to standard logic, certain statements, such as "A dog is a dog" and "Everything either is, or is not, a dog," are accepted as true, while others, like "A dog is not a dog," are deemed false. A deductive argument starts out with a set of assumptions called "premises." These are statements or conditions which are held to be the case without further questioning, for the purposes of the argument. Obviously the premises should be mutually consistent.
It is widely believed that the conclusion of a logico-eductive argument contains no more than was present in the original premises, so that such an argument can never be used to prove anything genuinely new. Consider, for example, the deductive sequence (known as a "syllogism"):
1. All bachelors are men.
2. Alex is a bachelor.
3. Therefore, Alex is a man.
Statement 3 tells us no more than was present in statements 1 and 2 combined. So, according to this view, deductive reasoning is really only a way of processing facts or concepts so as to present them in a more interesting or useful form.
When deductive logic is applied to a complex set of concepts, the conclusions can often be surprising or unexpected, even if they are merely the outworking of the original premises. A good example is provided by the subject of geometry, which is founded on a collection of assumptions, known as "axioms," on which the elaborate edifice of geometrical theory is erected. In the third century B.C. the Greek geometer Euclid enumerated five axioms on which conventional school geometry is founded, including such things as "Through every two points there is a unique straight line." From these axioms, deductive logic can be used to derive all the theorems of geometry that we learn at school. One of these is Pythagoras' theorem, which, although it has no greater information content than Euclid's axioms, from which it is derived, is certainly not intuitively obvious.
Clearly a deductive argument is only as good as the premises on which it is founded. For example, in the nineteenth century some mathematicians decided to follow up the consequences of dropping Euclid's fifth axiom, which states that through every point it is possible to draw a line parallel to another given-line. The resulting "non. Euclidean geometry" turned out to be of great use in science. In fact, Einstein employed it in his general theory of relativity (a theory of gravitation), and, as mentioned, we now know that Euclid's geometry is actually wrong in the real world: crudely speaking, space is curved by gravity. Euclidean geometry is still taught in schools because it remains a very good approximation under most circumstances. The lesson of this story, however, is that it is unwise to consider any axioms as so self-evidently correct that they could not possibly be otherwise.
It is generally agreed that logico-deductive arguments constitute the most secure form of reasoning, though I should mention that even the use of standard logic has been questioned by some. In so-called quantum logic, the rule that something cannot both be and not be such-and-such is dropped. The motivation for this is that in quantum physics the notion of "to be" is more subtle than in everyday experience: physical systems can exist in superpositions of alternative states.
Another form of reasoning that we all employ is called "inductive." Like deduction, induction starts out from a set of given facts or assumptions, and arrives at conclusions from them, but it does so by a process of generalization rather than sequential argument. The prediction that the sun will rise tomorrow is an example of inductive reasoning based on the fact that the sun has faithfully risen every day so far in our experience. And when I let go of a heavy object, I expect it to fall, on the basis of my previous experiences with the pull of gravity. Scientists employ inductive reasoning when they frame hypotheses based on a limited number of observations or experiments. The laws of physics, for instance, are of this sort. The inverse-square law of electric force has been tested in a number of ways, and always confirmed. We call it a law because, on the basis of induction, we reason that the inverse-square property will always hold. However, the fact that nobody has observed a violation of the inverse-square law does not prove it must be true, in the way that, given the axioms of Euclidean geometry, Pythagoras' theorem must be true. No matter on how many individual occasions the law is confirmed, we can never be absolutely certain that it applies unfailingly. On the basis of induction, we may conclude only that it is very probable that the law will hold the next time it is tested.
The philosopher David Hume cautioned against inductive reasoning. That the sun has always been observed to rise on schedule, or that the inverse-square law has always been confirmed, is no guarantee that these things will continue in the future. The belief that they will is based on the assumption that "the course of nature continues always uniformly the same." But what is the justification for this assumption? True, it may be the case that a state of affairs B (e.g., dawn) has invariably been observed to follow A (e.g., dusk), but one should not construe this to imply that B is a necessary consequence of A. For in what sense might B have to follow A? We can certainly conceive of a world where A occurs but B doesn't: there is no logically necessary connection between A and B. Might there be some other sense of necessity, a sort of natural necessity? Hume and his followers deny that there is any such thing.
It seems we are forced to concede that conclusions arrived at inductively are never absolutely secure in the logical manner of deductive conclusions, even though "common sense" is based on induction. That inductive reasoning is so often successful is a (remarkable) property of the world that one might characterize as the "dependability of nature." We all go through life holding beliefs about the world (such as the inevitability of sunrise) which are inductively derived, and considered to be wholely reasonable, and yet rest not on deductive logic, but on the way the world happens to be. As we shall see, there is no logical reason why the world may not have been otherwise. It could have been chaotic in a way that made inductive generalization impossible.
Modern philosophy has been strongly influenced by the work of Karl Popper, who argues that in practice scientists rarely use inductive reasoning in the way described. When a new discovery is made, scientists tend to work backward to construct hypotheses consistent with that discovery, and then go on to deduce other consequences of those hypotheses that can in turn be experimentally tested. If any one of these predictions turns out to be false, the theory has to be modified or abandoned. The emphasis is thus on falsification, not verification. A powerful theory is one that is highly vulnerable to falsification, and so can be tested in many detailed and specific ways. If the theory passes those tests, our confidence in the theory is reinforced. A theory that is too vague or general, or makes predictions concerning only circumstances beyond our ability to test, is of little value.
In practice, then, human intellectual endeavor does not always proceed through deductive and inductive reasoning. The key to major scientific advances often rests with free-ranging imaginative leaps or inspiration. In such cases an important fact or conjecture springs ready-made into the mind of the inquirer, and only subsequently is justification found in reasoned argument. How inspiration comes about is a mystery that raises many questions. Do ideas have a type of independent existence, so that they are "discovered" from time to time by a receptive mind? Or is inspiration a consequence of normal reasoning which takes place hidden in the subconscious, with the result being delivered to the conscious only when complete? If so, how did such an ability evolve? What biological advantages can such things as mathematical and artistic inspiration confer on humans?
A Rational World
The claim that the world is rational is connected with the fact that it is ordered. Events generally do not happen willy-nilly: they are related in some way. The sun rises on cue because the Earth spins in a regular manner. The fall of a heavy object is connected with its earlier release from a height. And so on. It is this interrelatedness of events that gives us our notion of causation. The window breaks because it is struck by a stone. The oak tree grows because the acorn is planted. The invariable conjunction of causally related events becomes so familiar that we are tempted to ascribe causative potency to material objects themselves: the stone actually brings about the breakage of the window. But this is to attribute to material objects active powers that they do not deserve. All one can really say is that there is a correlation between, say, stones rushing at windows and broken glass. Events that form such sequences are therefore not independent. If we could make a record of all events in some region of space over a period of time, we would notice that the record would be crisscrossed by patterns, these being the "causal linkages." It is the existence of these patterns that is the manifestation of the world's rational order. Without them there would be only chaos.
Closely related to causality is the notion of determinism. In its modem form this is the assumption that events are entirely determined by other, earlier events. Determinism carries the implication that the state of the world at one moment suffices to fix its state at a later moment. And because that later state fixes subsequent states, and so on, the conclusion is drawn that everything which ever happens in the future of the universe is completely determined by its present state. When Isaac Newton proposed his laws of mechanics in the seventeenth century, determinism was automatically built into them. For example, treating the solar system as an isolated system, the positions and velocities of the planets at one moment suffice to determine uniquely (through Newton's laws) their positions and velocities at all subsequent moments. Moreover, Newton's laws contain no directionality in time, so the trick works in reverse: the present state suffices to fix uniquely all past states. In this way we can, for example, predict eclipses in the future, and also retrodict their occurrences in the past.
If the world is strictly deterministic, then all events are locked in a matrix of cause and effect. The past and future are contained in the present, in the sense that the information needed to construct the past and future states of the world are folded into its present state just as rigidly as the information about Pythagoras' theorem is folded into the axioms of Euclidean geometry. The entire cosmos becomes a gigantic machine or clockwork, slavishly following a pathway of change already laid down from the beginning of time. Ilya Prigogine has expressed it more poetically: God is reduced to a mere archivist turning the pages of a cosmic history book already written.
Standing in opposition to determinism is indeterminism, or chance. We might say that an event happened by "pure chance" or "by accident" if it was not obviously determined by anything else. Throwing a die and flipping a coin are familiar examples. But are these cases of genuine indeterminism, or is it merely that the factors and forces that determine their outcome are hidden from us, so that their behavior simply appears random to us?
Before this century most scientists would have answered yes to the latter question. They supposed that, at rock bottom, the world was strictly deterministic, and that the appearance of random or chance events was entirely the result of ignorance about the details of the system concerned. If the motion of every atom could be known, they reasoned, then even coin tossing would become predictable. The fact that it is unpredictable in practice is because of our limited information about the world. Random behavior is traced to systems that are highly unstable, and therefore at the mercy of minute fluctuations in the forces that assail them from their environment.
This point of view was largely abandoned in the late 1920s with the discovery of quantum mechanics, which deals with atomic-scale phenomena and has indeterminism built into it at a fundamental level. One expression of this indeterminism is known as Heisenberg's uncertainty principle, after the German quantum physicist Werner Heisenberg. Roughly speaking, this states that all measurable quantities are subject to unpredictable fluctuations, and hence to uncertainty in their values. To quantify this uncertainty, observables are grouped into pairs: position and momentum form a pair, as do energy and time. The principle requires that attempts to reduce the level of uncertainty of one member of the pair serves to increase the uncertainty of the other. Thus an accurate measurement of the position of a particle such as an electron, say, has the effect of making its momentum highly uncertain, and vice versa. Because you need to know both the positions and the momenta of the particles in a system precisely if you want to predict its future states, Heisenberg's uncertainty principle puts paid to the notion that the present determines the future exactly. Of course, this supposes that quantum uncertainty is genuinely intrinsic to nature, and not merely the result of some hidden level of deterministic activity. In recent years a number of key experiments have been performed to test this point, and they have confirmed that uncertainty is indeed inherent in quantum systems. The universe really is indeterministic at its most basic level.
So does this mean that the universe is irrational after all? No, it doesn't. There is a difference between the role of chance in quantum mechanics and the unrestricted chaos of a lawless universe. Although there is generally no certainty about the future states of a quantum system, the relative probabilities of the different possible states are still determined. Thus the betting odds can be given that, say, an atom will be in an excited or a nonexcited state, even if the outcome in a particular instance is unknown. This statistical lawfulness implies that, on a macroscopic scale where quantum effects are usually not noticeable, nature seems to conform to deterministic laws.
The job of the physicist is to uncover the patterns in nature and try to fit them to simple mathematical schemes. The question of why there are patterns, and why such mathematical schemes are possible, lies outside the scope of physics, belonging to a subject known as metaphysics.
Metaphysics: Who Needs It?
In Greek philosophy, the term "metaphysics" originally meant "that which comes after physics." It refers to the fact that Aristotle's metaphysics was found, untitled, placed after his treatise on physics. But metaphysics soon came to mean those topics that lie beyond physics (we would today say beyond science) and yet may have a bearing on the nature of scientific inquiry. So metaphysics means the study of topics about physics (or science generally), as opposed to the scientific subject itself. Traditional metaphysical problems have included the origin, nature, and purpose of the universe, how the world of appearances presented to our senses relates to its underlying "reality" and order, the relationship between mind and matter, and the existence of free will. Clearly science is deeply involved in such issues, but empirical science alone may not be able to answer them, or any "meaning-of-life" questions.
In the nineteenth century the entire metaphysical enterprise began to filter after being critically called into question by David Hume and Immanuel Kant. These philosophers cast doubt not on any particular metaphysical system as such, but on the very meaningfulness of metaphysics. Hume argued that meaning can be attached only to those ideas that stem directly from our observations of the world, or from deductive schemes such as mathematics. Concepts like "reality," "mind," and "substance," which are purported to lie somehow beyond the entities presented to our senses, Hume dismissed on the grounds that they are unobservable. He also rejected questions concerning the purpose or meaning of the universe, or Mankind's place within it, because he believed that none of these concepts can be intelligibly related to things we can actually observe. This philosophical position is known as "empiricism," because it treats the facts of experience as the foundation for all we can know.
Kant accepted the empiricist's premise that all knowledge begins with our experiences of the world, but he also believed, as I have mentioned, that human beings possess certain innate knowledge that is necessary for any thought to take place at all. There are thus two components that come together in the process of thinking: sense data and a priori knowledge. Kant used his theory to explore the limits of what human beings, by the very nature of their powers of observation and reasoning, could ever hope to know. His criticism of metaphysics was that our reasoning can apply only to the realm of experience, to the phenomenal world we actually observe. We have no reason to suppose it can be applied to any hypothetical realm that might lie beyond the world of actual phenomena. In other words, we can apply our reasoning to things-as-we-see-them, but this can tell us nothing about the things-in-themselves. Any attempt to theorize about a "reality" that lies behind the objects of experience is doomed to failure.
Although metaphysical theorizing went out of fashion after this onslaught, a few philosophers and scientists refused to give up speculating about what really lies behind the surface appearances of the phenomenal world. Then, in more recent years, a number of advances in fundamental physics, cosmology, and computing theory began to rekindle a more widespread interest in some of the traditional metaphysical topics. The study of "artificial intelligence" reopened debate about free will and the mind-body problem. The discovery of the big bang triggered speculation about the need for a mechanism to bring the physical universe into being in the first place. Quantum mechanics exposed the subtle way in which observer and observed are interwoven. Chaos theory revealed that the relationship between permanence and change was far from simple.
In addition to these developments, physicists began talking about Theories of Everything -- the idea that all physical laws could be unified into a single mathematical scheme. Attention began to focus on the nature of physical law. Why had nature opted for one particular scheme rather than another? Why a mathematical scheme at all? Was there anything special about the scheme we actually observe? Would intelligent observers be able to exist in a universe that was characterized by some other scheme?
The term "metaphysics" came to mean "theories about theories" of physics, Suddenly it was respectable to discuss "classes of laws" instead of the actual laws of our universe. Attention was given to hypothetical universes with properties quite different from our own, in an effort to understand whether there is anything peculiar about our universe. Some theorists contemplated the existence of "laws about laws," which act to "select" the laws of our universe from some wider set. A few were prepared to consider the real existence of other universes with other laws.
In fact, in this sense physicists have long been practicing metaphysics anyway. Part of the job of the mathematical physicist is to examine certain idealized mathematical models that are intended to capture only various narrow aspects of reality, and then often only symbolically. These models play the role of "toy universes" that can be explored in their own right, sometimes for recreation, usually to cast light on the real world by establishing certain common themes among different models. These toy universes often bear the name of their originators. Thus there is the Thirring model, the Sugawara model, the Taub-NUT universe, the maximally extended Kruskal universe, and so on. They commend themselves to theorists because they will normally permit exact mathematical treatment, whereas a more realistic model may be intractable. My own work about ten years ago was largely devoted to exploring quantum effects in model universes with only one instead of three space dimensions. This was done to make the problems easier to study. The idea was that some of the essential features of the one-dimensional model would survive in a more realistic three-dimensional treatment. Nobody suggested that the universe really is one-dimensional. What my colleagues and I were doing was exploring hypothetical universes to uncover information about the properties of certain types of physical laws, properties that might pertain to the actual laws of our universe.
Time and Eternity: The Fundamental Paradox of Existence
"Eternity is time
To see the two as opposites
Is Man's perversity"
The Book of Angelus Silesius
"I think, therefore I am." With these famous words the seventeenth-century philosopher René Descartes expressed what he took to be the most primitive statement concerning reality about which any thinking persons could agree. Our own existence is our primary experience. Yet even this unexceptionable claim contains within it the essence of a paradox that obstinately runs through the history of human thought. Thinking is a process. Being is a state. When I think, my mental state changes with time. But the "me" to which the mental state refers remains the same. This is probably the oldest metaphysical problem in the book, and it is one which has resurfaced with a vengeance in modern scientific theory.
Though our own selves constitute our primary experience, we also perceive an external world, and we project onto that world the same paradoxical conjunction of process and being, of the temporal and the atemporal. On the one hand, the world continues to exist; on the other hand, it changes. We recognize constancy not just in our personal identities, but in the persistence of objects and qualities in our environment. We frame notions like "person," "tree," "mountain," "sun." These things need not endure forever, but they have a quasi-permanence that enables us to bestow upon them a distinct identity. However, superimposed on this constant Backdrop of being is continual change. Things happen. The present fades into the past, and the future "comes into being": the phenomenon of be-coming. What we call "existence" is this paradoxical conjunction of being and becoming.
Men and women, perhaps for psychological reasons, being afraid of their own mortality, have always sought out the most enduring aspects of existence. People come and go, trees grow and die, even mountains gradually erode away, and we now know the sun cannot keep burning forever. Is there anything that is truly and dependably constant? Can one find absolute unchanging being in a world so full of becoming? At one time the heavens were regarded as immutable, the sun and stars enduring from eternity to eternity. But we now know that astronomical bodies, immensely old though they may be, have not always existed, nor will they always continue to do so. Indeed, astronomers have discovered that the entire universe is in a state of gradual evolution.
What, then, is absolutely constant? One is inevitably led away from the material and the physical to the realm of the mystical and the abstract. Concepts like "logic," "number," "soul," and "God" recur throughout history as the firmest ground on which to build a picture of reality that has any hope of permanent dependability. But then the ugly paradox of existence rears up at us. For how can the changing world of experience be rooted in the unchanging world of abstract concepts?
Already, at the dawn of systematic philosophy in ancient Greece, Plato confronted this dichotomy. For Plato, true reality lay in a transcendent world of unchanging, perfect, abstract Ideas, or Forms, a domain of mathematical relationships and fixed geometrical structures. This was the realm of pure being, inaccessible to the senses. The changing world of our direct experience -- the world of becoming -- he regarded as fleeting, ephemeral, and illusory. The universe of material objects was relegated to a pale shadow or parody of the world of Forms. Plato illustrated the relationship between the two worlds in terms of a metaphor. Imagine being imprisoned in a cave with your back to the light. As objects passed by the entrance to the cave they would cast shadows on the cave wall. These shadows would be imperfect projections of the real forms. He likened the world of our observations to the shadow-world of cave images. Only the immutable world of Ideas was "illuminated by the sun of the intelligible."
Plato invented two gods to have dominion over the two worlds. At the pinnacle of the world of Forms was the Good, an eternal and immutable being, beyond space and time. Locked within the half-real and changing world of material objects and forces was the so-called Demiurge, whose task it was to fashion existing matter into an ordered state, using the Forms as a type of template or blueprint. But, being less than perfect, this fashioned world is continually disintegrating and in need of the creative attentions of the Demiurge. Thus arises the state of flux of the world of our sense impressions. Plato recognized a fundamental tension between being and becoming, between the timeless and eternal Forms and the changing world of experience, but made no serious attempt to reconcile the two. He was content to relegate the latter to a partially illusory status, and regard only the timeless and eternal as of ultimate value.
Aristotle, a student of Plato, rejected the concept of timeless Forms, and constructed instead a picture of the world as a living organism, developing like an embryo toward a definite goal. Thus the cosmos was infused with purpose, and drawn toward its goal by final causes. Living things were ascribed souls to guide them in their purposive activity, but Aristotle regarded these souls as immanent in the organisms themselves, and not transcendent in the Platonic sense. This animistic view of the universe laid stress on process through progressive goal-oriented change. Thus it might be supposed that, in contrast to Plato, Aristotle gave primacy to becoming over being. But his world remained a paradoxical conjunction of the two. The ends toward which things evolved did not change, nor did the souls. Moreover, Aristotle's universe, though admitting continuous development, had no beginning in time. It contained objects -- the heavenly bodies -- that were "ungenerated, imperishable, and eternal," moving forever along fixed and perfect circular orbits.
Meanwhile, in the Middle East, the Judaic world view was based on Yahweh's special covenant with Israel. Here the emphasis was placed on God's revelation through history, as expressed in the historical record of the Old Testament, and represented most obviously in Genesis with the account of God's creation of the universe at a finite moment in the past. And yet the Jewish God was still declared to be transcendent and immutable. Again, no real attempt was made to resolve the inevitable paradox of an immutable God whose purposes nevertheless changed in response to historical developments.
A systematic world view that tackled seriously the paradoxes of time had to await the fifth century A.D., and the work of Saint Augustine of Hippo. Augustine recognized that time was part of the physical universe -- part of creation -- and so he placed the Creator firmly outside the stream of time. The idea of a timeless Deity did not rest easily within Christian doctrine, however. Special difficulty surrounded the role of Christ: What can it mean for a timeless God to become incarnate and die on the cross at some particular epoch in history? How can divine impassibility be reconciled with divine suffering? The debate was continuing in the thirteenth century, when the works of Aristotle became available in translation in the new universities of Europe. These documents had a major impact. A young friar in Paris, Thomas Aquinas, set out to combine the Christian religion with the Greek methods of rational philosophy. He conceived of a transcendent God inhabiting a Platonic realm beyond space and time. He then attributed a set of well-defined qualities to God -- perfection, simplicity, timelessness, omnipotence, omniscience -- and attempted to argue logically for their necessity and consistency after the fashion of geometrical theorems. Though his work was immensely influential, Aquinas and his followers had terrible difficulty relating this abstract, immutable Being to the time-dependent physical universe, and the God of popular religion. These and other problems led to Aquinas' work being condemned by the Bishop of Paris, though he was later exonerated and eventually canonized.
In his book God and Timelessness, Nelson Pike concludes, after an exhaustive study: "It is now my suspicion that the doctrine of God's timelessness was introduced into Christian theology because Platonic thought was stylish at the time and because the doctrine appeared to have considerable advantage from the point of view of systematic elegance. Once introduced, it took on a life of its own." The philosopher John O'Donnell arrives at the same conclusion. His book Trinity and Temporality addresses the conflict between Platonic timelessness and Christian-Judaic historicity: "I am suggesting that as Christianity came into greater contact with Hellenism...it sought to achieve a synthesis which was bound to break down precisely at this point....The gospel, combined with certain Hellenistic presuppositions about the nature of God, led to impasses from which the church has yet to extricate itself." I shall return to these "impasses" in chapter 7.
Medieval Europe witnessed the rise of science, and a completely new way of looking at the world. Scientists such as Roger Bacon and, later, Galileo Galilei stressed the importance of obtaining knowledge through precise, quantitative experiment and observation. They regarded Man and nature as distinct, and experiment was seen as a sort of dialogue with nature, whereby her secrets could be unlocked. Nature's rational order, which itself derived from God, was manifested in definite laws. Here an echo of the immutable, timeless Deity of Plato and Aquinas enters science, in the form of eternal laws, a concept that achieved its most persuasive form with the monumental work of Isaac Newton in the seventeenth century. Newtonian physics distinguishes sharply between states of the world, which change from moment to moment, and laws, which remain unchanging. But here once more the difficulty of reconciling being and becoming resurfaces, for how do we account for a flux of time in a world founded upon timeless laws? This "arrow-of-time" conundrum has plagued physics ever since, and is still the subject of intense debate and research.
No attempt to explain the world, either scientifically or theologically, can be considered successful until it accounts for the paradoxical conj unction of the temporal and the atemporal, of being and becoming. And no subject confronts this paradoxical conjunction more starkly than the origin of the universe.
Copyright © 1992 by Orion Productions