<<

. 2
( 7)



>>

expected waiting time (for the next bus, for example) is the
same, however long since the last one passed. Whether we have
been waiting for one minute or half an hour makes no differ-
ence to our prediction. More exactly, this can be expressed
mathematically to tell us what the frequency distribution of
inter-arrival times should be if this were true. It turns out to
be what is called the exponential distribution. It looks like the
frequency distribution sketched in ¬gure 3.1.
This distribution has its highest frequency near time zero
and declines steadily as the waiting time increases.3 Short

Students are often confused by what seems to be an implied contradiction
3

about the distribution of waiting times. The frequency does decline as stated,
Order out of chaos 35




Fig. 3.1 The form of the exponential frequency distribution

waiting times are, therefore, much more common, relatively
speaking, than large ones. This distribution occurs whenever
the condition for randomness is met, so we might legitimately
refer to it as a law. It is a regular and universal occurrence
under speci¬ed conditions “ and that is what laws usually are “
but it is a law generated by randomness, which seems rather
odd because we tend to think of laws as determining what
will necessarily happen. Here we suppose that there are no
determining factors, and yet we have a law.
But this is not the end of the story, because this exponential
law is not actually what is usually known as Poisson™s law. The
latter arises if we describe the process in another way. Instead
of observing the times between events we could count how
many events occur in ¬xed intervals of time. Standing at the bus
stop we could count how many buses arrive in successive ten
minute intervals, say. The result would be a string of numbers
which might be something like the following:

yet the expected waiting time is constant no matter how long one has waited.
The ¬rst relates to someone newly arrived, the second to someone who has
already waited some time. When you join a queue the chance of having
to wait time t certainly decreases as t increases. But once you have been in
the queue for a length of time t your position is no different from that of
someone who has just joined.
God, Chance and Purpose
36
3, 0, 1, 3, 2, 0, 0, 2, . . .

These do not look very lawful, but if we collect them together
a pattern begins to emerge. In a long sequence the successive
proportions will display a characteristic pattern known as the
Poisson distribution or Poisson™s law. In earlier times it was
sometimes referred to as the Law of Small Numbers (in con-
trast to the Law of Large Numbers) because it showed that,
even with rare events, there may be stable patterns.
I do not need to go into the mathematical details beyond
saying that the proportion of 3s, say, is simply related to the
proportion of 2s, and so on. The former is obtained by multi-
plying by the average and dividing by 3 (or whatever the cur-
rent number is). There are simple tests which can be applied
to see whether the proportions do conform to this pattern, and
it is that pattern which constitutes the law.
In reality it is, as we have seen, just another way of describ-
ingrandomnessintime.Thereisnosenseinwhichoccurrences
have to conform to the law. It is merely a description of what
happens when there is no law.
At this point the reader may still be slightly misled by the bus
example that I have used to illustrate these ideas. The objection
may be raised that there are, in fact, reasons why buses arrive
when they do. The fact that we are unable to predict them is
because we lack the necessary knowledge. This is correct, and
it illustrates the limitation of the illustration and also hints at
serious theological issues. Must there be a reason, as in the bus
example, or could it be there is no reason at all.
This brings us to the example of the emission of radioactive
particles. These do seem to conform to the requirements of
randomness in time with little prospect of our being able to
look behind the phenomenon and discover what is going on. It
is in such circumstances that some theologians have invoked
Order out of chaos 37




Fig. 3.2 The form of the normal frequency distribution

God as the controller. After all, if every event must have a
cause,andifnoothercausecanbedetected,theonlyalternative
seems to be God.4 This is a matter to which we must return
but the point being made here is quite simple. Even when we
specify a process which seems to be totally random, patterns
emerge which display all the characteristics we associate with
the idea of law. This must have profound implications for what
we mean when we speak of the lawfulness of the world and
what it means to attribute those laws to God.

t h e n o r m a l law
The oldest and best known of all statistical laws is the so-called
Normal Law or, re¬‚ecting its origin, the Normal Law of Error.
This goes back to Gauss but is now known to be so funda-
mental and widespread that its historical origins are incidental.
The Normal Law5 refers to the shape of a frequency distribu-
tion often described as bell-shaped. Figure 3.2 illustrates the
shape.

This would be the view of all those who take a broadly Calvinistic position.
4

The Normal Distribution is often referred to as the Bell Curve, a term made
5

popular by the somewhat notorious book with that title by Richard Herrn-
stein and Charles Murray (Free Press Paperbacks, 1994) on intelligence and
class structure in American life. Ironically the name is somewhat tangential
to the main message of that book.
God, Chance and Purpose
38
A quantity which is normally distributed will have a fre-
quency distribution like that shown above, where most values
are clustered in the middle of the range and then tail off towards
the extremes. It occurs widely in nature “ human heights are
one example. Is this because it was built into the creation as
something to which a wide range of phenomena would be
required to conform? The answer is, almost certainly, no. It
may be thought of as another example of order out of chaos.
It simply describes one of the regularities of randomness. In
other words it takes a world of chance to produce this law.
A simple example will help to illustrate one way in which
the law comes about. As always in this book, I shall ignore the
technical conditions which must be imposed if the mathematics
operating behind the scenes is to deliver the results. For present
purposes this is of no practical signi¬cance and enables us to
penetrate to the essential idea with greater ease.
Imagine that we are interested in the age distribution of the
members of a human society. Because they are too numerous
to question individually, we select a random sample. (A fuller
discussion of random sampling is given in chapter 10.) A ran-
dom sample is one chosen so that all possible samples have an
equal chance of being selected. In other words it is the fairest
possible method of selection “ no sample has any advantage
over any other. If the sample is fairly small “ under a hundred,
say “ it will not be too dif¬cult a job to ascertain all the ages.
Suppose we do this and ¬nd the average. Now imagine this
to be repeated many times on different, independent, samples
with the average being computed on each occasion. Surpris-
ingly the frequency distribution will have a form very close to
the Normal Law. The remarkable thing is that we can say this
without any knowledge whatsoever of the population age distri-
bution. The result must, therefore, be due to the method of
drawing the samples and have little to do with the underlying
Order out of chaos 39
distribution itself. It is thus a product of the sampling mech-
anism, and human choice had nothing to do with it nor did it
depend on any other feature of the world. Order, in the shape
of the distribution, has come out of the randomness of the
sampling procedure. It tells us nothing about any Creator but
more about our own ingenuity “ or does it?
To take this example a little further, I return to the emission
of particles from a radioactive source. If we observe many
emissions we shall ¬nd that the frequency distribution of the
time to the next emission is highly skewed. As already noted,
short intervals will be the most common and the frequencies
will decline as the duration increases. This is nothing like the
normal distribution, but the normal is not far away. Suppose
we repeatedly form the average of, say, twenty emission times
and look at their distribution. It will turn out that this is quite
close to normal. Once again, this has little to do with the
waiting-time distribution but neither, on this occasion, can
we attribute it to the randomness of sampling, for we did not
sample the intervals but took them as they came. This time
the randomness lies in the variation in waiting time and the
independence between successive emissions.
The surprising thing here is that the form of the waiting-
time distribution has very little to do with the result. It would
have been much the same whatever that distribution had been.
The unpredictability of the individual emission turns into the
predictability of the ¬nal distribution, with minimal conditions
on what the initial distribution is. It does not even matter
whether all the distributions are the same nor is it vital that
they be totally independent.
The ubiquity of the normal distribution arises not so much
because it is built into the structure of the world but rather that
it is imposed onto the world by the way we choose to observe
it. It may, therefore, tell us very little about the intentions of
God, Chance and Purpose
40
the Creator. But it does tell us that order and randomness are
so closely tied up together that we can hardly speak of one
without the other. Total disorder seems impossible. Order is
discernible at the aggregate level in even the most chaotic
micro-level happenings.

dy na m i c o r d e r
Order can be more subtle than that considered so far. It can
refer to stable patterns which develop and persist over time.
Stuart Kauffman6 has explored this kind of order over many
years in an effort to discover how biological complexity might
haveemerged.Ishalllaterdismissthecommonsimplisticargu-
ments about the origin of life which imagine all the ingredients
shaken up together and linking themselves together at random.
This kind of process bears no relation to any kind of plausible
biological process. Kauffman™s approach may be crude but it
is far more realistic in that it allows the interaction among the
ingredients to play their part. Kauffman™s aim is not to model
biological processes in minute detail “ that would be quite

Stuart Kauffman is a theoretical biologist who has championed the view
6

that the self-organisation of the relevant molecules may have created the
complexity necessary for life to appear. This has attracted the opposition
of Overman because it would mean that life would have appeared spon-
taneously, without the need for any Designer. Kauffman, himself, sees his
own approach as meeting the objections of those who claim that life could
not have arisen ˜by chance™. For example he draws attention to the extremely
small probabilities for the origin of life given by Hoyle and Wickramasinghe
1981 (see Kauffman 1993, p. 23) as showing that such a random assembly
was impossible. Kauffman sees such calculations as irrelevant. If we reject
special creation, this leaves the way clear for his own approach. There are
other, chemical, arguments for believing that life might arise spontaneously
in the conditions obtaining on the primeval earth. Recent examples are
reported on page 12 of New Scientist, 18 January 2003 and page 16 of 16
December 2006.
Order out of chaos 41
impossible “ but to capture the essential qualitative features
of this system. This may depend on the broad characteristics
of the system rather than its details. He constructs a model
consisting of elements linked together in a random fashion
and which exert an in¬‚uence on one another via links. In real-
ity the elements might be genes or enzymes but, at this stage,
it is the broad qualitative properties of such systems that I
wish to study. Many results for systems of this kind are given
in Kauffman (1993) or, at a more popular level, in Kauffman
(1995).
For illustrative purposes Kauffman imagines the elements
as being electric light bulbs which can be on or off. At any
one time, therefore, there will be a display of lights formed by
those bulbs which happen to be on.
The pattern now changes in the following manner. To take
a particular case, let us suppose that each bulb is connected to
three other bulbs, selected at random. The state of the initial
bulb at the next point in time is supposed to depend on those of
the three bulbs to which it has been connected. There has to be
a rule saying how this works. One possibility is that the initial
bulb will only light if all those to which it has been connected
are also alight. What has been done for one bulb is done for
all, so that we end up with a network of connections, such
that the state of any bulb at the next step is determined by the
three to which it was connected. Such a system is set off by
having an initial set of bulbs illuminated. Successive patterns
will then be generated by the on/off rules. A network of this
kind is called a Boolean7 net because the two sorts of rules
correspond to the and or or rules of Boolean logic. One or

George Boole (1815“64) was a British mathematician who is remembered
7

as the originator of Boolean algebra, which is central to computers which
work with binary numbers.
God, Chance and Purpose
42
other of these rules is attached to each bulb for determining
whether it is switched on or off.
The question now arises: what will happen if such a sys-
tem is allowed to run inde¬nitely? The ¬rst thing is to note
that there will usually be a very large number of possible pat-
terns. For example if there are 100 bulbs there would be 2100
possible patterns, which is an extremely large number. More
realistic examples with, say, 1,000 bulbs, would be capable of
generating an unimaginably large number of possible states.
To run through the full gamut, even at the rate of one million
per second, would take longer than the time the universe has
been in existence! This might lead one to expect a seemingly
endless succession of meaningless patterns.
The surprising and remarkable thing is that this may not
happen. In some circumstances the system may reach a rela-
tively small set of states around which it cycles for ever. This
small set may have only a few hundred patterns which is neg-
ligibly small compared with the original set. It turns out that
the critical numbers in determining the behaviour of the sys-
tem are the number of bulbs and the number of other bulbs to
which each is linked. We used three as an illustration. It turns
out that if there is only one connection, the behaviour is rather
uninteresting, whereas if it is large, greater than four or ¬ve,
say, the behaviour is chaotic and there are unlikely to be any
simple patterns. It seems that two connections are capable of
producing more interesting behaviour, sandwiched between
the monotony of just one connection and the chaos of many
connections. It is on this borderland of chaos and order that
the interesting things happen. There is just suf¬cient order
to produce simple patterns which persist, but not so much
as to render the result chaotic. Order on the edge of chaos
seems to be a key characteristic necessary for the emergence
of interesting phenomena.
Order out of chaos 43
The patterns here are not static. The order lies in the
repeatability of a small set of states through which the
system continually cycles. The order, remember, arises from
a random speci¬cation of the connections. It does not require
any detailed design work to produce the patterns. These
persist in spite of the randomness in the structure. It can
also be shown that other minor variations of the rules gov-
erning the behaviour leave the broad characteristics un-
changed.

i t ™ s a s m a l l wo r l d
There are about six billion people in the world so it would seem
obvious that almost all of them are so remote from us that we
could have nothing in common. Yet there are so many sur-
prising occurrences “ as when we meet two people in different
spheres who turn out to know one another, and we comment
˜it™s a small world™. This turns out to be truer than we might
think when we start to wonder how we might measure our
closeness to other people. There is a considerable literature
on this and related phenomena to which Barab´ si (2003) and
a
Buchanan (2002) provide a useful point of entry.
There are some well-known examples of the phenomenon.
Two widely discussed cases relate to the media and mathemat-
ics. Kevin Bacon acted in ¬lms. Other actors who appeared
with him in a particular ¬lm were obviously close to him in
a certain sense. People who did not appear in any ¬lms with
him, but who appeared with someone who had appeared with
Kevin Bacon, are less close. Let us say that they are distance
2 from Bacon. Those who came no closer than acting with
those with number 2, may be said to have distance 3, and so
on. What is the distribution of these distances, or, what is a
typical distance? It seems that the typical distance is around
God, Chance and Purpose
44
6 which, given the number of actors and ¬lms seems to be a
remarkably small number.
Mathematicians and many others whose work involves
mathematics have what is known as an Erd¨ s number. Paul
o
Erd¨ s was a proli¬c mathematician who collaborated with
o
many other mathematicians. One™s closeness to Erd¨ s can be
o
measured by one™s Erd¨ s number. Those who co-authored
o 8

a paper with Erd¨ s have an Erd¨ s number 1; there are about
o o
509 of them. Those who collaborated with a collaborator,
but not with Erd¨ s himself, have an Erd¨ s number 2; there
o o
are currently about 6984 of these and their names have been
published. Almost every mathematician has a small Erd¨ s o
number “ usually in single ¬gures. Again, this seems rather
remarkable given the vast size of the published literature in
mathematics.
It turns out that this phenomenon is widespread and in
totally different ¬elds, ranging from power grids to the study
of infectious diseases. Is there something going on that we have
missed or is this another example of order arising out of chaos?
The chaos in this case is the haphazard collection of links which
are formed between pairs of individuals. ˜Haphazard™ seems to
be the right word to use here because it conveys a sense of lack
of purpose. There may, indeed, be purpose in the formation
of individual links, but it is the overall pattern with which we
are concerned. It does not need anyone to oversee the overall
process. Purely local connections of a haphazard nature are
suf¬cient for the ˜small-world™ structure to emerge.
The study of small-world phenomena appears to have
begun with an experiment conducted by Stanley Milgram9
The Erd¨ s Number Project has a website: http://www.oakland.edu/enp,
o
8

from which this and similar information can be obtained.
Stanley Milgram (1933“84) was a psychologist who served at Yale Univer-
9

sity, Harvard University and City University of New York. While at Yale
Order out of chaos 45
in 1967. His work was concerned with how far any of us might
be from some public personage. We might ask whether we can
¬nd links through personal acquaintances which stretch from
us to the said personage. Although many of the details of the
experiment might be criticised, it established the hypothesis
that the average number of such links was about six. If one
wanted to establish contact, for example, with the President
of the United States by going through ˜someone, who knew
someone, who knew someone . . .™ it should not therefore need
too many steps to do so.

o r d e r i n n et wo r k s
The small-world phenomenon is one example of what happens
in networks created by linking individuals in a haphazard way.
Networks10 are everywhere and have become such a familiar
feature of our world that the word has passed into the language

he conducted experiments on obedience to authority which attracted a good
deal of criticism on the grounds that they were unethical. Also at Yale, in
1967, he conducted his ˜small-world experiment™. This was an empirical
investigation in which he sent letters to sixty randomly chosen people in
Omaha and Nebraska. Recipients were asked to hand the letters to some
one they thought might be able to reach the target “ a stockbroker in
Sharon, Massachusetts. The results were somewhat ambiguous and were
subject to a good deal of criticism but it was a truly pioneering study and
has proved a point of reference for much work since on the small-world
phenomenon.
There has been a great deal of interest in networks in the last few years. This
10

has resulted from the discovery that many of the networks of real life share a
common architecture. Much of this work has been done by physicists who
have seen parallels between the physical systems with which they work
and networks arising in the social sciences. Since about the year 2000 there
have been several books published which aim to make the results obtained
available to the wider public. Some, such as Strogatz (2003) and Barab´ si a
(2003), are authored by active researchers in the ¬eld. Others, such as
Buchanan (2002), are by experienced popularisers who are able to take a
God, Chance and Purpose
46
as a verb. In spite of their diverse origins, networks display
many common features remarkable for their simplicity. They
are a striking example of order out of disorder. Even highly
random networks exhibit order.
The simplest example is appropriately called the random
net. This appears to have had its origin in the study of neural
nets, where the interest was in such things as paths through
the network along which communication could take place.
A clutch of papers was published in and around the 1950s in
the Journal of Mathematical Biophysics by Anatol Rapoport11
and others. It was quickly recognised that the results also had
applications in the theory of epidemics, where the nodes are
people and the links are paths of infection. This work was
summarised in chapter 10 of Bartholomew (1982) (though
rather more detail is given in the ¬rst two editions of that
book, 1967 and 1973).
Rather surprisingly this work was not noticed by the math-
ematical community until a new impetus was given to it when
Erd¨ s and R´ nyi weighed in with a series of eight papers
o e
on random graph theory beginning in 1959. (The details will
be found at the top of page 245 in Barab´ si (2003).) This,
a
together with the growing power and accessibility of comput-
ers, opened the way for the rapid contemporary development
of the ¬eld. An interesting example of the application of a
random net is given by Kauffman in his book Investigations
(2000, pp. 35ff.). Kauffman™s interest is in the chemistry behind
broad but informed view. On 13 April 2002 the New Scientist published an
article by David Cohen entitled ˜All the world™s a net™.
Anatol Rapoport (1911“2007) was born in Russia but emigrated to the
11

United States in 1922. He has worked on a wide range of topics but the
focus was on the mathematics of decision making. From 1970 he was based
at the University of Toronto where he held chairs in Psychology and
Mathematics and in Peace and Con¬‚ict Studies. It is surprising that his
work on nets was unknown to Erd¨ s and his associates.
o
Order out of chaos 47
the origin of life but what he calls ˜the magic™ comes across
without going into the background.
Kauffman supposes that we have a large number of buttons
scattered across a hard wooden ¬‚oor. We begin by picking two
buttons at random and joining them with a coloured thread.
Next we repeat the process, joining another pair of buttons.
It could happen that one or both of those picked the second
time could include one of the original pair. This process is
repeated again and again. Every so often we pause and pick
up a single button chosen at random. It may turn out to be an
isolated button or it may be linked to others which will also
be lifted because of the threads which tie them together. In
the early stages these linked clusters will usually be small but
as time goes on they will get larger. One way of charting the
progress of this clustering is to see how the size of the largest
connected cluster increases as the number of threads (con-
nections) increases (expressed as a proportion of the number
of buttons). The magic occurs when this proportion comes
close to a half. Below a half the size of the largest cluster will
be small, but as it passes through a half it increases rapidly
and soon comes close to the maximum possible. In chemical
terms this is a phase change. We move from a situation in which
groups are small to one in which connectivity is high. This
happens without any ˜design™. The connections are purely
random and yet this haphazard sort of linking produces a situ-
ation which, to some eyes, might appear to express purpose or
design.
It is worth re¬‚ecting on this example a little longer to see
what these random connections have done for us. Suppose,
instead, that we initially lay the buttons in a straight line. If we
now connect neighbouring pairs we shall eventually create a
single cluster and the number of links will be one fewer than
the number of buttons. Expressed as a proportion, the ratio of
God, Chance and Purpose
48
links to buttons is very close to one “ much higher than was
obtained using random connections. The clusters obtained by
the two methods are likely to be very different and for some
purposes one or the other might be preferred. The point is
that it does not take many links, randomly assigned, to create
something very close to a single structure.
Networks can be created in all sorts of ways and the features
of interest may vary greatly. What does not vary is the presence
of patterns or regularities which seem to appear in spite of the
chaotic way in which things happen.
Epidemics provide another example. Infections such as
in¬‚uenza spread in human populations largely through con-
tacts. Who meets whom may be fairly random as in business,
shopping, education, and so on. Contact, in individual cases,
may be planned or unplanned yet the spread may be very
rapid. Diseases such as AIDS spread through sexual contacts.
Rumours spread in a manner rather similar to infections. They
are passed from friend to friend and often, it seems, at light-
ning speed. A suf¬ciently hot piece of news will spread quickly
enough without anyone going to the trouble of engineering
the spread. This fact is well known by those who wish to
counter rumours. To counter such a rapid spread, broadcast-
ing in some form which reaches everyone almost immediately
is essential.
There is a very extensive scienti¬c literature on how epi-
demics, whether of infection or rumour, spread. Not surpris-
ingly, some things about such processes are predictable and
constant in spite of the fact that they are driven by purposeless
activity. In the simplest examples, anything spread in a given
population will eventually reach everyone if enough time is
allowed. This is because in a freely mixing population every-
one will meet everyone else at least once and possibly several
times. But not many epidemics are like this; usually there is
Order out of chaos 49
some self-limiting factor. People who become infected, for
example, will withdraw from the population to recover. This
is voluntary isolation but, in serious cases, such as SARS,12
they will be removed from the exposed population. Rumours,
similarly, often die out before the epidemic has run its course.
Passing on a piece of salacious gossip ceases to be attractive
if potential hearers have already heard. This may cause the
spreader to give up.
Models have been constructed to quantify some of these
intuitively based observations. Under suitable conditions it is
possible to predict the proportion of those who have heard
or been infected at any stage of the epidemic. The growth
curve also has a characteristic shape, rising slowly at ¬rst,
then more rapidly before ¬nally levelling off. This shape also
depends remarkably little on the precise details of how the
spreading takes place. What is interesting for present purposes
is that haphazard and seemingly unguided processes prove to
have more pattern than we might have guessed. Randomness
achieves easily that which, by design, might have been very
dif¬cult.
In all of this work it is supposed that the population of nodes
(e.g. people) is ¬xed. In practice many networks are constantly
growing; the World Wide Web is, perhaps, the prime example.
It is found that such networks differ in some important respects
from the random net. For example, in a random net the number
of incoming paths to a node will vary but not by much. The
actual numbers will have a distribution fairly tightly packed
around the average. Growing networks are not like this. They
have a number of what are called hubs which are at the centre
of large numbers of links. To account for this Barab´ si anda
his colleagues proposed a model leading to what they called

Severe Acute Respiratory Syndrome.
12
God, Chance and Purpose
50
scale-free networks. They supposed that as each new node
appeared it established links with already existing nodes. The
choice of destination could be purely at random or the prob-
ability could be proportional to the number of links already
established. It is intuitively clear that nodes which appeared
near the beginning of the process will tend to have more
links, simply because they have been around longer and so
have had more opportunity. The effect of growing in this
way is to create a hierarchy of hubs, some with very large
numbers of links. The distribution of the number of links per
node now becomes highly skewed, with large numbers hav-
ing very few links and a few with a very large number. This
pattern is preserved as the system grows and hence accounts
for the description scale-free. For our purposes it is important
to note that a large measure of randomness is still present in
the formation of the net but the resulting pattern, though still
simple, is very different from that of a random net. It is also
interesting to note, in the context of this book, that one of
the examples Barab´ si uses is the spread of early Christian-
a
ity through the activities of St Paul, who created a network
as he travelled between centres of population in the ancient
world.
The number of links per node emerges from these analyses
as a key descriptor of a network. Once we get away from the
random net, the distribution of that number is, as we have
seen, highly skewed. But more can be said about it than that.
The long upper tail of that distribution can be approximated
by what is known as a power law. Such distributions have
been known for a long time in other contexts of which the best
known is, perhaps, the Pareto Law of Income Distributions.
They are yet another example of order out of disorder. Much
of that work is brought together in chapter 7 of Bartholomew
(1982).
Order out of chaos 51

a p p rox i m at e o r d e r
Appearances can be deceptive. Order can appear, persist for a
time and then disappear. I describe a simple example merely
to illustrate the possibilities. In a sense, the example is like a
sequence of independent tosses of a coin with the difference
that the probability of a head at any toss changes from time
to time depending on the outcomes of previous tosses. The
experiment goes as follows. Toss a fair coin 100 times, say, and
compute the proportion of heads. Make a further 100 tosses
but this time make the probability of a head be equal to the
proportion of heads in the previous 100 tosses. Repeat this
procedure and at each stage let the probability of a head be
equal to the proportion of heads in the previous set of 100
tosses. What will happen in the long run to the probability?
Intuition might lead us to expect it to remain around 0.5. We
expect to get about 50 per cent of heads the ¬rst time round
that should lead to around the same ¬gure next time, and so
on. In the very long term this argument will let us down.
Sooner or later all of the 100 tosses will be all heads or all
tails. When that happens the future is ¬xed, because if we
estimate the probability to be 0 or 1 we shall get nothing but
all heads or all tails for ever. Nevertheless, in the medium
term, the probability is likely to remain around 0.5 for some
time, thus exhibiting a degree of order which, though giving
the appearance of constancy, is doomed to disappear. Thus
order, when it arises, is not necessarily something which is
permanent.

sync
Christiaan Huygens invented the pendulum clock in about
1655. Ten years later he suffered a slight indisposition and
God, Chance and Purpose
52
was in a room with two such clocks when he noticed ˜a mar-
vellous thing™ about which he wrote to his friend, Sir Robert
Moray.13 He noticed that two similar clocks, hanging together
in the same room, were beating in time. This was an early
and very simple example of synchronisation. This might have
happened because the clocks were set going together and,
being very accurate timekeepers, had kept in step ever since.
But Huygens found that, however they were started, they
would always synchronise in about half an hour. There must
have been some means by which the action of one clock
in¬‚uenced the other in a manner which brought them into
harmony.
Pendulum clocks are not the only things which interact in
this way. Synchronisation is a very widespread phenomenon
and includes some spectacular examples. Fire¬‚ies, as their
name implies, produce ¬‚ashes of light and these occur at reg-
ular intervals. If one encountered a large number of ¬‚ies, all
¬‚ashing, one would expect to see a chaotic mixing of the indi-
vidual ¬‚ashes but this is not what happens. In Southeast Asia
and elsewhere thousands of ¬re¬‚ies gather along river banks
and provide a spectacular display of synchronised ¬‚ashing.
Sometimes this type of phenomenon can be harmful, as in
epilepsy, when the synchronised activity of many brain cells
can lead to convulsions. At other times it can be essential as
when many pacemaker cells in the heart combine to produce
a regular heartbeat. This kind of phenomenon is remarkably
widespread, which suggests that there is a common pattern in
what is going on which might be susceptible to mathematical
analysis. This turns out to be the case and the topic is the
subject of a fascinating account by Steven Strogatz, one of

Further details of this example and some further background will be found
13

in Strogatz (2003, chapter 9).
Order out of chaos 53
the principal researchers in this ¬eld. (See Strogatz (2003) for
example.)
This phenomenon is certainly an example of spontaneous
order arising from disorder. It is not necessarily a case of
order arising from chance, so some justi¬cation is required
for its inclusion here. The essential point is that there is no
overarching control which produces the order represented by
the regularly beating heart. There is no central timekeeper or
conductor orchestrating the behaviour of ¬re¬‚ies. The pat-
tern arises from local interactions which bear no obvious rela-
tionship to the global pattern. The clocks will beat in time
however much or little their starting positions differ. It would
make no difference if the starting positions were determined
randomly and that means that this aspect does not have to
be designed or planned. Viewed theologically, the regularities
can be achieved without any detailed planning and hence there
needs to be no detailed control at the micro level. Spontaneous
synchronisation can be achieved almost for nothing.

c on c lu d i n g r e m a r k s
I have ranged widely over many ¬elds to show that order often
arises as a consequence of disorder. The ˜laws™ I have consid-
ered might speak to the uninitiated of the direct intervention of
a God, at every stage, to ensure that things turned out exactly
as he desired. In fact the beautiful shape of something like the
Normal Distribution might be regarded as eloquent testimony
to the nature of the Designer. But as Mark Twain memorably
remarked, ˜it ain™t necessarily so™. Lawful behaviour in the
shape of statistical laws seems to emerge naturally from an
underlying chaos. Indeed one might go so far as to say that
chaos is a precondition of the order which, to some, speaks so
eloquently of the divine mind.
God, Chance and Purpose
54
The full theological implications of all this are profound but
a discussion of them must wait until more pieces of the jigsaw
are in place. For the present, I note that the involvement of
God in the creation may be much more subtle than theologians,
rushing to premature closure of the science“religion debate,
are prepared to allow.
c h a pt e r 4
Chaos out of order



The transition from chaos to order is not a one-way process. Just
as order can result from chaos, so can chaos result from order. In
this chapter I describe three ways in which this may happen. These
are: accidents, pseudo-random numbers and mathematical chaos. This
transition, also, has to be taken into account by our theology. If these
were the only sources of uncertainty in the world there would be no
need to invoke pure chance or to explain how it might be consistent
with divine purpose.


d i s o r d e r g e n e r at e d b y o r d e r
Our theological assertions about what God can or cannot do
in the world depend very much on what kind of place we
believe the world to be. In the last chapter we encountered
the somewhat disconcerting fact that much of the order and
lawfulness which we so readily attribute directly to God has
its roots in disorder. But this is only half of the story “ a great
deal of disorder attends the regularities that are all around us.
In this chapter, therefore, we shall look at the other side of the
coin as a prelude to seeing whether the apparently paradox-
ical situation which faces us can be resolved. For example, if
the purposefulness of God is to be discerned anywhere, one
would expect it to be in the regularity and predictability of the
aggregate. But, if this is the case, how does God engineer it?

55
God, Chance and Purpose
56
On the other hand, if the motions of individual molecules in
a gas are to be thought of as purposeless and undirected, how
can we attribute purpose to the aggregate behaviour which is
built on this irregularity?
This will demand a reassessment of what it means to say
that happenings in the world reveal the purposeful activity of
God “ or the lack of it. Here we shall look at three manifesta-
tions of chaos out of order: ¬rst, accidents, which in retrospect
can often be seen as determined by antecedent events but which
constantly surprise us; secondly, pseudo-random numbers “
this may seem to be an esoteric branch of mathematics but, in
reality, it is very big business in applied science of both the nat-
ural and social varieties; ¬nally, chaos theory, which, in a more
technical sense, has come into great prominence in recent years
and has sometimes been described as the ˜third great revolu-
tion in physics this [the twentieth] century™. Some theologians
have seen chaos theory as a lifeline offering a means for God to
exercise control without disturbing the orderliness of nature.
To me this claim seems premature and unconvincing.

ac c i d e n ts an d c o i n c i d e n c e s
Accidents often appear to be unintended and unpredictable
happenings. The phrase ˜accidents will happen™ expresses both
their inevitability in the long run and the uncertainty sur-
rounding particular occurrences “ as does the remark about
˜accidents waiting to happen™. Speaking of ˜happy accidents™
reminds us that it is not the consequences of the happening
that de¬ne an accident. The essential idea behind the word is
of something that is not planned or intended. There are no
obvious causative factors and, though there may well be rea-
sons for accidents, they are largely hidden from us. In other
words, they bear all the signs of being chance events. Yet they
can happen in systems that are fully determined.
Chaos out of order 57
Coincidences are closely related to accidents. Again there
is no suggestion that they are unnatural in any sense; two
events just happen to have occurred at the same time and
place. The distinctive thing about them is the juxtaposition of
two unrelated events which take on a new meaning because of
their coincidence.
The striking of the earth by a large asteroid may, except
perhaps in the last stages of its approach, have all the appear-
ances of an accident in the sense that it was unintended and
largely unpredictable. But we know that heavenly bodies are
subject to known gravitational forces and, given their trajecto-
ries over a period, collisions could be predicted with certainty.
The whole set-up may well be deterministic but our ignorance
of the contributing paths of causation would lead to our being
just as surprised as we would have been by some truly ran-
dom happening. From the perspective of chapter 2 on ˜What
is chance?™ we simply do not have enough information to pre-
dict with certainty what is going to happen; we are therefore
uncertain.
At a more mundane level, accidents also commonly arise
from the coincidence of two independent paths in space and
time. In speaking of paths we are borrowing terminology
which is familiar enough in two-dimensional geometry but
may seem odd in this context. Just as we can plot a path
on a two-dimensional map, so we can imagine doing it in
more dimensions than we can actually visualise. With three
space dimensions and one time dimension we have a four-
dimensional space but we can stretch our language to speak of
a path in the same way as in fewer dimensions where we can
visualise what is going on. When two causal paths coincide at a
particular time and place they may trigger the event we call an
accident. To an all-seeing eye the accident could be foreseen
and would come as no surprise. We sometimes ¬nd ourselves
in this position. We may shout a warning to someone who
God, Chance and Purpose
58
cannot see what is coming. To them it comes as a surprise
because they do not have enough information to make the
prediction that is a certainty to us.
An accident could, of course, be a truly random event trig-
gered by some purely chance happening as, for example, the
emission of a radioactive particle. Usually, however, it will
only appear to be random because of our ignorance of the
processes which give rise to it. All that we need to establish at
the moment is that, even in a fully deterministic system, events
happen that have, to us, all the characteristics of chance events.
For all practical purposes accidents and coincidences are just
like pure-chance happenings, but the theological issues they
raise are quite different.
At one level accidents pose no serious theological questions.
If from God™s perspective all is known to him in advance, then
his sovereignty is unchallenged. At another level, of course,
accidents raise profound questions as to why God should allow
themtohappen.Inturn,thisleadsontothequestionofwhether
the system could have been designed to avoid all such unde-
sirable happenings. Given the highly complex and interacting
nature of the creation, it is not at all obvious that it would be
possible to ˜design out™ all unwanted outcomes. The highly
interconnected character of the world may mean that one can-
not have some desired outcome without having others which,
so to speak, necessarily go along with it. This is a problem for
those who demand total sovereignty for God at every level. It
may be less of a problem for those who are prepared to allow
the elasticity which a modicum of uncertainty provides.

p s e u d o - r an d o m n u m b e r s
Pseudo-random numbers are big business. The production of
such numbers by computers “ even pocket calculators “ is a
Chaos out of order 59
largely unnoticed yet universal feature of contemporary soci-
ety. Basic scienti¬c research and the selection of winners in
competitions both call for a ready supply of random numbers.
But what are pseudo-random numbers, why do we need them,
and what bearing does their generation have on the present
question of chaos out of order? In passing, it should be noticed
that genuinely random numbers can be produced using phys-
ical processes, such as radioactive decay. If this is so and if the
real thing is available one might ask why we should bother with
something which is second best. The answer is that pseudo-
random numbers are almost as good and do not require any
physical apparatus with all its costs and inconvenience.
It may help to begin by looking at some familiar things
which serve to introduce the general idea. In some sense we
are talking about what might be called ˜contrived™ accidents.
Think ¬rst of the tables of ¬gures published by National Sta-
tistical Of¬ces. Narrowing it down, imagine we are looking
at the populations of large cities in a big country such as the
United States. One such ¬gure might be 1,374,216. Not all
of the digits have equal importance. The ¬rst 1 is the most
important. It immediately gives us an idea of whether this is
a big city or not; in this instance it is in the million or more
bracket. The next most signi¬cant digit is the 3 in the second
position. Knowing that the city population was in the region
of 1.3 million would probably tell us most of what we want
to know and it would certainly ¬x the size in relation to most
other major cities. As we move along the sequence the digits
become progressively less signi¬cant. Another way of putting
the matter is to say that the information, conveyed by these
successive digits, diminishes as we move along the sequence.
The ¬nal 6 tells us very little indeed about the size of this city,
or about its size in relation to other cities. In fact, it may not
be accurate given the uncertainties of counting such things. If
God, Chance and Purpose
60
we were to collect together the last digits of the populations of
many cities they would be almost devoid of meaning and so
would be rather like random numbers. But we very well know
that they are not strictly random. They have been arrived at
by a process of counting well-de¬ned objects (human beings,
no less). It would not be too far-fetched to describe collections
of digits arrived at in this way as pseudo-random digits on the
grounds that while they are certainly not random, nevertheless
they look very much as if they were.
Any method which involves using some well-de¬ned arith-
metical process and which then discards the most meaningful
parts is a way of producing pseudo-randomness. One tradi-
tional method, long used by children, is the use of counting-
out rhymes to decide who shall be it in a game. A method is
required which is accepted as fair by the participants in the
sense that, over the long run, it does not appear to favour any
particular individual.
There are several variations of the method but the usual
procedure is to count around the circle of children, counting
one word to each child. The child on whom the last word
falls is selected. In principle it would be perfectly possible to
work out in advance who would be selected. To do this we
would divide the number of words in the rhyme by the number
of children in the group. We would then disregard the inte-
ger part of the answer and keep the remainder. This number
determines which child would be selected. If the remainder
was four it would be easy to spot the fourth member of the cir-
cle “ much easier than doing the full mental calculation. The
method works because it is not immediately obvious what
the remainder will be, because people do not carry in their
heads the numbers of words in all, or any, of the many rhymes
there are. It is much easier to remember the words than their
number. The remainder is a pseudo-randomly selected digit.
Chaos out of order 61
For the large-scale uses of random numbers these sim-
ple methods would be far too slow, their yields would be
inadequate and there would be serious doubts about whether
they were suf¬ciently random. These dif¬culties are over-
come by using what are called random-number generators.
Many of these are based on much the same principle as the
counting-out rhyme. One does a calculation, such as a division
sum, and then retains the least signi¬cant part of the answer.
Nowadays most computers incorporate random-number gen-
erators. One characteristic of many such generators is that, if
you go on long enough, the output will start repeating itself.
This is one sign that the numbers are not genuinely random
but this fact is of little practical importance if the number we
need is very small in relation to the cycle length.
It is worth observing at this stage that coin tossing is also
pseudo-random. It does not produce a sequence of digits from
0 to 9 but it does produce the simplest kind of numbers we
can have “ that is binary numbers. If we code a head as 1 and
a tail as 0, a sequence of tosses will be a string of 0s and 1s.
Whether or not a coin falls heads or tails depends on what is
left over when the number of complete rotations is ¬nished.
This, in turn, depends on the impulse and spin imparted to the
coin by the tosser™s hand. Whether or not it falls heads or tails
depends on which side of the vertical the coin is when it hits the
ground on its return. The binary outcome is thus determined
by a remainder to the number of tosses in much the same
way as are pseudo-random numbers. This comparison shows
just how close to being genuinely random a pseudo-random
sequence can be. In turn, this raises the question of whether
there are degrees of randomness and whether they would tell
us anything of theological relevance.
There are two entirely distinct ways of approaching this
question. The traditional statistical approach is to ask what
God, Chance and Purpose
62
properties a random series should have and then to examine a
candidate series to see whether it has them. In this approach
there is no such thing as ˜a random number™, in the singu-
lar. The randomness refers to the process of generation and
can only be detected in a long sequence of such numbers. A
pseudo-random-number generator is a device which produces
a sequence of numbers (usually the digits 0, 1, 2, . . . , 9 or
the binary digits 0 and 1), which for all practical purposes
are indistinguishable from a purely random sequence. And
what would a purely random sequence look like? It would
look like a sequence from a purely random generator! We
might begin to specify some of the required properties. To
begin with, the number occurring at any point in the sequence
should not depend on anything that has gone before. If it did,
it would be, in part, predictable. More generally, nothing we
could learn from part of the series should contain any informa-
tion for predicting any other part. This all depends on asking
about the mechanism which generated the series. According
to this approach one does not ask whether a particular string
is random. Any string whatsoever could have been generated
by a random process. A sequence of 20 sixes generated by
rolling a six-sided die has probability (1/6)20 which, although
an exceedingly small number, is exactly the same probability
as any other sequence, however random it might look. Instead
one asks whether the generating mechanism was such that all
possible digits had the same probability of occurring inde-
pendently of all other outcomes. From this one can deduce
various properties which a series should have. For example,
in the binary case, each digit should appear roughly equally
often; each time a 1 occurs it should be followed equally often
by a 1 or 0, and so on. By making these comparisons one
can never categorically rule out any series as non-random but
one can get closer to saying whether it is likely to have been
generated by a random generator.
Chaos out of order 63
The second approach is to ask how close the actual series
is to randomness. This is based on a clever idea of Gregory
Chaitin1 (see his chapter 2 in Gregerson, 2003a) though, like so
many other good ideas, it had been anticipated some years ear-
lier. To avoid confusion it would be more accurate to describe
the method as measuring the amount of pattern rather than the
degree of randomness. A highly non-random series exhibits a
large degree of pattern and conversely. For example a sequence
of 100 ones is highly patterned and it can be described very
concisely as 100 ones. The sequence 01010101 . . . is almost
equally simple and could be described as 01 repeated 50 times,
say. Chaitin™s idea was that the simpler the pattern, the fewer
the words which were needed to describe it. At the other
extreme, randomness occurs when there is no detectable pat-
tern at all and then there is no shorter way of describing it
than to write the sequence out in full. Actually one has to be a
little more precise than this. Descriptions have to be in terms
of the length of the computer program (on an ideal computer)
necessary to reproduce the sequence. Nevertheless the idea
is essentially very simple. Randomness is measured by the
degree to which one can describe the series more economi-
cally than by using the series itself. We shall meet this second
way of measuring randomness again in chapter 7.

c h ao s
In ordinary usage the word chaos carries a much wider con-
notation than described in this section and this tends to spill
over into theological discussions. I am talking here about what

Gregory J. Chaitin, born in 1947, is a mathematician and computer scientist
1

who apparently had the basic idea mentioned here while still at school.
There are many accounts of this way of measuring randomness but the
chapter referred to here (entitled ˜Randomness and mathematical proof™)
has the merit of being written by Chaitin himself.
God, Chance and Purpose
64
is more accurately described as mathematical chaos.2 In fact it
is, at least, debatable whether anything exists in the real world
corresponding exactly to mathematical chaos.
Science has been particularly successful where it can predict
change over a long period. Tide tables are published well
in advance because tides depend on accurate knowledge of
the relative positions of the earth, moon and sun and how
they change with respect to one another. Similarly the great
successes of engineering are the result of good knowledge of
materials and how they respond to the strains and stresses that
are put upon them. The study of dynamical systems has also
depended on being able to determine how the rates of change
depend on the current state of the system.
Mathematical chaos had its origins in the discovery that
there are physical processes which do not appear to behave
in a regular and predictable way. This is especially true of
the weather. Forecasters attempt to model weather systems.
They measure such things as air and sea temperatures at many
points on the earth™s surface and use models to predict how
they will change and then construct their forecasts. Yet despite
the growing complexity of their models and vastly increased
computing power available it has proved impossible to look
more than a few days ahead. Edward Lorenz3 was one of
the ¬rst to study the situation mathematically and to discover
chaotic behaviour in an apparently deterministic model. He
wrote down a set of differential equations which, he thought,
There are many books on chaos aimed at different audiences. A popular
2

account is given in Chaos: Making a New Science (1987) by James Gleick. An
excellent, somewhat more technical account “ but still not making undue
mathematical demands “ is Explaining Chaos (1998) by Peter Smith.
Edward Lorenz (1917“) was a meteorologist working at the Massachusetts
3

Institute of Technology. It was in simulating weather patterns that he dis-
covered, accidentally apparently, that the development of a process could
depend heavily on the initial conditions.
Chaos out of order 65
provided a reasonable approximation to real weather systems.
As he projected them into the future it became apparent that
the ˜futures™ which they generated depended critically on the
precise starting assumptions. Thus was born chaos theory.
A characteristic feature of chaotic processes is this extreme
sensitivity to initial conditions. It is this fact which makes long-
term forecasting almost impossible and which justi¬es the
introduction of the description chaotic. Two systems, which are
extremely close together at one time, can subsequently pursue
very different trajectories. This near lack of predictability is
the source of the term chaotic.
Weather is not the only phenomenon which exhibits this
sort of behaviour. Heart beats and the turbulent ¬‚ow of li-
quids are two other examples which have attracted attention. It
is not, perhaps, immediately obvious whether chaotic pro-
cesses such as I have described bear any relationship to the
pseudo-random-number generators discussed above. They
do in the sense that one can choose simple indicators charting
the path which a chaotic process takes which are, essentially,
sequences of digits. In this form, their properties can be com-
pared with sequences of pseudo-random numbers.
This similarity to random processes has certain attractions
for theologians. For example, for those who are uneasy with
any true randomness and hanker after the certainties of deter-
minism, it holds out the prospect of a fully controlled universe
with a fully sovereign God. For if the unpredictability of some
dynamic processes displays the lack of predictability of chaos
then, maybe, all the uncertainties of life are rooted in mathe-
matical chaos. This, as I note elsewhere (for example in chapter
12), poses problems of another kind for theology, but at least
it avoids the most ¬‚agrant introduction of pure randomness.
A second attraction of chaos theory for theologians is that
it offers the prospect of space for God™s action in the world
God, Chance and Purpose
66
which does not violate the lawfulness which is so integral to
science. For if such different paths are followed by processes
starting from positions which differ in¬nitesimally, then per-
haps God can engineer changes of direction at the human
level by such minuscule adjustments that they are not humanly
detectable. This view of things creates enough space for God
to act without the possibility of detection by humans.4 Like
so many seemingly simple recipes, this one raises enormous
problems which we shall come to later. Note here, however,
that it makes the major assumption that the world can, in fact,
be controlled in this way and secondly, that it implies things
about the nature of God from which the orthodox might shrink
if they were made fully explicit.
For the moment I simply add chaos theory to the list of
deterministic processes which contribute to the unpredictabil-
ity of the world.

The idea that chaos theory provides space for God to act without violating
4

the lawfulness of nature is particularly associated with the name of John
Polkinghorne. He clari¬es his views in his contribution to Russell et al.
(1995) (pp. 147“56, see especially the section on Chaos theory, p. 153).
In the same volume, p. 348, Nancey Murphy makes a similar point but it
should be emphasised that the views of both authors are more subtle than a
super¬cial reading might suggest. We return to the subject of God™s action
in the world in chapters 8 and 9.
c h a pt e r 5
What is probability?



Theologically important arguments sometimes depend on the correct-
ness of probability calculations. In order to evaluate these arguments
it is necessary to know something about elementary probability the-
ory, and in particular, about mistakes which are often made. In this
brief, non-technical, chapter I introduce some basic ideas of measuring
probability and then identify several common fallacies. These involve
the importance of the assumption of independence, conditionality and
the so-called prosecutor™s fallacy.


t wo k i n d s o f p ro ba b i l i t y
Chance is the general-purpose word we use when we can see
no causal explanation for what we observe. Hitherto I have
not needed to quantify our uncertainties but we shall shortly
move into territory where everything depends on exactly how
large these uncertainties are. I have already spoken of prob-
ability in unspeci¬c terms but now I must take the next step.
Probability is a numerical measure of uncertainty and numer-
ical probabilities have been used extensively in theological
discussions. As we shall see shortly, there are events, such as
the appearance of life on this planet, for which the way in
which probabilities are calculated is absolutely crucial. Cer-
tain events are claimed to have extremely small probabilities
and on their validity hang important consequences for what
we believe about life on earth or, even, the existence of God.
67
God, Chance and Purpose
68
Many of the arguments deployed are often little more than
rather wild subjective opinions, so it is vital to be as precise as
possible about probability statements. There is no need to go
into technicalities but it is important to be aware of some of the
major pitfalls and to be able to recognise them. The purpose of
this short chapter is to set out the more important things that
the reader needs to know before we come to such contentious
issues as Intelligent Design, where the correct calculation of
the probabilities is crucial.
Probabilitytheorymightseemtobeoneofthemoreesoteric
corners of the ¬eld which can be passed over without losing the
general drift of the argument. But much can turn on whether
probabilities are correctly calculated and interpreted. This
is just as true in ¬elds outside theology, so I begin with an
example of considerable public interest from forensic science
to illustrate this point. The outcomes of trials and subsequent
imprisonment have depended on probabilities. Sally Clark, a
solicitor and mother of two young children, was imprisoned
for the murder of her two children in 1999.1 The prosecution
argued that the probability of two cot deaths in one family was
so small that it could be ignored in favour of the alternative
explanation of murder. The probability calculation played a
major part in the conviction of Sally Clark and her subsequent
release after two appeals. DNA matching is another ¬eld where

This case raised a number of important issues about the presentation of
1

scienti¬c evidence in courts of law. In this particular case the expert witness
was a distinguished paediatrician and his evidence was crucial in the court™s
decision to convict Sally Clark of murder. It was subsequently recognised
that he was not competent to give expert testimony on statistical matters.
The Royal Statistical Society issued a press release pointing out that a seri-
ous mistake had been made and urging that steps should be taken to use
appropriate expertise in future. In the United Kingdom the Crown Prose-
cution Service has issued guidelines for expert witnesses in such matters as
DNA pro¬ling.
What is probability? 69
it is becoming increasingly common for probabilities to be
used in criminal cases. Probabilities of one in several millions
are not uncommonly quoted in court as supporting evidence.
A probability is a number purporting to measure uncer-
tainty. It is commonly expressed as a number in the range 0“1,
with the upper end of 1 denoting certainty and the lower end of
0 corresponding to impossibility. A probability of 0.5, which
is halfway, means ˜as likely as not™. Anything larger than 0.5 is
then more likely than not. The betting fraternity do not often
use probabilities in this form but use odds instead. These are
an equivalent way of expressing uncertainty and are, perhaps,
more widely understood. Odds of 3 to 1 in favour of some
outcome is saying the same thing as a probability of 0.75 and
˜evens™ is equivalent to a probability of 0.5. Odds tend to be
favoured when probabilities are very large or small. In Sally
Clark™s trial the estimated odds of a cot death were 1 in 8543, a
¬gure whose import is, perhaps, more readily grasped than the
corresponding probability of 0.00012. Odds are, however, less
convenient when we come to make probability calculations or
to be explicit about the assumptions on which the probability
is calculated.
But where do the numbers in which such probabilities are
expressed come from? For our purposes we can identify two
sorts of probability.2 First there are those based on frequencies.
These are used by actuaries when constructing life tables for

This twofold division of probability is, of course, somewhat oversimpli¬ed.
2

Elementary textbooks often begin with ˜equally likely cases™ and do not
get much beyond problems concerning coins, dice and cards, where the
possible outcomes are, self-evidently, equally likely. Traces of this idea
appear in what we shall meet later where complicated biological entities are
mistakenly treated as what we shall call combinatorial objects (see chapter
7, p. 110). What are here called degrees of belief can be further subdivided
into objective and subjective (or personal) probabilities.
God, Chance and Purpose
70
insurance purposes. If, for example, 60 per cent of forty-year-
old men survive to their seventieth birthday we may say that
the probability that a randomly selected forty-year-old man
reaching that age is 0.6. Armed with such probabilities one
can go on to estimate the probabilities of certain other events
concerning forty-year-old men. The probability of a baby
suffering cot death, quoted above, would have been estimated
by reference to a large number of babies at risk and then
observing how many succumbed. Frequency probabilities are
probabilities of events.
The second kind of probability is more subjective and is
often referred to as a degree of belief. Betting odds are an
example. Bookmakers arrive at such odds by re¬‚ecting on all
the factors of which they are aware and making a subjective
judgement. Although they may be very experienced and have
copious information at their disposal, in the end the number at
which they arrive is subjective. There is no absolute standard
to which their judgement can be referred to decide whether it
is right or wrong. Degrees of belief, like relative frequencies,
can also refer to events as, for example, if we consider the
possibility that there will be a change of government at the
next general election. They can also refer to propositions, such
as when we consider the probability that God exists. There are
sophisticated ways in which degrees of belief can be elicited,
by offering hypothetical bets, for example, and there are also
crude subjective judgements arrived at with little thought. For
the moment I have lumped all of these together in the second
category of degree of belief.

m u lt i p ly i n g p ro ba b i l i t i e s
Probability theory is not so much concerned with estimating
individual probabilities as with combining them to determine
What is probability? 71
probabilities of complicated events. For present purposes, by
far the most important of these relate to conjunctions. If some-
thing occurs with probability 0.5 then what is the probability
that it will happen ten times in a row if it is repeated? If the
chance of a baby dying from cot death is 0.00012, then what is
the probability of two babies in the same family dying from
the same cause? The answer to this question was given in Sally
Clark™s trial as 0.00012 — 0.00012 = 0.000000144 or about one
in 69 million. What is the justi¬cation for multiplying prob-
abilities in this way? The answer is that it is only valid if the
events in question are independent; that is, if what happens in
one case is entirely unin¬‚uenced by the other. When tossing
coins this is a plausible assumption because it is dif¬cult to
see how the two tosses could be linked physically in any way.
But in the case of cot death it is far from obvious that two
deaths in the same family would be independent. Common
in¬‚uences, genetical and environmental, are not unusual in
families and it would be surprising if events happening to sib-
lings were independent. If events are not independent then
the second probability in products such as the above must be
modi¬ed to take account of what happened to the ¬rst baby.
This could substantially alter the answer. The same consid-
erations apply if there are more than two events. Multiplying
several very small numbers produces a product which can be
very small indeed. It is on such calculations that many of the
more startling conclusions in the science“religion debate rest.
The validity of the implicit assumption of independence is
seldom commented on or even noticed!
In a section entitled ˜Universes galore: the problem of dupli-
cate beings™, Paul Davies (2006, pp. 201“3) discusses calcula-
tions purporting to show that in a large enough universe, or
collection of universes, all sorts of unlikely things are bound to
happen sooner or later. Thus one would certainly ¬nd another
God, Chance and Purpose
72
planet, exactly like the earth, on which lives an individual
exactly like oneself in every detail. This would pose obvious
problems about personal identity. Davies begins with coin
tossing showing that, if you toss a coin suf¬ciently often, such
extremely unlikely sequences as a thousand heads in a row
will certainly occur. He then reports some calculations by the
cosmologist Max Tegmark about how far, for example, one
would expect to travel through the universe before ¬nding an
exact copy of oneself. The distance is enormous (about 1029
metres), but we are told that ˜Weird though these conclusions
may seem, they follow incontrovertibly from the logic of sim-
ple statistics and probability theory.™ Such conclusions may be
weird but they certainly do not follow from ˜the logic of simple
statistics and probability theory™. In addition to assumptions
about the uniformity of the laws of physics and suchlike, we
may be sure that, lurking somewhere, there will be assump-
tions of independence such as those used in the introduction
about coin tossing; these do not come as part of the probability
package. They have to be imported as part of what is assumed
to be ˜given™. Probability theory must not be saddled with
responsibility for every weird conclusion drawn from its use.
The weirdness almost always lies in the assumptions, not the
logic. If one has a suf¬ciently lively imagination it is possi-
ble to make all sorts of assumptions for whose truth one has
no evidence whatsoever. Conclusions are no better than the
assumptions on which they are based. Science ¬ction thrives
on improbable scenarios, and though readers may enjoy the
thrill of being transported into realms of near-fantasy on the
wings of elementary statistics, they should be on their guard
against being taken for a ride by calculations based on a grain
of fact and a mountain of guesswork.
The probability of life emerging on earth or the remarkable
coincidence of the parameter values on which the evolution of
What is probability? 73
an inhabitable world depends are other examples we shall meet
later. Only rarely, with such arti¬cial events as coin tossing,
is one likely to be con¬dent in assuming independence.3

c on d i t i ona l p ro ba b i l i t i e s
One of the most common traps for the unwary is in fail-
ing to notice that all probabilities are conditional probabilities.
This means that the numerical value of a probability depends
on what we already know about the event or proposition in
question. The reason that this rather obvious fact is so often
overlooked is that the conditioning circumstances are often
˜understood™. For example, we talk about the probability that
a coin falls heads without feeling the need to specify all the
circumstances of the toss because they do not, as far as we
can see, affect the outcome. The probability arguments used
in physics are often of this kind and the notation used makes
no mention of such irrelevant factors but many of the falla-
cious probability arguments we shall meet in the next chapter
fail by overlooking this elementary fact. If a doctor is try-
ing to diagnose the cause of a patient™s symptoms, then it is
obvious that the probability of a particular diagnosis depends
on how much the doctor knows about the patient. Tests will
be carried out to clarify the position and as the results come
in so will the probability of the possible diagnoses change.
In order to make this explicit, and to keep track of what is
going on, it is necessary to introduce a notation which will
incorporate all the relevant information. All that the reader
will have to cope with is a new notation which is no more
The late William Kruskal, a Chicago statistician, also had a serious interest
3

in theology and he clearly recognised the pitfalls of ignoring independence.
His paper on ˜Miracles and statistics™ was subtitled ˜The casual assumption
of independence™ (Kruskal 1988).
God, Chance and Purpose
74
than a convenient shorthand. Probabilities will be written as
follows:
P(A g i v e n B).
The P stands for probability and the A and the B within the
brackets denote the two things we need to know to spec-
ify a probability. A is the thing (event or proposition) in
whose probability we are interested. The B speci¬es the con-
ditions under which the probability is to be calculated (what
is assumed to be known). Thus, for example, we might be
interested in the probability that someone who is now forty
years of age would still be alive at seventy. This obviously
depends on a variety of things about the person. So B might,
in this case, be male or female and the numerical value of the
probability will depend on which is chosen. Again, we might
be interested in the effect of smoking on the probability. Thus
we would want to calculate,
P(40-year-old alive at 70 g i v e n that they are male and a smoker).
The conditioning event in this case is a compound event
depending on two things about the subject. In general, the
conditioning event may be as complicated as is necessary. We
would not ordinarily include in B anything which we knew
to have no effect on the probability. Thus although we could
write
P(head g i v e n six heads in a row already)
we would not usually do so because, in independent tosses, it
is generally recognised that the probability is unaffected by
what has happened prior to that point. The important point to
notice and to bear in mind, especially when we come to con-
sider particular probabilities in the next and later chapters, is
that every probability is calculated under speci¬c assumptions
What is probability? 75

<<

. 2
( 7)



>>