Epistemic Irony in Philosophical Narrative.
Dreher, John H.
Epistemic Irony in Philosophical Narrative.
Introduction
Philosophy is a strange subject in that its name suggests that it
isn't a subject at all, but rather an affection: the love of
wisdom; and yet, philosophy is also viewed as the parent of all
theoretical knowledge. What then is the relation between wisdom and
knowledge? Perhaps it is right to insist that the wise distinguish what
they know from what they do not know. Broadly, the purpose of this
project is to explore this suggestion.
Socrates was the first Western thinker to maintain something like
the view that philosophers can distinguish what they know from what they
do not know. In Apology Socrates reports that he consulted one who had a
reputation for his wisdom, but that "in the process of talking with
him and examining him," Socrates was driven to the conclusion that
his companion wasn't wise at all. In fact, Socrates congratulates
himself, because although he thinks that he doesn't know anything
much worth knowing, at least he doesn't think that he knows
something worth knowing.
So, I left him, saying to myself as I went away: Well, although I
do not suppose that either of us knows anything really worth knowing, I
am at least wiser than this fellow--for he knows nothing, and thinks
that he knows: I neither know nor think that I know. In this one little
point, then, I seem to the advantage of over him. (1) (Plato/Jowett,
Apology, 21df, p. 345)
Ironically, Socrates seems to imply that there is at least one
thing worth knowing, and that is knowing that one doesn't know
anything (else?) worth knowing. Irony is achieved by using a fragment of
language to assert or designate the exact opposite of its literal
meaning. I suppose, if anything is bad news, it is that no one really
knows anything worth knowing.
The ironic response to Socrates' narrative surely is the
current cliche: "Good to know."
The Socratic Paradox: On a Grand Scale
The above passage from Apology has given rise to a series of
puzzles collectively designated as the "Socratic Paradoxes."
Indeed, if one really knows nothing worth knowing (or perhaps
doesn't know anything at all) but thinks that one knows, one must
be seriously mistaken about what knowledge is or what it takes to get
it. Imbedded is a crucial distinction between being aware of a given
proposition and not knowing whether it is true and believing that there
might (or even must be) some factor that is relevant to knowing that
proposition, but not knowing what that factor is, or else what to make
of it. Issues of this sort arise in epistemology and metaphysics, in
natural science, in analysis of personal introspection and in the
statistical analysis of significant empirical correlations.
Newton and his successors knew that the "wobble" in the
precession of the perihelion of Mercury's orbit around the sun
posed a threat to Newton's own unified account of terrestrial and
celestial motion. One might attribute the deviation to some sort of
intervention by God or perhaps to the gravitational attraction of a
hitherto unobserved object (which was tentatively named
"Vulcan." during the 19th century). Yet for over 200 years no
one suspected or could have even conceived that the explanation for
Mercury's misbehavior lay hidden in the presupposition of Newtonian
science that space is "absolute." Unfortunately for Newton,
Einstein demonstrated that the geometry of space depends upon the
presence of mass. Although space emptied of all mass would be Euclidean,
actual space is Riemannian, meaning that Euclid's Fifth Postulate
is false of actual space, where straight lines within a plane do not
have parallels. It is therefore the "deformation" of space due
to the mass of the sun that accounts for the "wobble" in the
precession in the perihelion of Mercury. (2)
Sometimes the impossibility of knowledge of specific facts is a
consequence of scientific theory itself. For example, a consequence of
the Special Theory of Relativity, which treats of non-accelerating
inertial frames, is illustrated by Makowski's famous space-time
diagram, which illustrates the fact that space and time are not
independent and as a consequence much of the universe is inaccessible to
us. (3) Another obvious example comes from Heisenberg, who demonstrated
that it is impossible to determine the momentum and position of certain
small entities, like electrons. (4) More recently Stephen Hawking
suggested that despite evidence to the contrary, "information"
apparently lost as events are gobbled up by black holes is nevertheless
stored (somehow) at the event horizon itself. (5)
These examples are extremely important from a philosophical point
of view because they show that even the most certain theories can turn
out to be false. The most important of these cases mentioned above is
Euclidean geometry. For two thousand years philosophers thought that
Euclidean geometry was the paradigm of a successful theory of the
structure of physical space. Descartes, for example, thought that any
theory as certain as Euclidean geometry was surely beyond doubt. The
fact that that the finest minds were mistaken about the structure of
physical space ought to give us pause. We all believe the deliverances
of the best minds of our time, but mightn't they be mistaken?
Perhaps we do not have a reason now to doubt the great theories of our
time, but that might be only because we are unaware of pertinent data.
All this is to say that we cannot assess the epistemic importance of
information that we do not have, and, by hypothesis, we do not know just
what information that might be.
Worries about mathematical physics are troubling enough, but there
are still grander questions. The questions of theoretical science do not
challenge our capacity to know. Irony arises at this level as well, most
explicitly and famously in Descartes, Pascal and Hume. Descartes wonders
whether all putative knowledge is undermined by an evil force so that
even the most certain of beliefs systematically mislead us about the
world and even about ourselves. Only God, Descartes insists, can
extricate us from systemic doubt. (6) Yet as most commentators have
insisted, we can hardly know that God rescues us from doubt without
knowing that God exists, and we can hardly know that God exists unless
we already know that God has given us reliable tools that will rescue us
from doubt about our powers to know.
One way in which Descartes was misled (indeed the most important
way in my view), is that Descartes assumed that mathematics is the
unshakeable foundation of physics. According to Descartes we clear and
distinctly perceive the attribute of extension, which is the essence of
matter, and therefore the mathematical truths about extension must
describe the physical space in which matter is located. So, for
Descartes, whatever is necessarily true of extension applies, mutatis
mutandis, to matter and the space it occupies. Since we "know"
that Euclidean geometry is necessarily true of extension, it must be
necessarily true of physical space as well. As we have seen above,
however, Einstein successfully argued that matter, (though extended) and
the space that it occupies are not Euclidean, but rather are
Riemannian--meaning that straight lines in the neighborhood of mass do
not have parallels.
Pascal, who was reasonably threatened (intimidated?) by thoughts of
the "infinite," accepts the idea that the grand questions of
philosophy are imponderable.
Pascal claims that it is incomprehensible that God should exist,
and incomprehensible that he should not; that the soul should exist
in the body, that we should have no soul; that the world should be
created, that it should not; that original sin should exist, and
that it should not; etc. (7)
Yet, Pascal concludes that one inescapably lives either on the
basis of the existence of God or does not live on the basis of the
existence of God. Arguing that the payoff for a correct bet on the
existence of God is eternal bliss and that an incorrect bet against the
existence of God is eternal damnation, Pascal concludes that the only
wise course is to take a chance and go "all in" for God. (8)
There have been many objections to Pascal's line of reasoning
about how to wager assuming that one "must" wager. Do we
really know that the consequence of an incorrect choice is eternal
damnation and of a correct choice eternal bliss? Isn't that very
claim also open to open to Socratic doubt? Perhaps the right conclusion
is that we just do not know how to choose. Perhaps Pascal concedes as
much inasmuch as he insists that we "know" of God's
existence only through faith. For many, however, that is just another
way of saying that we really do not know at all; perhaps that is what
Socrates would have said in response to Pascal.
Hume argues famously that our capacity to make simple predictions
is flawed in that predictions presuppose that past and present are
connected or related in a way that assures us that the future will
continue to resemble the past. But knowledge that the future will
resemble the past can be inferred from past successes only if we already
know precisely what we do not know, which is that the future will
continue to resemble the past. There is an additional and even deeper
problem, which is that we cannot ever be certain that we have all the
evidence that is pertinent to a given knowledge claim or even that what
has counted as pertinent evidence in the past will continue to be
counted as pertinent evidence in the future. To summarize: We do not and
cannot know that we have all the pertinent evidence to support a claim
about the future, but even if we did know that in the past we had all
the pertinent evidence, we could not infer that what was then pertinent
evidence is still pertinent nor could be infer that facts that were not
pertinent in the past would not become pertinent. (9)
The Retreat to Probable Knowledge and Statistics
Perhaps all this goes to show only that our assertions need to be
qualified in order to count as certain knowledge. Perhaps we cannot
know, for example, whether or not we shall die this year, but perhaps we
can know the probability that we will die this year. However, as we
shall see, claims to know the probability that an event will occur are
just as problematic as knowledge claims. Indeed, probabilistic claims
are uncertain for the very same reasons that unqualified,
"unhedged" knowledge claims are problematic.
What we learn from statistics is not the probability of a given
event but rather the probability of a given event on certain evidence.
In order to explore this idea, I am going to refer extensively to an
excellent example presented by James Joyce in the Stanford Encyclopedia
of Philosophy. (The passage is fairly long, and it is reproduced for the
reader's convenience at the end of this paper under the heading
"Appendix ").
Joyce begins his example and analysis by referring to an
"unconditioned probability." (10) Our goal in the example is
to calculate the probability that "J. Doe" died during the
year 2000. Our initial information comes from the U.S. Census Bureau,
according to whom "roughly" (my emphasis) 2.4 million of the
275 million American citizens alive on January 1, 2000 died during the
year 2000. According to Joyce, we might reasonably conjecture that the
probability that J. Doe was one of those who died during 2000 is 2.4
m./275m., which is .00873.
Joyce calls this "probability" an "unconditional
probability," and is absolutely right to do that. The reason is
that the unqualified form of the resulting probability statement is:
P(H) = .00873. (11) This is going to be the starting point for our
analysis of the probability that J. Doe died during 2000. As Joyce
implicitly acknowledges by qualifying his data report by the word
"roughly", the calculation is probability not absolutely
accurate. (12) So, if we hope to have absolutely certain knowledge of
probabilities, we'll need to concede at the very outset that we are
not dealing with input data that are absolutely certain. Therefore, the
resulting conditional probability calculation will not be absolutely
certain. Yet, perhaps this is all too quick. Perhaps someone will argue
that lacking confidence that a calculation is precisely correct is
pretty much the same as being absolutely certain that it is roughly
correct. Either way, however, belief in the present case that the
unconditioned probability that J. Doe died during 2000 is .00873 is
fallible.
That unconditioned initial probabilities are uncertain is also
illustrated by the fact that many introductions to the subject, unlike
the one by Joyce, begin with examples that are contrived. For example:
There are three bags of marbles; two are filled with fifty red marbles
each; the other is filled with fifty black marbles. The bags are put
into an empty urn after each of the 150 marbles has been uniquely
numbered from 1 to 150. What is the probability that #29 is red? We are
inclined to say that it is certain that the unconditioned probability is
100/150 = .66667. It is tempting to think that here we have conquered
the unknown, but that is not true. Someone might have miscounted the
marbles in the bags or perhaps didn't get all the marbles back in
their proper bags after numbering them. The point is that we often
generate our unproblematic unconditioned probabilities by ignoring all
the facts that might undermine the calculation., It would be better, I
think, to qualify unconditioned probabilities by saying that they are
unconditioned probabilities for all we know or suppose.
It must have occurred to the reader that in the previous case, it
is natural, indeed essential to ask about how the marbles were numbered.
The example really presupposes that the numbering was random. But
suppose that one bag of red marbles was numbered from 1 to 50; the
blacks were numbered from 51 to 100 and the third bag of reds was
numbered from 101-150. In that case, the probability that #29 is red,
given the information about the numbering process and the other
pertinent information about the case, is 100%. So, if you don't
know about the numbering process, you don't know that it is certain
that #29 is red. What you know is that #29 is red for sure on the
condition that the numbers were inserted as described and that the
marbles were deposited from the three bags into an empty urn.
The only unconditioned probability calculations that are arguably
absolutely certain are those that are purely abstract. Consider the
integers from 1 to 100. What is the probability that a given integer
from that group is also in the group from 1 to 40? The answer is clearly
40/100= .4. Here there isn't room for error because all the
relevant information (the "givens") are themselves certain.
But these are just the cases where statistical analysis is unnecessary.
All probability calculations of this sort are "a priori,"
which is to say independent of contingent facts. (13,14)
Whatever we think of the unconditioned probability that J. Doe died
during 2000, it is obvious that there might be factors that might affect
the probability of J. Doe's death in 2000. Joyce selects an obvious
one; 13 14 which is age. According to the U.S. Census Bureau, of the
16.6 million American citizens 75 or older, 1.36 million died during the
year 2000. Now, suppose that we complicate our calculation in this way:
Let E represent the information that J. Doe was a senior. What is the
probability that J. Doe died during 2000 on the condition that J. Dow
was a senior, that is, on the condition E? Now the probability that J.
Doe was a senior in 2000 is 1.36m./275m. = .06036, and the probability
that J. Doe was a senior who died during 2000 is P(H & E)=1.36
m./275m.= .00495 divided by P(E) = .06030. The probability of H in
condition E is written as [P.sub.E](H).
[P.sub.E](H) = P(H & E)/P(E), which is: 1.36 m./275m.= .00495,
which in turn is divided by 06030 and = .082 (15)
It may seem that the calculation of [P.sub.E](H) is certain because
it is a mathematical certainty. (Well, keep in mind that whether or not
it is a mathematical certainty depends upon the skill of the people who
perform the calculation.) Even if the [P.sub.E](H) is a mathematical
certainty, its certainty depends upon the prior calculations of P(H
& E) and P(E), those calculations are themselves based upon data
that may be imprecise or actually false.
The idea of conditional probability is made more precise by Bayes
Theorem. In order to understand Bayes Theorem in terms of Joyce's
example, we need to distinguish [P.sub.E](H) from [P.sub.H](E).
[P.sub.H](E) is the probability that J. Doe is a senior on the condition
that he is an American citizen. The probability that J. Doe died in 2000
on the condition that he is an American citizen is
[P.sub.H](E) = [P(H & E)/P(E)]/P(H) = .57.
As Joyce observes, [P.sub.H](E) is the probability that J. Doe died
in 2000 given that he is a senior who also is a part of the total
population of American citizens. This number tells us that 57% of those
who died among the total population of American citizens were seniors.
The previous calculation of PeH tells us that 8.2% of the senior
population died in 2000, and hence the probability that J. Doe is among
those who died on condition that he was a senior is 8.2%.
The calculations [P.sub.E](H) and [P.sub.H](E) make sense and are
intuitively correct, but of course it is always possible that a
conceptual error has crept into the calculation. More importantly,
however, the air of certainty encouraged by the mathematics is disguised
by the fact that our data might be inaccurate. Even more importantly, if
we want to know the probability that J. Doe died in 2000, there will be
many other variables that are relevant; for example, whether J. Doe is
male or female; whether J. Doe smoked tobacco, and if so how frequently
for how long; and if J. Doe was a consumer of alcoholic beverages or
narcotics, and if J. Doe had genetic disorders, and so on. The
calculations of conditional probabilities will need to continue on and
on in order to accurately assess J. Doe's chances. And most
importantly of all, we will never really know whether or not we have
captured all the salient facts about J. Doe. So, while it looks as
though we can be certain of what is probably true, assertions of
probable truth seem to be open to the same worries that we have
encountered previously concerning unqualified apodictic assertions.
Error can creep in at any point, whether it is a mathematical error or a
conceptual confusion or corrupt data. 15
As Joyce continues, there is an important relation between a
conditional probability and its inverse. (16) We have already seen this
relation but perhaps have not noticed it. [P.sub.E](H) is the inverse of
[P.sub.H](E). The reason this is important is because there is a
relation between a conditional probability and its inverse that was
captured by Thomas Bayes and is known as Bayes Theorem. It states:
[P.sub.E](H) = [P(H)/P(E)] [P.sub.H](E)
Recall that
[P.sub.E](H) = .082
[P.sub.H](E) = 57
P(E) = .06030
P(H) = .00873.
Thus, Bayes Theorem tells us that
[P.sub.E](H) = (.00873/.06030)*.57, where (.00873/.06030) =
.144776, and .144776 * .57 = .0082, Which is the value of [P.sub.E](H).
Intuitively this equation tells us that the ratio of the deaths in
the total population to the deaths in the population of seniors is the
same as the ratio of the probability of the death of a given member of
the population, J. Doe, given that he is a senior, to the probability of
the death of a senior, J. Doe, given that he is a member of the general
population. In other words:
[P.sub.E](H)/[P.sub.H](E) = P(H)/P(E).
Showing that this equation is true is a simple calculation:
[P.sub.E](H)/[P.sub.H](E) = .082/.57 = .144.
and
P(H)/P(E) = .00873/.06030 = .144, allowing for rounding errors.
Joyce and many others think of Bayes Theorem as an obvious truth,
and even a mathematical triviality. I suppose that this is so, but it is
important to remember that for over two thousand years virtually
everyone thought that Euclid's Fifth Postulate (that there is
exactly one line parallel to any given straight line through a point
outside the line but on its plane) is trivial and obvious--until Gauss,
that is, who introduced differential geometry and thereby enabled us
"to define the curvature of a surface at a point" (17) In this
way Gauss paved the way for the non-Euclidean geometries developed by
Lobachevski and Riemann. (18)
I suppose that someone will say that the foregoing reinforces and
elaborates Disraeli's famous quip that there are lies, damn lies
and statistics. Yet, nothing could be further from the truth.
Statistical analysis has done more to sharpen our understanding of
probability than any other tool. The point of this paper is not to
criticize, much less to denigrate, a subtle, ingenious and essential
tool in identifying the ways in which events are correlated with each
other. The point is simply to emphasize the overlooked (or at least
under emphasized fact) that merely qualifying a prediction by saying
that it is "probable" or "very probable" even after
the most rigorous statistical analysis, is not sufficient to remove any
possible doubt about its probability, whether unconditional or
conditional. The possibility of errors in reasoning, of falsely
characterizing data, and miscategorizing true data are ever present.
When it comes to statistics and the analysis of probability, we
just don't know what we do not know.
Still More Worries, on a "Smaller" Scale
It is wrong to assume that "grand" ironies are the ones
that affect us most deeply. Knowing one's own position in life is
not a grand issue, but that does not make it less important. Oedipus and
Jocasta were in the dark about who they were. Had they known their
identities, their lives would not have ended in tragedies that not only
engulfed them but also were visited upon the next generation. Indeed,
Oedipus was determined to avoid the fate that the oracle said was
inevitable, and tragically it was his very effort to avoid the
inevitable that resulted in the inevitable. It is indeed difficult to
know just what the moral of the story is meant to be. Sometimes things
don't work out no matter how careful we are, and sometimes it is
the very care that we take that undermines us. Yet, sometimes tragedy
multiplies itself as it does in the Theban tragedies, which end in the
deaths of three of four of Jocasta's children. The only surviving
child is Ismene who lives a cautious and conventional life, leaving
fighting to men; as she says: "We are women and do not fight with
men." (19) One of the lessons of the Theban tragedies is humility.
The one who survives is not the one who tries too hard or takes herself
to seriously.
In her own way, Ismene is humble, and I believe that one of the
lessons of the Theban tragedies is humility. This is the conclusion that
Sophocles draws as he ends Antigone, in the very the last lines of the
tripartite tragedy:
Wisdom is supreme for a blessed life,
And reverence for the gods
Must never cease.
Great words, sprung from arrogance,
Are punished by great blows.
So it is one learns, in old age, to be wise (20)
Human knowledge and strength are limited, and it is only hubris
that makes us confident that we shall succeed where all others have
failed.
The case of Emma and Charles Bovary is comical as well as tragic.
The problem is not that they were determined, like Oedipus, to avoid an
inevitable, tragic outcome. Their failing was much simpler; they were
simply unaware of the dreadful fate that they tempted. Charles was
completely unaware of his wife's dissatisfaction with her and their
life; indeed, it did not even occur to him that that he might not know
his wife at all. Unfortunately for Charles, Emma felt imprisoned by her
life of bourgeois complacency; as a nineteenth century Justice Ginsburg
might have said: neither perched on a pedestal nor recumbent upon a soft
couch, but rather imprisoned within in a cage. Emma longs for
accomplishments of her own and a life of glamour and adventure, not the
lazy comforts of a pretty little house on the French countryside.
Charles's pathetic ignorance of his wife's undernourished
identity persisted even beyond her death. It was simply never occurred
to him that his wife could have wanted more than he could provide. After
all, what could a woman want more than her own secure home, in a
comfortable place, respected by her neighbors and loved by a
"successful" husband? Perhaps an identity of her own?
Emma too was in the dark insofar as she expected liberation from
Charles to lead automatically to a life of excitement and fulfillment.
Music is a hard master, and it is easy to think that one can play like
the masters until one actually takes lessons and tries hard. More
importantly, Emma's life of extravagance was not based upon her own
modest achievements or a realistic assessment of her prospects away from
Charles' secure home. As she took on more and more debt and became
more and more desperate, she turns to a former lover, Rodolph, in an
attempt to renew their love. At first, he responded ardently, but when
it comes time for him to make a life for her, he thinks better of it. As
he contemplates the inevitable burdens of a life with Emma, he
concludes: "And besides, the worry, the expense! Ah no, no, no, no,
a thousand times no." (21) It is as though Flaubert wryly observes
that there is nothing that so quickly chills a renewed love affair more
than a request for financial support.
It might be that there is a Iago ready to worm his (or her) way
into every intimate relation, but there are similar worries about any
friendship or professional association. The thought of not knowing what
is really in the hearts and on the minds of others is the rot in which
suspicion grows. From an emotional point of view, it affects us more
deeply and tragically that the arcane doubts that undermine our
confidence in theoretical science or even reason itself. Who, at one
time or another, hasn't been an Othello? And then there are those
who think that they can live again. Goethe's Faust longs for
renewed love, but instead he only breaks his beloved's heart. His
efforts to recover in acts of grotesque sensuality and vain searches for
the extraordinary lead him instead to the perfectly ordinary:
engineering land reclamation, which is a project that an old man of
common sense might have undertaken in the first place. No fool like an
old fool; they say, but apparently the saying comes only to the minds of
those who already know.
These and innumerable other examples that could be offered show
that in our relations with others and more importantly our relations
with ourselves, true motives are often obscured, or misconstrued, or are
obsessions determined to save us from the very truth that would save us.
All these forms of ignorance (whether setting aside the important, or
ignoring what others see, or mischaracterizing one's own intentions
in compulsive self-deception) all of them are epistemological failures.
We ought to know better, but we don't; and even when we do, it is
often too late. Introspective knowledge is perhaps the least secure of
all. There is no hope there from deductively valid reasoning or from
subtle statistical analysis. The truth would be worth knowing, but a
lesson of tragic literature is that truths worth knowing are often
beyond us.
Public Policy
It is difficult to think of an area of human endeavor in which it
is more important than public policy to take care not to act in or from
ignorance. It was Donald Rumsfeld, former Secretary of Defense of the
United States, who emphasized the problem of acting on unknowns. The
issue arose at a NATO conference in Prague during 2002. The purpose of
the meeting was to determine NATO's response to global terrorism.
Secretary Rumsfeld was worried especially about difficulties in planning
that seem to require intelligence that NATO did not have. The principal
issues that arose concerned terrorism and weapons of mass destruction.
The following is a part of the news conference:
Q: Regarding terrorism and weapons of mass destruction, you said
something to the effect that the real situation is worse than the facts
show. I wonder if you could tell us what is worse than is generally
understood.
Rumsfeld: Sure. All of us in this business read intelligence
information. And we read it daily and we think about it and it becomes,
in our minds, essentially what exists. And that's wrong. It is not
what exists.
I say that because I have had experiences where I have gone back
and done a great deal of work and analysis on intelligence information
and looked at important countries, target countries, looked at important
subject matters with respect to those target countries and asked, probed
deeper and deeper and kept probing until I found out what it is we knew,
and when we learned it, and when it actually had existed. And I found
that, not to my surprise, but I think anytime you look at it that way
what you find is that there are very important pieces of intelligence
information that countries, that spend a lot of money, and a lot of time
with a lot of wonderful people trying to learn more about what's
going in the world, did not know some significant event for two years
after it happened, for four years after it happened, for six years after
it happened, in some cases 11 and 12 and 13 years after it happened.
Now what is the message there? The message is that there are no
"knowns." There are things we know that we know. There are
known unknowns. That is to say there are things that we now know we
don't know. But there are also unknown unknowns. There are things
we don't know we don't know. So, when we do the best we can
and we pull all this information together, and we then say well
that's basically what we see as the situation that is really only
the known knowns and the known unknowns. And each year, we discover a
few more of those unknown unknowns.
It sounds like a riddle. It isn't a riddle. It is a very
serious, important matter.
There's another way to phrase that and that is that the
absence of evidence is not evidence of absence. It is basically saying
the same thing in a different way.
Simply because you do not have evidence that something exists does
not mean that you have evidence that it doesn't exist. And yet
almost always, when we make our threat assessments, when we look at the
world, we end up basing it on the first two pieces of that puzzle,
rather than all three. (22)
I have reproduced Secretary Rumsfeld's remarks because I think
that they are of interest right now in light of worries about the spread
of nuclear weapons. Rumsfeld's reply is also interesting in that it
illustrates the complexity of Socrates's worries and the
awkwardness of even describing them. There are, as Rumsfeld says,
"the known knowns, but there are also unknown unknowns." They
are tricky because we don't know them, but "each year we
discover a few more of those unknowns." The secretary closes with
an interesting epistemological insight, where he asserts "because
you do not have evidence that something exists does not mean that you do
have evidence that it does not exist." That obviously applies only
to the happy case in which you are aware of possible evidence, but then
there are the unhappy cases in which you do not have any idea of what
you are looking for, and yet the unhappier case in which you are simply
unaware that there is anything to look for at all.
It is simply impossible to exaggerate the consequences of the
inadequate intelligence concerning weapons of mass destruction at this
juncture. The decision that was taken to go to war in Iraq over the
issue of weapons of mass destruction crucially depended upon the
existence of those weapons. And that, unfortunately, was uncertain. Many
believed that President George W. Bush deliberately overstated the case
for war. Many even attributed bad motives to the president. Perhaps they
were right. But it is also possible that the president thought that the
consequences of a wrong decision would be so horrific that he really had
no choice but to go to war on the evidence (or perhaps suspicion) that
he had.
This case is especially difficult because we do not have a group of
cases that are very similar, and so we cannot mimic the earlier
calculation about the probability of the death of J. Doe in 2000. In
that case we could count the number of individuals with a certain
property and distinguish them by that property from the whole. This is
what happened in the case concerning the deaths of seniors. (23) Cases
like those concerning the development of nuclear weapons as well as the
personal stories developed from great literature tend to respond to
subjectivist accounts of probability. What makes a theory more
subjective is the subjectivity of the initial "unconditioned"
probability. In the examples of the senior citizens, we can simply find
the ratio of seniors to the total population by counting. Sure, we might
miscount; we might leave some out, we might not really know whether or
not an actual person is an American citizen, but even so, there appears
to be a more or less objective standard to be applied in calculating the
initial unconditioned probability. In the one-off cases, our initial
probability is more speculative. In fact, if we cannot compare a part of
a set to the whole set, just how are we to assign a value to a
probability function when we consider the more subjective cases?
This leads us to consider a more subjectivist account of
probability. On this account, probability assignments are not meant to
represent the world (e.g., how many senior citizens will die next year),
but rather are mean to represent probability as a willingness to bet.
Here the magnitude as well as the "direction" of the bet needs
to be taken into consideration. Obviously, President Bush and Secretary
Rumsfeld attached great importance to the matter of weapons of mass
destruction. The negative value of the presence of those weapons led
both officials to assign a huge expected negative value to the
possession of nuclear weapons by Iraq, which justified a very large bet
(the "ante") or cost of conflict with a very low probability
of success. That meant, as President Kennedy might have said, that it
would be rational to "bear any burden and to pay any price" to
disarm the adversary.
To be sure, a decision to place a bet is not always an indicator of
uncertainty. Suppose, for example, that we are placing bets on which
card will be randomly drawn next from an ordinary deck of 52 cards. The
probability that a Heart will be drawn is % because there are 13 hearts
in every deck with 52 cards. Thus, we actually know that a fair bet is
1:4. If we put up one dollar, and a heart comes up, we should expect to
receive four dollars (the one we put up plus three more). (24) Someone
taking the other side, who puts up three dollars should expect only one
dollar (plus the original three put up) if a non-Heart comes up, since
the probability that a non-heart will be drawn is % (because there are
39 non-hearts in a deck of 52 cards). On the other hand, if a non-Heart
is drawn, we should expect to lose our dollar.
This reasoning depends upon the assumption that the deck of cards
is good and that the draw is fair, that is, that each card has an equal
chance of being drawn. That, of course, is quite an assumption, and we
are loath to take it for granted. However, given that the assumptions
are correct, the probability calculation is certain, and we can actually
know whether a bet is fair or whether it is to our advantage or
disadvantage. In this context, although we are thinking of probability
on a subjective basis, as indicated by a willingness to bet, the
probability calculation is merely a matter of counting, and not a matter
of chance, even though the card actually drawn is a matter of chance.
It is useful to consider probability as a measure of willingness to
place a bet where the probability of an outcome cannot be determined as
a matter of ordinary mathematical calculation. Cases of this sort might
arise in card games as well. Suppose that we are playing Blackjack (also
called "21") In this case we bet on whether or not the sum of
series of cards drawn from a fair deck (or usually four fair decks) at
random will come closer to 21 than the sum of the series of cards drawn
by our opponent. Now, as the game progresses, some cards are drawn and
discarded. So, if we could only keep them in mind, we would know how the
probability of drawing a given card, say the 3 of Hearts, changes after
each draw. However, the unfortunate truth is that most people cannot
keep all the drawn cards in mind, and even if they could, they could not
calculate the changes in probability fast enough to place a favorable
bet.
How shall we understand a bet in these circumstances? Perhaps a bet
is just a matter or irrational commitment, a thought that might well
provoke the conclusion that honest labor really is better than gambling.
But never mind! Someone who places a bet in those circumstances is
acting upon incomplete information and a flawed understanding of the
circumstances in which the bet is made. Unfortunately, the decisions
made in the ordinary course of life, viewed as bets, are just like this.
So, the question that we all face is just how much to risk on the basis
of imperfect understanding. The subjectivist says that the probability
that is actually assigned by a person should be understood, for better
or worse, as a bet placed on the basis of incomplete information. The
broader philosophical point is that due to the fact that there are
always "unknown unknowns," every probability assignment,
except those that are purely abstract, should be viewed as a bet.
As Ian Hacking observes, during the nineteenth century, statistical
generalizations were thought to be "reducible" to
deterministic calculations of the probability of events. This is
essentially the "classical" model of statistical analysis, and
perhaps it is still the prevailing theory, which is illustrated by the
example concerning the probability of the death of J. Doe. However, by
the end of the nineteenth century, the reducibility of statistical
generalizations to deterministic models began to be questioned and
statistical "laws" were thought to capture regularities among
natural phenomena. (25) Thus the subjectivist model has gained favor
throughout the contemporary era. However, the subjectivist model is not
new. Colin Howson traces this understanding of statistical analysis back
to David Hume and seventeenth century philosophy of science., Many
philosophers now have taken up the subjective approach implied or at
least anticipated by Hume. (26) In fact, this more subjective view of
probability can be traced even farther back, all the way to philosophers
like Pascal. (27) As we have seen, on this theory, rationality is
thought to be exhibited as a kind of betting.
The interpretation of even very simple subjectivist probabilistic
calculations can be misleading and must be undertaken cautiously. For
example, suppose that there are two research proposals, R and R*, that
promise to develop a new pesticide that will increase food production.
Suppose that R and R* each require $1 m. in research funding. (This is
the analogue of the "ante" in the previous example concerning
betting on cards.) Suppose that the research committee judges each
proposal to be equally promising. But the committee is convinced that if
R works out, the estimated payoff is 4m; whereas the R* works out the
estimated payoff is 5m. (In each case the original 1m is included in the
payout, just as "the ante" was included in the previous
example concerning cards.). Now, from these expectations it follows that
a "fair bet" on R requires that the probability that R will be
successful is .25. On the other hand, a bet on R* is a fair if the
probability of a successful outcome is .2. Now the rational bet is R*,
which may initially seem to be counter-intuitive because it appears that
the rational bet is on the less probable outcome. Nevertheless, by
hypothesis R and R* are equally likely to succeed. And the previous
calculation does not undermine the hypothesis. The above calculation
shows that the committee can afford to take a greater risk on R* than on
R because the payout on R* is greater. The "probability"
deduced in the example from the ante and payoff is not a function of the
likelihood of the success of the research proposal but merely the
probability that is required to make the research grant a "fair
bet" on the funded research. That indeed is exactly what common
sense would expect: the higher the payout on the same investment, the
greater the risk that may be rationally accepted.
In this connection is instructive to return to the example of the
spread of nuclear weapons. In this case of the decision to go to war
against Iraq, it is clear that Secretary Rumsfeld and President Bush
thought that the expected value of the decision to go to prevent the
spread of nuclear weapons was very high and that it justified an effort
even if the effort had a low probability of success. In the event,
President Bush did not emphasize the possible cost, and perhaps because
he thought that the effort was so important that it would be worth
pursuing even if the probability of success were very low. Yet, many
Americans were worried that the cost, both direct and indirect, might be
very high, and in the ensuing event they came to think that the costs
were very high. What might well have misled people about the decision to
go to war could very well have been a different assessment about what
its costs would be and whether or not a successful outcome couldjustify
those costs. That is why it pays to consider probable costs, benefits
and risks in explicit detail. In that way, it is more likely that
communication failures can be avoided. Nevertheless, critics may claim
that the Bush administration exaggerated the probability of a positive
outcome in order to justify the decision to war. Perhaps that would
explain why it is that so many people were outraged when in the end
weapons of mass destruction were not discovered and the cost of the war
was very high. It was not that he was mistaken, it was rather he
represented the probability of success that he believed would justify
the war as the probability that the war would actually be successful.
There are many "everyday" examples of the ways in which
incomplete knowledge can undermine rational deliberation when it comes
to betting. We all know that driving while intoxicated increases the
risk of fatal accidents. Exactly how much the rate is increased for a
given person, however, appears to be imponderable. It is true that there
might well be statistics showing how many people on average who have
given level of blood alcohol cause fatal accidents while driving. Yet,
there are some who believe that they can "handle it." They
might attach a very small probability that they will cause a fatal
accident with the standard permitted limit of alcohol in them. When they
take the wheel, they are in effect betting their own lives (and the
lives of others) on their assessment that the risk they are assuming is
not significant. The probability of a negative outcome in that case is
significantly lower than the probability that would be calculated
assuming that each person is equally affected by blood alcohol levels.
Absent this reasonable assumption, there is cognitive space for a
delusory, reckless driver to hide from the obvious conclusion that the
risk being assumed is too high given the usual aversion to death. But
some evidently believe, as reflected by the bet they take in getting
behind the wheel, that their risk is significantly lower than average.
(28)
Putting It Altogether
The conclusion that appears to be warranted from all this is that
it is best to remain extremely cautious in making knowledge claims. And
here it, as everywhere, it is best to practice what we preach. Socrates
said that he neither knows nor thinks he knows anything worth much; but
how about that piece of wisdom itself? Wisdom is certainly worth
something (as Socrates implies) and so it would be worth knowing whether
or not one is wise.
We have observed that even many "time-honored" (or
perhaps merely persistent) "facts" of pure mathematics must be
qualified. Great theories of mathematical physics haven fallen. Not only
are there non-Euclidean, purely abstract geometries, but physical space
itself is non-Euclidean. We have seen our conception of nature at the
atomic level has changed to the point that we now no longer think that
every "object" has a definite position and momentum at any
given time. Perhaps that means that our very conception of reality at
the atomic level has been separated from the common-sense reality of
medium size physical objects.
Someone might have thought that we could have real knowledge if
only we are more cautious in characterizing that knowledge. Perhaps
knowledge is essentially statistical. The field of statistics is of
course immensely complicated. Some statistical analyses appear to be
more or less objective, like the analysis of the probability that J. Doe
died in 2000 based upon statistical information about American citizens
and the age distribution of deaths among them. Although in those cases
the mathematics of conditional probability and Bayes Theorem make sense,
we saw that the possibility of error always creeps in as we identify the
unconditioned probabilities that are refined, over and over, by adding
new conditions to our probability judgments.
Moreover, it appears that many probability calculations are very
subjective. When we decide to go to war, there doesn't seem to be a
place for statistics at all. Information about possible enemies is
unavailable, and momentous decisions are ultimately based upon
"unknowns," sometimes on nothing more than mere suspicion. It
just is not true that these cases are rare or are consequential only
when the decision affects millions, or perhaps everyone on Earth.
Individual tragedy often hinges upon misinformation or perhaps on total
ignorance of salient facts. How do we know what we don't know when
we are completely "clueless"? In all these cases, probable
calculations appear to be essentially bets. What makes a bet rational
depends upon the probability of a favorable outcome and the value of the
outcome. But when the value is immense (whether positive or negative),
very small probabilities are overwhelmed, and the resulting expected
values approach "infinity" or zero. Outcomes of bad bets are
often tragic, not because whoever made the bet is foolish (or evil) for
having made it, but rather because tragic outcomes occur despite our
best, good faith efforts,
Wisdom
Surely wisdom consists in at least recognizing and accepting what
we might call epistemological contingency. We can be fully confident in
our belief systems and schemes of rational choice only if we know that
their underlying assumptions are true, and that is exactly what Socrates
claims that the wise do not think that they know. Yet, that raises
another, even more difficult question. Just what does "fully
confident" actually mean? In particular, how are we to understand
the contrast between "fully confident" and "partially
confident," "somewhat confident," or "more or less
confident? More precisely, this is a request for an understanding of the
meaning of "probable" itself. As we have seen, one of the most
important ways of construing probability is to consider it in contexts
where we face choices and therefore a choice among various bets. (29)
This conception of probability assignments (that is, as bets) is to
be contrasted with classical models of probability that were illustrated
previously in calculating the probability of the death of a certain
person in a certain year. As previously noted, this is the type of
calculation that arises in fields like epidemiology. In addition to
these conceptions of probability, there are innumerable others. This
essay has only touched the surface of the subject. As we have also
observed, there are issues about knowledge that do not seem to involve
probabilistic thinking in an obvious way, like grand issues of
philosophy and mathematics and physics. Moreover, there are issues about
self-knowledge, which appear to involve "knowledge" gained
only by introspection, which is often augmented by psychoanalysis.
These disparate conceptions of deductive knowledge, introspective
knowledge and knowledge of probable outcomes tempt us to think that
knowledge is really contextual. What counts as knowledge depends upon
the context of the question and the discipline that has been designed to
answer them. Yet, even if "knowledge" is contextual, the
philosophical mind is drawn to the possibility that there is a
"broader," more encompassing conception of knowledge that can
unify all the others by providing a comprehensive account of the methods
that can be applied in disparate cases. In a way it would be a
"super-context," broad enough, with methods powerful enough,
to account for all the others.
I believe, however, that this vision is seriously misguided. It is
true that we yearn here and elsewhere for super-contexts, but a
super-context looks suspiciously like the context of all contexts. The
context of all contexts would define a method by which we could assess
the relative merits of all the various methods of coming to know.
Unfortunately, in order to make a case for the context of all contexts
we shall encounter the Cartesian dilemma. How shall the deliverances of
the context of all contexts be evaluated? If by its own standards, it
will surely be deemed to be question-begging; if not by its own
standards, then by which, since all the others are to be evaluated by
it? The idea that knowledge is fragmentary is offensive to the
philosophical mind. From Plato on we have tried to convince ourselves
that there is a special state of mind in which all judgments will be
validated. That state of mind has often been defined by its objects: in
Plato's case the forms; in Descartes' innate ideas; in
Kant's the pure categories of the understanding and the pure forms
of intuition. Yet, all these attempts are at least inconclusive because
they cannot reasonably take account of themselves and according to them,
there is nowhere else to turn. All this is unavoidably vague, but
perhaps it can be best understood as a return to pragmatism. On the
contrary, each discipline focuses on methods that have proved
successful, and the ensuing division of labor strongly suggests the
conclusion that each discipline, usually by trial and error, finds its
own methodology and assesses its validity by its own methods. That is
perhaps the best that can be done, and although the best is not the
realization of all our hopes for objectivity, it has unquestionably
served us well enough--for more ideas about just how and how well, see
Romeijan. (30)
Toleration
Perhaps a final irony is that Socratic frustration might well turn
out to be one of humankind's greatest blessings. Justifying the
imposition of our will on others depends upon knowing what is good for
them in the circumstances in which they find themselves. We require
parents to provide medical care for their children, even if those same
parents are sure that God will look after their children better than
medical doctors. We require students to master physics even though we
know that physical theories come and go; even though we know that
mathematics describes abstract structures and not necessarily physical
space and time. Finally, when it comes to matters of the heart, the
unfortunate truth appears to be that we do not really know much of
anything at all, including our own hearts.
All this suggests that the wise course is not to proselytize. This
doctrine of toleration applies not only within families, to parents and
spouses with "big ideas" for those they love, but also to
large enterprises that rigidly enforce polices that discourage and even
undermine creativity. Even more importantly, following Montesquieu,
toleration must apply to cultures and nations with differing ideas about
what is good and about how best to govern themselves.
Aristotle distinguished intellectual virtue from moral virtue.
Intellectual virtue properly guides belief formation; moral virtue
properly guides our actions. Toleration, I suggest, is an example of a
virtue that is both intellectual and moral. Intellectual toleration
recognizes that beliefs that have been inviolate for centuries can
nonetheless be overturned. More importantly, toleration recognizes that
it cannot be its own judge; it cannot determine whether what it thinks
it knows, it really does know; and it cannot determine whether or not it
has taken all "unknowns" into account. This intellectual
modesty leads to moral tolerance. We can disagree with others about how
best to live and what to value, but that disagreement must be tempered
by the confession that nothing is so certain that it can be justifiably
forced upon others.
Even so, our conclusions about toleration must be cautious and
couched. One can go too far with toleration: To the point of tolerating
intolerance. Some things just cannot be tolerated, like genocide, which
perhaps is the ultimate act of intolerance. To be sure, there
aren't many absolutely clear cases of the intolerable, and the best
remedy for it is hardly ever obvious. Surely doing the least violence is
best, even if in the last resort it is sometimes unavoidable. After all,
it is one thing to say that we really do not know how best to live; it
is quite another thing to say that we might as well live up to someone
else's standards and allow them to dominate our lives. The more
important the issues, the more difficult it is to sort them out.
Good to know!
Summary and Conclusion
We began by reflecting upon the wisdom Socrates imparts to those
who had voted to put Socrates to death for corrupting their youth, for
teaching them to question values that their elders cherished.
Socrates' only possible defense, the one he made, is that he
didn't really teach the young anything positive at all; he only
taught them not to accept without question what others taught. Perhaps
we shall rediscover Socratic wisdom in our time. This analysis, as brief
and inadequate as it is, suggests that assessing our own epistemic
state, even when it is carefully couched as probabilistic knowledge,
must be cautious, if only because we cannot consider completely unknown
factors or factors that our methods cannot treat adequately. Our
conclusion is that all knowledge is tentative, including this knowledge.
Moreover, knowledge depends upon method, which in turn depends upon
context. The various branches of knowledge are fragmented, and there
isn't likely to be a single standard that will emerge by which to
adjudicate all its claims. However, all this leads to an intellectual
toleration that embraces moral toleration. Although moral toleration
cannot be coherently extended to the point of tolerating intolerance, it
constrains dogmatic judgments about the ways of other people and their
cultures. Some may feel liberated in this way; others perhaps sorely
disappointed to forego the unity of thought sought by great forebears
like Plato, Descartes and Kant.
Either way, we are left with Socratic wisdom: It is easy to think
we know when we do not know; as one might enigmatically say: We just do
not know what we do not know, but perhaps it is enough, therefore, to
know that it is best to tread lightly; to refrain from judging others;
to do no harm.
Good to know; really--I think!
Appendix
"The probability of a hypothesis H conditional on a given body
of data E is the ratio of the unconditional probability of the
conjunction of the hypothesis with the data to the unconditional
probability of the data alone.
Definition.
The probability of H conditional on E is defined as [P.sub.E](H) =
P(H & E)/P(E), provided that both terms of this ratio exist and P(E)
> 0. [1]
To illustrate, suppose J. Doe is a randomly chosen American who was
alive on January 1, 2000. According to the United States Center for
Disease Control, 'the 2000 calendar year. Among the approximately
16.6 million senior citizens (age 75 or greater) about 1.36 million
died. The unconditional probability of the hypothesis that our J. Doe
died during 2000, H, is just the population-wide mortality rate P(H) =
2.4M/275M = 0.00873. To find the probability of J. Doe's death
conditional on the information, E, that he or she was a senior citizen,
we divide the probability that he or she was a senior who died, P(H
& E) = 1.36M/275M = 0.00495, by the probability that he or she was a
senior citizen, P(E) = 16.6M/275M = 0.06036. Thus, the probability of J.
Doe's death given that he or she was a senior is [P.sub.E](H) = P(H
& E)/P(E) = 0.00495/0.06036 = 0.082. Notice how the size of the
total population factors out of this equation, so that [P.sub.E](H) is
just the proportion of seniors who died. One should contrast this
quantity, which gives the mortality rate among senior citizens, with the
"inverse" probability of E conditional on H, [P.sub.H](E) =
P(H & E)/P(H) = 0.00495/0.00873 = 0.57, which is the proportion of
deaths in the total population that occurred among seniors.
Here are some straightforward consequences of (1.1):
* Probability. Pe is a probability function. [2]
* Logical Consequence. If E entails H, then [P.sub.E](H) = 1.
* Preservation of Certainties. If P(H) = 1, then [P.sub.E](H) = 1.
* Mixing. P(H) = P(E)[P.sub.E](H) + P(~E)[P.sub.~E](H). [3]
The most important fact about conditional probabilities is
undoubtedly Bayes' Theorem, whose significance was first
appreciated by the British cleric Thomas Bayes in his posthumously
published masterwork, "An Essay Toward Solving a Problem in the
Doctrine of Chances" (Bayes 1764). Bayes' Theorem relates the
"direct" probability of a hypothesis conditional on a given
body of data, [P.sub.E](H), to the "inverse" probability of
the data conditional on the hypothesis, [P.sub.H](E).
Bayes' Theorem.
[P.sub.E](H) = [P(H)/P(E)] [P.sub.H](E)
In an unfortunate, but now unavoidable, choice of terminology,
statisticians refer to the inverse probability [P.sub.H](E) as the
"likelihood" of H on E. It expresses the degree to which the
hypothesis predicts the data given the background information codified
in the probability P.
In the example discussed above, the condition that J. Doe died
during 2000 is a fairly strong predictor of senior citizenship. Indeed,
the equation [P.sub.H](E) = 0.57 tells us that 57% of the total deaths
occurred among seniors that year. Bayes' theorem lets us use this
information to compute the "direct" probability of J. Doe
dying given that he or she was a senior citizen. We do this by
multiplying the "prediction term" [P.sub.H](E) by the ratio of
the total number of deaths in the population to the number of senior
citizens in the population, P(H)/P(E) = 2.4M/16.6M = 0.144. The result
is [P.sub.E](H) = 0.57 x 0.144 = 0.082, just as expected." (31)
References
Boyer, Carl A History of Mathematics, New York, John Wiley &
Sons 1968.
Descartes, Rene, John Cottingham, et al. trans., The Philosophical
Works of Descartes, Vol 2, Cambridge, Cambridge University Press,
1984/orig. 1641.
Einstein, Albert and Leopold Infeld, The Evolution of Physics, New
York, Simon & Schuster, 1938.
Flaubert, Gustave, Madame Bovary: A Tale of Provincial Life,
Project Gutenberg (originally in Revue de Paris), 2008/orig. 1857.
Hacking, Ian, The Taming of Chance, Cambridge, Cambridge University
Press, 1991.
Hawking, Stephen, "Into a Black Hole,"
http://www.hawking.org.uk/into-a-black-hole.html; 2008.
Howson, CXXX, Hume's problem: induction and the justification
of belief, at the Clarendon Press, Oxford University Press, 2000.
Hume, David, Norton and Norton, eds. Treatise of Human Nature, Vol.
1, Oxford, at the Clarendon Press, 2007/orig. 1730-40.
Joyce, James, "Conditional Probabilities and Bayes'
Theorem," Stanford Encyclopedia of Philosophy, 2003,
Pascal, Blaise, Honor Levi, trans., Pensees and Other Writings, New
York/Oxford, Oxford University Press, 1995/orig. 1670.
Plato, Jowett, trans., The Dialogues of Plato, 4th edition, Amen
House, London, revised at the Clarendon Press, Oxford University Press,
1953, Vol 1, 1953/orig. 399 BCE.
Rumsfeld, D., NATO news conference,
http://www.nato.int/docu/speech/2002/s020606g.htm; 2002.
Romeijan, Jan, "Philosophy of Statistics," Stanford
Encyclopedia of Philosophy: The Metaphysics Research Lab, Center for the
Study of Language and Information, (CSLI), Stanford University, ISSN
1095-5054, 2016), 2014.
Sophocles, Paul Woodruff., trans., Antigone,
Indianapolis/Cambridge, Hackett Publishing Company, 2001/orig. 442 BCE.
Stanford University SPRG, Lecture 3: Spacetime Diagrams, Spacetime,
Geometry, https://web.stanford.edu/~oas/SI/SRGR/notes/SRGRLect3 2015.pdf
Stark, P.B., Glossary of Statistical Terms,
http://stat.berkeley.edu/ -stark/SticiGui/Text/gloss.htm, 2016.
Texas Tribune, http://texastribune.org/2018/08/29/us-denying-passports-americans-along-border- throwing-their-citizenship/ 2018/08/29.
Van Fraassen, Bas C., Laws and Symmetry, at the Clarendon Press,
Oxford University Press, 1989.
John H Dreher, University of Southern California, Associate
Professor of Philosophy
(1) Plato, trans. Jowett, 1953/orig. 399 BCE, 21df, p. 345.
(2) Einstein and Infeld, 1938, pp. 209-239.
(3) SPGR, Stanford University, Lecture 3: Lecture 3 Spacetime
Diagrams, Spacetime, Geometry, 2015, p.1.
(4) Op. cit, Einstein and Infeld, pp 287-91.
(5) Hawking, 2008, p.2.
(6) Descartes, Rene, trans. Cottingham et al, 1984/orig. 1641,
[section]23 6, [paragraph] 40f., pp. 27f.
(7) Pascal, Blaise, trans. Honor Levi, 1995/orig. 1670,
[section]656, p. 148
(8) Ibid, [section]680, pp. 152--55.
(9) Hume/Norton & Norton, 2007/orig. 1739-40, [section]3.2-3.4,
pp. 52--58.
(10) What Joyce calls "unconditioned probabilities" are
also called "prior probabilities" in explicitly Bayesian
contexts.
(11) In this example, the unconditioned probability appears to be
"objective,' meaning that it is more than a mere guess.
Sometimes, as we shall see later, probability judgments are blatantly
subjective. In that case, as intimated above in fn. 10, the starting
point of a probability calculation is sometimes called a "prior
probability,' which is also "unconditioned."
(12) Indeed, we could complicate the story by trying to measure the
probability that the U.S., Census Bureau's estimation of deaths of
American citizens during 2000 is correct. Problematic features might
conclude that the difficulty of knowing just who actually are American
citizens. (For example, recently and during the Bush and Obama
administrations, difficulties arose in determining whether or not
certain people born near the Texas/Mexico border really were born in the
United States because well-meaning, charitable, sympathetic medical
workers erroneously recorded the United States rather than Mexico as
their patients' place or birth. (Texas Tribune.org, 2018/08/29.)
(13) Even this claim needs to be more carefully circumscribed. In
the case that we are considering infinite sets, the probability
calculations are arguably problematic. Is the probability that an
integer is greater than 40 greater than the probability that an integer
is greater than 30? It might seem so, but the matter is complicated by
the fact that the cardinality of the sets of integers greater than 30 is
the same as the cardinality of the set of integers that are greater than
40. It is worth emphasizing that even in this case the certainty of the
calculation presupposes that the calculation is done correctly. So, what
doubts we have about our mathematical and logical abilities will of
course affect the certainty of our calculation of probabilities. As
Descartes suggests in Meditations One, perhaps the evil demon might
deceive us in thinking that 2 + 3 = 5. Descartes, Cottingham, et al.,
op. cit., [section]21, p. 14.
(14) There is a distinction often drawn between subjective and
objective probability or interpretations of probability. Probabilities
concerning purely abstract identities are calculated by purely deductive
reasoning. They are "subjective' only to the degree that we
doubt our own powers of reasoning. On the other hand, many beliefs about
what is probable that are based upon introspection are purely
subjective. Examples might be drawn from the testimony of mystics, who
claim to have experiences of the supernatural that others do not have.
In the middle are probable judgments of one degree or another. The
notion of a conditional probability (X is probable to degree x on
condition y), applies to those judgments that are more or less objective
as well as those that are obviously subjective. Bayes Theorem instructs
us to revise each probability assessment on new information. It applies
equally to more or less objective as well as to more or less subjective
probability claims. A detailed explanation of this understanding of
probability claims and their relation to Bayes Theorem and statistical
analysis is beyond the scope of this paper.
(15) At first this may seem counter-intuitive, because it may not
be obvious why it is that P(H & E) is divided by P(E). The answer is
that P(H & E) is determined by the ratio of people who are American
citizens and seniors who died to the entire population. But that portion
of the total population includes seniors only to the extent that seniors
are a part of the total population and the probability the Doe is among
those who died is conditioned upon his membership in the smaller portion
of the population who are seniors.
(16) This distinction is illustrated by the following example. Let
us suppose that of the people who come down with lung cancer, Y percent
have been smokers. That obviously is different from the claim that of
the people who have been smokers, Y percent come down with lung cancer.
(17) Boyer, 1968, p.568.
(18) Ibid, pp. 586-90.
(19) Sophocles, trans. Paul Woodruff, 2001/orig. 442 BCE, p. 3
lines 58--62
(20) Ibid, p. 58, lines 1348--53.
(21) Flaubert, Gustave, 2008/orig. 1857, p. 250.
(22) Rumsfeld, D., 2002.
(23) On the other hand, it should not be thought that all public
policy is like the calculation to go to war over possible weapons of
mass destruction. For example, policies that are developed on the basis
of epidemiological models are like the cases that we previously analyzed
concerning the probable death of a senior. In general, epidemiologists
try to develop drugs that result in the great increase of healthy years
among the population at the lowest cost. So, medications to reduce blood
pressure are cheap to produce and distribute and have an enormously
positive effect on a great mass of the population. Successive
applications of the drugs will be tailored to avoid objectionable side
effects. Obviously, calculations of this sort involve conditional
probabilities that are continually revised.
(23) Stark, P.B., 2016, "A fair bet is one for which the
EXPECTED VALUE of the payoff is zero, after accounting for the cost of
the bet. For example, suppose I offer to pay you $2 if a fair coin lands
heads, but you must ante up $1 to play. Your expected payoff is -$1 + $0
xP(tails) + $2 xP(heads) = -$1 + $2 x 50% = $0. This is a fair bet--in
the long run, if you made this bet over and over again, you would expect
to break even. The expected value of a random variable is the long-term
limiting average of its values in independent repeated experiments. The
expected value of the random variable X is denoted EX or E(X). For a
discrete random variable (one that has a countable number of possible
values) the expected value is the weighted average of its possible
values, where the weight assigned to each possible value is the chance
that the random variable takes that value. One can think of the expected
value of a random variable as the point at which its probability
histogram would balance, if it were cut out of a uniform material.
Taking the expected value is a linear operation: if X and Y are two
random variables, the expected value of their sum is the sum of their
expected values (E(X+Y) = E(X) + E(Y)), and the expected value of a
constant a times a random variable X is the constant times the expected
value of X (E(a*X) = ax E(X))."
(25) Hacking, 1991, pp. vii; 1-10.
(26) Howson, 2000, pp. pp. 116-11.
(27) Van Fraassen, 1989, pp. 293-300.
(28) Examples like the foregoing are instructive in analyzing the
logic of self-deception. Self-deception is not a matter of deliberately
ignoring pertinent evidence (How could it be?) but is rather a matter of
mischaracterizing the strength of the evidence one has. Who, for
example, wants to be bothered by the anxiety and inconvenience of a trip
to the doctor over a mere mole on the hand?
(29) Admittedly, this interpretation of probability is only one
among several. See: Romeijan, 2014, [section]4.3.1
(30) Romeijan, op. cit. [section]2-5.
(31) Joyce, James, 2003, [section]1.1; 11.2.
COPYRIGHT 2018 Forum on Public Policy
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2018 Gale, Cengage Learning. All rights reserved.