Re: Just one message on random variables

Kevin S. Van Horn (ksvhsoft@xmission.com)
Fri, 26 Jun 1998 08:12:44 -0600

Denver Dash wrote:

> However, I interpreted the above paragraph as an argument _supporting_
> the Bayesian definition, by attempting to discredit--even given the
> frequentist definition of probability--the frequentist notion that
> probalities represent objective physical quantities. The argument,
> as I understand it, says that the probability of the coin landing heads
> is just a manifestation of our uncertainty of the initial conditions
> of the coin toss.

Yes, you've understood my argument correctly here.

> My argument is that the ML calculation averaged over all conditions which
> are not intrinsic properties of the coin (i.e. the initial conditions)
> yields a well-defined value that is different for different coins. Why
> isn't this then a "physical property" of the coin?

Averaged *how*? You can't get away from specifying some sort of probability
distribution over initial conditions to produce this number. You might
come up with some distribution which is uniform in what you consider the
appropriate parameter space, in order to do this averaging. Whether or not
this distribution is really the right one to use depends on your own state
of knowledge. It will be inappropriate for someone who has detailed
information on the initial conditions. Once you've worked out a
distribution over the initial conditions, you can then combine it with
the physical properties of the coin to obtain the probability of heads vs.
tails. But the probability you end up with is not, in itself, a property
of the coin; it is a combination of *properties of the coin* and
*properties of YOU*.

And this is why I say that, with the possible exception of quantum mechanics,
there are no physical probabilities. All purported physical probabilities
are in fact combinations of physical properties with personal states of
knowledge. The illusion of physical probabilities arises when
there is some canonical state of knowledge (usually some form of ignorance)
that is often a close approximation to one's true state of knowledge, and
thus one can report a canonical probability that is often (but not always)
appropriate for describing the system.

One shortcoming of thinking in terms of physical probabilities is that it
gives you no guidance in how to proceed if you are uncertain about the
physical properties of the system in question. The view of probabilities
as describing states of knowledge handles this quite tidily -- you can
incorporate your uncertainty about the system's physical properties by doing
a weighted average:
P(event | I) =
(SUM p:: P(system has properties p | I) * P(event | I, p))

As a final note, for the example in question (coin tossing), we in fact
usually do not have detailed information on the physical properties of
the system. So why do we assign a probability of 1/2 to heads? Largely
because the way the coin lands has a lot more to do with how it is flipped
than with its detailed physical properties, and it is usually flipped in
a manner that is highly sensitive to initial conditions. Jaynes discusses
this in Chapter 10, Physics of "Random Experiments", of his book, under the
heading "How to Cheat at Coin and Die Tossing". I'll send a postscript
file of this chapter to anyone who requests it. (It's also available at
bayes.wustl.edu.) Jaynes shows, both by analysis and experiment, that with
a bit of practice one can obtain any frequency of heads desired,
*regardless of the bias of the coin*. It all depends on how the coin is
tossed. So for this particular example, physical properties of the coin
have very little bearing on the problem.