I concur with what Zadeh had to say. My own interest has been in the
application of Baysian and similar theories to the provision of useful
predictions in 'closed' law enforcement environments such as air and sea
cargo terminals and passenger halls at airports (looking for a better way to
target contraband smugglers and quarantine risks). I use the term 'closed'
in so far as the number and type of data variables are more limited than in
street policing and therefore possibly more manageable from an analytical
perspective.
The problem that those of us who have examined the application of Baysian
systems to such environments have faced is that of usefully dealing with the
'possible but not probable'. The worse case scenario is that everyone in a
passenger hall at a given is trying to smuggle contraband - 'possible' but
hardly 'probable'.
But as you work your way back from this extreme, the whole application of
probability to the question of who is carrying the drugs, guns, weapons-grae
plutonium, etc. becomes blurred and therefore imprecise. The fact that such
imprecision exists is not an argument against attempting to
allocate/quantify risk factors and the undertaking considered analysis.
Rather it is an argument in favour of also having sniffer dogs and x-ray
machines at airports. I would be interested in the views of others.
Glenn Jones
Managing Director
Prediction Systems Pty Ltd
Canberra Australia
> -----Original Message-----
> From: owner-uai@cs.orst.edu [mailto:owner-uai@cs.orst.edu]On Behalf Of
> zadeh
> Sent: Saturday, 12 May 2001 2:11
> To: uai@cs.orst.edu
> Subject: Re:[UAI]how to evaluate approximate algorithms when exact
> solution is not available?
>
>
> The question posed by Haipeng Guo and subsequent comments by Rina
> Dechter, Marek Druzdzel and Eugene Santos touch upon a basic problem.
> What is not widely recognized is that the problem in question is an
> instance of a class of problems which have no crisp solutions within the
> conceptual structure of classical logic and probability theory.
>
> The culprit is what may be called the dilemma of "it is possible
> but not probable." More specifically, to compute an upper bound on the
> error of approximation it is necessary to be able to assess the
> probability of the worst case scenario-- a scenario which, in general,
> is possible but not probable. The problem is that the probability of the
> worst case scenario does not lend itself to crisp assessment.
>
> The same problem arises in dealing with imprecise probabilities,
> especially in the context of Bayesian networks. When imprecisely known
> probabilities are treated as if they were precise, validity of analysis
> is open to question. What this points to is an imperative need to
> develop a better understanding of computing with imprecise
> probabilities-- as most real-world probabilities are.
>
> What is said above does not detact from the usefulness of results
> which are alluded to in the comments. What it means is that in the
> analysis of complex systems it is hard, and frequently impossible, to
> achieve precision, rigor and usefulness at the same time.
>
> --
> Professor in the Graduate School, Computer Science Division
> Department of Electrical Engineering and Computer Sciences
> University of California
> Berkeley, CA 94720 -1776
> Director, Berkeley Initiative in Soft Computing (BISC)
>
> Address:
> Computer Science Division
> University of California
> Berkeley, CA 94720-1776
> Tel(office): (510) 642-4959 Fax(office): (510) 642-1712
> Tel(home): (510) 526-2569
> Fax(home): (510) 526-2433, (510) 526-5181
> zadeh@cs.berkeley.edu
> http://www.cs.berkeley.edu/People/Faculty/Homepages/zadeh.html
>
This archive was generated by hypermail 2b29 : Sat May 12 2001 - 19:25:32 PDT