Uncertain Reasoning

What is Uncertainty?

Knowledge that is incomplete, imprecise, vague, probabilistic,
or uncertain in any other number of ways
"I think the probability of the Mickey Mouse party winning
the next election is between 8090%"  Spokesperson for the Mickey
Mouse party.

Above is uncertain in several ways

Probability is high but not 100%

Probability expressed as a range, 8090%, is vague.

Source may be biased or otherwise unreliable
Example of Different Types of Uncertainty in One Context

Written transcripts collected from Experts recommending dosage of drug
for hypothetical patient
Expert 
Prescribed Treatment 
A 
450mg, 3 times a day 
B 
600mg800mg, 3 times a day 
C 
About 650mg, 23 times a day 
D 
Likely to be 500mg twice a day 
E 
500mg or 800mg (handwriting unclear) twice a day 
F 
400mg 4 times a day or 200mg once a day 
G 
500mg 
H 
At least 500mg, twice a day 
I 
The usual dose for this drug is 500mg, twice a day 
J 
10g, twice a day 
K 
Don't know, have to look it up 
L 
No idea whatsoever 
M 
13, Acacia Avenue 
What Types of Uncertainty?

No uncertainty, 'hard and fast'
B 
600800mg, 3 times a day 
C 
About 650mg, 23 times a day 

Vague, how many mg? How many times a day? B is imprecise, C is fuzzy
D 
Likely to be 500mg twice a day 

Statement of confidence
E 
500mg or 800mg (handwriting unclear), twice a day 

Ambiguous
F 
400mg 4 times a day or 200mg once a day 

Inconsistent (think about it)
G 
500mg 
H 
At least 500mg, twice a day 

Incomplete, doesn't give enough information to act on.
I 
The usual dose for this drug is 500mg, twice a day 

Based on default rule of doubtful relevance

Anomalous compared to other responses
K 
Don't Know, have to look it up 
L 
No idea whatsoever 

Ignorance

Completely Irrelevant
An AI Classification of Uncertainty

Other classifications exist, this one is particularly suitable for AI

An understanding of the different types of uncertainty is important to
a KE

Choosing shells, analyzing transcripts

etc.
Unary Uncertainty

Ignorance

Indeterminate knowledge The patient should take 23 doses of 200 per
day

Partial knowledge We have a 90% chance of winning the election

Conflict

Equivocation There has been rain in the South Island of New Zealand.
If there was a small amount, it should mean a good wheat crop, but if there
was too much, the crop may be washed out.

Ambiguity The best jam to use is that sold in Sainbury's

Anomaly The patient was 18 metres tall
Set Theoretic Uncertainty

Conflict

Anomaly Let's see, all these cake recipes use 12 teaspoons of baking
powder, and this one uses 2 tablespoons of it!

Inconsistency The Conservative party will win the next election but
the Labour party will win 90% of the seats

Ignorance

Incompleteness We, uh, did hear that the patient had chest pains, but,
uh we didn't do an electrocardiograph on his heart.

Irrelevancy John Smith is a patient with a severe headache, and he is
wearing a blue suit
Another Typology of Uncertainty

Smitheson's Typology used to argue about emphasis in Western Science on
catagorisation of certain knowledge at the expense of uncertain knowledge

Mainly here to show how vagueness subdivides into fuzziness and nonspecificity
Fuzziness

Words can have different meanings depending on the speaker
Martin is very tall
Really, he's not that tall
Well, not compared to you, but not everyone is seven feet!!!!

Or, depending on context
That full back is really short, he must have something else going
for him to be selected for the national team

This rugby player may be 6 feet tall

Effectively the element (e.g. rugby player) may be partially in
the set tall and partially out

Other Fuzzy words include; short, bug, tasty, cold, late, etc. etc.
Membership in a Fuzzy Set

Can use number 0 .. 1

membership of 1.0 means completely and unambiguously a member of the
set

membership of 0.0 mean not a member of the set at all

E.g. if someone is only slightly tall, or tall in a few contexts,
we could say:
m_{TALL}(Ringo) = 0.2
If someone is very tall or tall in most contexts, we could
say
m_{TALL}(George) = 0.8
Can represent this 'tallness' as a function, possibly acquired by sampling
opinion
Fuzzy Set Membership Values Are Not Probabilities

Can't really say that p(someone is tall) is some number, when we
don't really know what tall really is

Also, membership value is used in ordering, but does not follow any proportions
in the real world
m_{TALL}(Ringo) = 0.2 m_{TALL}(George) = 0.8
George is NOT four times as tall as Ringo
Negation

if SHORT is the opposite of TALL, then we can calculate
m_{SHORT}(Ringo) = 1.0  m_{TALL}(Ringo)
= 1.0  0.2
= 0.8
Note that SHORT must be the exact opposite of TALL
AND

To find the intersection of two sets take the minimum of the two raw fuzzy
values

E.g. if SPIV means TALL AND WELL_DRESSED
m_{SPIV}(Ringo) = min(m_{TALL}(Ringo), m_{WELL_DRESSED}(Ringo))
= min( 0.6, 0.2 )
= 0.2
OR

To find the union of two sets, take the maximum of the two raw fuzzy values

E.g. if DANGEROUS means SLOPPY OR POORLY_TRAINED
m_{DANGEROUS}(Pat) = max(m_{SLOPPY}(Pat), m_{POORLY_TRAINED}(Pat))
= max( 0.7, 0.1 )
= 0.7
Probability Tables

Imagine p(rain) = 0.2, p(¬rain) = 0.8

If it rains, p(rain) becomes 1.0, p(¬rain) becomes 0.0

We know that it is raining, no doubt

Imagine we are considering two variables, p(rain), p(thursday)

Can represent all possible combinations of events as a probability table
p(rain & thursday) 
0.02846 
p(rain&¬thursday) 
0.17154 
p(¬rain&thursday) 
0.1142 
p(¬rain&¬thursday) 
0.6858 

Can do sophisticated reasoning with this table
p(rain & thursday) 
0.02846 
p(rain&¬thursday) 
0.17154 
p(¬rain&thursday) 
0.1142 
p(¬rain&¬thursday) 
0.6858 

Can work out p(rain) easily
p(rain) = p(rain&thursday) + p(rain&¬thursday)
= 0.02846 + 0.17154
= 0.2
Can also update table if we observe one variable, e.g. rain
p(rain&thursday) 
p(rain&¬thursday) 
become 1.0 (sum total)
p(¬rain&thursday) 
p(¬rain&¬thursday) 
become 0.0 (sum total)
Can do this using the ratios:
p*(rain) (new p(rain)) / p(rain) (old p(rain)) = 1.0/0.2 = 5
p*(¬rain) / p(¬rain) = 0/0.8 = 0
p(rain & thursday) = 
0.02846 * 5 = 0.14230 
p(rain&¬thursday) = 
0.17154 * 5 = 0.85770 
p(¬rain&thursday) = 
0.1142 * 0 = 0 
p(¬rain&¬thursday) = 
0.6858 * 0 = 0 

Here the table only has two variables

Far more sophisticated reasoning can be done on larger tables

Note that we can refer to the new value of p*(thursday) as
p(thursdayrain)
Or The probability that it's thursday given that we know that it is
raining

Here the two variables are independent

Much more interesting cases will be discussed in the section on Bayesian
Logic
Probability Based Expert Systems?
 When should we use probabilistic methods for ES?
 When:
 Knowledge in the domain reasonably fits the axioms of probability.
 Well defined propositions, additive beliefs
 Appropriate source of knowledge (E.g. Experts)
 Even Experts may not know the probabilities for their domain of expertise.
 Not necessary to reason with other types of uncertainty such as ignorance
 The domain is not too large.
Problem
 People don't work with probabiities very well.
 Common problems
 BaseRate Neglect
 People may ignore a high baserate (e.g. most days have clouds in the ski when updating their belief in a hypothesis (it will rain), leading to too high a probability assigned to the hypothesis
 Anchoring
 People often don't reject a hypothesis or adequately reduce their belief in it upon seeing contrary evidence. People often become quite attached to their hypotheses.
 Sub or superadditivity of elicited probabilities
 Simply put, the probabilities elicited may add up to less than or greater than 1.0.
 Reasons for this may be quite complex, e.g. either overconfidence in probabilities, missing hypotheses, implicit openworld assumptions, etc.
References
 Representing Uncertain Knowledge, by Krause & Clarke. (In the library)
 Web pages from postgraduate course on UR
 Bayesian People and other Bayes Resources.
 Cattle Disease Diagnosis System.
 Lincoln University Bayesian Logic Notes.
 Bayes Links Page (not very complete).
 Bayes: Some Technical References.
 Bayes Networks.
 Bayes' Theorem.
 Bayes' Theorem.
 Biography of Thomas Bayes (interesting).