Reasoning with Uncertain Information

Estudies4you
Introduction to Reasoning with Uncertain Information of Artificial Intelligence

REASONING WITH UNCERTAIN INFORMATION 

INTRODUCTION 
In some cases it is possible that the information about a task is uncertain. That is we do not know exact information. For example, in the case of tossing a coin we do not know the output that is, it can be a head or tail. This chapter deals with giving some reasoning about the uncertain information like, how certain we are about the result. 

REVIEW OF PROBABILITY THEORY 
To know how certain is the given information we use probability. The basic or fundamental ideas of probability are discussed in the following sections. 

Fundamental Ideas 
Before going to have a review of probability theory let us have a look at how the variables and values are represented in different environments. 
Let a collection of random variables be A1, A2, A3....... Ak
These variables will be of many types based on application. Like, 
1) If they are of propositions, the value of these variables will be true or false. 
2) If they represent the case of coin tossing, the values will be head or tail i.e., H or T and the coin is represented as C. If many coins are tossed they are represented as (C1, C2, C3,....... , Ck ). 
3) The values are in the form of numbers in case of representing heights and weights etc. Totally they are categorical. That is, they depend on the category. 

Joint Probability: If the variables are  A1, A2, A3....... Ak and the values of them are a1, a2, …. ak  then the joint probability is denote by the expression, P(A1=a1, A2= a2, ….Ak=ak) and the expression P(A1, A2, ….Ak) is referred as joint probability function over variab!es V1, V2,....... , Vk
Example: If we toss a coin then the probability that it is a tail is P(T) = 1/2. 
Some of the rules or conditions which must be satisfied by probability functions are, 
1) 0 ≤ P(A1, A2,....... , Ak 1  
That is the probability of any variables must lie between 0 and 1. 
2) ∑P(A1, A2, A3,....... , Ak) = 1 
    This condition states the summation of overall values of variables should not exceed 1. 
    If in the case of coin tossing, we know that probability of getting a tail is 1/2 with the property '2' we can get the probability of getting a head to be 1/2. 
Consider four propositional atoms P, Q, R, S, T their values may be true or false. These four binary valued variables will have 16 joint probabilities. The joint probability can be represented as P(Q =q, R = r, S = s, T = t) where q, r, s, t represents the variable's value that is whether true or false. 

Let us consider the joint probabilities of these variables to be as listed "in the Table 3.2.1. 
Table 3.2.1: Joint Probabilities of Four Variables Q, R, S, T

(Q, R, S, T)

Joint Probability

(True, True, True, True)

0.5686

(True, True, True, False)

0.0299

(True, True, False, True)

0.0135

(True, True, False, False)

0.0007

These values are just assumption. 

Marginal Probability: When we know the value's of joint probability for a collection of random variables the marginal probability of one of these variables can be calculated.
Example: If joint probabilities of (Q, R,. S, T), are known then marginal probability of P(Q = True) is the case where Q is true which is found in 8 joint probabilities of the 16 joint probabilities of (Q, R, S, T).
So, marginal probability of P(Q = true) is, 
                      P(Q = true) = \sum\limits_{Q = true} {P(Q,R,S,T)}
in general it will be, 
                     P(Q = q) = \sum\limits_{Q = q} {P(Q,R,S,T)}
The marginal probability P(Q = True) = 0.95 is calculated from the above formula. 
    ∴ Lower order joint probability lower order. joint probabilities can also be calculated in the same way as marginal probability 
Example: Joint probability of P(Q = q, T = t) is the sun of all those 4 full joint probabilities where Q = q & T = t.
                  P(Q = q,T = t) = \sum\limits_{Q = q,T = t} {P(Q,R,S,T)}
If lower  order joint probabilities are known, they can be used to, know the other lower order joint probabilities and marginal probabilities. 
We can use the notation P(Q, ¬á†¨R) instead of P(Q = true, R = false). 
If the joint probability for a set of random. variables is known as in above table then we can calculate all the marginal and lower order joint probabilities. 

To Top