Conditional Probabilities in Artificial Intelligence

Estudies4you
Conditional Probabilities in Artificial Intelligence

Conditional Probabilities

In probability theory, a condition probability of Ai given Aj is is the probability of Ai if Ai and Aj is known to occur (or have occurred). 
It is commonly denoted by P(Ai | Aj) and sometimes PAj(Ai).
        {\rm{P(}}{{\rm{A}}_i}{\rm{ | }}{{\rm{A}}_j}){\rm{  =  }}\frac{{P({A_i},{\rm{ }}{{\rm{A}}_j})}}{{P({A_j})}}

Here P(Ai, Aj) is the joint probability of Ai and Aj and also denoted as P(Ai  Aj) and P(Aj) is the marginal probability of Aj
The joint probability can also be represented in the terms of conditional, probability as follows, 
            P(Ai, Aj) = P(Ai | AjP(Aj)
Consider an example where we can calculate the probability that the student has failed if he does not study well. 
        P(F = True,{\rm{ }}S = False){\rm{ }} = {\rm{ }}\frac{{P(F = True,{\rm{ }}S = False)}}{{P(S = False)}}
A conditional probability represents the normalized version of joint probability. 
The conditional and joint probabilities can be explained by using the venn diagrams. 
Let us take the same example of a student studies or not and if he fails or not, P(F, ᆨS) is represented as the overlapping region of the ellipses in the below Figure. 3.2.1. 
Examples of conditional probability in Aritificial Intelligence,Conditional Probabilities in Artificial Intelligence,Conditional Probabilities in AI,what is conditional probability in AI,AI conditional probability,
Fig 3.2.1: Venn Diagram
It denotes F = true and S = false. The region outside the ellipses denotes the condition where the student studies and does not fail. The region where S = false denotes that the student does not study and the region F = true denote the student fails. 

NOTE 1: The marginal probabilities can be calculated from the joint probabilities from the above Figure. 3.2.1 as follows, 
            P(F) = P(F, S) + P(F, ᆨS)

NOTE 2: From several variables conditioned on other variables we can have joint conditional probabilities. 
Example{\rm{P(}}\neg {\rm{Q, R | }}\neg {\rm{S, T)  =  }}\frac{{{\rm{P(}}\neg {\rm{Q, R, }}\neg {\rm{Q, T)}}}}{{P(\neg S,{\rm{ T)}}}}

NOTE 3: A joint probability can be expressed in terms of a chain conditional probabilities. 
Example : P(Q, R, S, T) = P(Q | R, S, T) P(R | S, T) P(S | T) P (T) 
This chain rule has it general form, 
        P({A_1},{\rm{ }}{{\rm{A}}_2},.....{A_k}){\rm{  =  }}\prod\limits_{i = 1}^k {P(A{}_i{\rm{ |}}} {\rm{ }}{{\rm{A}}_{i - 1}},....{A_1})
    The way of ordering of variables in a joint probability function is not so important. Therefore the joint probability function can be written as, 
        P({A_i},{\rm{ }}{A_j}){\rm{  =  P(}}{{\rm{A}}_i}{\rm{ | }}{{\rm{A}}_j}){\rm{ P(}}{{\rm{A}}_j}){\rm{  =  P(}}{{\rm{A}}_j}{\rm{ | }}{{\rm{A}}_i}){\rm{ P(}}{{\rm{A}}_i}){\rm{  =  }}P({A_j},{\rm{ }}{A_i})
It can be noted that, 
        P({A_i},{\rm{ }}{A_j}){\rm{  =  }}\frac{{{\rm{P(}}{{\rm{A}}_j}{\rm{ | }}{{\rm{A}}_i})}}{{{\rm{P(}}{{\rm{A}}_j})}}
The above equation is known as Bayes rule. 
p(v) is used as an abbreviation for P(A1, A2, ….Ak) where v = {A1, A2, ….Ak}


To Top