CMSC250

Probability

Probability

Conditional Probability
Bayes Theorem

Conditional Probability

Recall: $P(A\vert B) = \frac{P(A \cap B)}{P(B)}$

$P(A)$ given that $B$ already occured

Roll 2 distinct dice. Probability that the sum is prime?

Probability that the sum is prime, given the first dice is a 1?

Recall: $P(A\vert B) = \frac{P(A \cap B)}{P(B)}$

$P(A\vert B)(P(B)) = P(A \cap B)$

Suppose I have 52 cards. Each card has a number (1-13) 4 of each number

Draw 2, without replacement: probability I get first an even number and then a 7 or 11?

Consider: for events $A,B$, and outcome in A is either in B or it isn't.

That is $A \cap B$ and $A \cap B^c$ are disjoint

$P(A) = P(A \cap B) + P(A \cap B^c)$

$P(A) = P(A\vert B)P(B) + P(A\vert B^c)P(B^c)$

$P(A) = P(A\vert B)P(B) + P(A\vert B^c)P(B^c)$

Suppose a bag has 5 green and 7 purple balls. You take out one and then another. Probability second is green?

$A = $ The second draw is green

$B = $ The first draw is purple

Bayes Theorem

Question: Are $P(A\vert B)$ and $P(B\vert A)$ related?

$P(B\vert A) = \frac{P(A \cap B)}{P(A)}$

$P(A\cap B) = P(B\vert A)P(A)$

$P(A\vert B) = \frac{P(A\cap B)}{P(B)}$

$P(A\vert B) = \frac{P(B\vert A)P(A)}{P(B)}$

$P(A\vert B) = \frac{P(B\vert A)P(A)}{P(B)}$

Bayes Theorem

Useful for prior knowledge probabilities

$P(A\vert B) = \frac{P(B\vert A)P(A)}{P(B|A)P(A)+P(B|A^c)P(A^c)}$

Suppose there is a medical test to see if you have a virus

Test has 95% true positive rate

Test has 2% false positive rate

3% of population have virus

Probability you have virus if you test positive?

Suppose we are doing some Natrual Langauge Processing (NLP)

A noun is followed by a verb 40% of the time

A non-noun is followed by a verb 20% of the time

10% of all words are nouns

Given a verb, probability a noun preceeded it?

Baysean Inference

$P(H\vert E) = \frac{P(E\vert H)P(H)}{P(E)}$

$H$ is the hypothesis, $E$ is the evidence

How does $H$ change given more and more evidence?

Take CMSC421 or CMSC422