Probability for continuous spaces
f(x)= 1/360, f(0) < x <= 360
Date * Time you were born
P(x)= 0
f(x)= 0.0166
f(x<=noon) = 2*f(x>noon)
a=0.0555 1/18
b=0.0277 1/3*1/12
随机应变 ABCD: Always Be Coding and … : хороший
Probability for continuous spaces
f(x)= 1/360, f(0) < x <= 360
Date * Time you were born
P(x)= 0
f(x)= 0.0166
f(x<=noon) = 2*f(x>noon)
a=0.0555 1/18
b=0.0277 1/3*1/12
p(x)=0
in continuous distribution
every outcome has probability 0
outcome:x
P(0
P(c)= p0 = 0.1, p(¬c)=0.9
p(pos|c)= p1 = 0.9, p(pos|¬c)=0.1
p(neg|¬c)= p2 = 0.8, p(neg|c)= 0.2
p(p)= 0.09 + 0.18 = 0.27
def f(p0, p1, p2): return p0*p1 + (1-p0)*(1-p2) print f(0.1, 0.9, 0.8)
program bayes rule
def f(p0, p1, p2): return p0*p1 / (p0 * p1 + (1-p0)*(1-p2)) print f(0.1, 0.9, 0.8)
def f(p0,p1,p2): return p0 * (1-p1)/(p0 * (1-p1)+(1-p0)*p2) print f(0.1, 0.9, 0.8)
def f(p1, p2): return p1 * p2 print f(0.5, 0.8)
c1 p(H|c1)=p1
c2 p(H|c2)=p2
p(c1)=p0=0.3
p(c2)=1-p0=0.7
p1 = 0.5
p2 = 0.9
0.15+0.7*0.9
p(H)=0.78
def f(p0, p1, p2): return p0 * p1 + (1-p0) * p2 print f(0.3, 0.5, 0.9)
P(R)=P(G)=0.5
P(seeR|atR)=0.8 P(seeR|¬R)=0.2
P(seeG|atG)=0.8 P(seeG|¬G)=0.2
P(atR|seeR)=0.8
P(atG|seeR)=0.2
def f(p): return 1-p print f(0.3)
def f(p): return p*p print f(0.5)
P(c)=0.1, P(¬c)=0.9
P(Pos|C)=0.9 P(Neg|C)=0.1
P(Neg|¬C)=0.5 P(Pos|¬C)=0.5
Test = Neg P(c, Neg)=0.01
P(¬c, Neg)=0.45
P(Neg)=0.46
P(c|Neg)=0.0217
P(¬c|Neg)=0.9783
Test = Neg P(c, Pos)=0.09
P(¬c, Neg)=0.45
P(Pos)=0.54
P(c|Pos)=0.167
P(¬c|Pos)=0.833
P(c)=0.01, P(¬c)=0.99
P(Pos|c)=0.9, P(Neg|c)=0.1
P(Neg|¬c)=0.9, P(Pos|¬c)=0.1
P(c,neg)=0.001
P(¬c,Neg)=0.891
P(Neg)=0.892
P(gone)=0.6
p(home)=0.4
P(rain|home)=0.01 P(rain|¬home)=0.99
P(rain|gone)=0.3 P(rain|¬gone)=0.7
P(home|rain)=0.0217
0.4*0.01 / 0.4*0.01 + 0.6*0.3
def f(p): return p print f(0.3)
Bayes Rule
Rev Thomas Bayes
Example P(c)=0.01
90% it is positive if you have c
90% it is negative if you don’t have c
test = positive
probability of having cancer: 8.1/3%
Bayes Rule
prior probability + test evidence -> posterior probability
prior
P(c)=0.01
P(pos|c)=0.9
p(neg|¬c)=0.9
posterior
P(c|Pos)= P(c)*P(Pos|C) = 0.009
P(¬C|Pos)= P(¬C)*P(Pos|¬C) = 0.099
normalize
P(pos)= P(c,Pos)+P(¬C,Pos)=0.108
posterior
P(c|Pos)=0.0833
P(¬c|Pos)=0.9167
This is the algorithm of Bayes Rule
Dependent things
outcome first is dependent
smart 0.5, dumb 0.5
P(P@S) = 0.001
P(P@S/SMART) = 0.002
P(P@S/DUMB) = 0.000
P(cancer)=0.1
P(¬cancer)=0.9
Dependant is conditional probability
P(positive|cancer)= 0.9
P(negative|cancer)= 0.1
P(positive|¬cancer)=0.2
p(negative|¬cancer)=0.8
Cancer Test P()
Y, P, 0.09(0.1*0.9)
Y, N, 0.01(0.1*0.1)
N, P, 0.18(0.9*0.2)
N, N, 0.72(0.9*0.8)
P(Positive Result)=0.27
P(c) -> P(¬c)
P(P|c) -> P(n|c)
P(P|¬c) -> P(n|¬c)
P(p) = P(p|c)・P(c)+P(p|¬c)・p(¬c)
Total probability
P(test|disease)
P(test)= P(test|disease)*p(disease) + P(test|¬disease)*P(¬disease)
Probability of event: P
P(A) = 1 – P(¬A)
P(H)= 0.5, P(H,H) = 0.25
P(H)=0.6, P(T)=0.4, P(H,H)= 0.36
P(Exactly one H) = 0.5 -> HT, TH
P(H)=0.5, P(Exactly one H) in 3 Flips = 3/8
P(H)=0.6, P(Exactly one H) in 3 Flips = 0.288
(0.6*0.4*0.4) * 3 = 0.288