Supervised Learning

Supervised learning
x1, x2, x3 … xn -> y1
x21 x22 x23 … x2n -> y2
->Xm

f(xm) = ym

OCCAM’S RAZOR
everything eles begins equal
choose the less complex hypothesis
fit <-> low complexity
generalization error and over fitting error.

SPAM, HAM
offer is secret, play sports today
click secret link, went play sports
secret sports link, secret sport event, sport is today, sport costs money

P(spam) = 3/8

Maximum likehood
ssshhhhh p(s)=π
p(yi) = {π if yi = s, 1-π if yi = h
p(yi) = πyi ・(1-π)1-yi
p(date) = πi=1p(yi)= πcount(yi=1) (1-π)count(yi=0)=π3・(1-π)5

MLSolutions for
P(“secret” | spam) = 1/3
P(“secret” | ham) = 1/15