贝叶斯决策理论

2019-01-25  本文已影响0人  cccshuang

Bayesian

Bayes's Theorem

P(A|B) = \frac{P(B|A)P(A)}{P(B)}
prior: P(\omega)
likelihood: P(x|\omega)
posterior: P(\omega_i|x) = \frac{P(x|\omega_i)P(\omega_i)}{P(x)} = \frac{P(x|\omega_i)P(\omega_i)}{\sum_{j=1}^k P(x|\omega_j)P(\omega_j)}

Optimal Bayes Decision Rule: minimize the probability of error.
    if P(\omega_1|x) > P(\omega_2|x) then True state of nature =\omega_1;
    if P(\omega_1|x) < P(\omega_2|x) then True state of nature =\omega_2.

Prove: For a particular x,
        P(error|x) = P(\omega_1|x) if we decide \omega_2;
        P(error|x) = P(\omega_2|x) if we decide \omega_1.
Bayes Decision Rule:Decide \omega_1 if P(\omega_1|x) > P(\omega_2|x);otherwise decide \omega_2.
Therefore: P(error|x) = min[P(\omega_1|x),P(\omega_2|x)].
The unconditional error P(error) obtained by integration over all P(error|x).

Bayesian Decision Theory

c state of nature: \{\omega_1,,\omega_2,\cdots,\omega_c\}
a possible actions: \{\alpha_1,\alpha_2,\cdots,\alpha_a\}
the loss for taking action \alpha_i when the true state of nature is \omega_j: \lambda(\alpha_i|\omega_j)
R(\alpha_i|x) = \sum_{j=1}^{c}\lambda(\alpha_i|\omega_j)P(\omega_j|x)
Select the action for which the conditional risk R(\alpha_i|x) is minimum.
Bayes Risk: R = \sum_{over x} R(\alpha_i|x).

Binary classification \longrightarrow Multi‐class classfication

f1 f2 f3
c1 -1 1 -1
c2 1 -1 -1
c3 -1 1 1
上一篇 下一篇

猜你喜欢

热点阅读