交叉熵的含义

2018-06-08  本文已影响14人  美环花子若野

Cross entropy is a measure of similarity between two distributions. Since the classification models used in deep learning typically output probabilities for each class, we can compare the true class (distribution p) with the probabilities of each class given by the model (distribution q). The more similar the two distributions, the smaller our cross entropy will be.

上一篇下一篇

猜你喜欢

热点阅读