Softmax函数详解

2021-11-27  本文已影响0人  LabVIEW_Python

深度学习中常提到的Softmax函数,用于把 logits层,即用于分类任务的神经网络的最后一层的输出logits, 取值范围为[-∞,+∞],转换为 和为1的一组概率分布值。数学定义如下:

Softmax函数数学定义
Python代码实现(Numpy + TensorFlow实现比较)
import numpy as np
def softmax(x):
    return np.exp(x) / np.sum(np.exp(x))

x = np.array([1.0,2.0,1.0])
y = softmax(x)
print(f"x: {x}; softmax(x):{y}; sum of y:{np.sum(y)}")

import tensorflow as tf 
softmax = tf.keras.layers.Softmax()
Y = softmax(x)
print(f"x: {x}; softmax(x):{Y}; sum of y:{tf.math.reduce_sum(Y)}")

x: [1. 2. 1.]; softmax(x):[0.21194156 0.57611688 0.21194156]; sum of y:1.0
tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load
2021-11-27 16:40:21.900741: I tensorflow/core/platform/cpu_feature_guard.cc:142] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions
in performance-critical operations: AVX AVX2
x: [1. 2. 1.]; softmax(x):[0.21194157 0.5761169 0.21194157]; sum of y:1.0000001192092896

其用途在于:

上一篇 下一篇

猜你喜欢

热点阅读