转置卷积的理解

2019-01-16  本文已影响0人  苟且偷生小屁屁

转置卷积(Transposed convolution)用在什么地方?

  1. 转置卷积在图像的语义分割领域应用很广,如果说pooling层用于特征降维,那么在多个polling层后,就需要用转置卷积来进行分辨率的恢复。
  2. 比方说在全卷积神经网络中,up-sampling采用双线性插值进行分辨率的提升,而这种提升是非学习的,采用解卷积来完成上采样的工作,就可以通过学习的方式得到更高的精度。

转置卷积(解卷积)为什么叫“转置”?

对于下图,卷积核C可以表示为:

图片.png

\left( {\begin{array}{*{20}c} {\begin{array}{*{20}c} {w_{0,0} } & {w_{0,1} } & {w_{0,2} } & {\begin{array}{*{20}c} 0 & {w_{1,0} } & {w_{1,1} } & {\begin{array}{*{20}c} {w_{1,2} } & 0 & {w_{2,0} } & {\begin{array}{*{20}c} {w_{2,1} } & {w_{2,2} } & {\begin{array}{*{20}c} 0 & 0 & 0 & {\begin{array}{*{20}c} 0 & 0 \\ \end{array}} \\ \end{array}} \\ \end{array}} \\ \end{array}} \\ \end{array}} \\ \end{array}} \\ {\begin{array}{*{20}c} {\begin{array}{*{20}c} {\begin{array}{*{20}c} {\begin{array}{*{20}c} 0 & {w_{0,0} } & {w_{0,1} } \\ \end{array}} & {w_{0,2} } & 0 & {w_{1,0} } \\ \end{array}} & {w_{1,1} } & {w_{1,2} } & 0 \\ \end{array}} & {w_{2,0} } & {w_{2,1} } & {\begin{array}{*{20}c} {\begin{array}{*{20}c} {w_{2,2} } & 0 \\ \end{array}} & 0 & 0 & 0 \\ \end{array}} \\ \end{array}} \\ {\begin{array}{*{20}c} 0 & 0 & {\begin{array}{*{20}c} {\begin{array}{*{20}c} 0 & 0 & {w_{0,0} } & {w_{0,1} } \\ \end{array}} & {w_{0,2} } & 0 & {w_{1,0} } \\ \end{array}} & {\begin{array}{*{20}c} {w_{1,1} } & {w_{1,2} } & {\begin{array}{*{20}c} {\begin{array}{*{20}c} 0 & {w_{2,0} } \\ \end{array}} & {w_{2,1} } & {w_{2,2} } & 0 \\ \end{array}} \\ \end{array}} \\ \end{array}} \\ {\begin{array}{*{20}c} 0 & 0 & {\begin{array}{*{20}c} {\begin{array}{*{20}c} {\begin{array}{*{20}c} 0 & 0 & 0 \\ \end{array}} & {w_{0,0} } & {w_{0,1} } & {w_{0,2} } \\ \end{array}} & 0 & {w_{1,0} } & {\begin{array}{*{20}c} {\begin{array}{*{20}c} {w_{1,1} } & {w_{1,2} } \\ \end{array}} & 0 & {w_{2,0} } & {w_{2,1} } \\ \end{array}} \\ \end{array}} & {w_{2,2} } \\ \end{array}} \\ \end{array}} \right)

所以,如果已经知道Y反过来想求X,
X = C^T Y
所以,才有了"转置"这种称呼。

转置卷积的缺点

  1. 从上文中可以看到,卷积矩阵是稀疏的,因此大量的信息是无用的;
  2. 求卷积矩阵的转置矩阵是非常耗费计算资源的。
上一篇下一篇

猜你喜欢

热点阅读