Dynamic Network Surgery

2017-10-25  本文已影响93人  信步闲庭v

Approach

Han song recently propose to compress DNN by deleting unimportant parameters and retraining the remaining ones. This leads to two main issues. The first issue is the possibility of irretrievable network. Another issue is learning inefficiency.

To be more specific, we propose to sever redundant connections by means of continual network maintenance, which we call dynamic network surgery. The proposed method involves two key operations: pruning and splicing.

Taking the kth layer as an example, we propose to solve the following optimization problem, h(.) is a discriminative function, which satisfies h(w) = 1 if parameter w seems to be crucial in the current layer, and 0 otherwise.

Experiment


References:
Dynamic Network Surgery for Efficient DNNs, Yiwen Guo, 2016, NIPS

上一篇 下一篇

猜你喜欢

热点阅读