Soft weight-sharing for DNN
2017-10-26 本文已影响105人
信步闲庭v
Approach
where p(w) is the prior over w and p(D|w) is the model likelihood.
After re-training we set each weight to the mean of the component that takes most responsibility for it.
Experiment
References:
SOFT WEIGHT-SHARING FOR NEURAL NETWORK COMPRESSION, Karen Ullrich, 2017, ICLR