@BerivanISIK
Berivan Isik
3 years
We derived the information-theoretical limit of model compression and showed that this limit can only be achieved when the reconstructed model is sparse (pruned).
1
0
4

Replies

@BerivanISIK
Berivan Isik
3 years
I will give a talk on our recent work on information-theoretic model compression at the Sparsity in Neural Networks Workshop @sparsenn on Friday.
3
5
73
@BerivanISIK
Berivan Isik
3 years
We also developed a novel model compression method (called SuRP), guided by this information-theoretic formulation, which indeed outputs a sparse model without an explicit pruning step.
1
0
5
@BerivanISIK
Berivan Isik
3 years
Check out our preprint for more details: Registration to @sparsenn workshop is free:
1
0
5