TY - GEN
T1 - Globally Variance-Constrained Sparse Representation for Rate-Distortion Optimized Image Representation
AU - Zhang, Xiang
AU - Ma, Siwei
AU - Lin, Zhouchen
AU - Zhang, Jian
AU - Wang, Shiqi
AU - Gao, Wen
PY - 2017/4
Y1 - 2017/4
N2 - Sparse representation is efficient to approximately recover signals by a linear composition of a few bases from an over-complete dictionary. However, in the scenario of data compression, its efficiency and popularity are hindered due to the extra overhead for encoding the sparse coefficients. Therefore, how to establish an accurate rate model in sparse coding and dictionary learning becomes meaningful, which has been not fully exploited in the context of sparse representation. According to the Shannon entropy inequality, the variance of data source can bound its entropy, thus can reflect the actual coding bits. Therefore, a Globally Variance-Constrained Sparse Representation (GVCSR) model is proposed, where a variance-constrained rate term is introduced to the conventional sparse representation. To solve the non-convex optimization problem, we employ the Alternating Direction Method of Multipliers (ADMM) for sparse coding and dictionary learning, both of which have shown state-of-The-Art rate-distortion performance in image representation.
AB - Sparse representation is efficient to approximately recover signals by a linear composition of a few bases from an over-complete dictionary. However, in the scenario of data compression, its efficiency and popularity are hindered due to the extra overhead for encoding the sparse coefficients. Therefore, how to establish an accurate rate model in sparse coding and dictionary learning becomes meaningful, which has been not fully exploited in the context of sparse representation. According to the Shannon entropy inequality, the variance of data source can bound its entropy, thus can reflect the actual coding bits. Therefore, a Globally Variance-Constrained Sparse Representation (GVCSR) model is proposed, where a variance-constrained rate term is introduced to the conventional sparse representation. To solve the non-convex optimization problem, we employ the Alternating Direction Method of Multipliers (ADMM) for sparse coding and dictionary learning, both of which have shown state-of-The-Art rate-distortion performance in image representation.
UR - http://www.scopus.com/inward/record.url?scp=85020038585&partnerID=8YFLogxK
UR - https://www.scopus.com/record/pubmetrics.uri?eid=2-s2.0-85020038585&origin=recordpage
U2 - 10.1109/DCC.2017.63
DO - 10.1109/DCC.2017.63
M3 - RGC 32 - Refereed conference paper (with host publication)
SN - 9781509067213
SP - 380
EP - 389
BT - Proceedings - DCC 2017, 2017 DATA COMPRESSION CONFERENCE
A2 - Bilgin, Ali
A2 - Marcellin, Michael W.
A2 - Serra-Sagrista, Joan
A2 - Storer, James A.
PB - IEEE
T2 - 2017 Data Compression Conference (DCC 2017)
Y2 - 4 April 2017 through 7 April 2017
ER -