A novel mode decision for depth map coding in 3D-AVS

Jing Su, Falei Luo, Shanshe Wang, Shiqi Wang, Xiaoqiang Guo, Siwei Ma

Research output: Chapters, Conference Papers, Creative and Literary WorksRGC 32 - Refereed conference paper (with host publication)peer-review

Abstract

In this paper, a new mode decision scheme is proposed for depth map coding in 3D-AVS. The novelty of the paper mainly contains the following two points. Firstly, an improved distortion estimation model of synthesized views is proposed. Secondly, for the mode decision of depth map coding, the distortion is represented to be the weighted sum of depth distortion and estimated distortion of the synthesized view. We proposed a new scheme to derive the weighting factors adaptively based on the disparity. Then the distortion is utilized to calculate the rate distortion cost for mode decision. Experimental results demonstrate that the proposed scheme achieves remarkable performance improvement in 3D-AVS. The average BD-rate gain is about 12%.
Original languageEnglish
Title of host publicationVCIP 2016 : the 30th Anniversary of Visual Communication and Image Processing
PublisherIEEE
ISBN (Electronic)978-1-5090-5316-2
ISBN (Print)978-1-5090-5317-9
DOIs
Publication statusPublished - Nov 2016
Externally publishedYes
EventVCIP 2016 : International Conference on Visual Communications and Image Processing - Chengdu, China
Duration: 27 Nov 201630 Nov 2016
http://www.wikicfp.com/cfp/servlet/event.showcfp?eventid=51810&copyownerid=85341

Publication series

NameVisual Communications and Image Processing

Conference

ConferenceVCIP 2016 : International Conference on Visual Communications and Image Processing
Abbreviated titleVCIP 2016
Country/TerritoryChina
CityChengdu
Period27/11/1630/11/16
Internet address

Research Keywords

  • 3D-AVS
  • Depth distortion
  • depth map coding
  • disparity
  • view synthesized distortion estimation

Fingerprint

Dive into the research topics of 'A novel mode decision for depth map coding in 3D-AVS'. Together they form a unique fingerprint.

Cite this