Deep compression of remotely rendered views
Research output: Journal Publications and Reviews (RGC: 21, 22, 62) › 21_Publication in refereed journal › peer-review
|Journal / Publication||IEEE Transactions on Multimedia|
|Publication status||Published - Jun 2006|
|Link to Scopus||https://www.scopus.com/record/display.uri?eid=2-s2.0-33646746032&origin=recordpage|
Three-dimensional (3-D) models are information-rich and provide compelling visualization effects. However downloading and viewing 3-D scenes over the network may be excessive. In addition low-end devices typically have insufficient power and/or memory to render the scene interactively in real-time. Alternatively, 3-D image warping, an image-based-rendering technique that renders a two-dimensional (2-D) depth view to form new views intended from different viewpoints and/or orientations, may be employed on a limited device. In a networked 3-D environment, the warped views may be further compensated by the graphically rendered views and transmitted to clients at times. Depth views can be considered as a compact model of 3-D scenes enabling the remote rendering of complex 3-D environment on relatively low-end devices. The major overhead of the 3-D image warping environment is the transmission of the depth views of the initial and subsequent references. This paper addresses the issue by presenting an effective remote rendering environment based on the deep compression of depth views utilizing the context statistics structure present in depth views. The warped image quality is also explored by reducing the resolution of the depth map. It is shown that proposed deep compression of the remote rendered view significantly outperforms the JPEG2000 and enables the realtime rendering of remote 3-D scene while the degradation of warped image quality is visually imperceptible for the benchmark scenes. © 2006 IEEE.
- 3-D image warping, Context modeling, Epipolar geometry, Image compression and streaming, Image-based-rendering, Virtual reality