<pre id="dnv3r"><output id="dnv3r"><delect id="dnv3r"></delect></output></pre>
<address id="dnv3r"><pre id="dnv3r"><p id="dnv3r"></p></pre></address>

<pre id="dnv3r"><output id="dnv3r"><delect id="dnv3r"></delect></output></pre><output id="dnv3r"><delect id="dnv3r"><menuitem id="dnv3r"></menuitem></delect></output>
<pre id="dnv3r"><output id="dnv3r"></output></pre>

<noframes id="dnv3r">

<pre id="dnv3r"></pre>

<p id="dnv3r"></p>
<p id="dnv3r"><delect id="dnv3r"></delect></p>

<pre id="dnv3r"></pre>

<pre id="dnv3r"><output id="dnv3r"></output></pre>
<p id="dnv3r"><output id="dnv3r"></output></p>

<p id="dnv3r"></p>
<address id="dnv3r"><p id="dnv3r"><p id="dnv3r"></p></p></address>
<p id="dnv3r"><delect id="dnv3r"></delect></p>
<pre id="dnv3r"></pre>
<p id="dnv3r"></p>

<p id="dnv3r"></p>

<pre id="dnv3r"></pre>
<p id="dnv3r"></p>

<p id="dnv3r"><output id="dnv3r"><delect id="dnv3r"></delect></output></p><p id="dnv3r"><output id="dnv3r"></output></p>
<p id="dnv3r"><output id="dnv3r"><menuitem id="dnv3r"></menuitem></output></p><pre id="dnv3r"><output id="dnv3r"></output></pre>
<p id="dnv3r"></p>

<p id="dnv3r"><p id="dnv3r"><output id="dnv3r"></output></p></p>
<noframes id="dnv3r">

<p id="dnv3r"></p>

<p id="dnv3r"></p>
<p id="dnv3r"></p>

<output id="dnv3r"></output>

<p id="dnv3r"><output id="dnv3r"></output></p>

<pre id="dnv3r"><output id="dnv3r"><delect id="dnv3r"></delect></output></pre>
<p id="dnv3r"><output id="dnv3r"></output></p>
<output id="dnv3r"><delect id="dnv3r"><menuitem id="dnv3r"></menuitem></delect></output>
<pre id="dnv3r"><p id="dnv3r"></p></pre><pre id="dnv3r"><output id="dnv3r"></output></pre>

<pre id="dnv3r"></pre>

<pre id="dnv3r"></pre>
<pre id="dnv3r"></pre>

<p id="dnv3r"></p>

<pre id="dnv3r"></pre>

<p id="dnv3r"></p>
<p id="dnv3r"></p><p id="dnv3r"></p>
<p id="dnv3r"><delect id="dnv3r"></delect></p>
<pre id="dnv3r"></pre>

<noframes id="dnv3r">

<pre id="dnv3r"><output id="dnv3r"><menuitem id="dnv3r"></menuitem></output></pre>

<p id="dnv3r"></p>

<p id="dnv3r"></p>

<pre id="dnv3r"><p id="dnv3r"><delect id="dnv3r"></delect></p></pre>

<pre id="dnv3r"></pre>

<pre id="dnv3r"><output id="dnv3r"></output></pre>

<p id="dnv3r"></p>

<pre id="dnv3r"></pre>

<p id="dnv3r"><p id="dnv3r"><delect id="dnv3r"></delect></p></p>
<p id="dnv3r"><delect id="dnv3r"><listing id="dnv3r"></listing></delect></p><p id="dnv3r"></p>

<p id="dnv3r"></p>

<noframes id="dnv3r"><pre id="dnv3r"><output id="dnv3r"></output></pre><address id="dnv3r"><p id="dnv3r"></p></address><noframes id="dnv3r"><p id="dnv3r"><output id="dnv3r"></output></p>

<p id="dnv3r"></p><pre id="dnv3r"></pre>

<p id="dnv3r"><delect id="dnv3r"></delect></p>
<pre id="dnv3r"><p id="dnv3r"></p></pre>

<p id="dnv3r"><output id="dnv3r"></output></p>
<pre id="dnv3r"></pre>
<p id="dnv3r"><delect id="dnv3r"></delect></p>
<pre id="dnv3r"><delect id="dnv3r"></delect></pre><p id="dnv3r"></p>

<output id="dnv3r"></output>

<noframes id="dnv3r"><output id="dnv3r"></output>

<p id="dnv3r"><output id="dnv3r"></output></p>

<pre id="dnv3r"><output id="dnv3r"></output></pre>

<noframes id="dnv3r"><p id="dnv3r"></p>

<p id="dnv3r"><output id="dnv3r"></output></p>
<noframes id="dnv3r"><output id="dnv3r"></output>

<pre id="dnv3r"></pre>

<p id="dnv3r"><output id="dnv3r"></output></p>

<p id="dnv3r"></p>

<pre id="dnv3r"></pre><pre id="dnv3r"><output id="dnv3r"></output></pre>

Spherical domain rate-distortion optimization for 360-degree video coding

Yiming Li     Jizheng Xu     Zhenzhong Chen    




Abstract

Emerging virtual reality (VR) applications bring much challenge to video coding for 360-degree videos. To compress this kind of video, each picture should be projected to a 2-D plane (e.g. equirectangular projection map) first, adapting to the input of existing video coding systems. At the display side, an inverse projection is performed before viewport rendering. However, such a project introduces much different levels of distortions depending on the location, which makes the rate-distortion optimization process in video coding much inefficient. In this paper, we consider the distortion in spherical domain and analyse its influence to the rate-distortion optimization process. Then we derive the optimal rate-distortion relationship in spherical domain and present its optimal solution based on HEVC/H.265. Experimental results show that the proposed method can bring up to 11.5% bit-saving compared with the current HEVC/H.265 anchor for 360-degree video coding.

 


Our Method

To express and compress these spherical 360-degree videos, currently, it is necessary to map the spherical information into a 2-D image before encoding. Then a standard video coding framework like H.264/AVC, HEVC/H.265 can be used to encode these unfolded 360-degree videos directly. Unfortunately, these projections introduce much different levels of distortions depending on the location, i.e., it is inaccuracy to use PSNR in 2-D plane as the indicator to calculate the distortion in spherical domain. In the recent meeting of Joint Video Exploration Team (JVET) of ITU-T SG16 WP3 and ISO/IEC JTC1/SC29/WG11, the committee recommended the evaluation indicators in spherical domain for 360-degree video evaluation. While it remains a problem in the rate-distortion optimization (RDO) process of encoder.

Given the mismatch between RDO and quality evaluation, in this paper, we propose a spherical domain rate distortion optimization method to improve upon the current HEVC/H.265.

 

Our Results

Fig. 1: 360-degree video testing procedure(recommended by JVET)

 

Fig. 2: Network size effects in City A and B.

Fig. 3: Rate WS-PSNR performance comparison (anchor: HM 16.9 with QP adjustment, RA configuration).

Fig. 4: Subjective visual quality comparison of viewport. (a) shows the viewport image and it’s enlarged images rendered from original map. (b) shows the images from our proposed scheme. (c) shows the images from HM 16.9 with QP adjustment. (bit rate of proposed scheme: 5926.38 kbps, bit rate of HM 16.9 with QP adjustment: 6031.92 kbps)


Reference

 

 

 

Copyright © 2016 Institute of Intelligent Sensing and Computing (IISC). All rights reserved.

a彩彩票登录