VRProj: Delivering 360-degree Video With Viewport-Adaptive Truncation
Planning to Explore via Self-Supervised World Models
VRProj: Delivering 360-degree Video With Viewport-Adaptive Truncation

WPMC 2022


Abstract
Delivering Virtual Reality (VR) content wirelessly involves projecting a 360-video into a 2D format and then encoding it to satisfy the wireless bitrate requirements. However, the popular equirectangular and cubemap projections offer little flexibility to adapt to changing bitrates and headset motion. In this work, we show that the truncated square pyramid projection offers high flexibility for network and headset motion adaptation. We adapt by tuning a truncation parameter that controls the video quality for different spatial regions in the 360-video. Depending on the video, our scheme improves average video quality by up to 1.1dB in PSNR and up to 4.6 in VMAF score compared to a non-adaptive baseline.


Website Template Originally made by Phillip Isola and Richard Zhang for colorful ECCV project; the code can be found here.