ITU's 160 anniversary

Engagée à connecter le monde

Edge-assisted user-centric real-time 3D remote near-eye rendering for AR/MR headsets

Edge-assisted user-centric real-time 3D remote near-eye rendering for AR/MR headsets

Authors: Bishakha Rani Biswas, Xueyu Hou, Yongjie Guan
Status: Final
Date of publication: 25 June 2025
Published in: ITU Journal on Future and Evolving Technologies, Volume 6 (2025), Issue 2, Pages 162-169
Article DOI : https://doi.org/10.52953/CCGV8693
Abstract:
Augmented Reality and Mixed Reality (AR/MR) headsets are transforming computing by enabling immersive 3D experiences, yet inherent size and power limitations prevent them from matching desktop systems in delivering complex graphics. As a result, many graphics-intensive applications cannot run natively on these devices. Remote rendering offers a promising alternative by offloading heavy 3D graphics computations to a server and streaming the rendered results to AR/MR headsets. However, conventional remote rendering approaches often suffer from considerable interaction latency over wireless networks, making them unsuitable for latency-sensitive applications. This paper introduces a novel low-latency remote rendering system that enables real-time 3D graphics on AR/MR headsets. By leveraging image-based rendering with advanced 3D image warping techniques, our system synthesizes headset displays from server-generated depth images. Experimental results demonstrate that our approach significantly reduces interaction latency while maintaining high rendering quality, achieved through the careful optimization of multiple-depth image generation strategies.

Keywords: 3D rendering, augmented reality, edge assisted, human-computer interaction
Rights: © International Telecommunication Union, available under the CC BY-NC-ND 3.0 IGO license.
electronic file
DetailArticlePrix
anglais
PDF format  
GratuitTéléchargement