Deep 6-DOF Head Motion Prediction for Latency in Lightweight Augmented Reality Glasses

Seongwook Yoon, Hee Jeong Lim, Jae Hyun Kim, Hong Seok Lee, Yun Tae Kim, Sanghoon Sull

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Computationally expensive rendering of virtual 3D objects in mixed reality applications of lightweight AR glasses can be performed by a remotely connected external server. However, nonnegligible 6DOF pose error caused by the remote rendering latency results in 3D visual inconsistency which can be hardly removed by 2D image correction using IMU. In this paper, we propose a novel 6DOF pose prediction algorithm based on learnable combination of consistent motion model and deep prediction. We formulate the combination of both as controlled residual learning and model ensemble. We build a dataset and demonstrate that our algorithm provides accurate prediction under 200ms.

Original languageEnglish
Title of host publication2022 IEEE International Conference on Consumer Electronics, ICCE 2022
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781665441544
DOIs
Publication statusPublished - 2022
Event2022 IEEE International Conference on Consumer Electronics, ICCE 2022 - Virtual, Online, United States
Duration: 2022 Jan 72022 Jan 9

Publication series

NameDigest of Technical Papers - IEEE International Conference on Consumer Electronics
Volume2022-January
ISSN (Print)0747-668X

Conference

Conference2022 IEEE International Conference on Consumer Electronics, ICCE 2022
Country/TerritoryUnited States
CityVirtual, Online
Period22/1/722/1/9

Keywords

  • 6-DOF pose prediction
  • AR glasses
  • augmented reality
  • deep learning
  • mixed reality

ASJC Scopus subject areas

  • Industrial and Manufacturing Engineering
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Deep 6-DOF Head Motion Prediction for Latency in Lightweight Augmented Reality Glasses'. Together they form a unique fingerprint.

Cite this