Avatar motion adaptation for AR based 3D tele-conference

Dongsik Jo, Ki Hong Kim, Jeonghyun Kim

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

With the advent of inexpensive depth sensors and more viable methods for human tracking, traditional 2D tele-conference systems are evolving into one that is AR and 3D teleportation based. Compared to the traditional tele-conference systems which offer only flat 2D upper body imageries and mostly a fixed view point (and inconsistent gaze directions), an AR tele-conference with 3D teleported avatars would be more natural and realistic, and can give an enhanced and immersive communication experience. This paper presents an AR based 3D tele-conference prototype with a method to adapt the motion of the teleported avatar to the physical configuration of the other site. The adaptation is needed due to the differences in the physical environments between two sites where the human controller is interacting at one (e.g. sitting on a low chair) and the avatar is being displayed at the other (e.g. augmented on a high chair). The adaptation technique is based on preserving a particular spatial property among the avatar and its interaction objects between the two sites. The spatial relationship is pre-established between the important joint positions of the user/avatar and carefully selected points on the environment interaction objects. The motions of the user transmitted to the other site are then modified in real time considering the changed environment object and by preserving the spatial relationship as much as possible. We have developed a test prototype to demonstrate our approach using the Kinect-based human tracking and a video see-through head-mounted display.

Original languageEnglish
Title of host publication2014 International Workshop on Collaborative Virtual Environments, 3DCVE 2014
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages29-32
Number of pages4
ISBN (Print)9781479952175
DOIs
Publication statusPublished - 2015 Jul 16
EventInternational Workshop on Collaborative Virtual Environments, 3DCVE 2014 - Minneapolis, United States
Duration: 2014 Mar 30 → …

Other

OtherInternational Workshop on Collaborative Virtual Environments, 3DCVE 2014
CountryUnited States
CityMinneapolis
Period14/3/30 → …

Fingerprint

Avatar
Display devices
Controllers
Motion
Communication
Sensors
Prototype
Teleportation
Interaction
Inconsistent
Display
Controller
Sensor
Configuration
Demonstrate
Human
Object

Keywords

  • augmented reality
  • collaborative augmented environments
  • motion adaptation
  • Tele-conference
  • tele-presence

ASJC Scopus subject areas

  • Modelling and Simulation
  • Human-Computer Interaction

Cite this

Jo, D., Kim, K. H., & Kim, J. (2015). Avatar motion adaptation for AR based 3D tele-conference. In 2014 International Workshop on Collaborative Virtual Environments, 3DCVE 2014 (pp. 29-32). [7160932] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/3DCVE.2014.7160932

Avatar motion adaptation for AR based 3D tele-conference. / Jo, Dongsik; Kim, Ki Hong; Kim, Jeonghyun.

2014 International Workshop on Collaborative Virtual Environments, 3DCVE 2014. Institute of Electrical and Electronics Engineers Inc., 2015. p. 29-32 7160932.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Jo, D, Kim, KH & Kim, J 2015, Avatar motion adaptation for AR based 3D tele-conference. in 2014 International Workshop on Collaborative Virtual Environments, 3DCVE 2014., 7160932, Institute of Electrical and Electronics Engineers Inc., pp. 29-32, International Workshop on Collaborative Virtual Environments, 3DCVE 2014, Minneapolis, United States, 14/3/30. https://doi.org/10.1109/3DCVE.2014.7160932
Jo D, Kim KH, Kim J. Avatar motion adaptation for AR based 3D tele-conference. In 2014 International Workshop on Collaborative Virtual Environments, 3DCVE 2014. Institute of Electrical and Electronics Engineers Inc. 2015. p. 29-32. 7160932 https://doi.org/10.1109/3DCVE.2014.7160932
Jo, Dongsik ; Kim, Ki Hong ; Kim, Jeonghyun. / Avatar motion adaptation for AR based 3D tele-conference. 2014 International Workshop on Collaborative Virtual Environments, 3DCVE 2014. Institute of Electrical and Electronics Engineers Inc., 2015. pp. 29-32
@inproceedings{5c16031c2ef74f119f7c593a1f814908,
title = "Avatar motion adaptation for AR based 3D tele-conference",
abstract = "With the advent of inexpensive depth sensors and more viable methods for human tracking, traditional 2D tele-conference systems are evolving into one that is AR and 3D teleportation based. Compared to the traditional tele-conference systems which offer only flat 2D upper body imageries and mostly a fixed view point (and inconsistent gaze directions), an AR tele-conference with 3D teleported avatars would be more natural and realistic, and can give an enhanced and immersive communication experience. This paper presents an AR based 3D tele-conference prototype with a method to adapt the motion of the teleported avatar to the physical configuration of the other site. The adaptation is needed due to the differences in the physical environments between two sites where the human controller is interacting at one (e.g. sitting on a low chair) and the avatar is being displayed at the other (e.g. augmented on a high chair). The adaptation technique is based on preserving a particular spatial property among the avatar and its interaction objects between the two sites. The spatial relationship is pre-established between the important joint positions of the user/avatar and carefully selected points on the environment interaction objects. The motions of the user transmitted to the other site are then modified in real time considering the changed environment object and by preserving the spatial relationship as much as possible. We have developed a test prototype to demonstrate our approach using the Kinect-based human tracking and a video see-through head-mounted display.",
keywords = "augmented reality, collaborative augmented environments, motion adaptation, Tele-conference, tele-presence",
author = "Dongsik Jo and Kim, {Ki Hong} and Jeonghyun Kim",
year = "2015",
month = "7",
day = "16",
doi = "10.1109/3DCVE.2014.7160932",
language = "English",
isbn = "9781479952175",
pages = "29--32",
booktitle = "2014 International Workshop on Collaborative Virtual Environments, 3DCVE 2014",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - GEN

T1 - Avatar motion adaptation for AR based 3D tele-conference

AU - Jo, Dongsik

AU - Kim, Ki Hong

AU - Kim, Jeonghyun

PY - 2015/7/16

Y1 - 2015/7/16

N2 - With the advent of inexpensive depth sensors and more viable methods for human tracking, traditional 2D tele-conference systems are evolving into one that is AR and 3D teleportation based. Compared to the traditional tele-conference systems which offer only flat 2D upper body imageries and mostly a fixed view point (and inconsistent gaze directions), an AR tele-conference with 3D teleported avatars would be more natural and realistic, and can give an enhanced and immersive communication experience. This paper presents an AR based 3D tele-conference prototype with a method to adapt the motion of the teleported avatar to the physical configuration of the other site. The adaptation is needed due to the differences in the physical environments between two sites where the human controller is interacting at one (e.g. sitting on a low chair) and the avatar is being displayed at the other (e.g. augmented on a high chair). The adaptation technique is based on preserving a particular spatial property among the avatar and its interaction objects between the two sites. The spatial relationship is pre-established between the important joint positions of the user/avatar and carefully selected points on the environment interaction objects. The motions of the user transmitted to the other site are then modified in real time considering the changed environment object and by preserving the spatial relationship as much as possible. We have developed a test prototype to demonstrate our approach using the Kinect-based human tracking and a video see-through head-mounted display.

AB - With the advent of inexpensive depth sensors and more viable methods for human tracking, traditional 2D tele-conference systems are evolving into one that is AR and 3D teleportation based. Compared to the traditional tele-conference systems which offer only flat 2D upper body imageries and mostly a fixed view point (and inconsistent gaze directions), an AR tele-conference with 3D teleported avatars would be more natural and realistic, and can give an enhanced and immersive communication experience. This paper presents an AR based 3D tele-conference prototype with a method to adapt the motion of the teleported avatar to the physical configuration of the other site. The adaptation is needed due to the differences in the physical environments between two sites where the human controller is interacting at one (e.g. sitting on a low chair) and the avatar is being displayed at the other (e.g. augmented on a high chair). The adaptation technique is based on preserving a particular spatial property among the avatar and its interaction objects between the two sites. The spatial relationship is pre-established between the important joint positions of the user/avatar and carefully selected points on the environment interaction objects. The motions of the user transmitted to the other site are then modified in real time considering the changed environment object and by preserving the spatial relationship as much as possible. We have developed a test prototype to demonstrate our approach using the Kinect-based human tracking and a video see-through head-mounted display.

KW - augmented reality

KW - collaborative augmented environments

KW - motion adaptation

KW - Tele-conference

KW - tele-presence

UR - http://www.scopus.com/inward/record.url?scp=84962885945&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84962885945&partnerID=8YFLogxK

U2 - 10.1109/3DCVE.2014.7160932

DO - 10.1109/3DCVE.2014.7160932

M3 - Conference contribution

SN - 9781479952175

SP - 29

EP - 32

BT - 2014 International Workshop on Collaborative Virtual Environments, 3DCVE 2014

PB - Institute of Electrical and Electronics Engineers Inc.

ER -