First-person activity recognition based on three-stream deep features

Ye Ji Kim, Dong Gyu Lee, Seong Whan Lee

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

In this paper, we present a novel three-stream deep feature fusion technique to recognize interaction-level human activities from a first-person viewpoint. Specifically, the proposed approach distinguishes human motion and camera ego-motion to focus on human’s movement. The features of human and camera ego-motion information are extracted from the three-stream architecture. These features are fused by considering a relationship of human action and camera ego-motion. To validate the effectiveness of our approach, we perform experiments on UTKinect-FirstPerson dataset, and achieve state-of-the-art performance.

Original languageEnglish
Title of host publicationInternational Conference on Control, Automation and Systems
PublisherIEEE Computer Society
Pages297-299
Number of pages3
Volume2018-October
ISBN (Electronic)9788993215151
Publication statusPublished - 2018 Dec 10
Event18th International Conference on Control, Automation and Systems, ICCAS 2018 - PyeongChang, Korea, Republic of
Duration: 2018 Oct 172018 Oct 20

Other

Other18th International Conference on Control, Automation and Systems, ICCAS 2018
CountryKorea, Republic of
CityPyeongChang
Period18/10/1718/10/20

Keywords

  • First-person activity recognition
  • Human-robot interaction
  • Robot surveillance.
  • Three-stream deep features

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Science Applications
  • Control and Systems Engineering
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'First-person activity recognition based on three-stream deep features'. Together they form a unique fingerprint.

  • Cite this

    Kim, Y. J., Lee, D. G., & Lee, S. W. (2018). First-person activity recognition based on three-stream deep features. In International Conference on Control, Automation and Systems (Vol. 2018-October, pp. 297-299). [8571982] IEEE Computer Society.