Implementation of convergence P2P information retrieval system from captured video frames

Gilsang Yoo, Hyeoncheol Kim, Sungdae Hong

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

With the evolution of information technology, devices such as smart TVs allow users to easily search for broadcast information, browse, participate in broadcasting, shop, etc. However, while watching, such additional information can be obstructive elements when seen by other people. Moreover, such smart functionalities cannot be utilized in public places. Therefore, in order to address the problem of the viewing disturbance element and formulate a method of providing the information, it is necessary to photograph the TV or the monitor screen using a smart device and immediately transmit the requisite information such as P2P information and advertisement information. We designed and implemented a system that allows confirmation. The proposed system minimizes interruption of viewing and presents a new user experience model so that anyone can easily use screenshots to generate additional information. A region of interest is extracted from a frame of a moving image photographed for efficiently searching for additional information such as the position information of the region of interest, the size information, and the projection information of the image edge of the region of interest. As a result of the experiment, it was confirmed that the additional information matched the video image shot using the smart device.

Original languageEnglish
Pages (from-to)1-11
Number of pages11
JournalPeer-to-Peer Networking and Applications
DOIs
Publication statusAccepted/In press - 2017 Aug 24

Keywords

  • Blind code
  • P2P multimedia
  • Real-time video
  • Video index

ASJC Scopus subject areas

  • Software
  • Computer Networks and Communications

Fingerprint Dive into the research topics of 'Implementation of convergence P2P information retrieval system from captured video frames'. Together they form a unique fingerprint.

Cite this