Querying Video Libraries

Een Jun Hwang, V. S. Subrahmanian

Research output: Contribution to journalArticle

23 Citations (Scopus)

Abstract

There is now growing interest in organizing and querying large bodies of video data. In this paper, we will develop a simple SQL-like video language which can be used not only to identify videos in the library that are of interest to the user, but which can also be used to extract, from such a video in a video library, the relevant segments of the video that satisfy the specified query condition. We investigate various types of user requests and show how they are expressed using our query language. We also develop polynomial-time algorithms to process such queries. Furthermore, we show how video presentations may be synthesized in response to a user query. We show how a standard relational database system can be extended in order to handle queries such as those expressed in our language. Based on these principles, we have built a prototype video retrieval system called VIQS. We describe the design and implementation of VIQS and show some sample interactions with VIQS.

Original languageEnglish
Pages (from-to)44-60
Number of pages17
JournalJournal of Visual Communication and Image Representation
Volume7
Issue number1
DOIs
Publication statusPublished - 1996 Mar 1
Externally publishedYes

Fingerprint

Relational database systems
Query languages
Polynomials

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition
  • Signal Processing
  • Electrical and Electronic Engineering

Cite this

Querying Video Libraries. / Hwang, Een Jun; Subrahmanian, V. S.

In: Journal of Visual Communication and Image Representation, Vol. 7, No. 1, 01.03.1996, p. 44-60.

Research output: Contribution to journalArticle

Hwang, Een Jun ; Subrahmanian, V. S. / Querying Video Libraries. In: Journal of Visual Communication and Image Representation. 1996 ; Vol. 7, No. 1. pp. 44-60.
@article{3a87cb451c12400ab94ef2780081bb0d,
title = "Querying Video Libraries",
abstract = "There is now growing interest in organizing and querying large bodies of video data. In this paper, we will develop a simple SQL-like video language which can be used not only to identify videos in the library that are of interest to the user, but which can also be used to extract, from such a video in a video library, the relevant segments of the video that satisfy the specified query condition. We investigate various types of user requests and show how they are expressed using our query language. We also develop polynomial-time algorithms to process such queries. Furthermore, we show how video presentations may be synthesized in response to a user query. We show how a standard relational database system can be extended in order to handle queries such as those expressed in our language. Based on these principles, we have built a prototype video retrieval system called VIQS. We describe the design and implementation of VIQS and show some sample interactions with VIQS.",
author = "Hwang, {Een Jun} and Subrahmanian, {V. S.}",
year = "1996",
month = "3",
day = "1",
doi = "10.1006/jvci.1996.0005",
language = "English",
volume = "7",
pages = "44--60",
journal = "Journal of Visual Communication and Image Representation",
issn = "1047-3203",
publisher = "Academic Press Inc.",
number = "1",

}

TY - JOUR

T1 - Querying Video Libraries

AU - Hwang, Een Jun

AU - Subrahmanian, V. S.

PY - 1996/3/1

Y1 - 1996/3/1

N2 - There is now growing interest in organizing and querying large bodies of video data. In this paper, we will develop a simple SQL-like video language which can be used not only to identify videos in the library that are of interest to the user, but which can also be used to extract, from such a video in a video library, the relevant segments of the video that satisfy the specified query condition. We investigate various types of user requests and show how they are expressed using our query language. We also develop polynomial-time algorithms to process such queries. Furthermore, we show how video presentations may be synthesized in response to a user query. We show how a standard relational database system can be extended in order to handle queries such as those expressed in our language. Based on these principles, we have built a prototype video retrieval system called VIQS. We describe the design and implementation of VIQS and show some sample interactions with VIQS.

AB - There is now growing interest in organizing and querying large bodies of video data. In this paper, we will develop a simple SQL-like video language which can be used not only to identify videos in the library that are of interest to the user, but which can also be used to extract, from such a video in a video library, the relevant segments of the video that satisfy the specified query condition. We investigate various types of user requests and show how they are expressed using our query language. We also develop polynomial-time algorithms to process such queries. Furthermore, we show how video presentations may be synthesized in response to a user query. We show how a standard relational database system can be extended in order to handle queries such as those expressed in our language. Based on these principles, we have built a prototype video retrieval system called VIQS. We describe the design and implementation of VIQS and show some sample interactions with VIQS.

UR - http://www.scopus.com/inward/record.url?scp=0030104825&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0030104825&partnerID=8YFLogxK

U2 - 10.1006/jvci.1996.0005

DO - 10.1006/jvci.1996.0005

M3 - Article

AN - SCOPUS:0030104825

VL - 7

SP - 44

EP - 60

JO - Journal of Visual Communication and Image Representation

JF - Journal of Visual Communication and Image Representation

SN - 1047-3203

IS - 1

ER -