Citation
Affendey, Lilly Suriani and Kouchehbagh, Sara Memar and Mustapha, Norwati and C. Doraisamy, Shyamala
(2012)
Concept-based video retrieval system using integration of similarity measures.
In: The Workshop on Advanced Information Technology 2012 (WIT-A2012), 25-27 June 2012, Sydney, Australia. (pp. 21-25).
Abstract
There is a tremendous need to query and process large amount of video data that cannot be easily described textually. Although a query can be on object, motion, texture, color and so on, queries which are expressed in terms of semantic concepts are more intuitive and realistic for end users. However, most videos are not fully annotated; hence results of queries which are solely based on the available annotations would not be exhaustive. Furthermore, annotations of a video shot depends on the annotators perception, thus if several people were to annotate the videos, each might have different perceptions, which results with various tagging semantic concepts. Thus, there is a need to match closely the semantics of the desired query term to the available annotations. This paper describes a concept-based video retrieval model for supporting queries using concepts which are not available in the annotated video. The similarity mapping is based on the integration of the knowledge-based and corpus-based semantic word similarity measures for supporting semantic queries.The system prototype was able to demonstrate the retrieval of the 100 top ranked video shots which are semantically similar to a new concept.
Download File
Additional Metadata
Actions (login required)
|
View Item |