The Library
An efficient approach for geo-multimedia cross-modal retrieval
Tools
Zhu, Lei, Long, Jun, Zhang, Chengyuan, Yu, Weiren, Yuan, Xinpan and Sun, Longzhi (2019) An efficient approach for geo-multimedia cross-modal retrieval. IEEE Access, 7 . pp. 180571-180589. doi:10.1109/ACCESS.2019.2940055 ISSN 2169-3536.
|
PDF
WRAP-efficient-approach-geo-multimedia-cross-modal-Yu-2019.pdf - Accepted Version - Requires a PDF viewer. Download (12Mb) | Preview |
Official URL: http://dx.doi.org/10.1109/ACCESS.2019.2940055
Abstract
Due to the rapid development of mobile Internet techniques, such as online social networking and location-based services, massive amount of multimedia data with geographical information is generated and uploaded to the Internet. In this paper, we propose a novel type of cross-modal multimedia retrieval, called geo-multimedia cross-modal retrieval, which aims to find a set of geo-multimedia objects according to geographical distance proximity and semantic concept similarity. Previous studies for cross-modal retrieval and spatial keyword search cannot address this problem effectively because they do not consider multimedia data with geo-tags (geo-multimedia). Firstly, we present the definition of kNN geo-multimedia cross-modal query and introduce relevant concepts such as spatial distance and semantic similarity measurement. As the key notion of this work, cross-modal semantic representation space is formulated at the first time. A novel framework for geo-multimedia cross-modal retrieval is proposed, which includes multi-modal feature extraction, cross-modal semantic space mapping, geo-multimedia spatial index and cross-modal semantic similarity measurement. To bridge the semantic gap between different modalities, we also propose a method named cross-modal semantic matching (CoSMat for shot) which contains two important components, i.e., CorrProj and LogsTran, which aims to build a common semantic representation space for cross-modal semantic similarity measurement. In addition, to implement semantic similarity measurement, we employ deep learning based method to learn multi-modal features that contains more high level semantic information. Moreover, a novel hybrid index, GMR-Tree is carefully designed, which combines signatures of semantic representations and R-Tree. An efficient GMR-Tree based kNN search algorithm called $k$ GMCMS is developed. Comprehensive experimental evaluations on real and synthetic datasets clearly demonstrate that our approach outperforms the...
Item Type: | Journal Article | |||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Subjects: | P Language and Literature > P Philology. Linguistics T Technology > TK Electrical engineering. Electronics Nuclear engineering |
|||||||||||||||||||||
Divisions: | Faculty of Science, Engineering and Medicine > Science > Computer Science | |||||||||||||||||||||
Library of Congress Subject Headings (LCSH): | Semantics, Video recordings, Internet, Indexing, Visualization | |||||||||||||||||||||
Journal or Publication Title: | IEEE Access | |||||||||||||||||||||
Publisher: | IEEE | |||||||||||||||||||||
ISSN: | 2169-3536 | |||||||||||||||||||||
Official Date: | 9 September 2019 | |||||||||||||||||||||
Dates: |
|
|||||||||||||||||||||
Volume: | 7 | |||||||||||||||||||||
Page Range: | pp. 180571-180589 | |||||||||||||||||||||
DOI: | 10.1109/ACCESS.2019.2940055 | |||||||||||||||||||||
Status: | Peer Reviewed | |||||||||||||||||||||
Publication Status: | Published | |||||||||||||||||||||
Reuse Statement (publisher, data, author rights): | © 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. | |||||||||||||||||||||
Access rights to Published version: | Restricted or Subscription Access | |||||||||||||||||||||
Date of first compliant deposit: | 31 January 2020 | |||||||||||||||||||||
Date of first compliant Open Access: | 12 February 2020 | |||||||||||||||||||||
RIOXX Funder/Project Grant: |
|
Request changes or add full text files to a record
Repository staff actions (login required)
View Item |
Downloads
Downloads per month over past year