Skip to content Skip to navigation
University of Warwick
  • Study
  • |
  • Research
  • |
  • Business
  • |
  • Alumni
  • |
  • News
  • |
  • About

University of Warwick
Publications service & WRAP

Highlight your research

  • WRAP
    • Home
    • Search WRAP
    • Browse by Warwick Author
    • Browse WRAP by Year
    • Browse WRAP by Subject
    • Browse WRAP by Department
    • Browse WRAP by Funder
    • Browse Theses by Department
  • Publications Service
    • Home
    • Search Publications Service
    • Browse by Warwick Author
    • Browse Publications service by Year
    • Browse Publications service by Subject
    • Browse Publications service by Department
    • Browse Publications service by Funder
  • Help & Advice
University of Warwick

The Library

  • Login
  • Admin

Region-object relation-aware dense captioning via transformer

Tools
- Tools
+ Tools

Shao, Zhuang, Han, Jungong, Marnerides, Demetris and Debattista, Kurt (2022) Region-object relation-aware dense captioning via transformer. IEEE Transactions on Neural Networks and Learning Systems . pp. 1-12. doi:10.1109/tnnls.2022.3152990 ISSN 2162-2388.

[img]
Preview
PDF
WRAP-Region-object-relation-aware-dense-captioning-transformer-2022.pdf - Accepted Version - Requires a PDF viewer.

Download (1293Kb) | Preview
Official URL: https://doi.org/10.1109/tnnls.2022.3152990

Request Changes to record.

Abstract

Dense captioning provides detailed captions of complex visual scenes. While a number of successes have been achieved in recent years, there are still two broad limitations: 1) most existing methods adopt an encoder-decoder framework, where the contextual information is sequentially encoded using long short-term memory (LSTM). However, the forget gate mechanism of LSTM makes it vulnerable when dealing with a long sequence and 2) the vast majority of prior arts consider regions of interests (RoIs) equally important, thus failing to focus on more informative regions. The consequence is that the generated captions cannot highlight important contents of the image, which does not seem natural. To overcome these limitations, in this article, we propose a novel end-to-end transformer-based dense image captioning architecture, termed the transformer-based dense captioner (TDC). TDC learns the mapping between images and their dense captions via a transformer, prioritizing more informative regions. To this end, we present a novel unit, named region-object correlation score unit (ROCSU), to measure the importance of each region, where the relationships between detected objects and the region, alongside the confidence scores of detected objects within the region, are taken into account. Extensive experimental results and ablation studies on the standard dense-captioning datasets demonstrate the superiority of the proposed method to the state-of-the-art methods.

Item Type: Journal Article
Divisions: Faculty of Science, Engineering and Medicine > Engineering > WMG (Formerly the Warwick Manufacturing Group)
SWORD Depositor: Library Publications Router
Journal or Publication Title: IEEE Transactions on Neural Networks and Learning Systems
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
ISSN: 2162-2388
Official Date: 11 March 2022
Dates:
DateEvent
11 March 2022Published
13 March 2022Accepted
Page Range: pp. 1-12
DOI: 10.1109/tnnls.2022.3152990
Status: Peer Reviewed
Publication Status: Published
Reuse Statement (publisher, data, author rights): © 2022 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
Access rights to Published version: Restricted or Subscription Access
Date of first compliant deposit: 13 June 2022
Date of first compliant Open Access: 13 June 2022
Related URLs:
  • https://ieeexplore.ieee.org/Xplorehelp/d...

Request changes or add full text files to a record

Repository staff actions (login required)

View Item View Item

Downloads

Downloads per month over past year

View more statistics

twitter

Email us: wrap@warwick.ac.uk
Contact Details
About Us