The Library
Deep learning for predictive analytics in reversible steganography
Tools
Chang, Ching-Chun, Wang, Xu, Chen, Sisheng, Echizen, Isao, Sanchez, Victor and Li, Chang-Tsun (2023) Deep learning for predictive analytics in reversible steganography. IEEE Access, 11 . pp. 3494-3510. doi:10.1109/access.2023.3233976 ISSN 2169-3536.
|
PDF
WRAP-deep-learning-predictive-analytics-reversible-steganography-2023.pdf - Published Version - Requires a PDF viewer. Available under License Creative Commons Attribution 4.0. Download (4Mb) | Preview |
Official URL: https://doi.org/10.1109/access.2023.3233976
Abstract
Deep learning is regarded as a promising solution for reversible steganography. There is an accelerating trend of representing a reversible steo-system by monolithic neural networks, which bypass intermediate operations in traditional pipelines of reversible steganography. This end-to-end paradigm, however, suffers from imperfect reversibility. By contrast, the modular paradigm that incorporates neural networks into modules of traditional pipelines can stably guarantee reversibility with mathematical explainability. Prediction-error modulation is a well-established reversible steganography pipeline for digital images. It consists of a predictive analytics module and a reversible coding module. Given that reversibility is governed independently by the coding module, we narrow our focus to the incorporation of neural networks into the analytics module, which serves the purpose of predicting pixel intensities and a pivotal role in determining capacity and imperceptibility. The objective of this study is to evaluate the impacts of different training configurations upon predictive accuracy of neural networks and provide practical insights. In particular, we investigate how different initialisation strategies for input images may affect the learning process and how different training strategies for dual-layer prediction respond to the problem of distributional shift. Furthermore, we compare steganographic performance of various model architectures with different loss functions.
Item Type: | Journal Article | ||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Subjects: | Q Science > Q Science (General) Q Science > QA Mathematics Q Science > QA Mathematics > QA76 Electronic computers. Computer science. Computer software |
||||||||||||||||||||||||
Divisions: | Faculty of Science, Engineering and Medicine > Science > Computer Science | ||||||||||||||||||||||||
SWORD Depositor: | Library Publications Router | ||||||||||||||||||||||||
Library of Congress Subject Headings (LCSH): | Deep learning (Machine learning) , Cryptography , Neural networks (Computer science) , Coding theory | ||||||||||||||||||||||||
Journal or Publication Title: | IEEE Access | ||||||||||||||||||||||||
Publisher: | IEEE | ||||||||||||||||||||||||
ISSN: | 2169-3536 | ||||||||||||||||||||||||
Official Date: | January 2023 | ||||||||||||||||||||||||
Dates: |
|
||||||||||||||||||||||||
Volume: | 11 | ||||||||||||||||||||||||
Page Range: | pp. 3494-3510 | ||||||||||||||||||||||||
DOI: | 10.1109/access.2023.3233976 | ||||||||||||||||||||||||
Status: | Peer Reviewed | ||||||||||||||||||||||||
Publication Status: | Published | ||||||||||||||||||||||||
Access rights to Published version: | Open Access (Creative Commons) | ||||||||||||||||||||||||
Date of first compliant deposit: | 14 March 2023 | ||||||||||||||||||||||||
Date of first compliant Open Access: | 14 March 2023 | ||||||||||||||||||||||||
RIOXX Funder/Project Grant: |
|
Request changes or add full text files to a record
Repository staff actions (login required)
View Item |
Downloads
Downloads per month over past year