scispace - formally typeset
Open AccessJournal ArticleDOI

Large-Scale JPEG Image Steganalysis Using Hybrid Deep-Learning Framework

TLDR
A generic hybrid deep-learning framework for JPEG steganalysis incorporating the domain knowledge behind rich steganalytic models is proposed, and it is demonstrated that the framework is insensitive to JPEG blocking artifact alterations, and the learned model can be easily transferred to a different attacking target and even a different data set.
Abstract
Adoption of deep learning in image steganalysis is still in its initial stage. In this paper, we propose a generic hybrid deep-learning framework for JPEG steganalysis incorporating the domain knowledge behind rich steganalytic models. Our proposed framework involves two main stages. The first stage is hand-crafted, corresponding to the convolution phase and the quantization and truncation phase of the rich models. The second stage is a compound deep-neural network containing multiple deep subnets, in which the model parameters are learned in the training procedure. We provided experimental evidence and theoretical reflections to argue that the introduction of threshold quantizers, though disabling the gradient-descent-based learning of the bottom convolution phase, is indeed cost-effective. We have conducted extensive experiments on a large-scale data set extracted from ImageNet. The primary data set used in our experiments contains 500 000 cover images, while our largest data set contains five million cover images. Our experiments show that the integration of quantization and truncation into deep-learning steganalyzers do boost the detection performance by a clear margin. Furthermore, we demonstrate that our framework is insensitive to JPEG blocking artifact alterations, and the learned model can be easily transferred to a different attacking target and even a different data set. These properties are of critical importance in practical applications.

read more

Citations
More filters
Journal ArticleDOI

Deep Residual Network for Steganalysis of Digital Images

TL;DR: A deep residual architecture designed to minimize the use of heuristics and externally enforced elements that is universal in the sense that it provides state-of-the-art detection accuracy for both spatial-domain and JPEG steganography.
Journal ArticleDOI

A Novel Image Steganography Method via Deep Convolutional Generative Adversarial Networks

TL;DR: A novel image SWE method based on deep convolutional generative adversarial networks that has the advantages of highly accurate information extraction and a strong ability to resist detection by state-of-the-art image steganalysis algorithms.
Journal ArticleDOI

CNN-Based Adversarial Embedding for Image Steganography

TL;DR: This paper presents a steganographic scheme with a novel operation called adversarial embedding (ADV-EMB), which achieves the goal of hiding a stego message while at the same time fooling a convolutional neural network (CNN)-based steganalyzer.
Posted Content

Deep Convolutional Neural Network to Detect J-UNIWARD

Guanshuo Xu
- 26 Apr 2017 - 
TL;DR: An empirical study on applying convolutional neural networks (CNNs) to detecting J-UNIWARD -- one of the most secure JPEG steganographic method and generalizes to large-scale databases and to different cover sizes.
Journal ArticleDOI

Robust Detection of Image Operator Chain With Two-Stream Convolutional Neural Network

TL;DR: Experimental results show that the proposed framework for detecting image operator chain based on convolutional neural network not only obtains significant detection performance but also can distinguish the order in some cases that previous works were unable to identify.
References
More filters
Proceedings ArticleDOI

Going deeper with convolutions

TL;DR: Inception as mentioned in this paper is a deep convolutional neural network architecture that achieves the new state of the art for classification and detection in the ImageNet Large-Scale Visual Recognition Challenge 2014 (ILSVRC14).
Proceedings Article

Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift

TL;DR: Applied to a state-of-the-art image classification model, Batch Normalization achieves the same accuracy with 14 times fewer training steps, and beats the original model by a significant margin.
Posted Content

Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift

TL;DR: Batch Normalization as mentioned in this paper normalizes layer inputs for each training mini-batch to reduce the internal covariate shift in deep neural networks, and achieves state-of-the-art performance on ImageNet.
Journal ArticleDOI

Deep learning in neural networks

TL;DR: This historical survey compactly summarizes relevant work, much of it from the previous millennium, review deep supervised learning, unsupervised learning, reinforcement learning & evolutionary computation, and indirect search for short programs encoding deep and large networks.
Posted Content

Caffe: Convolutional Architecture for Fast Feature Embedding

TL;DR: Caffe as discussed by the authors is a BSD-licensed C++ library with Python and MATLAB bindings for training and deploying general-purpose convolutional neural networks and other deep models efficiently on commodity architectures.
Related Papers (5)