scispace - formally typeset
Open AccessJournal ArticleDOI

Novel Linguistic Steganography Based on Character-Level Text Generation

Lingyun Xiang, +4 more
- Vol. 8, Iss: 9, pp 1558
Reads0
Chats0
TLDR
A character-level linguistic steganographic method to embed the secret information into characters instead of words by employing a long short-term memory (LSTM) based language model, which has the fastest running speed and highest embedding capacity.
Abstract
With the development of natural language processing, linguistic steganography has become a research hotspot in the field of information security. However, most existing linguistic steganographic methods may suffer from the low embedding capacity problem. Therefore, this paper proposes a character-level linguistic steganographic method (CLLS) to embed the secret information into characters instead of words by employing a long short-term memory (LSTM) based language model. First, the proposed method utilizes the LSTM model and large-scale corpus to construct and train a character-level text generation model. Through training, the best evaluated model is obtained as the prediction model of generating stego text. Then, we use the secret information as the control information to select the right character from predictions of the trained character-level text generation model. Thus, the secret information is hidden in the generated text as the predicted characters having different prediction probability values can be encoded into different secret bit values. For the same secret information, the generated stego texts vary with the starting strings of the text generation model, so we design a selection strategy to find the highest quality stego text from a number of candidate stego texts as the final stego text by changing the starting strings. The experimental results demonstrate that compared with other similar methods, the proposed method has the fastest running speed and highest embedding capacity. Moreover, extensive experiments are conducted to verify the effect of the number of candidate stego texts on the quality of the final stego text. The experimental results show that the quality of the final stego text increases with the number of candidate stego texts increasing, but the growth rate of the quality will slow down.

read more

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI

Image super-resolution reconstruction based on feature map attention mechanism

TL;DR: The evaluating indicator of Peak Signal to Noise Ratio and Structural Similarity Index has been improved to a certain degree, while the effectiveness of using feature map attention mechanism in image super-resolution reconstruction applications is useful and effective.
Journal ArticleDOI

Linguistic Generative Steganography With Enhanced Cognitive-Imperceptibility

TL;DR: In this paper, the proposed methods can further constrain the semantic expression of the generated steganographic text on the basis of ensuring certain perceptual-imperceptibility and statistical-IMPERceptibility, so as to enhance its cognitive-inceptibility.
Journal ArticleDOI

Coverless Steganography Based on Motion Analysis of Video

TL;DR: A coverless steganography scheme based on motion analysis of video that not only obtains a good trade-off between hiding information capacity and robustness but also can achieve higher hiding success rate and lower transmission data load, which shows good practicability and feasibility.
Journal ArticleDOI

Entity alignment via knowledge embedding and type matching constraints for knowledge graph inference

TL;DR: This work proposes a new EA framework based on knowledge embeddings (KEs) and type matching constraints that significantly improves the accuracy of EA compared with state-of-the-art methods.
Journal ArticleDOI

A Novel Approach for Linguistic Steganography Evaluation Based on Artificial Neural Networks

TL;DR: In this paper, the authors evaluated an AI-based statistical language model for text Steganography and proposed the positive aspects of NLP-based Markov chain model for an auto-generative cover text.
References
More filters
Journal ArticleDOI

Long short-term memory

TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Proceedings Article

Sequence to Sequence Learning with Neural Networks

TL;DR: The authors used a multilayered Long Short-Term Memory (LSTM) to map the input sequence to a vector of a fixed dimensionality, and then another deep LSTM to decode the target sequence from the vector.
Posted Content

Sequence to Sequence Learning with Neural Networks

TL;DR: This paper presents a general end-to-end approach to sequence learning that makes minimal assumptions on the sequence structure, and finds that reversing the order of the words in all source sentences improved the LSTM's performance markedly, because doing so introduced many short term dependencies between the source and the target sentence which made the optimization problem easier.
Proceedings Article

Generating Text with Recurrent Neural Networks

TL;DR: The power of RNNs trained with the new Hessian-Free optimizer by applying them to character-level language modeling tasks is demonstrated, and a new RNN variant that uses multiplicative connections which allow the current input character to determine the transition matrix from one hidden state vector to the next is introduced.
Book ChapterDOI

Hiding the Hidden: A software system for concealing ciphertext as innocuous text

TL;DR: A system for protecting the privacy of cryptograms to avoid detection by censors is presented, which transforms ciphertext into innocuous text which can be transformed back into the original ciphertext.
Related Papers (5)