scispace - formally typeset
Open AccessJournal ArticleDOI

Deep Learning Enabled Semantic Communication Systems

Reads0
Chats0
TLDR
In this paper, a deep learning based semantic communication system, named DeepSC, for text transmission based on the Transformer, aims at maximizing the system capacity and minimizing the semantic errors by recovering the meaning of sentences, rather than bit- or symbol-errors in traditional communications.
Abstract
Recently, deep learned enabled end-to-end communication systems have been developed to merge all physical layer blocks in the traditional communication systems, which make joint transceiver optimization possible Powered by deep learning, natural language processing has achieved great success in analyzing and understanding a large amount of language texts Inspired by research results in both areas, we aim to provide a new view on communication systems from the semantic level Particularly, we propose a deep learning based semantic communication system, named DeepSC, for text transmission Based on the Transformer, the DeepSC aims at maximizing the system capacity and minimizing the semantic errors by recovering the meaning of sentences, rather than bit- or symbol-errors in traditional communications Moreover, transfer learning is used to ensure the DeepSC applicable to different communication environments and to accelerate the model training process To justify the performance of semantic communications accurately, we also initialize a new metric, named sentence similarity Compared with the traditional communication system without considering semantic information exchange, the proposed DeepSC is more robust to channel variation and is able to achieve better performance, especially in the low signal-to-noise (SNR) regime, as demonstrated by the extensive simulation results

read more

Citations
More filters
Journal ArticleDOI

Semantic Communication Systems for Speech Transmission

TL;DR: This paper designs a deep learning (DL)-enabled semantic communication system for speech signals, named DeepSC-S, developed based on an attention mechanism by utilizing a squeeze-and-excitation (SE) network, which outperforms the traditional communications in both cases in terms of the speech signals metrics.
Journal ArticleDOI

A Lite Distributed Semantic Communication System for Internet of Things

TL;DR: This paper proposes a lite distributed semantic communication system based on DL, named L-DeepSC, for text transmission with low complexity, where the data transmission from the IoT devices to the cloud/edge works at the semantic level to improve transmission efficiency.
Posted Content

6G Networks: Beyond Shannon Towards Semantic and Goal-Oriented Communications

TL;DR: The goal of this paper is to promote the idea that including semantic and goal-oriented aspects in future 6G networks can produce a significant leap forward in terms of system effectiveness and sustainability.
Journal ArticleDOI

Semantic Communications: Overview, Open Issues, and Future Research Directions

TL;DR: An overview of the latest deep learning (DL) and end-to-end (E2E) communication based semantic communications will be given and open issues that need to be tackled will be discussed explicitly.
Journal ArticleDOI

A Full Dive Into Realizing the Edge-Enabled Metaverse: Visions, Enabling Technologies, and Challenges

TL;DR: This survey focuses on the edge-enabled Metaverse to realize its ultimate vision and explores how blockchain technologies can aid in the interoperable development of the Metaverse, not just in terms of empowering the economic circulation of virtual user-generated content but also to manage physical edge resources in a decentralized, transparent, and immutable manner.
References
More filters
Proceedings Article

Attention is All you Need

TL;DR: This paper proposed a simple network architecture based solely on an attention mechanism, dispensing with recurrence and convolutions entirely and achieved state-of-the-art performance on English-to-French translation.
Proceedings ArticleDOI

BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

TL;DR: BERT as mentioned in this paper pre-trains deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers, which can be fine-tuned with just one additional output layer to create state-of-the-art models for a wide range of tasks.
Proceedings ArticleDOI

Bleu: a Method for Automatic Evaluation of Machine Translation

TL;DR: This paper proposed a method of automatic machine translation evaluation that is quick, inexpensive, and language-independent, that correlates highly with human evaluation, and that has little marginal cost per run.
Posted Content

Efficient Estimation of Word Representations in Vector Space

TL;DR: This paper proposed two novel model architectures for computing continuous vector representations of words from very large data sets, and the quality of these representations is measured in a word similarity task and the results are compared to the previously best performing techniques based on different types of neural networks.
Proceedings Article

Neural Machine Translation by Jointly Learning to Align and Translate

TL;DR: It is conjecture that the use of a fixed-length vector is a bottleneck in improving the performance of this basic encoder-decoder architecture, and it is proposed to extend this by allowing a model to automatically (soft-)search for parts of a source sentence that are relevant to predicting a target word, without having to form these parts as a hard segment explicitly.
Related Papers (5)