scispace - formally typeset
Search or ask a question
Journal ArticleDOI

5G SLAM Using the Clustering and Assignment Approach with Diffuse Multipath

18 Aug 2020-Sensors (MDPI AG)-Vol. 20, Iss: 16, pp 4656
TL;DR: This study considers an intermediate approach, which consists of four phases—downlink data transmission, multi-dimensional channel estimation, channel parameter clustering, and simultaneous localization and mapping (SLAM) based on a novel likelihood function.
Abstract: 5G communication systems operating above 24 GHz have promising properties for user localization and environment mapping. Existing studies have either relied on simplified abstract models of the signal propagation and the measurements, or are based on direct positioning approaches, which directly map the received waveform to a position. In this study, we consider an intermediate approach, which consists of four phases-downlink data transmission, multi-dimensional channel estimation, channel parameter clustering, and simultaneous localization and mapping (SLAM) based on a novel likelihood function. This approach can decompose the problem into simpler steps, thus leading to lower complexity. At the same time, by considering an end-to-end processing chain, we are accounting for a wide variety of practical impairments. Simulation results demonstrate the efficacy of the proposed approach.

Content maybe subject to copyright    Report

Citations
More filters
Posted Content
TL;DR: In this paper, the authors provide a tutorial on the fundamental properties of the RIS technology from a signal processing perspective, to complement the recent surveys of electromagnetic and hardware aspects, and exemplify how they can be utilized for improved communication, localization and sensing.
Abstract: A reconfigurable intelligent surface (RIS) is a two-dimensional surface of engineered material whose properties are reconfigurable rather than static [4]. For example, the scattering, absorption, reflection, and diffraction properties can be changed with time and controlled by software. In principle, the surface can be used to synthesize an arbitrarily-shaped object of the same size, when it comes to how electromagnetic waves interact with it [5]. The long-term vision of the RIS technology is to create smart radio environments [9], where the wireless propagation conditions are co-engineered with the physical-layer signaling, and investigate how to utilize this new capability. The common protocol stack consists of seven layers and wireless technology is chiefly focused on the first three layers (physical, link, and network) [10]. An RIS operates at what can be referred to as Layer 0, where the traditional design issue is the antennas of the transmitter/receivers; one can think of RIS as extending the antenna design towards the environment, commonly seen as uncontrollable and decided by "nature". This approach can profoundly change the wireless design beyond 5G. This article provides a tutorial on the fundamental properties of the RIS technology from a signal processing perspective, to complement the recent surveys of electromagnetic and hardware aspects [4], [7], communication theory [11], and localization [8]. We will provide the formulas and derivations that are required to understand and analyze RIS-aided systems, and exemplify how they can be utilized for improved communication, localization, and sensing. We will also elaborate on the fundamentally new possibilities enabled by Layer 0 engineering and electromagnetic phenomena that remain to be modeled and utilized for improved signal processing.

98 citations

Journal ArticleDOI
TL;DR: In this paper , the authors provide a tutorial on the fundamental properties of the RIS technology from a signal processing perspective, and exemplify how they can be utilized for improved communication, localization, and sensing.
Abstract: Antenna array technology enables directional transmission and reception of wireless signals, for communications, localization, and sensing purposes. The signal processing algorithms that underpin this technology began to be developed several decades ago [1], but it is first with the ongoing deployment of the fifth-generation (5G) wireless mobile networks that it becomes a mainstream technology [2]. The number of antenna elements in the arrays of the 5G base stations and user devices can be measured at the order of 100 and 10, respectively. As the networks shift towards using higher frequency bands, more antennas fit into a given aperture. The 5G developments enhance the transmitter and receiver functionalities, but the wireless channel propagation remains an uncontrollable system. This is illustrated in Fig. 1(a) and its mathematical notation will be introduced later. Transmitted signals with three different frequencies are shown to illustrate the fact that attenuation can vary greatly across frequencies. Looking beyond 5G, the advent of electromagnetic components that can shape how they interact with wireless signals enables partial control of the propagation. A reconfigurable intelligent surface (RIS) is a two-dimensional surface of engineered material whose properties are reconfigurable rather than static [4]. This article provides a tutorial on the fundamental properties of the RIS technology from a signal processing perspective. It is meant as a complement to recent surveys of electromagnetic and hardware aspects [4], [7], [11], acoustics [12], communication theory [13], and localization [8]. We will provide the formulas and derivations that are required to understand and analyze RIS-aided systems using signal processing, and exemplify how they can be utilized for improved communication, localization, and sensing.

71 citations

Proceedings ArticleDOI
13 Sep 2021
TL;DR: The Hexa-X project as mentioned in this paper aims to support high-resolution localization and sensing in the next generation of mobile communication, which will enable novel use cases requiring extreme localization performance and provide a means to support and improve communication functionalities.
Abstract: 6G will likely be the first generation of mobile communication that will feature tight integration of localization and sensing with communication functionalities. Among several worldwide initiatives, the Hexa-X flagship project stands out as it brings together 25 key players from adjacent industries and academia, and has among its explicit goals to research fundamentally new radio access technologies and high-resolution localization and sensing. Such features will not only enable novel use cases requiring extreme localization performance, but also provide a means to support and improve communication functionalities. This paper provides an overview of the Hexa-X vision alongside the envisioned use cases. To close the required performance gap of these use cases with respect to 5G, several technical enablers will be discussed, together with the associated research challenges for the coming years.

71 citations

Journal ArticleDOI
TL;DR: A novel low-complexity method for joint localization and synchronization based on an optimized design of the base station (BS) active precoding and RIS passive phase profiles is proposed, for the challenging case of a single-antenna receiver.
Abstract: Reconfigurable intelligent surfaces (RISs) have attracted enormous interest thanks to their ability to overcome line-of-sight blockages in mmWave systems, enabling in turn accurate localization with minimal infrastructure. Less investigated are however the benefits of exploiting RIS with suitably designed beamforming strategies for optimized localization and synchronization performance. In this paper, a novel low-complexity method for joint localization and synchronization based on an optimized design of the base station (BS) active precoding and RIS passive phase profiles is proposed, for the challenging case of a single-antenna receiver. The theoretical position error bound is first derived and used as metric to jointly optimize the BS-RIS beamforming, assuming a priori knowledge of the user position. By exploiting the low-dimensional structure of the solution, a novel codebook-based robust design strategy with optimized beam power allocation is then proposed, which provides low-complexity while taking into account the uncertainty on the user position. Finally, a reduced-complexity maximum-likelihood based estimation procedure is devised to jointly recover the user position and the synchronization offset. Extensive numerical analysis shows that the proposed joint BS-RIS beamforming scheme provides enhanced localization and synchronization performance compared to existing solutions, with the proposed estimator attaining the theoretical bounds even at low signal-to-noise-ratio and in the presence of additional uncontrollable multipath propagation.

29 citations

Journal ArticleDOI
TL;DR: Terahertz (THz) communications are celebrated as key enablers for converged localization and sensing in future 6G wireless communication systems and beyond as discussed by the authors , and localization in 6G is indispensable for location-aware communications.
Abstract: Terahertz (THz) communications are celebrated as key enablers for converged localization and sensing in future sixth-generation (6G) wireless communication systems and beyond. Instead of being a byproduct of the communication system, localization in 6G is indispensable for location-aware communications. Towards this end, we aim to identify the prospects, challenges, and requirements of THz localization techniques. We first review the history and trends of localization methods and discuss their objectives, constraints, and applications in contemporary communication systems. We then detail the latest advances in THz communications and introduce THz-specific channel and system models. Afterward, we formulate THz-band localization as a 3D position/orientation estimation problem, detailing geometry-based localization techniques and describing potential THz localization and sensing extensions. We further formulate the offline design and online optimization of THz localization systems, provide numerical simulation results, and conclude by providing lessons learned and future research directions. Preliminary results illustrate that under the same transmission power and array footprint, THz-based localization outperforms millimeter wave-based localization. In other words, the same level of localization performance can be achieved at THz-band with less transmission power or a smaller footprint.

25 citations

References
More filters
Book
08 Sep 2000
TL;DR: This book presents dozens of algorithms and implementation examples, all in pseudo-code and suitable for use in real-world, large-scale data mining projects, and provides a comprehensive, practical look at the concepts and techniques you need to get the most out of real business data.
Abstract: The increasing volume of data in modern business and science calls for more complex and sophisticated tools. Although advances in data mining technology have made extensive data collection much easier, it's still always evolving and there is a constant need for new techniques and tools that can help us transform this data into useful information and knowledge. Since the previous edition's publication, great advances have been made in the field of data mining. Not only does the third of edition of Data Mining: Concepts and Techniques continue the tradition of equipping you with an understanding and application of the theory and practice of discovering patterns hidden in large data sets, it also focuses on new, important topics in the field: data warehouses and data cube technology, mining stream, mining social networks, and mining spatial, multimedia and other complex data. Each chapter is a stand-alone guide to a critical topic, presenting proven algorithms and sound implementations ready to be used directly or with strategic modification against live data. This is the resource you need if you want to apply today's most powerful data mining techniques to meet real business challenges. * Presents dozens of algorithms and implementation examples, all in pseudo-code and suitable for use in real-world, large-scale data mining projects. * Addresses advanced topics such as mining object-relational databases, spatial databases, multimedia databases, time-series databases, text databases, the World Wide Web, and applications in several fields. *Provides a comprehensive, practical look at the concepts and techniques you need to get the most out of real business data

23,600 citations

Proceedings Article
01 Jan 1996
TL;DR: DBSCAN, a new clustering algorithm relying on a density-based notion of clusters which is designed to discover clusters of arbitrary shape, is presented which requires only one input parameter and supports the user in determining an appropriate value for it.
Abstract: Clustering algorithms are attractive for the task of class identification in spatial databases. However, the application to large spatial databases rises the following requirements for clustering algorithms: minimal requirements of domain knowledge to determine the input parameters, discovery of clusters with arbitrary shape and good efficiency on large databases. The well-known clustering algorithms offer no solution to the combination of these requirements. In this paper, we present the new clustering algorithm DBSCAN relying on a density-based notion of clusters which is designed to discover clusters of arbitrary shape. DBSCAN requires only one input parameter and supports the user in determining an appropriate value for it. We performed an experimental evaluation of the effectiveness and efficiency of DBSCAN using synthetic data and real data of the SEQUOIA 2000 benchmark. The results of our experiments demonstrate that (1) DBSCAN is significantly more effective in discovering clusters of arbitrary shape than the well-known algorithm CLARANS, and that (2) DBSCAN outperforms CLARANS by a factor of more than 100 in terms of efficiency.

14,297 citations

Journal ArticleDOI
TL;DR: This survey provides an overview of higher-order tensor decompositions, their applications, and available software.
Abstract: This survey provides an overview of higher-order tensor decompositions, their applications, and available software. A tensor is a multidimensional or $N$-way array. Decompositions of higher-order tensors (i.e., $N$-way arrays with $N \geq 3$) have applications in psycho-metrics, chemometrics, signal processing, numerical linear algebra, computer vision, numerical analysis, data mining, neuroscience, graph analysis, and elsewhere. Two particular tensor decompositions can be considered to be higher-order extensions of the matrix singular value decomposition: CANDECOMP/PARAFAC (CP) decomposes a tensor as a sum of rank-one tensors, and the Tucker decomposition is a higher-order form of principal component analysis. There are many other tensor decompositions, including INDSCAL, PARAFAC2, CANDELINC, DEDICOM, and PARATUCK2 as well as nonnegative variants of all of the above. The N-way Toolbox, Tensor Toolbox, and Multilinear Engine are examples of software packages for working with tensors.

9,227 citations

Journal ArticleDOI
01 Jun 2010
TL;DR: A brief overview of clustering is provided, well known clustering methods are summarized, the major challenges and key issues in designing clustering algorithms are discussed, and some of the emerging and useful research directions are pointed out.
Abstract: Organizing data into sensible groupings is one of the most fundamental modes of understanding and learning. As an example, a common scheme of scientific classification puts organisms into a system of ranked taxa: domain, kingdom, phylum, class, etc. Cluster analysis is the formal study of methods and algorithms for grouping, or clustering, objects according to measured or perceived intrinsic characteristics or similarity. Cluster analysis does not use category labels that tag objects with prior identifiers, i.e., class labels. The absence of category information distinguishes data clustering (unsupervised learning) from classification or discriminant analysis (supervised learning). The aim of clustering is to find structure in data and is therefore exploratory in nature. Clustering has a long and rich history in a variety of scientific fields. One of the most popular and simple clustering algorithms, K-means, was first published in 1955. In spite of the fact that K-means was proposed over 50 years ago and thousands of clustering algorithms have been published since then, K-means is still widely used. This speaks to the difficulty in designing a general purpose clustering algorithm and the ill-posed problem of clustering. We provide a brief overview of clustering, summarize well known clustering methods, discuss the major challenges and key issues in designing clustering algorithms, and point out some of the emerging and useful research directions, including semi-supervised clustering, ensemble clustering, simultaneous feature selection during data clustering, and large scale data clustering.

6,601 citations

Journal ArticleDOI
16 Feb 2007-Science
TL;DR: A method called “affinity propagation,” which takes as input measures of similarity between pairs of data points, which found clusters with much lower error than other methods, and it did so in less than one-hundredth the amount of time.
Abstract: Clustering data by identifying a subset of representative examples is important for processing sensory signals and detecting patterns in data. Such "exemplars" can be found by randomly choosing an initial subset of data points and then iteratively refining it, but this works well only if that initial choice is close to a good solution. We devised a method called "affinity propagation," which takes as input measures of similarity between pairs of data points. Real-valued messages are exchanged between data points until a high-quality set of exemplars and corresponding clusters gradually emerges. We used affinity propagation to cluster images of faces, detect genes in microarray data, identify representative sentences in this manuscript, and identify cities that are efficiently accessed by airline travel. Affinity propagation found clusters with much lower error than other methods, and it did so in less than one-hundredth the amount of time.

6,429 citations