scispace - formally typeset
Search or ask a question
Author

Ali H. Sayed

Bio: Ali H. Sayed is an academic researcher from École Polytechnique Fédérale de Lausanne. The author has contributed to research in topics: Adaptive filter & Optimization problem. The author has an hindex of 81, co-authored 728 publications receiving 36030 citations. Previous affiliations of Ali H. Sayed include Harbin Engineering University & University of California, Los Angeles.


Papers
More filters
Proceedings ArticleDOI
14 Mar 2022
TL;DR: This work proposes a technique that addresses questions of explainability and interpretability when the graph is hidden, and aims to infer the underlying graph topology, discover pairwise in-serviceences between the agents, and identify significant trajectories in the network.
Abstract: Social learning algorithms provide models for the formation of opinions over social networks resulting from local reasoning and peer-to-peer exchanges. Interactions occur over an underlying graph topology, which describes the flow of information among the agents. To account for drifting conditions in the environment, this work adopts an adaptive social learning strategy, which is able to track variations in the underlying signal statistics. Among other results, we propose a technique that addresses questions of explainability and interpretability of the results when the graph is hidden. Given observations of the evolution of the beliefs over time, we aim to infer the underlying graph topology, discover pairwise influences between the agents, and identify significant trajectories in the network. The proposed framework is online in nature and can adapt dynamically to changes in the graph topology or the true hypothesis.

2 citations

Journal ArticleDOI
TL;DR: This paper proposes and study an adaptive decentralized strategy where the agents employ differential randomized quantizers to compress their estimates before communicating with their neighbors, revealing that decentralized learning is achievable at the expense of only a few bits.
Abstract: In this article, we consider decentralized optimization problems where agents have individual cost functions to minimize subject to subspace constraints that require the minimizers across the network to lie in low-dimensional subspaces. This constrained formulation includes consensus or single-task optimization as special cases, and allows for more general task relatedness models such as multitask smoothness and coupled optimization. In order to cope with communication constraints, we propose and study an adaptive decentralized strategy where the agents employ differential randomized quantizers to compress their estimates before communicating with their neighbors. The analysis shows that, under some general conditions on the quantization noise, and for sufficiently small step-sizes $\mu$, the strategy is stable both in terms of mean-square error and average bit rate: by reducing $\mu$, it is possible to keep the estimation errors small (on the order of $\mu$) without increasing indefinitely the bit rate as $\mu \rightarrow 0$ when variable-rate quantizers are used. Simulations illustrate the theoretical findings and the effectiveness of the proposed approach, revealing that decentralized learning is achievable at the expense of only a few bits.

2 citations

Book ChapterDOI
01 Jan 1999
TL;DR: In this article, the authors developed array algorithms for H ∞ filtering, which can be regarded as natural generalizations of their H2 counterparts, and involve propagating the indefinite square roots of the quantities of interest.
Abstract: Currently, the preferred method for implementing H2 estimation algorithms is what is called the array form, and includes two main families: square-root array algorithms, that are typically more stable than conventional ones, and fast array algorithms, which, when the system is time-invariant, typically offer an order of magnitude reduction in the computational effort. Using our recent observation that H ∞ filtering coincides with Kalman filtering in Krein space, in this chapter we develop array algorithms for H∞ filtering. These can be regarded as natural generalizations of their H2 counterparts, and involve propagating the indefinite square roots of the quantities of interest. The H ∞ square-root and fast array algorithms both have the interesting feature that one does not need to explicitly check for the positivity conditions required for the existence of H ∞ filters. These conditions are built into the algorithms themselves so that an H ∞ estimator of the desired level exists if, and only if, the algorithms can be executed. However, since H ∞ square-root algorithms predominantly use J-unitary transformations, rather than the unitary transformations required in the H2 case, further investigation is needed to determine the numerical behavior of such algorithms.

2 citations

Patent
15 Oct 2003
TL;DR: In this paper, an adaptive multi-bit delta and sigma-delta modulation and demodulation technique was proposed, where a one-bit modulator generates a binary output signal from an analog input signal and a multisigram adapter for generating a scaling signal for scaling a step-size of the sigmoid modulator.
Abstract: An adaptive multi-bit delta and sigma-delta modulation and demodulation technique, wherein a one-bit modulator generates a binary output signal from an analog input signal and a multi-bit adapter for generating a scaling signal for scaling a step-size of the sigma-delta modulator.

2 citations

Journal ArticleDOI
TL;DR: In this article , a family of diffusion Bayesian decorrelation least mean squares (DBDLMS) algorithms based on decorrelated observation models is proposed, with variable step-sizes emerging naturally to help address the conflicting requirements of fast convergence rate and small steady-state error.
Abstract: The convergence rate of the diffusion normalized least mean squares (NLMS) algorithm can be speeded up by decorrelation of signals. However, the diffusion decorrelation NLMS algorithm confronts the conflicting requirements of fast convergence rate and small steady-state error. To address the issue, this paper proposes a family of diffusion Bayesian decorrelation least mean squares (DBDLMS) algorithms based on decorrelated observation models. Firstly, the weight update equations of the proposed DBDLMS algorithms are obtained by performing Bayesian inference over the decorrelated observation models, with variable step-sizes emerging naturally to help address the conflicting requirement. Secondly, the update equations of the decorrelation coefficient vectors are inferred from the Bayesian perspective, with the variable step-sizes emerging again. Subsequently, the performance analysis of the proposed DBDLMS algorithms is carried out. Moreover, a simple and effective approach is derived to estimate the free parameters for the proposed DBDLMS algorithms. Finally, the learning performance of the proposed algorithms are verified by Monte Carlo simulations.

2 citations


Cited by
More filters
Journal ArticleDOI

[...]

08 Dec 2001-BMJ
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Abstract: There is, I think, something ethereal about i —the square root of minus one. I remember first hearing about it at school. It seemed an odd beast at that time—an intruder hovering on the edge of reality. Usually familiarity dulls this sense of the bizarre, but in the case of i it was the reverse: over the years the sense of its surreal nature intensified. It seemed that it was impossible to write mathematics that described the real world in …

33,785 citations

Journal ArticleDOI
TL;DR: This survey provides an overview of higher-order tensor decompositions, their applications, and available software.
Abstract: This survey provides an overview of higher-order tensor decompositions, their applications, and available software. A tensor is a multidimensional or $N$-way array. Decompositions of higher-order tensors (i.e., $N$-way arrays with $N \geq 3$) have applications in psycho-metrics, chemometrics, signal processing, numerical linear algebra, computer vision, numerical analysis, data mining, neuroscience, graph analysis, and elsewhere. Two particular tensor decompositions can be considered to be higher-order extensions of the matrix singular value decomposition: CANDECOMP/PARAFAC (CP) decomposes a tensor as a sum of rank-one tensors, and the Tucker decomposition is a higher-order form of principal component analysis. There are many other tensor decompositions, including INDSCAL, PARAFAC2, CANDELINC, DEDICOM, and PARATUCK2 as well as nonnegative variants of all of the above. The N-way Toolbox, Tensor Toolbox, and Multilinear Engine are examples of software packages for working with tensors.

9,227 citations

Proceedings ArticleDOI
22 Jan 2006
TL;DR: Some of the major results in random graphs and some of the more challenging open problems are reviewed, including those related to the WWW.
Abstract: We will review some of the major results in random graphs and some of the more challenging open problems. We will cover algorithmic and structural questions. We will touch on newer models, including those related to the WWW.

7,116 citations

Journal ArticleDOI

6,278 citations

01 Jan 2016
TL;DR: The table of integrals series and products is universally compatible with any devices to read and is available in the book collection an online access to it is set as public so you can get it instantly.
Abstract: Thank you very much for downloading table of integrals series and products. Maybe you have knowledge that, people have look hundreds times for their chosen books like this table of integrals series and products, but end up in harmful downloads. Rather than reading a good book with a cup of coffee in the afternoon, instead they cope with some harmful virus inside their laptop. table of integrals series and products is available in our book collection an online access to it is set as public so you can get it instantly. Our book servers saves in multiple locations, allowing you to get the most less latency time to download any of our books like this one. Merely said, the table of integrals series and products is universally compatible with any devices to read.

4,085 citations