scispace - formally typeset
Search or ask a question
Author

Ali H. Sayed

Bio: Ali H. Sayed is an academic researcher from École Polytechnique Fédérale de Lausanne. The author has contributed to research in topics: Adaptive filter & Optimization problem. The author has an hindex of 81, co-authored 728 publications receiving 36030 citations. Previous affiliations of Ali H. Sayed include Harbin Engineering University & University of California, Los Angeles.


Papers
More filters
Proceedings ArticleDOI
07 Jun 1995
TL;DR: In this paper, the authors provide a time-domain feedback analysis of gradient-based adaptive schemes with emphasis on stability and robustness issues, and show that an intrinsic feedback structure, mapping the noise sequence and the initial weight guess to the a priori estimation errors and the final weight estimate, can be associated with such schemes.
Abstract: This paper provides a time-domain feedback analysis of gradient-based adaptive schemes with emphasis on stability and robustness issues. It is shown that an intrinsic feedback structure, mapping the noise sequence and the initial weight guess to the a priori estimation errors and the final weight estimate, can be associated with such schemes. The feedback configuration is motivated via energy arguments and is shown to consist of two major blocks: a time-variant lossless (i.e., energy preserving) feedforward path and a time-variant feedback path. The configuration is further shown to lend itself rather immediately to analysis via a so-called small gain theorem; thus leading to stability conditions that require the contractivity of certain operators.

54 citations

Journal ArticleDOI
TL;DR: The paper establishes robustness, optimality, and convergence properties of the widely used class of instantaneous-gradient adaptive algorithms and employs the Cauchy-Schwarz inequality for vectors in an Euclidean space.
Abstract: The paper establishes robustness, optimality, and convergence properties of the widely used class of instantaneous-gradient adaptive algorithms. The analysis is carried out in a purely deterministic framework and assumes no a priori statistical information. It employs the Cauchy-Schwarz inequality for vectors in an Euclidean space and derives local and global error-energy bounds that are shown to highlight, as well as explain, relevant aspects of the robust performance of adaptive gradient filters (along the lines of H/sup /spl infin// theory).

54 citations

Proceedings ArticleDOI
01 Nov 2009
TL;DR: This work shows how to optimally select the weights, and proposes an adaptive algorithm to adapt them using local information at every node, and shows performance improvement in comparison to the case where fixed, non-adaptive weights are used.
Abstract: We study the problem of distributed Kalman filtering, where a set of nodes are required to collectively estimate the state of a linear dynamic system from their measurements. In diffusion Kalman filtering strategies, neighboring state estimates are linearly combined using a set of scalar weights. In this work we show how to optimally select the weights, and propose an adaptive algorithm to adapt them using local information at every node. The algorithm is fully distributed and runs in real time, with low processing complexity. Our simulation results show performance improvement in comparison to the case where fixed, non-adaptive weights are used.

53 citations

Journal ArticleDOI
TL;DR: This analysis reveals that the proposed algorithm can overcome the difficulty in the estimation of the space-varying parameters using distributed approaches to a large extent by benefiting from the network stochastic matrices that are used to combine exchanged information between nodes.
Abstract: We study the problem of distributed adaptive estimation over networks where nodes cooperate to estimate physical parameters that can vary over both space and time domains. We use a set of basis functions to characterize the space-varying nature of the parameters and propose a diffusion least mean-squares (LMS) strategy to recover these parameters from successive time measurements. We analyze the stability and convergence of the proposed algorithm, and derive closed-form expressions to predict its learning behavior and steady-state performance in terms of mean-square error. We find that in the estimation of the space-varying parameters using distributed approaches, the covariance matrix of the regression data at each node becomes rank-deficient. Our analysis reveals that the proposed algorithm can overcome this difficulty to a large extent by benefiting from the network stochastic matrices that are used to combine exchanged information between nodes. We provide computer experiments to illustrate and support the theoretical findings.

53 citations

Journal ArticleDOI
TL;DR: In this article, a model for the solution of multitask problems over asynchronous networks is described and a detailed mean and mean-square error analysis is carried out, which shows that sufficiently small step-sizes can still ensure both stability and performance.
Abstract: The multitask diffusion LMS is an efficient strategy to simultaneously infer, in a collaborative manner, multiple parameter vectors. Existing works on multitask problems assume that all agents respond to data synchronously. In several applications, agents may not be able to act synchronously because networks can be subject to several sources of uncertainties such as changing topology, random link failures, or agents turning on and off for energy conservation. In this paper, we describe a model for the solution of multitask problems over asynchronous networks and carry out a detailed mean and mean-square error analysis. Results show that sufficiently small step-sizes can still ensure both stability and performance. Simulations and illustrative examples are provided to verify the theoretical findings.

52 citations


Cited by
More filters
Journal ArticleDOI

[...]

08 Dec 2001-BMJ
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Abstract: There is, I think, something ethereal about i —the square root of minus one. I remember first hearing about it at school. It seemed an odd beast at that time—an intruder hovering on the edge of reality. Usually familiarity dulls this sense of the bizarre, but in the case of i it was the reverse: over the years the sense of its surreal nature intensified. It seemed that it was impossible to write mathematics that described the real world in …

33,785 citations

Journal ArticleDOI
TL;DR: This survey provides an overview of higher-order tensor decompositions, their applications, and available software.
Abstract: This survey provides an overview of higher-order tensor decompositions, their applications, and available software. A tensor is a multidimensional or $N$-way array. Decompositions of higher-order tensors (i.e., $N$-way arrays with $N \geq 3$) have applications in psycho-metrics, chemometrics, signal processing, numerical linear algebra, computer vision, numerical analysis, data mining, neuroscience, graph analysis, and elsewhere. Two particular tensor decompositions can be considered to be higher-order extensions of the matrix singular value decomposition: CANDECOMP/PARAFAC (CP) decomposes a tensor as a sum of rank-one tensors, and the Tucker decomposition is a higher-order form of principal component analysis. There are many other tensor decompositions, including INDSCAL, PARAFAC2, CANDELINC, DEDICOM, and PARATUCK2 as well as nonnegative variants of all of the above. The N-way Toolbox, Tensor Toolbox, and Multilinear Engine are examples of software packages for working with tensors.

9,227 citations

Proceedings ArticleDOI
22 Jan 2006
TL;DR: Some of the major results in random graphs and some of the more challenging open problems are reviewed, including those related to the WWW.
Abstract: We will review some of the major results in random graphs and some of the more challenging open problems. We will cover algorithmic and structural questions. We will touch on newer models, including those related to the WWW.

7,116 citations

Journal ArticleDOI

6,278 citations

01 Jan 2016
TL;DR: The table of integrals series and products is universally compatible with any devices to read and is available in the book collection an online access to it is set as public so you can get it instantly.
Abstract: Thank you very much for downloading table of integrals series and products. Maybe you have knowledge that, people have look hundreds times for their chosen books like this table of integrals series and products, but end up in harmful downloads. Rather than reading a good book with a cup of coffee in the afternoon, instead they cope with some harmful virus inside their laptop. table of integrals series and products is available in our book collection an online access to it is set as public so you can get it instantly. Our book servers saves in multiple locations, allowing you to get the most less latency time to download any of our books like this one. Merely said, the table of integrals series and products is universally compatible with any devices to read.

4,085 citations