scispace - formally typeset
Search or ask a question
Author

Ali H. Sayed

Bio: Ali H. Sayed is an academic researcher from École Polytechnique Fédérale de Lausanne. The author has contributed to research in topics: Adaptive filter & Optimization problem. The author has an hindex of 81, co-authored 728 publications receiving 36030 citations. Previous affiliations of Ali H. Sayed include Harbin Engineering University & University of California, Los Angeles.


Papers
More filters
Proceedings ArticleDOI
07 May 2001
TL;DR: This paper develops a framework for the mean-square analysis of adaptive filters with general data and error nonlinearities and provides closed form expressions for the steady-state performance and necessary and sufficient conditions for stability.
Abstract: This paper develops a framework for the mean-square analysis of adaptive filters with general data and error nonlinearities. The approach relies on energy conservation arguments and is carried out without restrictions on the probability distribution of the input sequence. In particular, for adaptive filters with diagonal matrix nonlinearities, we provide closed form expressions for the steady-state performance and necessary and sufficient conditions for stability. We carry out a similar study for long adaptive filters that employ error nonlinearities relying on a weaker form of the independence assumption. We provide expressions for the steady-state error and bounds on the step-size for stability by exploiting the Cramer-Rao bound of the underlying estimation process.

17 citations

Proceedings ArticleDOI
04 Oct 2012
TL;DR: A least mean-squares (LMS) diffusion strategy for sensor network applications where it is desired to estimate parameters of physical phenomena that vary over space using a set of basis functions to replace the space-variant parameters with space-invariant parameters.
Abstract: We develop a least mean-squares (LMS) diffusion strategy for sensor network applications where it is desired to estimate parameters of physical phenomena that vary over space. In particular, we consider a regression model with space-varying parameters that captures the system dynamics over time and space. We use a set of basis functions such as sinusoids or B-spline functions to replace the space-variant (local) parameters with space-invariant (global) parameters, and then apply diffusion adaptation to estimate the global representation. We illustrate the performance of the algorithm via simulations.

16 citations

Journal ArticleDOI
TL;DR: A time-domain feedback analysis of the robustness performance of Gauss-Newton recursive methods that are often used in identification and control is provided, showing that by properly selecting the free parameters, the resulting filters can be made to impose certain bounds on the error quantities, thus resulting in desirable robustness properties.

16 citations

Posted Content
TL;DR: In this article, the convergence rate and mean-square-error performance of momentum stochastic gradient methods in the constant step-size and slow adaptation regime were analyzed for strongly convex and smooth risk functions, and not limited to quadratic risks.
Abstract: The article examines in some detail the convergence rate and mean-square-error performance of momentum stochastic gradient methods in the constant step-size and slow adaptation regime. The results establish that momentum methods are equivalent to the standard stochastic gradient method with a re-scaled (larger) step-size value. The size of the re-scaling is determined by the value of the momentum parameter. The equivalence result is established for all time instants and not only in steady-state. The analysis is carried out for general strongly convex and smooth risk functions, and is not limited to quadratic risks. One notable conclusion is that the well-known bene ts of momentum constructions for deterministic optimization problems do not necessarily carry over to the adaptive online setting when small constant step-sizes are used to enable continuous adaptation and learn- ing in the presence of persistent gradient noise. From simulations, the equivalence between momentum and standard stochastic gradient methods is also observed for non-differentiable and non-convex problems.

16 citations

Journal ArticleDOI
TL;DR: Bounds are established on the step-size parameters in order to guarantee that the resulting algorithms will behave as robust filters and that an intrinsic feedback structure can be associated with the training schemes.
Abstract: This paper provides a time-domain feedback analysis of the perceptron learning algorithm and of training schemes for dynamic networks with output feedback. It studies the robustness performance of the algorithms in the presence of uncertainties that might be due to noisy perturbations in the reference signals or due to modeling mismatch. In particular, bounds are established on the step-size parameters in order to guarantee that the resulting algorithms will behave as robust filters. The paper also establishes that an intrinsic feedback structure can be associated with the training schemes. The feedback configuration is motivated via energy arguments and is shown to consist of two major blocks: a time-variant lossless (i.e., energy preserving) feedforward path and a time-variant feedback path. The stability of the feedback structure is then analyzed via the small gain theorem, and choices for the step-size parameter in order to guarantee faster convergence are deduced by using the mean-value theorem. Simulation results are included to demonstrate the findings.

16 citations


Cited by
More filters
Journal ArticleDOI

[...]

08 Dec 2001-BMJ
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Abstract: There is, I think, something ethereal about i —the square root of minus one. I remember first hearing about it at school. It seemed an odd beast at that time—an intruder hovering on the edge of reality. Usually familiarity dulls this sense of the bizarre, but in the case of i it was the reverse: over the years the sense of its surreal nature intensified. It seemed that it was impossible to write mathematics that described the real world in …

33,785 citations

Journal ArticleDOI
TL;DR: This survey provides an overview of higher-order tensor decompositions, their applications, and available software.
Abstract: This survey provides an overview of higher-order tensor decompositions, their applications, and available software. A tensor is a multidimensional or $N$-way array. Decompositions of higher-order tensors (i.e., $N$-way arrays with $N \geq 3$) have applications in psycho-metrics, chemometrics, signal processing, numerical linear algebra, computer vision, numerical analysis, data mining, neuroscience, graph analysis, and elsewhere. Two particular tensor decompositions can be considered to be higher-order extensions of the matrix singular value decomposition: CANDECOMP/PARAFAC (CP) decomposes a tensor as a sum of rank-one tensors, and the Tucker decomposition is a higher-order form of principal component analysis. There are many other tensor decompositions, including INDSCAL, PARAFAC2, CANDELINC, DEDICOM, and PARATUCK2 as well as nonnegative variants of all of the above. The N-way Toolbox, Tensor Toolbox, and Multilinear Engine are examples of software packages for working with tensors.

9,227 citations

Proceedings ArticleDOI
22 Jan 2006
TL;DR: Some of the major results in random graphs and some of the more challenging open problems are reviewed, including those related to the WWW.
Abstract: We will review some of the major results in random graphs and some of the more challenging open problems. We will cover algorithmic and structural questions. We will touch on newer models, including those related to the WWW.

7,116 citations

Journal ArticleDOI

6,278 citations

01 Jan 2016
TL;DR: The table of integrals series and products is universally compatible with any devices to read and is available in the book collection an online access to it is set as public so you can get it instantly.
Abstract: Thank you very much for downloading table of integrals series and products. Maybe you have knowledge that, people have look hundreds times for their chosen books like this table of integrals series and products, but end up in harmful downloads. Rather than reading a good book with a cup of coffee in the afternoon, instead they cope with some harmful virus inside their laptop. table of integrals series and products is available in our book collection an online access to it is set as public so you can get it instantly. Our book servers saves in multiple locations, allowing you to get the most less latency time to download any of our books like this one. Merely said, the table of integrals series and products is universally compatible with any devices to read.

4,085 citations