scispace - formally typeset
Search or ask a question
Author

Ali H. Sayed

Bio: Ali H. Sayed is an academic researcher from École Polytechnique Fédérale de Lausanne. The author has contributed to research in topics: Adaptive filter & Optimization problem. The author has an hindex of 81, co-authored 728 publications receiving 36030 citations. Previous affiliations of Ali H. Sayed include Harbin Engineering University & University of California, Los Angeles.


Papers
More filters
Proceedings ArticleDOI
11 Jun 2001
TL;DR: A technique for detecting and providing an estimate of the number of overlapping fading multipath components is developed, vital for accurate resolution of overlapping multipath component resolution as well as avoiding unnecessary computations and errors in single-path propagation cases.
Abstract: The Federal Communications Commission (FCC) mandate for locating the position of wireless 911 callers is fueling research in the area of mobile-positioning technologies. Overlapping multipath propagation is one of the main sources of mobile-positioning errors, especially in fast channel fading situations. In this paper we develop a technique for detecting and providing an estimate of the number of overlapping fading multipath components. Such information is vital for accurate resolution of overlapping multipath components as well as avoiding unnecessary computations and errors in single-path propagation cases. The proposed technique exploits the fact that multipath components fade independently as well as the pulse shape symmetry. The paper also presents supporting simulation results.

18 citations

Journal ArticleDOI
TL;DR: A theoretical guarantee of linear convergence under random reshuffling for SAGA in the mean-square sense is provided and a new amortized variance-reduced gradient (AVRG) algorithm with constant storage requirements and balanced gradient computations compared to SVRG is proposed.
Abstract: Several useful variance-reduced stochastic gradient algorithms, such as SVRG, SAGA, Finito, and SAG, have been proposed to minimize empirical risks with linear convergence properties to the exact minimizer. The existing convergence results assume uniform data sampling with replacement. However, it has been observed in related works that random reshuffling can deliver superior performance over uniform sampling and, yet, no formal proofs or guarantees of exact convergence exist for variance-reduced algorithms under random reshuffling. This paper makes two contributions. First, it provides a theoretical guarantee of linear convergence under random reshuffling for SAGA in the mean-square sense; the argument is also adaptable to other variance-reduced algorithms. Second, under random reshuffling, the article proposes a new amortized variance-reduced gradient (AVRG) algorithm with constant storage requirements compared to SAGA and with balanced gradient computations compared to SVRG. AVRG is also shown analytically to converge linearly.

18 citations

Journal ArticleDOI
TL;DR: This letter shows how to blend real-time adaptation with graph filtering and a generalized regularization framework to result in a graph diffusion strategy for distributed learning over multitask networks.
Abstract: This letter proposes a general regularization framework for inference over multitask networks. The optimization approach relies on minimizing a global cost consisting of the aggregate sum of individual costs regularized by a term that allows to incorporate global information about the graph structure and the individual parameter vectors into the solution of the inference problem. An adaptive strategy, which responds to streaming data and employs stochastic approximations in place of actual gradient vectors, is devised and studied. Methods allowing the distributed implementation of the regularization step are also discussed. This letter shows how to blend real-time adaptation with graph filtering and a generalized regularization framework to result in a graph diffusion strategy for distributed learning over multitask networks.

18 citations

Journal Article
TL;DR: In this article, the convergence rate and mean square error performance of momentum stochastic gradient methods in the constant step-size and slow adaptation regime was examined in the adaptive online setting.
Abstract: The article examines in some detail the convergence rate and mean-square-error performance of momentum stochastic gradient methods in the constant step-size and slow adaptation regime The results establish that momentum methods are equivalent to the standard stochastic gradient method with a re-scaled (larger) step-size value The size of the re-scaling is determined by the value of the momentum parameter The equivalence result is established for all time instants and not only in steady-state The analysis is carried out for general strongly convex and smooth risk functions, and is not limited to quadratic risks One notable conclusion is that the well-known benefits of momentum constructions for deterministic optimization problems do not necessarily carry over to the adaptive online setting when small constant step-sizes are used to enable continuous adaptation and learning in the presence of persistent gradient noise From simulations, the equivalence between momentum and standard stochastic gradient methods is also observed for nondifferentiable and non-convex problems

18 citations


Cited by
More filters
Journal ArticleDOI

[...]

08 Dec 2001-BMJ
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Abstract: There is, I think, something ethereal about i —the square root of minus one. I remember first hearing about it at school. It seemed an odd beast at that time—an intruder hovering on the edge of reality. Usually familiarity dulls this sense of the bizarre, but in the case of i it was the reverse: over the years the sense of its surreal nature intensified. It seemed that it was impossible to write mathematics that described the real world in …

33,785 citations

Journal ArticleDOI
TL;DR: This survey provides an overview of higher-order tensor decompositions, their applications, and available software.
Abstract: This survey provides an overview of higher-order tensor decompositions, their applications, and available software. A tensor is a multidimensional or $N$-way array. Decompositions of higher-order tensors (i.e., $N$-way arrays with $N \geq 3$) have applications in psycho-metrics, chemometrics, signal processing, numerical linear algebra, computer vision, numerical analysis, data mining, neuroscience, graph analysis, and elsewhere. Two particular tensor decompositions can be considered to be higher-order extensions of the matrix singular value decomposition: CANDECOMP/PARAFAC (CP) decomposes a tensor as a sum of rank-one tensors, and the Tucker decomposition is a higher-order form of principal component analysis. There are many other tensor decompositions, including INDSCAL, PARAFAC2, CANDELINC, DEDICOM, and PARATUCK2 as well as nonnegative variants of all of the above. The N-way Toolbox, Tensor Toolbox, and Multilinear Engine are examples of software packages for working with tensors.

9,227 citations

Proceedings ArticleDOI
22 Jan 2006
TL;DR: Some of the major results in random graphs and some of the more challenging open problems are reviewed, including those related to the WWW.
Abstract: We will review some of the major results in random graphs and some of the more challenging open problems. We will cover algorithmic and structural questions. We will touch on newer models, including those related to the WWW.

7,116 citations

Journal ArticleDOI

6,278 citations

01 Jan 2016
TL;DR: The table of integrals series and products is universally compatible with any devices to read and is available in the book collection an online access to it is set as public so you can get it instantly.
Abstract: Thank you very much for downloading table of integrals series and products. Maybe you have knowledge that, people have look hundreds times for their chosen books like this table of integrals series and products, but end up in harmful downloads. Rather than reading a good book with a cup of coffee in the afternoon, instead they cope with some harmful virus inside their laptop. table of integrals series and products is available in our book collection an online access to it is set as public so you can get it instantly. Our book servers saves in multiple locations, allowing you to get the most less latency time to download any of our books like this one. Merely said, the table of integrals series and products is universally compatible with any devices to read.

4,085 citations