scispace - formally typeset
Search or ask a question
Author

Ali H. Sayed

Bio: Ali H. Sayed is an academic researcher from École Polytechnique Fédérale de Lausanne. The author has contributed to research in topics: Adaptive filter & Optimization problem. The author has an hindex of 81, co-authored 728 publications receiving 36030 citations. Previous affiliations of Ali H. Sayed include Harbin Engineering University & University of California, Los Angeles.


Papers
More filters
Proceedings ArticleDOI
11 Jul 2004
TL;DR: The authors derive performance expressions for the throughput and blocking probability for a class of wireless networks with a clustering protocol that is assumed stationary and establishes connections with the master node according to a priority scheme that relates to their distances from the master nodes.
Abstract: We derive performance expressions for the throughput and blocking probability for a class of wireless networks with a clustering protocol. The nodes are assumed stationary and establish connections with the master node according to a priority scheme that relates to their distances from the master node. The latter is selected randomly in a cell and data are transmitted within predefined sectors.

3 citations

Proceedings ArticleDOI
01 Mar 2017
TL;DR: In this paper, the authors characterize the set of beliefs that can be imposed on non-influential agents and how the graph topology of these latter agents helps resist manipulation but only to a certain degree.
Abstract: In diffusion social learning over weakly-connected graphs, it has been shown that influential agents end up shaping the beliefs of non-influential agents. In this paper, we analyse this control mechanism more closely and reveal some critical properties. In particular, we characterize the set of beliefs that can be imposed on non-influential agents (i.e., the set of attainable beliefs) and how the graph topology of these latter agents helps resist manipulation but only to a certain degree. We also derive a design procedure that allows influential agents to drive the beliefs of non-influential agents to desirable attainable states. We illustrate the results with two examples.

3 citations

Journal ArticleDOI
TL;DR: A new solution to the four-block problem is described using the method of generalized Schur analysis, which parameterizes the unknown entry in terms of a Schur-type matrix function, which is shown to satisfy a finite number of interpolation conditions of the Hermite-Fejer type.
Abstract: We describe a new solution to the four-block problem using the method of generalized Schur analysis. We first reduce the general problem to a simpler one by invoking a coprime factorization with a block-diagonal inner matrix. Then, using convenient spectral factorizations, we are able to parameterize the unknown entry in terms of a Schur-type matrix function, which is shown to satisfy a finite number of interpolation conditions of the Hermite-Fejer type. All possible interpolating functions are then determined via a simple recursive procedure that constructs a transmission-line (or lattice) cascade of elementary J-lossless sections. This also leads to a parameterization of all solutions of the four-block problem in terms of a linear fractional transformation. >

3 citations

Proceedings ArticleDOI
06 Jun 2021
TL;DR: In this article, the authors propose an alternative scheme, which constructs perturbations according to a particular nullspace condition, allowing them to be invisible (to first order in the step-size) to the network centroid, while preserving privacy.
Abstract: Decentralized algorithms for stochastic optimization and learning rely on the diffusion of information through repeated local exchanges of intermediate estimates. Such structures are particularly appealing in situations where agents may be hesitant to share raw data due to privacy concerns. Nevertheless, in the absence of additional privacy-preserving mechanisms, the exchange of local estimates, which are generated based on private data can allow for the inference of the data itself. The most common mechanism for guaranteeing privacy is the addition of perturbations to local estimates before broadcasting. These perturbations are generally chosen independently at every agent, resulting in a significant performance loss. We propose an alternative scheme, which constructs perturbations according to a particular nullspace condition, allowing them to be invisible (to first order in the step-size) to the network centroid, while preserving privacy guarantees. The analysis allows for general nonconvex loss functions, and is hence applicable to a large number of machine learning and signal processing problems, including deep learning.

3 citations


Cited by
More filters
Journal ArticleDOI

[...]

08 Dec 2001-BMJ
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Abstract: There is, I think, something ethereal about i —the square root of minus one. I remember first hearing about it at school. It seemed an odd beast at that time—an intruder hovering on the edge of reality. Usually familiarity dulls this sense of the bizarre, but in the case of i it was the reverse: over the years the sense of its surreal nature intensified. It seemed that it was impossible to write mathematics that described the real world in …

33,785 citations

Journal ArticleDOI
TL;DR: This survey provides an overview of higher-order tensor decompositions, their applications, and available software.
Abstract: This survey provides an overview of higher-order tensor decompositions, their applications, and available software. A tensor is a multidimensional or $N$-way array. Decompositions of higher-order tensors (i.e., $N$-way arrays with $N \geq 3$) have applications in psycho-metrics, chemometrics, signal processing, numerical linear algebra, computer vision, numerical analysis, data mining, neuroscience, graph analysis, and elsewhere. Two particular tensor decompositions can be considered to be higher-order extensions of the matrix singular value decomposition: CANDECOMP/PARAFAC (CP) decomposes a tensor as a sum of rank-one tensors, and the Tucker decomposition is a higher-order form of principal component analysis. There are many other tensor decompositions, including INDSCAL, PARAFAC2, CANDELINC, DEDICOM, and PARATUCK2 as well as nonnegative variants of all of the above. The N-way Toolbox, Tensor Toolbox, and Multilinear Engine are examples of software packages for working with tensors.

9,227 citations

Proceedings ArticleDOI
22 Jan 2006
TL;DR: Some of the major results in random graphs and some of the more challenging open problems are reviewed, including those related to the WWW.
Abstract: We will review some of the major results in random graphs and some of the more challenging open problems. We will cover algorithmic and structural questions. We will touch on newer models, including those related to the WWW.

7,116 citations

Journal ArticleDOI

6,278 citations

01 Jan 2016
TL;DR: The table of integrals series and products is universally compatible with any devices to read and is available in the book collection an online access to it is set as public so you can get it instantly.
Abstract: Thank you very much for downloading table of integrals series and products. Maybe you have knowledge that, people have look hundreds times for their chosen books like this table of integrals series and products, but end up in harmful downloads. Rather than reading a good book with a cup of coffee in the afternoon, instead they cope with some harmful virus inside their laptop. table of integrals series and products is available in our book collection an online access to it is set as public so you can get it instantly. Our book servers saves in multiple locations, allowing you to get the most less latency time to download any of our books like this one. Merely said, the table of integrals series and products is universally compatible with any devices to read.

4,085 citations