scispace - formally typeset
Open AccessProceedings ArticleDOI

High-performance FPGA implementation of equivariant adaptive separation via independence algorithm for Independent Component Analysis

Reads0
Chats0
TLDR
In this paper, an FPGA implementation of adaptive independent component analysis (ICA) is presented, which can be used in various machine learning problems that use stochastic gradient descent optimization.
Abstract
Independent Component Analysis (ICA) is a dimensionality reduction technique that can boost efficiency of machine learning models that deal with probability density functions, e.g. Bayesian neural networks. Algorithms that implement adaptive ICA converge slower than their nonadaptive counterparts, however, they are capable of tracking changes in underlying distributions of input features. This intrinsically slow convergence of adaptive methods combined with existing hardware implementations that operate at very low clock frequencies necessitate fundamental improvements in both algorithm and hardware design. This paper presents an algorithm that allows efficient hardware implementation of ICA. Compared to previous work, our FPGA implementation of adaptive ICA improves clock frequency by at least one order of magnitude and throughput by at least two orders of magnitude. Our proposed algorithm is not limited to ICA and can be used in various machine learning problems that use stochastic gradient descent optimization.

read more

Citations
More filters
Proceedings ArticleDOI

FFT-based deep learning deployment in embedded systems

TL;DR: This work proposes a Fast Fourier Transform-based DNN training and inference model suitable for embedded platforms with reduced asymptotic complexity of both computation and storage, and develops and deploys the FFT-based inference model on embedded platforms achieving extraordinary processing speed.
Proceedings ArticleDOI

Opportunities for Machine Learning in Electronic Design Automation

TL;DR: Because some of the existing ML accelerators have used asynchronous design, the state of the art in asynchronous CAD support is reviewed, and opportunities for ML within these flows are identified.
Proceedings ArticleDOI

A hardware-friendly algorithm for scalable training and deployment of dimensionality reduction models on FPGA

TL;DR: This paper presents a high-performance, scalable, reconfigurable solution for both training and deployment of different dimensionality reduction models in hardware by introducing a hardware-friendly algorithm.
References
More filters
Journal ArticleDOI

Independent component analysis, a new concept?

Pierre Comon
- 01 Apr 1994 - 
TL;DR: An efficient algorithm is proposed, which allows the computation of the ICA of a data matrix within a polynomial time and may actually be seen as an extension of the principal component analysis (PCA).
Journal ArticleDOI

Fast and robust fixed-point algorithms for independent component analysis

TL;DR: Using maximum entropy approximations of differential entropy, a family of new contrast (objective) functions for ICA enable both the estimation of the whole decomposition by minimizing mutual information, and estimation of individual independent components as projection pursuit directions.
Journal ArticleDOI

Natural gradient works efficiently in learning

Shun-ichi Amari
- 15 Feb 1998 - 
TL;DR: In this paper, the authors used information geometry to calculate the natural gradients in the parameter space of perceptrons, the space of matrices (for blind source separation), and the spaces of linear dynamical systems for blind source deconvolution, and proved that Fisher efficient online learning has asymptotically the same performance as the optimal batch estimation of parameters.
Journal ArticleDOI

On the momentum term in gradient descent learning algorithms

TL;DR: The bounds for convergence on learning-rate and momentum parameters are derived, and it is demonstrated that the momentum term can increase the range of learning rate over which the system converges.
Journal ArticleDOI

Equivariant adaptive source separation

TL;DR: A class of adaptive algorithms for source separation that implements an adaptive version of equivariant estimation and is henceforth called EASI, which yields algorithms with a simple structure for both real and complex mixtures.
Related Papers (5)