scispace - formally typeset
Search or ask a question
Author

L. Yang

Bio: L. Yang is an academic researcher from University of California, Berkeley. The author has contributed to research in topics: Cellular automaton & Artificial neural network. The author has an hindex of 4, co-authored 4 publications receiving 5039 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: In this article, a class of information processing systems called cellular neural networks (CNNs) are proposed, which consist of a massive aggregate of regularly spaced circuit clones, called cells, which communicate with each other directly through their nearest neighbors.
Abstract: A novel class of information-processing systems called cellular neural networks is proposed. Like neural networks, they are large-scale nonlinear analog circuits that process signals in real time. Like cellular automata, they consist of a massive aggregate of regularly spaced circuit clones, called cells, which communicate with each other directly only through their nearest neighbors. Each cell is made of a linear capacitor, a nonlinear voltage-controlled current source, and a few resistive linear circuit elements. Cellular neural networks share the best features of both worlds: their continuous-time feature allows real-time signal processing, and their local interconnection feature makes them particularly adapted for VLSI implementation. Cellular neural networks are uniquely suited for high-speed parallel signal processing. >

4,583 citations

Proceedings ArticleDOI
07 Jun 1988
TL;DR: Examples of cellular neural networks which can be designed to recognize the key features of Chinese characters are presented and some theorems concerning the dynamic range and the steady states of Cellular neural networks are proved.
Abstract: A novel class of information-processing systems called cellular neural networks is proposed. Like neural networks, they are large-scale nonlinear circuits which process signals in real time. Like cellular automata, they are made of a massive aggregate of regularly spaced circuit clones, called cells, which communicate with each other directly only through their nearest neighbors. Some applications in such areas as image processing are demonstrated, albeit with only a crude circuit. In particular, examples of cellular neural networks which can be designed to recognize the key features of Chinese characters are presented. Some theorems concerning the dynamic range and the steady states of cellular neural networks are proved. In view of the near-neighbor interactive property of cellular neural networks, they are much more amenable to VLSI implementation than general neural networks. >

495 citations

Proceedings ArticleDOI
01 May 1990
TL;DR: A cellular neural network (CNN) which is an example of very-large-scale analog processing or collective analog computation is presented and VLSI implementation of these circuits is discussed.
Abstract: A cellular neural network (CNN) which is an example of very-large-scale analog processing or collective analog computation is presented. The CNN architecture combines some features of fully connected analog neural networks with the nearest-neighbor interactions found in cellular automata. VLSI implementation of these circuits is discussed. Though the circuits described have been fabricated for noise removal and connected segment extraction, most of the features of the VLSI circuits are shared by VLSI implementations of other processing functions. >

62 citations

Proceedings ArticleDOI
01 May 1990
TL;DR: The cellular neural network (CNN) is an example of very-large-scale analog processing or collective analog computation which combines some features of fully connected analog neural networks with the nearest-neighbor interactions found in cellular automata.
Abstract: The cellular neural network (CNN) is an example of very-large-scale analog processing or collective analog computation. The CNN architecture combines some features of fully connected analog neural networks with the nearest-neighbor interactions found in cellular automata. These networks have numerous advantages both for simulation and for VLSI implementation and can perform (though are not limited to) several important image processing functions. The important features which enable the CNN architecture to perform signal processing functions using standard VLSI technology are discussed. Circuit characteristics are outlined, and examples of cellular neural network signal processing are presented. Connected segment extraction is illustrated by examples, as is histogramming using a two-layer CNN. >

39 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: A new neural network model, called graph neural network (GNN) model, that extends existing neural network methods for processing the data represented in graph domains, and implements a function tau(G,n) isin IRm that maps a graph G and one of its nodes n into an m-dimensional Euclidean space.
Abstract: Many underlying relationships among data in several areas of science and engineering, e.g., computer vision, molecular chemistry, molecular biology, pattern recognition, and data mining, can be represented in terms of graphs. In this paper, we propose a new neural network model, called graph neural network (GNN) model, that extends existing neural network methods for processing the data represented in graph domains. This GNN model, which can directly process most of the practically useful types of graphs, e.g., acyclic, cyclic, directed, and undirected, implements a function tau(G,n) isin IRm that maps a graph G and one of its nodes n into an m-dimensional Euclidean space. A supervised learning algorithm is derived to estimate the parameters of the proposed GNN model. The computational cost of the proposed algorithm is also considered. Some experimental results are shown to validate the proposed learning algorithm, and to demonstrate its generalization capabilities.

5,701 citations

Journal ArticleDOI
TL;DR: In this article, a class of information processing systems called cellular neural networks (CNNs) are proposed, which consist of a massive aggregate of regularly spaced circuit clones, called cells, which communicate with each other directly through their nearest neighbors.
Abstract: A novel class of information-processing systems called cellular neural networks is proposed. Like neural networks, they are large-scale nonlinear analog circuits that process signals in real time. Like cellular automata, they consist of a massive aggregate of regularly spaced circuit clones, called cells, which communicate with each other directly only through their nearest neighbors. Each cell is made of a linear capacitor, a nonlinear voltage-controlled current source, and a few resistive linear circuit elements. Cellular neural networks share the best features of both worlds: their continuous-time feature allows real-time signal processing, and their local interconnection feature makes them particularly adapted for VLSI implementation. Cellular neural networks are uniquely suited for high-speed parallel signal processing. >

4,583 citations

Journal ArticleDOI
TL;DR: Examples of cellular neural networks which can be designed to recognize the key features of Chinese characters are presented and their applications to such areas as image processing and pattern recognition are demonstrated.
Abstract: The theory of a novel class of information-processing systems, called cellular neural networks, which are capable of high-speed parallel signal processing, was presented in a previous paper (see ibid., vol.35, no.10, p.1257-72, 1988). A dynamic route approach for analyzing the local dynamics of this class of neural circuits is used to steer the system trajectories into various stable equilibrium configurations which map onto binary patterns to be recognized. Some applications of cellular neural networks to such areas as image processing and pattern recognition are demonstrated, albeit with only a crude circuit. In particular, examples of cellular neural networks which can be designed to recognize the key features of Chinese characters are presented. >

2,332 citations

Journal ArticleDOI
TL;DR: A Machine Learning practitioner seeking guidance for implementing the new augmented LSTM model in software for experimentation and research will find the insights and derivations in this treatise valuable as well.

1,795 citations

Journal ArticleDOI
TL;DR: In this paper, the authors focus on the promise of artificial neural networks in the realm of modelling, identification and control of nonlinear systems and explore the links between the fields of control science and neural networks.

1,721 citations