Patent
Parallelizing neural networks during training
Reads0
Chats0
TLDR
In this paper, a parallel convolutional neural network (CNN) is implemented by a plurality of CNNs each on a respective processing node, and each CNN has a multiplicity of layers.Abstract:
A parallel convolutional neural network is provided. The CNN is implemented by a plurality of convolutional neural networks each on a respective processing node. Each CNN has a plurality of layers. A subset of the layers are interconnected between processing nodes such that activations are fed forward across nodes. The remaining subset is not so interconnected.read more
Citations
More filters
Book ChapterDOI
Determining Optimal Multi-layer Perceptron Structure Using Linear Regression
Mohamed Lafif Tej,Stefan Holban +1 more
TL;DR: A novel method to determine the optimal Multi-layer Perceptron structure using Linear Regression is presented, which work unsupervised unlike other methods and more flexible with different datasets types.
Patent
Methods and systems for improved transforms in convolutional neural networks
Matveev Alexander,Shavit Nir +1 more
TL;DR: In this paper, an improved convolutional layer in CNNs is presented. The convolution is performed via a transformation that includes relocating input, relocating convolution filters and performing an aggregate matrix multiply.
Patent
Systems and methods for generation of sparse code for convolutional neural networks
TL;DR: In this article, a system and method may generate code to be used when executing neural networks (NNs), for example convolutional neural networks, which may include one or more convolutions.
Patent
System and method of executing neural networks
TL;DR: In this paper, a neural network (NN) on one or more target computing devices is inferred using task instruction code that represents at least one computation of the kernel of the NN.
Patent
Data representations and architectures, systems, and methods for multi-sensory fusion, computing, and cross-domain generalization
TL;DR: In this article, a set of parameterizations of a plurality of semantic concepts, each parameterization of the set including: receiving existing data at a computer system on the plurality of Semantic concepts, the existing data including processed output data from the NNBCSs, and generating a data structure to define a continuous vector space of a digital knowledge graph (DKG).
References
More filters
Journal ArticleDOI
ImageNet classification with deep convolutional neural networks
TL;DR: A large, deep convolutional neural network was trained to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 different classes and employed a recently developed regularization method called "dropout" that proved to be very effective.
Proceedings ArticleDOI
Multi-column deep neural networks for image classification
TL;DR: In this paper, a biologically plausible, wide and deep artificial neural network architectures was proposed to match human performance on tasks such as the recognition of handwritten digits or traffic signs, achieving near-human performance.
Proceedings Article
Large Scale Distributed Deep Networks
Jeffrey Dean,Greg S. Corrado,Rajat Monga,Kai Chen,Matthieu Devin,Mark Z. Mao,Marc'Aurelio Ranzato,Andrew W. Senior,Paul A. Tucker,Ke Yang,Quoc V. Le,Andrew Y. Ng +11 more
TL;DR: This paper considers the problem of training a deep network with billions of parameters using tens of thousands of CPU cores and develops two algorithms for large-scale distributed training, Downpour SGD and Sandblaster L-BFGS, which increase the scale and speed of deep network training.
Patent
Method and tool for data mining in automatic decision making systems
TL;DR: In this article, the authors propose a qualitative modeling of the interrelations between various objects whose attributes are relevant to a score made by the predictor according to which decisions are made, wherein this relevancy is determined by an input of a domain expert to the problem in hand.