scispace - formally typeset
Search or ask a question

Showing papers on "Deep learning published in 1970"


Journal ArticleDOI
01 Jan 1970
TL;DR: Details from a selected variety of works published in recent years are presented to illustrate the versatility of the Deep Learning techniques, their potential in current and future research and industry applications as well as their state-of-the-art status in vision tasks, where their efficiency is experimentally proven to near 100% accuracy.
Abstract: Deep Learning usage is spread across many fields of application. This paper presents details from a selected variety of works published in recent years to illustrate the versatility of the Deep Learning techniques, their potential in current and future research and industry applications as well as their state-of-the-art status in vision tasks, where their efficiency is experimentally proven to near 100% accuracy. The presented applications range from navigation to localization, object recognition and more advanced interactions such as grasping.

1 citations


Posted ContentDOI
01 Jan 1970-ChemRxiv
TL;DR: In this paper, a model trained with 1 million structures (0.1 % of the database) reproduces 68.9% of the entire database after training, when sampling 2 billion molecules.
Abstract: Recent applications of Recurrent Neural Networks enable training models that sample the chemical space. In this study we train RNN with molecular string representations (SMILES) with a subset of the enumerated database GDB-13 (975 million molecules). We show that a model trained with 1 million structures (0.1 % of the database) reproduces 68.9 % of the entire database after training, when sampling 2 billion molecules. We also developed a method to assess the quality of the training process using log-likelihood plots. Furthermore, we use a mathematical model based on the “coupon collector problem” that compares the trained model to an upper bound, which shows that complex molecules with many rings and heteroatoms are more difficult to sample. We also suggest that the metrics obtained from this analysis can be used as a tool to benchmark any molecular generative model.

1 citations


Journal ArticleDOI
TL;DR: This work proposes to analyze neural networks using the approach which is more general than neural networks, a Calculus of Self- modifiable Algorithms, which is a universal model for intelligent and parallel systems, integrating different styles of programming.
Abstract: Neural Networks provide a powerful tool for new generation computers. The biggest problem of neural networks is the lack of representational power. We propose to analyze neural networks using the approach which is more general than neural networks. A Calculus of Self- modifiable Algorithms is a universal model for intelligent and parallel systems, integrating different styles of programming. and applied in different domains of future generation computers. Applying this model to neural networks gives some hints how to increase a representational power of neural networks.