Low Data Drug Discovery with One-Shot Learning
Reads0
Chats0
TLDR
Ramsundar et al. as mentioned in this paper proposed an iterative refinement long short-term memory (ILSTM) architecture, which can be combined with graph convolutional neural networks to improve learning of meaningful distance metrics over small molecules.Abstract:
Recent advances in machine learning have made significant contributions to drug discovery. Deep neural networks in particular have been demonstrated to provide significant boosts in predictive power when inferring the properties and activities of small-molecule compounds (Ma, J. et al. J. Chem. Inf. Model. 2015, 55, 263–274). However, the applicability of these techniques has been limited by the requirement for large amounts of training data. In this work, we demonstrate how one-shot learning can be used to significantly lower the amounts of data required to make meaningful predictions in drug discovery applications. We introduce a new architecture, the iterative refinement long short-term memory, that, when combined with graph convolutional neural networks, significantly improves learning of meaningful distance metrics over small-molecules. We open source all models introduced in this work as part of DeepChem, an open-source framework for deep-learning in drug discovery (Ramsundar, B. deepchem.io. https:...read more
Citations
More filters
Journal ArticleDOI
Machine learning for molecular and materials science.
TL;DR: A future in which the design, synthesis, characterization and application of molecules and materials is accelerated by artificial intelligence is envisaged.
Journal ArticleDOI
Opportunities and obstacles for deep learning in biology and medicine.
Travers Ching,Daniel Himmelstein,Brett K. Beaulieu-Jones,Alexandr A. Kalinin,Brian T. Do,Gregory P. Way,Enrico Ferrero,Paul-Michael Agapow,Michael Zietz,Michael M. Hoffman,Michael M. Hoffman,Wei Xie,Gail L. Rosen,Benjamin J. Lengerich,Johnny Israeli,Jack Lanchantin,Stephen Woloszynek,Anne E. Carpenter,Avanti Shrikumar,Jinbo Xu,Evan M. Cofer,Evan M. Cofer,Christopher A. Lavender,Srinivas C. Turaga,Amr Alexandari,Zhiyong Lu,David J. Harris,Dave DeCaprio,Yanjun Qi,Anshul Kundaje,Yifan Peng,Laura K. Wiley,Marwin H. S. Segler,Simina M. Boca,S. Joshua Swamidass,Austin Huang,Anthony Gitter,Anthony Gitter,Casey S. Greene +38 more
TL;DR: It is found that deep learning has yet to revolutionize biomedicine or definitively resolve any of the most pressing challenges in the field, but promising advances have been made on the prior state of the art.
Journal ArticleDOI
Generalizing from a Few Examples: A Survey on Few-shot Learning
TL;DR: A thorough survey to fully understand Few-shot Learning (FSL), and categorizes FSL methods from three perspectives: data, which uses prior knowledge to augment the supervised experience; model, which used to reduce the size of the hypothesis space; and algorithm, which using prior knowledgeto alter the search for the best hypothesis in the given hypothesis space.
Journal ArticleDOI
The rise of deep learning in drug discovery.
TL;DR: The first wave of applications of deep learning in pharmaceutical research has emerged in recent years, and its utility has gone beyond bioactivity predictions and has shown promise in addressing diverse problems in drug discovery.
Journal ArticleDOI
Generating Focused Molecule Libraries for Drug Discovery with Recurrent Neural Networks
TL;DR: This work shows that recurrent neural networks can be trained as generative models for molecular structures, similar to statistical language models in natural language processing, and demonstrates that the properties of the generated molecules correlate very well with those of the molecules used to train the model.
References
More filters
Journal ArticleDOI
Long short-term memory
TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Journal ArticleDOI
ImageNet Large Scale Visual Recognition Challenge
Olga Russakovsky,Jia Deng,Hao Su,Jonathan Krause,Sanjeev Satheesh,Sean Ma,Zhiheng Huang,Andrej Karpathy,Aditya Khosla,Michael S. Bernstein,Alexander C. Berg,Li Fei-Fei +11 more
TL;DR: The ImageNet Large Scale Visual Recognition Challenge (ILSVRC) as mentioned in this paper is a benchmark in object category classification and detection on hundreds of object categories and millions of images, which has been run annually from 2010 to present, attracting participation from more than fifty institutions.
Journal ArticleDOI
Mastering the game of Go with deep neural networks and tree search
David Silver,Aja Huang,Chris J. Maddison,Arthur Guez,Laurent Sifre,George van den Driessche,Julian Schrittwieser,Ioannis Antonoglou,Veda Panneershelvam,Marc Lanctot,Sander Dieleman,Dominik Grewe,John Nham,Nal Kalchbrenner,Ilya Sutskever,Timothy P. Lillicrap,Madeleine Leach,Koray Kavukcuoglu,Thore Graepel,Demis Hassabis +19 more
TL;DR: Using this search algorithm, the program AlphaGo achieved a 99.8% winning rate against other Go programs, and defeated the human European Go champion by 5 games to 0.5, the first time that a computer program has defeated a human professional player in the full-sized game of Go.
Proceedings ArticleDOI
TensorFlow: a system for large-scale machine learning
Martín Abadi,Paul Barham,Jianmin Chen,Zhifeng Chen,Andy Davis,Jeffrey Dean,Matthieu Devin,Sanjay Ghemawat,Geoffrey Irving,Michael Isard,Manjunath Kudlur,Josh Levenberg,Rajat Monga,Sherry Moore,Derek G. Murray,Benoit Steiner,Paul A. Tucker,Vijay K. Vasudevan,Pete Warden,Martin Wicke,Yuan Yu,Xiaoqiang Zheng +21 more
TL;DR: TensorFlow as mentioned in this paper is a machine learning system that operates at large scale and in heterogeneous environments, using dataflow graphs to represent computation, shared state, and the operations that mutate that state.
Book ChapterDOI
Identity Mappings in Deep Residual Networks
TL;DR: In this paper, the forward and backward signals can be directly propagated from one block to any other block, when using identity mappings as the skip connections and after-addition activation.