Institution
Université de Montréal
Education•Montreal, Quebec, Canada•
About: Université de Montréal is a(n) education organization based out in Montreal, Quebec, Canada. It is known for research contribution in the topic(s): Population & Poison control. The organization has 45641 authors who have published 100476 publication(s) receiving 4004007 citation(s). The organization is also known as: University of Montreal & UdeM.
Papers published on a yearly basis
Papers
More filters
[...]
TL;DR: Deep learning is making major advances in solving problems that have resisted the best attempts of the artificial intelligence community for many years, and will have many more successes in the near future because it requires very little engineering by hand and can easily take advantage of increases in the amount of available computation and data.
Abstract: Deep learning allows computational models that are composed of multiple processing layers to learn representations of data with multiple levels of abstraction. These methods have dramatically improved the state-of-the-art in speech recognition, visual object recognition, object detection and many other domains such as drug discovery and genomics. Deep learning discovers intricate structure in large data sets by using the backpropagation algorithm to indicate how a machine should change its internal parameters that are used to compute the representation in each layer from the representation in the previous layer. Deep convolutional nets have brought about breakthroughs in processing images, video, speech and audio, whereas recurrent nets have shone light on sequential data such as text and speech.
33,931 citations
08 Dec 2014
TL;DR: A new framework for estimating generative models via an adversarial process, in which two models are simultaneously train: a generative model G that captures the data distribution and a discriminative model D that estimates the probability that a sample came from the training data rather than G.
Abstract: We propose a new framework for estimating generative models via an adversarial process, in which we simultaneously train two models: a generative model G that captures the data distribution, and a discriminative model D that estimates the probability that a sample came from the training data rather than G. The training procedure for G is to maximize the probability of D making a mistake. This framework corresponds to a minimax two-player game. In the space of arbitrary functions G and D, a unique solution exists, with G recovering the training data distribution and D equal to ½ everywhere. In the case where G and D are defined by multilayer perceptrons, the entire system can be trained with backpropagation. There is no need for any Markov chains or unrolled approximate inference networks during either training or generation of samples. Experiments demonstrate the potential of the framework through qualitative and quantitative evaluation of the generated samples.
29,410 citations
Book•
[...]
TL;DR: Deep learning as mentioned in this paper is a form of machine learning that enables computers to learn from experience and understand the world in terms of a hierarchy of concepts, and it is used in many applications such as natural language processing, speech recognition, computer vision, online recommendation systems, bioinformatics, and videogames.
Abstract: Deep learning is a form of machine learning that enables computers to learn from experience and understand the world in terms of a hierarchy of concepts. Because the computer gathers knowledge from experience, there is no need for a human computer operator to formally specify all the knowledge that the computer needs. The hierarchy of concepts allows the computer to learn complicated concepts by building them out of simpler ones; a graph of these hierarchies would be many layers deep. This book introduces a broad range of topics in deep learning. The text offers mathematical and conceptual background, covering relevant concepts in linear algebra, probability theory and information theory, numerical computation, and machine learning. It describes deep learning techniques used by practitioners in industry, including deep feedforward networks, regularization, optimization algorithms, convolutional networks, sequence modeling, and practical methodology; and it surveys such applications as natural language processing, speech recognition, computer vision, online recommendation systems, bioinformatics, and videogames. Finally, the book offers research perspectives, covering such theoretical topics as linear factor models, autoencoders, representation learning, structured probabilistic models, Monte Carlo methods, the partition function, approximate inference, and deep generative models. Deep Learning can be used by undergraduate or graduate students planning careers in either industry or research, and by software engineers who want to begin using deep learning in their products or platforms. A website offers supplementary material for both readers and instructors.
26,972 citations
Proceedings Article•
01 Jan 2015TL;DR: It is conjecture that the use of a fixed-length vector is a bottleneck in improving the performance of this basic encoder-decoder architecture, and it is proposed to extend this by allowing a model to automatically (soft-)search for parts of a source sentence that are relevant to predicting a target word, without having to form these parts as a hard segment explicitly.
Abstract: Neural machine translation is a recently proposed approach to machine translation. Unlike the traditional statistical machine translation, the neural machine translation aims at building a single neural network that can be jointly tuned to maximize the translation performance. The models proposed recently for neural machine translation often belong to a family of encoder-decoders and consists of an encoder that encodes a source sentence into a fixed-length vector from which a decoder generates a translation. In this paper, we conjecture that the use of a fixed-length vector is a bottleneck in improving the performance of this basic encoder-decoder architecture, and propose to extend this by allowing a model to automatically (soft-)search for parts of a source sentence that are relevant to predicting a target word, without having to form these parts as a hard segment explicitly. With this new approach, we achieve a translation performance comparable to the existing state-of-the-art phrase-based system on the task of English-to-French translation. Furthermore, qualitative analysis reveals that the (soft-)alignments found by the model agree well with our intuition.
15,992 citations
TL;DR: A brief historical review of the development of lithium-based rechargeable batteries is presented, ongoing research strategies are highlighted, and the challenges that remain regarding the synthesis, characterization, electrochemical performance and safety of these systems are discussed.
Abstract: Technological improvements in rechargeable solid-state batteries are being driven by an ever-increasing demand for portable electronic devices. Lithium-ion batteries are the systems of choice, offering high energy density, flexible and lightweight design, and longer lifespan than comparable battery technologies. We present a brief historical review of the development of lithium-based rechargeable batteries, highlight ongoing research strategies, and discuss the challenges that remain regarding the synthesis, characterization, electrochemical performance and safety of these systems.
15,475 citations
Authors
Showing all 45641 results
Name | H-index | Papers | Citations |
---|---|---|---|
Yoshua Bengio | 202 | 1033 | 420313 |
Alan C. Evans | 183 | 866 | 134642 |
Richard H. Friend | 169 | 1182 | 140032 |
Anders Björklund | 165 | 769 | 84268 |
Charles N. Serhan | 158 | 728 | 84810 |
Fernando Rivadeneira | 146 | 628 | 86582 |
C. Dallapiccola | 136 | 1717 | 101947 |
Michael J. Meaney | 136 | 604 | 81128 |
Claude Leroy | 135 | 1170 | 88604 |
Georges Azuelos | 134 | 1294 | 90690 |
Phillip Gutierrez | 133 | 1391 | 96205 |
Danny Miller | 133 | 512 | 71238 |
Henry T. Lynch | 133 | 925 | 86270 |
Stanley Nattel | 132 | 778 | 65700 |
Lucie Gauthier | 132 | 679 | 64794 |