scispace - formally typeset
Search or ask a question
Institution

École Polytechnique de Montréal

EducationMontreal, Quebec, Canada
About: École Polytechnique de Montréal is a education organization based out in Montreal, Quebec, Canada. It is known for research contribution in the topics: Finite element method & Population. The organization has 8015 authors who have published 18390 publications receiving 494372 citations.


Papers
More filters
Proceedings Article
07 Dec 2015
TL;DR: BinaryConnect is introduced, a method which consists in training a DNN with binary weights during the forward and backward propagations, while retaining precision of the stored weights in which gradients are accumulated, and near state-of-the-art results with BinaryConnect are obtained on the permutation-invariant MNIST, CIFAR-10 and SVHN.
Abstract: Deep Neural Networks (DNN) have achieved state-of-the-art results in a wide range of tasks, with the best results obtained with large training sets and large models. In the past, GPUs enabled these breakthroughs because of their greater computational speed. In the future, faster computation at both training and test time is likely to be crucial for further progress and for consumer applications on low-power devices. As a result, there is much interest in research and development of dedicated hardware for Deep Learning (DL). Binary weights, i.e., weights which are constrained to only two possible values (e.g. -1 or 1), would bring great benefits to specialized DL hardware by replacing many multiply-accumulate operations by simple accumulations, as multipliers are the most space and power-hungry components of the digital implementation of neural networks. We introduce BinaryConnect, a method which consists in training a DNN with binary weights during the forward and backward propagations, while retaining precision of the stored weights in which gradients are accumulated. Like other dropout schemes, we show that BinaryConnect acts as regularizer and we obtain near state-of-the-art results with BinaryConnect on the permutation-invariant MNIST, CIFAR-10 and SVHN.

1,311 citations

Journal ArticleDOI
TL;DR: Usetox as discussed by the authors is a scientific consensus model that contains only the most influential model elements and is used to calculate CFs for several thousand substances and forms the basis of the recommendations from UNEP-SETAC Life Cycle Initiative regarding characterisation of toxic impacts in life cycle assessment.
Abstract: In 2005, a comprehensive comparison of life cycle impact assessment toxicity characterisation models was initiated by the United Nations Environment Program (UNEP)–Society for Environmental Toxicology and Chemistry (SETAC) Life Cycle Initiative, directly involving the model developers of CalTOX, IMPACT 2002, USES-LCA, BETR, EDIP, WATSON and EcoSense. In this paper, we describe this model comparison process and its results—in particular the scientific consensus model developed by the model developers. The main objectives of this effort were (1) to identify specific sources of differences between the models’ results and structure, (2) to detect the indispensable model components and (3) to build a scientific consensus model from them, representing recommended practice. A chemical test set of 45 organics covering a wide range of property combinations was selected for this purpose. All models used this set. In three workshops, the model comparison participants identified key fate, exposure and effect issues via comparison of the final characterisation factors and selected intermediate outputs for fate, human exposure and toxic effects for the test set applied to all models. Through this process, we were able to reduce inter-model variation from an initial range of up to 13 orders of magnitude down to no more than two orders of magnitude for any substance. This led to the development of USEtox, a scientific consensus model that contains only the most influential model elements. These were, for example, process formulations accounting for intermittent rain, defining a closed or open system environment or nesting an urban box in a continental box. The precision of the new characterisation factors (CFs) is within a factor of 100–1,000 for human health and 10–100 for freshwater ecotoxicity of all other models compared to 12 orders of magnitude variation between the CFs of each model, respectively. The achieved reduction of inter-model variability by up to 11 orders of magnitude is a significant improvement. USEtox provides a parsimonious and transparent tool for human health and ecosystem CF estimates. Based on a referenced database, it has now been used to calculate CFs for several thousand substances and forms the basis of the recommendations from UNEP-SETAC’s Life Cycle Initiative regarding characterisation of toxic impacts in life cycle assessment. We provide both recommended and interim (not recommended and to be used with caution) characterisation factors for human health and freshwater ecotoxicity impacts. After a process of consensus building among stakeholders on a broad scale as well as several improvements regarding a wider and easier applicability of the model, USEtox will become available to practitioners for the calculation of further CFs.

1,304 citations

Journal ArticleDOI
TL;DR: The McKean-Vlasov NCE method presented in this paper has a close connection with the statistical physics of large particle systems: both identify a consistency relationship between the individual agent at the microscopic level and the mass of individuals at the macroscopic level.
Abstract: We consider stochastic dynamic games in large population conditions where multiclass agents are weakly coupled via their individual dynamics and costs. We approach this large population game problem by the so-called Nash Certainty Equivalence (NCE) Principle which leads to a decentralized control synthesis. The McKean-Vlasov NCE method presented in this paper has a close connection with the statistical physics of large particle systems: both identify a consistency relationship between the individual agent (or particle) at the microscopic level and the mass of individuals (or particles) at the macroscopic level. The overall game is decomposed into (i) an optimal control problem whose Hamilton-Jacobi-Bellman (HJB) equation determines the optimal control for each individual and which involves a measure corresponding to the mass effect, and (ii) a family of McKean-Vlasov (M-V) equations which also depend upon this measure. We designate the NCE Principle as the property that the resulting scheme is consistent (or soluble), i.e. the prescribed control laws produce sample paths which produce the mass effect measure. By construction, the overall closed-loop behaviour is such that each agent’s behaviour is optimal with respect to all other agents in the game theoretic Nash sense.

1,195 citations

Journal ArticleDOI
TL;DR: The FactSage computer package as discussed by the authors consists of a series of information, calculation and manipulation modules that enable one to access and manipulate compound and solution databases and perform a wide variety of thermochemical calculations and generate tables, graphs and figures of interest.
Abstract: The FactSage computer package consists of a series of information, calculation and manipulation modules that enable one to access and manipulate compound and solution databases. With the various modules running under Microsoft Windows® one can perform a wide variety of thermochemical calculations and generate tables, graphs and figures of interest to chemical and physical metallurgists, chemical engineers, corrosion engineers, inorganic chemists, geochemists, ceramists, electrochemists, environmentalists, etc. This paper presents a summary of the developments in the FactSage thermochemical software and databases during the last six years. Particular emphasis is placed on the new databases and developments in calculating and manipulating phase diagrams.

1,175 citations

Proceedings ArticleDOI
21 Jul 2017
TL;DR: In this article, the authors extend DenseNets to semantic segmentation and achieve state-of-the-art results on urban scene benchmark datasets such as CamVid and Gatech, without any further post-processing module nor pretraining.
Abstract: State-of-the-art approaches for semantic image segmentation are built on Convolutional Neural Networks (CNNs). The typical segmentation architecture is composed of (a) a downsampling path responsible for extracting coarse semantic features, followed by (b) an upsampling path trained to recover the input image resolution at the output of the model and, optionally, (c) a post-processing module (e.g. Conditional Random Fields) to refine the model predictions.,,,,,, Recently, a new CNN architecture, Densely Connected Convolutional Networks (DenseNets), has shown excellent results on image classification tasks. The idea of DenseNets is based on the observation that if each layer is directly connected to every other layer in a feed-forward fashion then the network will be more accurate and easier to train.,,,,,, In this paper, we extend DenseNets to deal with the problem of semantic segmentation. We achieve state-of-the-art results on urban scene benchmark datasets such as CamVid and Gatech, without any further post-processing module nor pretraining. Moreover, due to smart construction of the model, our approach has much less parameters than currently published best entries for these datasets.

1,163 citations


Authors

Showing all 8139 results

NameH-indexPapersCitations
Yoshua Bengio2021033420313
Claude Leroy135117088604
Lucie Gauthier13267964794
Reyhaneh Rezvani12063861776
M. Giunta11560866189
Alain Dufresne11135845904
David Brown105125746827
Pierre Legendre9836682995
Michel Bouvier9739631267
Aharon Gedanken9686138974
Michel Gendreau9445636253
Frederick Dallaire9347531049
Pierre Savard9342742186
Nader Engheta8961935204
Ke Wu87124233226
Network Information
Related Institutions (5)
Delft University of Technology
94.4K papers, 2.7M citations

93% related

Royal Institute of Technology
68.4K papers, 1.9M citations

93% related

Georgia Institute of Technology
119K papers, 4.6M citations

93% related

University of Waterloo
93.9K papers, 2.9M citations

93% related

École Polytechnique Fédérale de Lausanne
98.2K papers, 4.3M citations

92% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
202340
2022276
20211,275
20201,207
20191,140
20181,102