scispace - formally typeset
Search or ask a question
Institution

École Polytechnique de Montréal

EducationMontreal, Quebec, Canada
About: École Polytechnique de Montréal is a education organization based out in Montreal, Quebec, Canada. It is known for research contribution in the topics: Finite element method & Computer science. The organization has 8015 authors who have published 18390 publications receiving 494372 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: This study aims to develop a robust and fully automated tissue classification method by using the convolutional neural networks (CNNs) as feature extractor and comparing the predictions of three state-of-the-art classifiers, CNN, random forest (RF), and support vector machine (SVM).
Abstract: Kawasaki disease (KD) is an acute childhood disease complicated by coronary artery aneurysms, intima thickening, thrombi, stenosis, lamellar calcifications, and disappearance of the media border. Automatic classification of the coronary artery layers (intima, media, and scar features) is important for analyzing optical coherence tomography (OCT) images recorded in pediatric patients. OCT has been known as an intracoronary imaging modality using near-infrared light which has recently been used to image the inner coronary artery tissues of pediatric patients, providing high spatial resolution (ranging from 10 to 20 μm). This study aims to develop a robust and fully automated tissue classification method by using the convolutional neural networks (CNNs) as feature extractor and comparing the predictions of three state-of-the-art classifiers, CNN, random forest (RF), and support vector machine (SVM). The results show the robustness of CNN as the feature extractor and random forest as the classifier with classification rate up to 96%, especially to characterize the second layer of coronary arteries (media), which is a very thin layer and it is challenging to be recognized and specified from other tissues.

105 citations

Proceedings Article
15 Nov 2017
TL;DR: This work unify successful ideas from recently proposed architectures into a stochastic recurrent model that achieves state-of-the-art results on standard speech benchmarks such as TIMIT and Blizzard and competitive performance on sequential MNIST.
Abstract: Many efforts have been devoted to training generative latent variable models with autoregressive decoders, such as recurrent neural networks (RNN). Stochastic recurrent models have been successful in capturing the variability observed in natural sequential data such as speech. We unify successful ideas from recently proposed architectures into a stochastic recurrent model: each step in the sequence is associated with a latent variable that is used to condition the recurrent dynamics for future steps. Training is performed with amortised variational inference where the approximate posterior is augmented with a RNN that runs backward through the sequence. In addition to maximizing the variational lower bound, we ease training of the latent variables by adding an auxiliary cost which forces them to reconstruct the state of the backward recurrent network. This provides the latent variables with a task-independent objective that enhances the performance of the overall model. We found this strategy to perform better than alternative approaches such as KL annealing. Although being conceptually simple, our model achieves state-of-the-art results on standard speech benchmarks such as TIMIT and Blizzard and competitive performance on sequential MNIST. Finally, we apply our model to language modeling on the IMDB dataset where the auxiliary cost helps in learning interpretable latent variables.

105 citations

Journal ArticleDOI
TL;DR: The findings indicate that the more reactive cyanide species initially associated with the solid tailings have naturally degraded within the mine tailings impoundment area, resulting primarily from volatilization, leaching, and bacterial degradation.

105 citations

Journal ArticleDOI
TL;DR: The synthesis of the state feedback controller that quadratically stabilizes the production dynamics and at the same time rejects the external demand fluctuation is cast as a set of linear matrix inequalities (LMIs).

105 citations

Journal ArticleDOI
TL;DR: In this paper, a new local search methodology, called Variable Space Search (VSSS), was proposed to solve the k-coloring problem, which considers several search spaces, with various neighborhoods and objective functions, and moves from one to another when the search is blocked at a local optimum in a given search space.

105 citations


Authors

Showing all 8139 results

NameH-indexPapersCitations
Yoshua Bengio2021033420313
Claude Leroy135117088604
Lucie Gauthier13267964794
Reyhaneh Rezvani12063861776
M. Giunta11560866189
Alain Dufresne11135845904
David Brown105125746827
Pierre Legendre9836682995
Michel Bouvier9739631267
Aharon Gedanken9686138974
Michel Gendreau9445636253
Frederick Dallaire9347531049
Pierre Savard9342742186
Nader Engheta8961935204
Ke Wu87124233226
Network Information
Related Institutions (5)
Delft University of Technology
94.4K papers, 2.7M citations

93% related

Royal Institute of Technology
68.4K papers, 1.9M citations

93% related

Georgia Institute of Technology
119K papers, 4.6M citations

93% related

University of Waterloo
93.9K papers, 2.9M citations

93% related

École Polytechnique Fédérale de Lausanne
98.2K papers, 4.3M citations

92% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
202340
2022276
20211,275
20201,207
20191,140
20181,102