Institution
École Polytechnique de Montréal
Education•Montreal, Quebec, Canada•
About: École Polytechnique de Montréal is a education organization based out in Montreal, Quebec, Canada. It is known for research contribution in the topics: Finite element method & Computer science. The organization has 8015 authors who have published 18390 publications receiving 494372 citations.
Papers published on a yearly basis
Papers
More filters
••
TL;DR: This study aims to develop a robust and fully automated tissue classification method by using the convolutional neural networks (CNNs) as feature extractor and comparing the predictions of three state-of-the-art classifiers, CNN, random forest (RF), and support vector machine (SVM).
Abstract: Kawasaki disease (KD) is an acute childhood disease complicated by coronary artery aneurysms, intima thickening, thrombi, stenosis, lamellar calcifications, and disappearance of the media border. Automatic classification of the coronary artery layers (intima, media, and scar features) is important for analyzing optical coherence tomography (OCT) images recorded in pediatric patients. OCT has been known as an intracoronary imaging modality using near-infrared light which has recently been used to image the inner coronary artery tissues of pediatric patients, providing high spatial resolution (ranging from 10 to 20 μm). This study aims to develop a robust and fully automated tissue classification method by using the convolutional neural networks (CNNs) as feature extractor and comparing the predictions of three state-of-the-art classifiers, CNN, random forest (RF), and support vector machine (SVM). The results show the robustness of CNN as the feature extractor and random forest as the classifier with classification rate up to 96%, especially to characterize the second layer of coronary arteries (media), which is a very thin layer and it is challenging to be recognized and specified from other tissues.
105 citations
•
15 Nov 2017TL;DR: This work unify successful ideas from recently proposed architectures into a stochastic recurrent model that achieves state-of-the-art results on standard speech benchmarks such as TIMIT and Blizzard and competitive performance on sequential MNIST.
Abstract: Many efforts have been devoted to training generative latent variable models with autoregressive decoders, such as recurrent neural networks (RNN). Stochastic recurrent models have been successful in capturing the variability observed in natural sequential data such as speech. We unify successful ideas from recently proposed architectures into a stochastic recurrent model: each step in the sequence is associated with a latent variable that is used to condition the recurrent dynamics for future steps. Training is performed with amortised variational inference where the approximate posterior is augmented with a RNN that runs backward through the sequence. In addition to maximizing the variational lower bound, we ease training of the latent variables by adding an auxiliary cost which forces them to reconstruct the state of the backward recurrent network. This provides the latent variables with a task-independent objective that enhances the performance of the overall model. We found this strategy to perform better than alternative approaches such as KL annealing. Although being conceptually simple, our model achieves state-of-the-art results on standard speech benchmarks such as TIMIT and Blizzard and competitive performance on sequential MNIST. Finally, we apply our model to language modeling on the IMDB dataset where the auxiliary cost helps in learning interpretable latent variables.
105 citations
••
TL;DR: The findings indicate that the more reactive cyanide species initially associated with the solid tailings have naturally degraded within the mine tailings impoundment area, resulting primarily from volatilization, leaching, and bacterial degradation.
105 citations
••
TL;DR: The synthesis of the state feedback controller that quadratically stabilizes the production dynamics and at the same time rejects the external demand fluctuation is cast as a set of linear matrix inequalities (LMIs).
105 citations
••
TL;DR: In this paper, a new local search methodology, called Variable Space Search (VSSS), was proposed to solve the k-coloring problem, which considers several search spaces, with various neighborhoods and objective functions, and moves from one to another when the search is blocked at a local optimum in a given search space.
105 citations
Authors
Showing all 8139 results
Name | H-index | Papers | Citations |
---|---|---|---|
Yoshua Bengio | 202 | 1033 | 420313 |
Claude Leroy | 135 | 1170 | 88604 |
Lucie Gauthier | 132 | 679 | 64794 |
Reyhaneh Rezvani | 120 | 638 | 61776 |
M. Giunta | 115 | 608 | 66189 |
Alain Dufresne | 111 | 358 | 45904 |
David Brown | 105 | 1257 | 46827 |
Pierre Legendre | 98 | 366 | 82995 |
Michel Bouvier | 97 | 396 | 31267 |
Aharon Gedanken | 96 | 861 | 38974 |
Michel Gendreau | 94 | 456 | 36253 |
Frederick Dallaire | 93 | 475 | 31049 |
Pierre Savard | 93 | 427 | 42186 |
Nader Engheta | 89 | 619 | 35204 |
Ke Wu | 87 | 1242 | 33226 |