scispace - formally typeset
K

Kyle Kastner

Researcher at Université de Montréal

Publications -  28
Citations -  2873

Kyle Kastner is an academic researcher from Université de Montréal. The author has contributed to research in topics: Recurrent neural network & Artificial neural network. The author has an hindex of 14, co-authored 24 publications receiving 2402 citations. Previous affiliations of Kyle Kastner include Salk Institute for Biological Studies.

Papers
More filters
Posted Content

A Recurrent Latent Variable Model for Sequential Data

TL;DR: In this article, the authors explore the use of latent random variables into the dynamic hidden state of a recurrent neural network (RNN) by combining elements of the variational autoencoder.
Proceedings Article

A recurrent latent variable model for sequential data

TL;DR: It is argued that through the use of high-level latent random variables, the variational RNN (VRNN)1 can model the kind of variability observed in highly structured sequential data such as natural speech.
Proceedings Article

Char2Wav: End-to-End Speech Synthesis

TL;DR: Char2Wav is an end-to-end model for speech synthesis that learns to produce audio directly from text and is a bidirectional recurrent neural network with attention that produces vocoder acoustic features.
Posted Content

ReNet: A Recurrent Neural Network Based Alternative to Convolutional Networks.

TL;DR: The proposed network, called ReNet, replaces the ubiquitous convolution+pooling layer of the deep convolutional neural network with four recurrent neural networks that sweep horizontally and vertically in both directions across the image.
Proceedings ArticleDOI

ReSeg: A Recurrent Neural Network-Based Model for Semantic Segmentation

TL;DR: In this article, the authors proposed a structured prediction architecture, which exploits the local generic features extracted by Convolutional Neural Networks and the capacity of Recurrent Neural Networks (RNN) to retrieve distant dependencies.