R
Renjie Liao
Researcher at University of Toronto
Publications - 98
Citations - 5891
Renjie Liao is an academic researcher from University of Toronto. The author has contributed to research in topics: Computer science & Artificial neural network. The author has an hindex of 35, co-authored 83 publications receiving 4067 citations. Previous affiliations of Renjie Liao include Beihang University & The Chinese University of Hong Kong.
Papers
More filters
Proceedings ArticleDOI
Detail-Revealing Deep Video Super-Resolution
TL;DR: In this article, a sub-pixel motion compensation (SPMC) layer is proposed to fuse multiple frames to reveal image details, which can generate visually and quantitatively high quality results without the need of parameter tuning.
Proceedings ArticleDOI
3D Graph Neural Networks for RGBD Semantic Segmentation
TL;DR: This paper proposes a 3D graph neural network (3DGNN) that builds a k-nearest neighbor graph on top of 3D point cloud that uses back-propagation through time to train the model.
Proceedings ArticleDOI
UPSNet: A Unified Panoptic Segmentation Network
TL;DR: UPSNet as mentioned in this paper proposes a unified panoptic segmentation network to solve the problem of conflicts between semantic and instance segmentation by combining a deformable convolution based head and a Mask R-CNN style instance head.
Proceedings ArticleDOI
GeoNet: Geometric Neural Network for Joint Depth and Surface Normal Estimation
TL;DR: The proposed Geometric Neural Network (GeoNet) to jointly predict depth and surface normal maps from a single image achieves top performance on surface normal estimation and is on par with state-of-the-art depth estimation methods.
Book ChapterDOI
Learning Lane Graph Representations for Motion Forecasting
TL;DR: A motion forecasting model that exploits a novel structured map representation as well as actor-map interactions to predict accurate and realistic multi-modal trajectories and significantly outperforms the state-of-the-art on the large scale Argoverse motion forecasting benchmark.