H
Hao Su
Researcher at University of California, San Diego
Publications - 364
Citations - 82843
Hao Su is an academic researcher from University of California, San Diego. The author has contributed to research in topics: Computer science & Point cloud. The author has an hindex of 57, co-authored 302 publications receiving 55902 citations. Previous affiliations of Hao Su include Philips & Jiangxi University of Science and Technology.
Papers
More filters
Book ChapterDOI
Weakly-Supervised 3D Shape Completion in the Wild
Jiayuan Gu,Wei-Chiu Ma,Sivabalan Manivasagam,Wenyuan Zeng,Zihao Wang,Yuwen Xiong,Hao Su,Raquel Urtasun +7 more
TL;DR: In this paper, a weakly supervised method is proposed to estimate both 3D canonical shape and 6-DoF pose for alignment, given multiple partial observations associated with the same instance.
Posted Content
Towards Fast and Energy-Efficient Binarized Neural Network Inference on FPGA
TL;DR: Two types of fast and energy-efficient architectures for BNN inference are proposed and analysis and insights are provided to pick the better strategy of these two for different datasets and network models.
Proceedings ArticleDOI
Machine Learning Based Adaptive Gait Phase Estimation Using Inertial Measurement Sensors
TL;DR: This paper presents a portable inertial measurement unit (IMU)-based motion sensing system and proposed an adaptive gait phase detection approach for non-steady state walking and multiple activities monitoring and demonstrates that the sensing suit can not only detect the gait status in any transient state but also generalize to multiple activities.
Journal ArticleDOI
ComplementMe: Weakly-Supervised Component Suggestions for 3D Modeling
TL;DR: In this article, a neural network architecture for suggesting complementary components and their placement for an incomplete 3D part assembly is proposed, which can be used to design complex shapes with minimal or no user input.
Proceedings ArticleDOI
DeepMetaHandles: Learning Deformation Meta-Handles of 3D Meshes with Biharmonic Coordinates
TL;DR: DeepMetaHandles as discussed by the authors learns a set of meta-handles for each shape, which are represented as combinations of the given handles, and disentangles the disentangled metahandles factorizing all plausible deformations of the shape, while each of them corresponds to an intuitive deformation.