scispace - formally typeset
K

Kai-Hung Chang

Researcher at Autodesk

Publications -  14
Citations -  366

Kai-Hung Chang is an academic researcher from Autodesk. The author has contributed to research in topics: Floorplan & Graph (abstract data type). The author has an hindex of 7, co-authored 14 publications receiving 178 citations. Previous affiliations of Kai-Hung Chang include Academia Sinica & Carnegie Mellon University.

Papers
More filters
Posted Content

House-GAN: Relational Generative Adversarial Networks for Graph-constrained House Layout Generation

TL;DR: A novel graph-constrained generative adversarial network, whose generator and discriminator are built upon relational architecture, to encode the constraint into the graph structure of its relational networks.
Book ChapterDOI

House-GAN: Relational Generative Adversarial Networks for Graph-Constrained House Layout Generation

TL;DR: In this article, a novel graph-constrained generative adversarial network is proposed, whose generator and discriminator are built upon relational architecture, encoding the constraint into the graph structure of its relational networks.
Journal ArticleDOI

Interactive design of animated plushies

TL;DR: A computational approach to creating animated plushies, soft robotic plush toys specifically-designed to reenact user-authored motions, inspired by muscular hydrostat structures, which drive highly versatile motions in many biological systems is presented.
Proceedings ArticleDOI

House-GAN++: Generative Adversarial Layout Refinement Network towards Intelligent Computational Agent for Professional Architects

TL;DR: In this paper, a generative adversarial layout refinement network is proposed for automated floorplan generation, where a previously generated layout becomes the next input constraint, enabling iterative refinement.
Proceedings ArticleDOI

Control of Tendon-Driven Soft Foam Robot Hands

TL;DR: A novel approach to control multi-fingered tendon-driven foam hands using a CyberGlove and a simple ridge regression model is provided and the results achieved include complex posing, dexterous grasping and in-hand manipulations.