X
Xiaoliang Dai
Researcher at Princeton University
Publications - 48
Citations - 2938
Xiaoliang Dai is an academic researcher from Princeton University. The author has contributed to research in topics: Computer science & Artificial neural network. The author has an hindex of 15, co-authored 37 publications receiving 1587 citations. Previous affiliations of Xiaoliang Dai include Peking University & Facebook.
Papers
More filters
Proceedings ArticleDOI
FBNet: Hardware-Aware Efficient ConvNet Design via Differentiable Neural Architecture Search
Bichen Wu,Kurt Keutzer,Xiaoliang Dai,Peizhao Zhang,Yanghan Wang,Fei Sun,Yiming Wu,Yuandong Tian,Peter Vajda,Yangqing Jia +9 more
TL;DR: This work proposes a differentiable neural architecture search (DNAS) framework that uses gradient-based methods to optimize ConvNet architectures, avoiding enumerating and training individual architectures separately as in previous methods.
Posted Content
Visual Transformers: Token-based Image Representation and Processing for Computer Vision
Bichen Wu,Chenfeng Xu,Xiaoliang Dai,Alvin Wan,Peizhao Zhang,Masayoshi Tomizuka,Kurt Keutzer,Peter Vajda +7 more
TL;DR: This work represents images as a set of visual tokens and applies visual transformers to find relationships between visual semantic concepts to densely model relationships between them, and finds that this paradigm of token-based image representation and processing drastically outperforms its convolutional counterparts on image classification and semantic segmentation.
Proceedings ArticleDOI
ChamNet: Towards Efficient Network Design Through Platform-Aware Model Adaptation
Xiaoliang Dai,Yangqing Jia,Peter Vajda,Matthew T. Uyttendaele,Niraj K. Jha,Peizhao Zhang,Bichen Wu,Hongxu Yin,Fei Sun,Yanghan Wang,Marat Dukhan,Yunqing Hu,Yiming Wu +12 more
TL;DR: The results show that adapting computation resources to building blocks is critical to model performance, and a novel algorithm to search for optimal architectures aided by efficient accuracy and resource (latency and/or energy) predictors is proposed.
Proceedings ArticleDOI
FBNetV2: Differentiable Neural Architecture Search for Spatial and Channel Dimensions
Alvin Wan,Xiaoliang Dai,Peizhao Zhang,Zijian He,Yuandong Tian,Saining Xie,Bichen Wu,Matthew Yu,Tao Xu,Kan Chen,Peter Vajda,Joseph E. Gonzalez +11 more
TL;DR: DMaskingNAS as mentioned in this paper proposes a masking mechanism for feature map reuse, so that memory and computational costs stay nearly constant as the search space expands, and employs effective shape propagation to maximize per-FLOP or per-parameter accuracy.
Posted Content
ChamNet: Towards Efficient Network Design through Platform-Aware Model Adaptation
Xiaoliang Dai,Peizhao Zhang,Bichen Wu,Hongxu Yin,Fei Sun,Yanghan Wang,Marat Dukhan,Yunqing Hu,Yiming Wu,Yangqing Jia,Peter Vajda,Matthew T. Uyttendaele,Niraj K. Jha +12 more
TL;DR: Chameleon as mentioned in this paper proposes an efficient neural network (NN) architecture design methodology called Chameleon that honors given resource constraints instead of developing new building blocks or using computationally-intensive reinforcement learning algorithms, instead of exploiting hardware traits and adapting computation resources to fit target latency and/or energy constraints.