K
Kaisheng Ma
Researcher at Tsinghua University
Publications - 100
Citations - 2629
Kaisheng Ma is an academic researcher from Tsinghua University. The author has contributed to research in topics: Computer science & Pruning (decision trees). The author has an hindex of 20, co-authored 73 publications receiving 1513 citations. Previous affiliations of Kaisheng Ma include Pennsylvania State University & Peking University.
Papers
More filters
Journal ArticleDOI
Dynamic Power and Energy Management for Energy Harvesting Nonvolatile Processor Systems
Kaisheng Ma,Xueqing Li,Huichu Liu,Xiao Sheng,Yiqun Wang,Karthik Swaminathan,Yongpan Liu,Yuan Xie,Jack Sampson,Vijaykrishnan Narayanan +9 more
TL;DR: This work introduces a unified energy-oriented approach to first optimize the number of backups, by more aggressively using the stored energy available when power failure occurs, and then optimize forward progress via improving the rate of input energy to computation via dynamic voltage and frequency scaling and self-learning techniques.
Proceedings ArticleDOI
NN-baton: DNN workload orchestration and chiplet granularity exploration for multichip accelerators
TL;DR: NN-Baton as discussed by the authors provides a hierarchical and analytical framework to describe the DNN mapping on a multichip accelerator and analyze the communication overhead, which can reduce the manufacturing cost, improve the fabrication yield, and achieve die-level reuse for different system scales.
Proceedings ArticleDOI
Light-weight Calibrator: A Separable Component for Unsupervised Domain Adaptation
Shaokai Ye,Kailu Wu,Mu Zhou,Yunfei Yang,Sia Huat Tan,Kaidi Xu,Jiebo Song,Chenglong Bao,Kaisheng Ma +8 more
TL;DR: In this paper, instead of training a classifier to adapt to the target domain, instead, a separable component called data calibrator is used to help the source classifier recover discrimination power in target domain while preserving the source domain's performance.
Proceedings Article
Task-Oriented Feature Distillation
TL;DR: A novel distillation method named task-oriented feature distillation (TOFD) is proposed where the transformation is convolutional layers that are trained in a data-driven manner by task loss to improve the performance of knowledge distillation.
Proceedings ArticleDOI
Dual Mode Ferroelectric Transistor based Non-Volatile Flip-Flops for Intermittently-Powered Systems
Sandeep Krishna Thirumala,Arnab Raha,Hrishikesh Jayakumar,Kaisheng Ma,Vijaykrishnan Narayanan,Vijay Raghunathan,Sumeet Kumar Gupta +6 more
TL;DR: System-level analysis of the proposed NVFFs in context of a state-of-the-art intermittently-powered system using real benchmarks yielded 5%-33% energy savings.