J
Jin Young Choi
Researcher at Seoul National University
Publications - 195
Citations - 4675
Jin Young Choi is an academic researcher from Seoul National University. The author has contributed to research in topics: Adaptive control & Nonlinear system. The author has an hindex of 31, co-authored 190 publications receiving 3585 citations. Previous affiliations of Jin Young Choi include Electronics and Telecommunications Research Institute & Systems Research Institute.
Papers
More filters
Proceedings ArticleDOI
A Comprehensive Overhaul of Feature Distillation
TL;DR: A novel feature distillation method in which the distillation loss is designed to make a synergy among various aspects: teacher transform, student transform, distillation feature position and distance function, which achieves a significant performance improvement in all tasks.
Journal ArticleDOI
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons
TL;DR: This paper proposes a knowledge transfer method via distillation of activation boundaries formed by hidden neurons and proposes an activation transfer loss that has the minimum value when the boundaries generated by the student coincide with those by the teacher.
Proceedings ArticleDOI
Visual Tracking Using Attention-Modulated Disintegration and Integration
TL;DR: A novel attention-modulated visual tracking algorithm that decomposes an object into multiple cognitive units, and trains multiple elementary trackers in order to modulate the distribution of attention according to various feature and kernel types is presented.
Journal ArticleDOI
Sensitivity analysis of multilayer perceptron with differentiable activation functions
Jin Young Choi,Chong-Ho Choi +1 more
TL;DR: A sensitivity depending on the weight set in a single-output multilayer perceptron (MLP) with differentiable activation functions is proposed and computer simulations have been performed, resulting in good agreement between theoretical and simulation outcomes for small weight perturbations.
Posted Content
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons
TL;DR: In this paper, a knowledge transfer method via distillation of activation boundaries formed by hidden neurons is proposed, where the student learns a separating boundary between activation region and deactivation region formed by each neuron in the teacher.