K
Kai-Tai Song
Researcher at National Chiao Tung University
Publications - 193
Citations - 3317
Kai-Tai Song is an academic researcher from National Chiao Tung University. The author has contributed to research in topics: Mobile robot & Robot. The author has an hindex of 28, co-authored 190 publications receiving 3056 citations. Previous affiliations of Kai-Tai Song include Industrial Technology Research Institute & Novatek.
Papers
More filters
Journal ArticleDOI
Tracking control of unicycle-modeled mobile robots using a saturation feedback controller
TL;DR: The tracking control problem with saturation constraint for a class of unicycle-modeled mobile robots is formulated and solved using the backstepping technique and the idea from the LaSalle's invariance principle, and computer simulations confirm the effectiveness of the proposed tracking control law.
Journal ArticleDOI
Real-time image tracking for automatic traffic monitoring and enforcement applications
TL;DR: An image tracking system and its applications for traffic monitoring and accident detection at road intersections using the active contour model approach and a contour initialization method based on the concept of contour growing are presented.
Journal ArticleDOI
A Study on Speech Recognition Control for a Surgical Robot
TL;DR: A design for a speech recognition interface for an HIWIN robotic endoscope holder and a method is proposed for voice-to-motion calibration that compares the degree of change in the endoscope image for a voice command.
Journal ArticleDOI
Dynamic Calibration of Pan–Tilt–Zoom Cameras for Traffic Monitoring
Kai-Tai Song,Jen-Chao Tai +1 more
TL;DR: This paper presents a novel calibration method for a PTZ camera overlooking a traffic scene that automatically uses a set of parallel lane markings and the lane width to compute the camera parameters, namely, focal length, tilt angle, and pan angle.
Journal ArticleDOI
Robotic Emotional Expression Generation Based on Mood Transition and Personality Model
TL;DR: A 2-D emotional model is proposed to combine robot emotion, mood, and personality in order to generate emotional expressions to fuse basic robotic emotional behaviors and to manifest robotic emotional states via continuous facial expressions.