J
Juan R. Terven
Researcher at Walsh University
Publications - 21
Citations - 239
Juan R. Terven is an academic researcher from Walsh University. The author has contributed to research in topics: Computer science & Gesture. The author has an hindex of 7, co-authored 17 publications receiving 178 citations. Previous affiliations of Juan R. Terven include Instituto Politécnico Nacional.
Papers
More filters
Journal ArticleDOI
Kin2. A Kinect 2 toolbox for MATLAB
TL;DR: Kin2 is the first publicly available toolbox that provides Kinect V2 capabilities to MATLAB users and reduces in one order of magnitude the average code length yielding a significant reduction in development time for prototyping and research.
Journal ArticleDOI
New Opportunities for Computer Vision-Based Assistive Technology Systems for the Visually Impaired
TL;DR: Computing advances and increased smartphone use gives technology system designers greater flexibility in exploiting computer vision to support visually impaired users.
Journal ArticleDOI
A multiple camera calibration and point cloud fusion tool for Kinect V2
Diana-Margarita Córdova-Esparza,Juan R. Terven,Hugo Jiménez-Hernández,Ana-Marcela Herrera-Navarro +3 more
TL;DR: A tool for calibrating multiple Kinect V2 sensors that uses the Kinect's coordinate mapping capabilities between the sensors to register data between camera, depth, and color spaces and uses a novel approach where it obtains multiple 3D points matches between adjacent sensors.
Journal ArticleDOI
Head-gestures mirroring detection in dyadic social interactions with computer vision-based wearable devices
TL;DR: A computer vision-based system to detect mirroring in dyadic social interactions with the use of a wearable platform and achieves a mirroring detection accuracy of 72% on a custom mirroring dataset.
Journal ArticleDOI
A Social-Aware Assistant to support individuals with visual impairments during social interaction: A systematic requirements analysis
TL;DR: A Social-Aware Assistant that provides visually impaired people with cues to enhance their face-to-face conversations and quantitatively and qualitatively the device usefulness and user satisfaction are assessed.