scispace - formally typeset
S

Samuele Gasparrini

Researcher at Marche Polytechnic University

Publications -  16
Citations -  653

Samuele Gasparrini is an academic researcher from Marche Polytechnic University. The author has contributed to research in topics: Sensor fusion & Synchronization (computer science). The author has an hindex of 9, co-authored 16 publications receiving 558 citations.

Papers
More filters
Journal ArticleDOI

A depth-based fall detection system using a Kinect® sensor.

TL;DR: An automatic, privacy-preserving, fall detection method for indoor environments, based on the usage of the Microsoft Kinect® depth sensor in an “on-ceiling” configuration, and on the analysis of depth frames, which shows the effectiveness of the proposed solution, even in complex scenarios.
Journal ArticleDOI

A Human Activity Recognition System Using Skeleton Data from RGBD Sensors

TL;DR: This work aims to propose an activity recognition algorithm exploiting skeleton data extracted by RGBD sensors based on the extraction of key poses to compose a feature vector, and a multiclass Support Vector Machine to perform classification.
Book ChapterDOI

Proposal and Experimental Evaluation of Fall Detection Solution Based on Wearable and Depth Data Fusion

TL;DR: This work study the joint use of a camera based system and wearable devices, in the so called data fusion approach, to design a fall detection solution, and the synchronization issues between the heterogeneous data provided by the devices are properly treated.
Journal ArticleDOI

Kinect as a tool for gait analysis: validation of a real-time joint extraction algorithm working in side view.

TL;DR: Tests show that the trajectories extracted by the proposed algorithm adhere to the reference curves better than the ones obtained from the skeleton generated by the native applications provided within the Microsoft Kinect and OpenNI Software Development Kits.
Proceedings ArticleDOI

Time synchronization and data fusion for RGB-Depth cameras and inertial sensors in AAL applications

TL;DR: This work presents a technical platform for the efficient and accurate synchronization of the data captured from RGB-Depth cameras and wearable inertial sensors, that can be integrated in AAL solutions.