scispace - formally typeset
J

Joseph DelPreto

Researcher at Massachusetts Institute of Technology

Publications -  20
Citations -  1098

Joseph DelPreto is an academic researcher from Massachusetts Institute of Technology. The author has contributed to research in topics: Robot & Gesture recognition. The author has an hindex of 9, co-authored 17 publications receiving 677 citations. Previous affiliations of Joseph DelPreto include Vassar College & University of Alberta.

Papers
More filters
Journal ArticleDOI

Exploration of underwater life with an acoustically controlled soft robotic fish

TL;DR: This work presents the design, fabrication, control, and oceanic testing of a soft robot fish that can swim in three dimensions to continuously record the aquatic life it is following or engaging and exhibits a lifelike undulating tail motion enabled by a soft robotic actuator design.
Proceedings ArticleDOI

Correcting robot mistakes in real time using EEG signals

TL;DR: The application of EEG-measured error-related potentials (ErrPs) to closed-loop robotic control and the potential for EEG-based feedback methods to facilitate seamless robotic control are explored, and the goal of real-time intuitive interaction is moved closer.
Journal ArticleDOI

Learning Object Grasping for Soft Robot Hands

TL;DR: The power of a 3D CNN model is exploited to estimate suitable grasp poses from multiple grasping directions (top and side directions) and wrist orientations, which has great potential for geometry-related robotic tasks.
Proceedings ArticleDOI

A highly-underactuated robotic hand with force and joint angle sensors

TL;DR: A novel underactuated robotic hand design that contains three fingers with three joints each controlled by a single motor and the addition of position and tactile sensors which provide precise angle feedback and binary force feedback is described.
Proceedings ArticleDOI

Sharing the Load: Human-Robot Team Lifting Using Muscle Activity

TL;DR: Two muscle signals are used to create a control framework for team lifting tasks in which a human and robot lift an object together, and a neural network trained only on previous users classifies biceps and triceps activity to detect up or down gestures on a rolling basis enables finer control over the robot and expands the feasible workspace.