J
Jean Sebastien Fouillade
Researcher at Microsoft
Publications - 7
Citations - 218
Jean Sebastien Fouillade is an academic researcher from Microsoft. The author has contributed to research in topics: Mobile robot & Mobile robot navigation. The author has an hindex of 6, co-authored 7 publications receiving 218 citations.
Papers
More filters
Patent
Finding a called party
Charles F. Olivier,Jean Sebastien Fouillade,Malek M. Chalabi,Nathaniel T. Clinton,Russell Sanchez,Adrien Felon,Graham A. Wheeler,Francois Burianek +7 more
TL;DR: In this article, a method for initiating a telepresence session with a person, using a robot, is described, where the robot moves to the location given by the person in response to the prompt.
Patent
Tracking and following of moving objects by a mobile robot
Charles F. Olivier,Jean Sebastien Fouillade,Adrien Felon,Jeffrey Cole,Nathaniel T. Clinton,Russell Sanchez,Francois Burianek,Malek M. Chalabi,Harshavardhana Narayana Kikkeri +8 more
TL;DR: In this paper, the relative positions and orientations of the robot and the object are determined, and the position and orientation can be used so as to maintain a desired relationship between the object and the robot.
Patent
Interactive robot initialization
TL;DR: In this article, an initial interaction between a mobile robot and at least one user is described, where the robot captures images of the face of the user responsive to detecting that the user has followed the instruction.
Patent
Semi-autonomous robot that supports multiple modes of navigation
Jean Sebastien Fouillade,Charles F. Olivier,Malek M. Chalabi,Nathaniel T. Clinton,Russ Sanchez,Chad Aron Voss +5 more
TL;DR: In this paper, the authors describe technologies pertaining to robot navigation, where the robot includes a video camera that is configured to transmit a live video feed to a remotely located computing device.
Patent
Natural human to robot remote control
TL;DR: In this paper, the subject disclosure is directed towards controlling a robot based upon sensing a user's natural and intuitive movements and expressions, which are captured by an image and depth camera, resulting in skeletal data and/or image data that is used to control a robot's operation.