scispace - formally typeset
Search or ask a question
Author

Keishi Okuda

Bio: Keishi Okuda is an academic researcher from Osaka Electro-Communication University. The author has contributed to research in topics: Laparoscopic surgery & Forceps. The author has an hindex of 1, co-authored 1 publications receiving 2 citations.

Papers
More filters
Book ChapterDOI
09 Jul 2017
TL;DR: This surgery, a surgeon uses an endoscopic camera, which is limited view volume, and forceps into the patient body to perform laparoscopic surgery.
Abstract: Laparoscopic surgery is becoming popular as a minimally invasive operation. It is widely used as a technique for various kinds of surgical operation. In this surgery, a surgeon uses an endoscopic camera, which is limited view volume, and forceps into the patient body. It has been needed to support system such as navigation system.

2 citations


Cited by
More filters
Book ChapterDOI
15 Jul 2018
TL;DR: This study revealed that using the robotic slider to move the camera up and down did not result in excessive vibration or inconsistent depth measurements before, during, and after the movement.
Abstract: In this study, we constructed and tested the usability of a surgical area-measuring robot-mechanical system, which does not obstruct the movements of doctors, assistants, or nurses during surgery, under two operating lights in an operating room. This study revealed that using the robotic slider to move the camera up and down did not result in excessive vibration or inconsistent depth measurements before, during, and after the movement. For example, if a doctor moves the camera out of the way to move a microscope to the upper part of the surgical area for microsurgery and then brings it back, the system could accurately retain the depth image alignment.

2 citations

Book ChapterDOI
26 Jul 2019
TL;DR: A surgical navigation system that takes stable measurements of a surgical area using RGB/depth cameras without obstructions, such as the surgeon’s head or hands is created.
Abstract: In this study, our goal is to create a surgical navigation system that takes stable measurements of a surgical area using RGB/depth cameras without obstructions, such as the surgeon’s head or hands. We mounted three cutting-edge D435 Intel RealSense Depth Cameras onto a ring and photographed the surgical area from three directions. We also installed a robotic mechanical system that can move the camera ring up and down so that the surgery can proceed smoothly. First, we calibrated the coordinate systems so that the coordinate systems of the three cameras (three-dimensional XYZ coordinate system) align and their Y-axis (vertical axis) aligns with the moving axis of the robot slider. Next, we captured an ArUco marker with each camera and visualized its position within the camera coordinate system. After verifying that the initial positions of the ArUco marker captured by the three cameras match, we moved the robot slider up by 50 mm thrice and investigated the degree of change in the measured position of the ArUco marker measured by the three cameras. The results show that as the ArUco marker moves farther away, the extent of error in the measured position of the ArUco marker increases. Additionally, the measured position of all the ArUco markers varied owing to the digital pixel error in the two-dimensional images (the pixel in which the ArUco marker is visible moves between neighboring pixels). Future work includes checking that the calibration pattern and ArUco marker are parallel to the camera ring and perpendicular to the robot slider using a level instrument; packing the calibration pattern and ArUco marker horizontally with vinyl; and implementing a moving average in the program to reduce pixel digital error in the calculations to sub-pixel level.