scispace - formally typeset
Search or ask a question
JournalISSN: 2576-3202

IEEE transactions on medical robotics and bionics 

Institute of Electrical and Electronics Engineers
About: IEEE transactions on medical robotics and bionics is an academic journal published by Institute of Electrical and Electronics Engineers. The journal publishes majorly in the area(s): Computer science & Medicine. It has an ISSN identifier of 2576-3202. Over the lifetime, 152 publications have been published receiving 240 citations. The journal is also known as: Institute of Electrical and Electronics Engineers transactions on medical robotics and bionics & Medical robotics and bionics.

Papers published on a yearly basis

Papers
More filters
Journal ArticleDOI
TL;DR: In this paper , a subject-independent hip moment estimator using a temporal convolutional network (TCN) was proposed to estimate the biological joint moments using wearable sensors and validated its generalizability during multimodal ambulation.
Abstract: Estimating biological joint moments using wearable sensors could enable out-of-lab biomechanical analyses and exoskeletons that assist throughout daily life. To realize these possibilities, this study introduced a subject-independent hip moment estimator using a temporal convolutional network (TCN) and validated its performance and generalizability during multimodal ambulation. Electrogoniometer and simulated IMU data from sixteen participants walking on level ground, ramps and stairs were used to evaluate our approach when benchmarked against a fully-connected neural network, a long short-term memory network, and a baseline method (i.e., using subject-average moment curves based on ambulation mode and gait phase). Additionally, the generalizability of our approach was evaluated by testing on ground slopes, stair heights, and gait transitions withheld during model training. The TCN outperformed the benchmark approaches on the hold-out data (p < 0.05), with an average RMSE of 0.131±0.018 Nm/kg and R2 of 0.880±0.030 during steady-state ambulation. When tested on the 20 leave-one-out slope and stair height conditions, the TCN significantly increased RMSE only on the steepest (+18°) incline (p < 0.05). Finally, the TCN RMSE and R2 was 0.152±0.027 Nm/kg and 0.786±0.055, respectively, during mode transitions. Thus, our approach accurately estimated hip moments and generalized to unseen gait contexts using data from three wearable sensors.

22 citations

Journal ArticleDOI
TL;DR: In this paper , a novel and high-accuracy Fiber Bragg Grating (FBG)-enabled tri-axial distal force sensor for minimally invasive surgical palpation is presented.
Abstract: The current study highlights the development of a novel and high-accuracy Fiber Bragg Grating (FBG)-enabled tri-axial distal force sensor for minimally invasive surgical palpation. This tri-axial sensor designs the primary flexure/elastomer consisting of axial and radial force-sensitive structures in a serial configuration, based on the method of freedom and constraint topology (FACT) from the perspective of the mechanism. This method provides general guidelines for designing multi-dimensional sensor design, achieving excellent sensitivity and a large measurement range, depressing crosstalks and couplings among different axes, and keeping the sensitivity of the three directions at the same order of magnitude. Five tightly suspended optical fibers embedded with one FBG element each have been assembled with the proposed flexure. They are configured with one arranged at the axial flexure’s central line and four distributed along the circumference of the radial flexure. This tight suspension configuration generates uniform and constant strain distribution on the FBG element, thus improving resolution and repeatability and further avoiding FBG chirping. Finite element modeling (FEM)-based simulation has been conducted for performance investigation and design optimization to improve the sensor sensitivity. Static calibration experiments and in-vitro and ex-vivo palpation experiments have been performed to investigate the proposed design’s performances and feasibility. The optimized sensor prototype achieves excellent resolution values of 1.18mN and 1.81mN in the x- and y-directions within [−5N, 5N], and 2.61mN in the z-direction within [0, 5N], realizing the same order of sensitivity magnitude at each axis.

16 citations

Journal ArticleDOI
TL;DR: A novel dense residual recurrent convolutional network, called DRR-Net, is proposed in this paper for automatic and accurate surgical instrument segmentation from endoscopic images and could achieve an excellent performance compared with other advanced segmentation models.
Abstract: The precise segmentation of surgical instruments is the key link for the stable and reasonable operation of surgical robots. However, accurate surgical instrument segmentation is still a challenging task due to the complex surgical environment in endoscopic images, low contrast between surgical instruments and tissues, and the diversity of surgical instruments and their morphological variability. In recent years, deep learning has been widely applied into medical image segmentation and achieved a certain achievements, especially U-Net and its variants. However, existing surgical instrument segmentation networks still suffer from some shortcomings, such as insufficient processing of local feature maps, lack of temporal modeling information, etc. To address the above issues, in order to effectively improve the segmentation accuracy of surgical instruments, based on an encoder-decoder network structure, a novel dense residual recurrent convolutional network, called DRR-Net, is proposed in this paper for automatic and accurate surgical instrument segmentation from endoscopic images. Faced with lack of temporal modeling information, inspired by the recurrent neural networks (RNNs), an attention dense-connected recurrent convolutional block (ADRCB) is proposed to optimize the backbone network to obtain temporal information and learn the correspond pixel relationship between frames. To address the insufficient processing issue of local feature maps, to replace the simple skip connections, a residual path is proposed to enhance the context feature representation. Meanwhile, it could also reduce the effect of semantic gap issue. Further, to improve the segmentation accuracy on segmented objects with different sizes, a context fusion block (CFB) is proposed to embed into the bottleneck layer to extract multi-scale attention context features. Multiple public data sets on surgical instrument segmentation are adopted for model evaluation and comparison, including kvasir-instrument set and UW-Sinus-Surgery-C/L set. Experimental results demonstrate that proposed DRR-Net network could achieve an excellent performance compared with other advanced segmentation models.

12 citations

DOI
TL;DR: Inspired by the human arm configuration in boxing maneuvers, an optimized anthropomorphic coordinated control strategy based on a dual-step optimization approach is proposed in this paper , which has been shown to perform dexterous bimanual manipulation more effectively, involving less instrument-interference and is free from singularities, thereby improving the safety and efficiency of SPS operations.
Abstract: Effective teleoperation of the small-scale and highly-integrated robots for single-port surgery (SPS) imposes unique control and human-robot interaction challenges. Traditional isometric teleoperation schemes mainly focus on end-to-end trajectory mapping, which is problematic when applied to SPS robotic control, especially for dual-arm coordinated operation. Inspired by the human arm configuration in boxing maneuvers, an optimized anthropomorphic coordinated control strategy based on a dual-step optimization approach is proposed. Theoretical derivation and solvability of the problem are addressed, and the effectiveness of the method is further demonstrated in detailed simulation and in-vitro experiments. The proposed control strategy has been shown to perform dexterous SPS bimanual manipulation more effectively, involving less instrument-interference and is free from singularities, thereby improving the safety and efficiency of SPS operations.

10 citations

Journal ArticleDOI
TL;DR: Wang et al. as discussed by the authors used dynamic time warping (DTW) to match multiple manipulation data and then calculated the intra-similarity to evaluate consistencies among each subject's different manipulations, while the intersimilarity was further analyzed to find skill differences among different subjects.
Abstract: Percutaneous coronary intervention (PCI) has become a popular treatment for coronary artery disease. Highly dexterous skills are necessary to procedure success. However, few effective methods can be applied to PCI skill assessment. In this study, ten interventional cardiologists (four experts and six novices) were recruited. In vivo studies were performed via delivering a medical guidewire into distal left circumflex artery (target vessel I) and obtuse marginal artery (target vessel II) of a porcine model. Regarded as a type of manipulation data, the guidewire motion is simultaneously acquired with an electromagnetic (EM) sensor attached to guidewire tail. To address the deficiency of conventional dynamic time warping (DTW) limited to two-sequence matching, a novel warping algorithm is proposed to match multiple manipulation data. Then the intra-similarity is calculated to evaluate consistencies among each subject’s different manipulations, while the inter-similarity is further analyzed to find skill differences among different subjects. Extensive statistical analysis demonstrates that the proposed algorithm can effectively distinguish between the manipulations made by different skill-level subjects with significant differences on the target vessel I ( $P = 3.25\times 10^{-4}$ ) and II ( $P = 7.30\times 10^{-3}$ ). These promising results show the proposed technique’s great potential to facilitate skill assessment in clinical practice and skill learning in surgical robotics.

9 citations

Performance
Metrics
No. of papers from the Journal in previous years
YearPapers
202398
2022128