scispace - formally typeset
Search or ask a question
Author

Masoud Dorrikhteh

Bio: Masoud Dorrikhteh is an academic researcher from Ajman University of Science and Technology. The author has contributed to research in topics: Augmented reality & Fiducial marker. The author has an hindex of 1, co-authored 1 publications receiving 5 citations.

Papers
More filters
Proceedings ArticleDOI
01 Apr 2019
TL;DR: This research reviewed and evaluated various fiducial marker systems by developing an Android mobile application for real-time biomechanical measurement and selected AprilTag2 as the best fiducIAL marker option for this application.
Abstract: Marker-based measurement has been used to assess human body positioning, but human marker tracking has yet to make the transition from the laboratory to personal computing devices, such as smartphones. A novel smartphone-based approach could use a fiducial marker system. Fiducial markers are applicable to augmented reality, robotics, and other applications where a camera-object pose is required and tracked. However, few fiducial systems can be implemented on a mobile phone because of the processing requirements for identifying and tracking the tags in realtime. In augmented reality, virtual information is shared with the real world to further enhance a person’s view of the environment; therefore, this illusion is directly associated with good registration of both virtual and real worlds. Measurement applications also require accurate and fast registration so that real objects are in alignment with virtual objects in real-time. Our research reviewed and evaluated various fiducial marker systems by developing an Android mobile application for real-time biomechanical measurement. A test was designed for two nominated fiducial systems to compare their speed and robustness on the mobile phone. AprilTag2 was selected as the best fiducial marker option for this application.

7 citations


Cited by
More filters
Proceedings ArticleDOI
01 Jul 2020
TL;DR: A novel Android smartphone augmented-reality-based application was developed using the AprilTag2 fiducial marker system and obtained valid and reliable angle and distance measurements with smartphone positions and cameras that would be expected in practice.
Abstract: Marker tracking for postural and range of motion (ROM) measurements transcends multiple disciplines (e.g., healthcare, ergonomics, engineering). A viable real-time mobile application is currently lacking for measuring limb angles and body posture. To address this need, a novel Android smartphone augmented-reality-based application was developed using the AprilTag2 fiducial marker system. To evaluate the app, two markers were printed on paper and attached to a wall. A Samsung S6 mobile phone was fixed on a tripod, parallel to the wall. The smartphone app tracked and recorded marker orientation and 2D position data in the camera frame, from front and rear cameras, for different smartphone placements. The average error between mobile phone and measured angles was less than 1 degree for all test settings (back camera=0.29°, front camera=0.33°, yaw rotation=0.75°, tilt rotation=0.22°). The average error between mobile phone and measured distance was less than 4 mm for all test settings (back camera=1.8 mm, front camera=2.5 mm, yaw rotation=3 mm, tilt rotation=3.8 mm). Overall, the app obtained valid and reliable angle and distance measurements with smartphone positions and cameras that would be expected in practice. Thus, this app is viable for clinical ROM and posture assessments.

7 citations

Journal ArticleDOI
28 May 2022-BioMed
TL;DR: A novel Android smartphone augmented-reality-based application was developed and evaluated to enable real-time AprilTag2 marker measurement at the point of patient contact and obtained valid and reliable angle measurements for postural and ROM assessments using the smartphone’s front camera.
Abstract: Human posture and range of motion (ROM) measurements are important health indicators for identifying abnormalities from various disorders (e.g., scoliosis, musculoskeletal disorders, pain syndromes). A viable real-time mobile application for measuring body posture and ROM is currently lacking. To address this need, a novel Android smartphone augmented-reality-based application was developed and evaluated to enable real-time AprilTag2 marker measurement at the point of patient contact (Biomechanical Augmented Reality-Marker, BAR-M). Mobile app performance was evaluated on a body opponent bag (BOB) and 15 healthy participants by comparing smartphone app and Vicon motion analysis output (pelvis, shoulder, arm, torso angles). A Samsung Galaxy smartphone recorded live video, calculated AprilTag orientations and angle of “a line connecting the center of two tags”, and displayed outcomes in real time. For the BOB test, the absolute difference between Vicon and smartphone angles were 0.09° ± 0.05° for hip, 0.09° ± 0.06° for shoulder, and 0.69° for arm abduction. For the participant test, the absolute mean angle differences were 1.70° ± 0.23° for hip, 1.34° ± 0.27° for shoulder, and 11.18° ± 3.68° for arm abduction. Overall, the app obtained valid and reliable angle measurements for postural and ROM assessments using the smartphone’s front camera. Arm abduction results were affected by clothing movement that caused Vicon markers to move differently from AprilTag markers. Thus, with appropriate measurement methods, this real-time smartphone app is a viable tool to facilitate immediate clinical decision making based on human posture and ROM assessments.

3 citations

Proceedings ArticleDOI
01 Oct 2019
TL;DR: The experimental results show that the navigation accuracy of the proposed system with AprilTags2 auxiliary positioning is significantly improved, and the system uses the Robot Operating System (ROS) as a platform to develop AGV navigation functions.
Abstract: Aiming at the problems of poor flexibility, complicated path maintenance and poor positioning performance in the current guidance technology of AGV, this paper designs and implements a new navigation system of automated guided vehicle (AGV) based on AprilTags2 auxiliary positioning. The system uses the Robot Operating System (ROS) as a platform to develop AGV navigation functions. The navigation system comprises two parts: the hardware and software layers. Firstly, hardware selection is performed after considering the actual requirements, performance, cost, and other factors in the hardware layer. Simultaneously, the AGV chassis and single-steering wheel walking mechanism are built to provide a stable and flexible operating platform for the software layer. Secondly, the software layer design includes two parts, namely the ROS navigation planning end and AprilTags2 detection. The ROS planning navigation end performs the design of four functional modules (i.e., AGV map construction, autonomous positioning, path planning, and path tracking), while the AprilTags2 detection part obtains the visual positioning pose of AGV by setting the AprilTags2 on each site and using the Kinect1 camera to detect. A more accurate AGV pose can be obtained by merging the former pose with the kinematic estimated pose. Finally, the two groups of positioning errors of the system before and after adding the AprilTags2 auxiliary positioning are tested and compared. The experimental results show that the navigation accuracy of the proposed system with AprilTags2 auxiliary positioning is significantly improved.

1 citations

Proceedings ArticleDOI
01 Nov 2019
TL;DR: A new picture-based localization service PicPose is presented that relies on the feature points extracted from a camera-captured image and conducts feature point matching with the original wall picture to conduct pose calculation, which is impossible for ArPico and ArUco.
Abstract: Device self-localization is an important capability for many IoT applications that require mobility in service capabilities. In our previous work, we have designed the ArPico method for robot indoor localization. By placing and recognizing pre-installed pictures on walls, robots can use low-cost cameras to identify their positions by referencing to pictures' precise locations. However, using ArPico, all pictures need to have clear rectangular borders for the pose computation. But some real-world pictures does not have clear thick borders. Moreover, some pictures may have odd shapes or are only partially visible. To address these problems, a new picture-based localization service PicPose is presented. PicPose relies on the feature points extracted from a camera-captured image and conducts feature point matching with the original wall picture to conduct pose calculation. Using PicPose, even partially visible pictures can be used for localization, which is impossible for ArPico and ArUco. We present our implementation and experiment results in this paper.

1 citations

Journal ArticleDOI
TL;DR: An autonomous moving robot that can self-localize itself using its on-board camera and the PicPose technology is built and shows that the localization methods are practical, have very good accuracy, and can be used for real time robot navigation.
Abstract: Localization is an important technology for smart services like autonomous surveillance, disinfection or delivery robots in future distributed indoor IoT applications. Visual-based localization (VBL) is a promising self-localization approach that identifies a robot’s location in an indoor or underground 3D space by using its camera to scan and match the robot’s surrounding objects and scenes. In this study, we present a pictorial planar surface based 3D object localization framework. We have designed two object detection methods for localization, ArPico and PicPose. ArPico detects and recognizes framed pictures by converting them into binary marker codes for matching with known codes in the library. It then uses the corner points on a picture’s border to identify the camera’s pose in the 3D space. PicPose detects the pictorial planar surface of an object in a camera view and produces the pose output by matching the feature points in the view with that in the original picture and producing the homography to map the object’s actual location in the 3D real world map. We have built an autonomous moving robot that can self-localize itself using its on-board camera and the PicPose technology. The experiment study shows that our localization methods are practical, have very good accuracy, and can be used for real time robot navigation.

1 citations