A cooperative mobile robot and manipulator system (Co-MRMS) for transport and lay-up of fibre plies in modern composite material manufacture
TL;DR: In the proposed system, marker-based and Fourier transform-based machine vision approaches are used to achieve high accuracy capability in localisation and fibre orientation detection respectively and a particle based approach is adopted to model material deformation during manipulation within robotic simulations.
Abstract: Composite materials are widely used in industry due to their light weight and specific performance. Currently, composite manufacturing mainly relies on manual labour and individual skills, especially in transport and lay-up processes, which are time consuming and prone to errors. As part of a preliminary investigation into the feasibility of deploying autonomous robotics for composite manufacturing, this paper presents a case study that investigates a cooperative mobile robot and manipulator system (Co-MRMS) for material transport and composite lay-up, which mainly comprises a mobile robot, a fixed-base manipulator and a machine vision sub-system. In the proposed system, marker-based and Fourier transform-based machine vision approaches are used to achieve high accuracy capability in localisation and fibre orientation detection respectively. Moreover, a particle-based approach is adopted to model material deformation during manipulation within robotic simulations. As a case study, a vacuum suction-based end-effector model is developed to deal with sagging effects and to quickly evaluate different gripper designs, comprising of an array of multiple suction cups. Comprehensive simulations and physical experiments, conducted with a 6-DOF serial manipulator and a two-wheeled differential drive mobile robot, demonstrate the efficient interaction and high performance of the Co-MRMS for autonomous material transportation, material localisation, fibre orientation detection and grasping of deformable material. Additionally, the experimental results verify that the presented machine vision approach achieves high accuracy in localisation (the root mean square error is 4.04 mm) and fibre orientation detection (the root mean square error is 1.84∘) and enables dealing with uncertainties such as the shape and size of fibre plies.
Summary (3 min read)
- Due to the interesting properties and high strength-to-weight ratio, the applications of composite materials have raised considerably in the last decades [1, 2].
- The method presented in  used a fibre reflection model to measure fibre orientation from an image and achieved good accuracies and robustness for different types of surfaces.
- Compared to previous works, this research addresses specific challenges that arise from the introduction of different robots that must be coordinated along with the complex set of tasks covering transport, detection, grasping and placement of deformable material for composite manufacturing applications.
- Another issue of automated handling composite material is end-effector design.
- Until now, a number of grippers, such as grid gripper and suction cup gripper, have been designed.
2 The proposed system and approach
- 1 Framework of the cooperative mobile robot and manipulator system (Co-MRMS) From a hardware perspective, the proposed Co-MRMS involves four components: a mobile robot, a fixed-base robotic manipulator, a vision system and a host PC.
- Aided by the vision system, the estimated position and orientation of the raw material are sent to the fixed-base robot manipulator via the host PC.
- The method presented above for modelling deformable objects in CoppeliaSim enables the stiffness of the overall material to be adjusted by tweaking two different types of model parameters: principle moments of inertia and individual primitive cuboids dimensions.
- The modified suction cup gripper with four suction cups, provides a useful simulation component for quickly evaluating different gripper designs comprising of an arrangement of multiple suction cups.
- This provides the Co-MRMS with a higher accuracy estimation of the position of the fibre material, which does not accumulate error over time.
2.3.1 Localisation approach
- As shown in Fig. 4, this work uses a single ArUco vision marker for material localisation material.
- The Suzuki algorithm  is then used to extract the contours, which are reconstructed by the Douglas-Peucker algorithm .
- Cells belonging to the border of the image carry a value of 0, while all inner cells are analysed to obtain the internal encoding, which corresponds to a 6x6 internal grid area.
- To improve the accuracy of the marker detection, the corners of the marker are refined through subpixel interpolation.
- Using this approach, the position of the material can be determined robustly regardless of the size and shape of the material.
2.3.2 Fibre orientation detection approach
- This is due to the material being anisotropic, meaning it provides varying strength along different directions across the material.
- In order to make sure that the plies are layered as designed, strict requirements are imposed for the orientation of each layer of fibres to obtain the expected Title Suppressed Due to Excessive Length 11 composite parts.
- The Fourier Transform is applied for fibre orientation analysis, where an image is converted into the frequency domain to obtain its spatial frequency components.
- Then the Fourier transform is applied to obtain the frequency domain image.
- Here curve fitting is achieved through the use of the least squares line fitting method.
3 Experimental Setup
- The composite material used here is a small sheet of fabric prepreg.
- Localisation of the material and fibre orientation detection are achieved through the use of a spotlight mounted together with the camera to produce strong reflections from the fibres of the material.
- In addition, the deformable object was modelled in CoppeliaSim by leveraging its support for the simulation of dynamic behaviours, which is achieved through the Bullet 2.78 physics engine.
- For the physical implementation, the ITRA toolbox , developed for the control of KUKA robots, provided the interface for directly sending actuation commands from Matlab to the KUKA robot controller unit for manipulator control, while ROS provided the interface for the actuation of the Turtlebot3 Burger.
- The vision system relied upon images captured by a webcam mounted on the end-effector of the manipulator to observe the environment.
4 Performance Evaluation
- To validate the developed system, several experiments were conducted to test the capabilities of the Co-MRMS.
- Initially, simulation-based experiments were carried out according to the proposed approaches in Section 2 and the accuracy of the vision system was assessed.
- The Co-MRMS, which employs a KUKA KR90 R3100 industrial fixed-base manipulator and a Turtlebot3 Burger differential drive mobile robot, was firstly modelled in CoppeliaSim to verify the performance in fulfilling the transportation and lay-up task of the proposed system.
- With the material position data obtained from machine vision system and wheel odometry, the localisation accuracy could be evaluated through Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE).
- In the second experiment, the fibre orientation detection algorithm was evaluated by comparing the output of the algorithm against the ground truth.
4.2.1 System interaction behaviour evaluation
- The cooperative system interaction behaviour was evaluated by physical experiments, of which a set of execution routines consisting of five active phases and two idle phases were obtained.
- The duration of this phase varies according to the start point, goal point and the subsequent path to move between these two points.
- After a brief pause where all systems remain idle to indicate that the mobile robot has reached its destination, the host PC sends the wheel odometry estimation of the mobile robot’s position as a target command to drive the manipulator towards the approximate location of the material (phase 2).
- This facilitates the placement of the material in a controlled orientation during grasping operations by ensuring that the fibre direction is always aligned with the z axis rotation of the end-effector.
- This experiment demonstrated the capability of the integrated system to correct any manipulator positional offset error that arises from wheel slippage of the mobile robot through higher accuracy estimation provided by machine vision.
4.2.2 Machine vision system accuracy evaluation
- To measure the accuracy of the machine vision algorithms in the real world, two additional experiments were conducted.
- The first experiment was used to quantify the errors in the measured position of the mobile robot using the vision-based localisation algorithm and wheel odometry.
- This experiment was conducted 20 times for statistical significance.
- A sample piece of composite material was placed in a fixed position in the workspace of the manipulator while the endeffector was positioned directly above the centre of the material with their rotation axes aligned at 0◦.
- The investigation shows that the closer the true fibre orientation is to 0◦, the higher the accuracy in fibre orientation detection.
- This section discusses the obtained experimental results.
- Firstly, it should be noted that the simulated trials incorporating the manipulation actions for grasping the material has so far not been included in the physical trials due to the lack of vacuum suction hardware.
- Instead, the educational mobile robot platform Turtlebot 3 Burger was adopted for the investigations conducted.
- Thus, additional development work is necessary to implement the proposed system framework onto an industrial standard set of hardware to validate the proposed system.
- Title Suppressed Due to Excessive Length 21.
- A cooperative mobile robot and manipulator system (Co-MRMS), which comprised of a fixed-base manipulator, an autonomous mobile robot and a machine vision sub-system, was developed as a promising strategy for autonomous material transfer and handling tasks to advance composite manufacturing.
- To demonstrate the feasibility and effectiveness of the proposed Co-MRMS, comprehensive simulations and physical experiments have been conducted.
- In conclusion, by exploiting the availability of wheel odometry and integrating this with machine vision algorithms within the proposed Co-MRMS, it is possible to implement a flexible system that provides autonomous material transportation and sufficiently-accurate material handling capabilities that extend beyond what is currently adopted in the industry.
- This research was funded by the Route to Impact Program 2019–2020 [grant no.: AFRC CATP 1469 R2I-Academy] and supported by the Advanced Forming Research Centre (University of Strathclyde), Lightweight Manufacturing Centre (University of Strathclyde) and Control Robotics Intelligence Group (Nanyang Technological University, Singapore).
- The datasets generated during and/or analysed during the current study are available from the corresponding author on reasonable request.
Did you find this useful? Give us your feedback
Related Papers (5)
Frequently Asked Questions (2)
Q1. What are the contributions mentioned in the paper "A cooperative mobile robot and manipulator system (co-mrms) for transport and lay-up of fibre plies in modern composite material manufacture" ?
As part of a preliminary investigation into the feasibility of deploying autonomous robotics for composite manufacturing, this paper investigates a cooperative mobile robot and manipulator system ( Co-MRMS ) for material transport and composite lay-up, which mainly comprises a mobile robot, a fixed-base manipulator and a machine vision sub-system. As a case study, a vacuum suction based end-effector model is developed to deal with sagging effects and to quickly evaluate different gripper designs, comprising of an array of multiple suction cups.
Q2. What are the future works in "A cooperative mobile robot and manipulator system (co-mrms) for transport and lay-up of fibre plies in modern composite material manufacture" ?
Future work will focus on validating the proposed system on industrial standard platforms and improving the system e. g. integrating a vacuum gripper, quantifying system efficiency, extending the work to multiple plies and developing a method for draping correction. In conclusion, by exploiting the availability of wheel odometry and integrating this with machine vision algorithms within the proposed Co-MRMS, it is possible to implement a flexible system that provides autonomous material transportation and sufficiently-accurate material handling capabilities that extend beyond what is currently adopted in the industry.