scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Symbiotic human-robot collaborative assembly

TL;DR: An overview of symbiotic human-robot collaborative assembly is provided and future research directions for voice processing, gesture recognition, haptic interaction, and brainwave perception are highlighted.
About: This article is published in CIRP Annals.The article was published on 2019-01-01 and is currently open access. It has received 273 citations till now. The article focuses on the topics: Robot & Workspace.

Summary (10 min read)

Jump to: [1. Introduction][2.1. Classification of human‐robot relationships][2.2. Definition of human‐robot collaboration][2.3. Symbiotic human‐robot collaboration][2.4. Characteristics of HRC assembly][3.1. Sensor][3.1.1. Contact‐based sensing][3.1.2. Contact‐less sensing][3.2. Smart sensor network and sensor data fusion][3.2.1. Localisation, mapping and tracking in HRC][3.2.2. Human‐robot collaborative assembly][4.1. Safety standards and systems][4.2. HR collision detection][4.3. Active collision avoidance][5.1. Context awareness and resource monitoring][5.2.1. Task planning and scheduling][5.2.2. Robot motion planning][5.3. On‐demand job dispatching][6.1. Multimodal programming][6.2. Smart algorithms embedding][6.3. Programming‐free robot control][6.4. Brainwave‐driven robot control][7.1. Mobile worker tracking and identification][7.2. AR‐based in‐situ decision support][7.3.1. Aspects of ergonomics][7.3.2. Psychological challenges in human‐robot collaboration][8.1. Towards new industrial standards of HRC][8.2. Modelling the human worker][8.3. Digital twin for symbiotic HRC][8.4. Optimising and adapting plans of redundant robots][8.5. Shared tasks and team formation][8.6. Handling of exceptions, emergency and recovery][8.8. Adaptive work instructions][8.9. Programming‐free robot control][8.10. HRC as social activity with trust][8.11. Social responsibility of a new type of automation][8.12. Limitations and challenges] and [9. Conclusions]

1. Introduction

  • Human‐robot collaboration (HRC) in a manufacturing context aims to realise an environment where humans can work side by side with robots in close proximity.
  • Using HRC, higher overall productivity and better product quality can be achieved.
  • Varying approaches to facilitating multimodal communication, dynamic assembly planning and task assignment, adaptive robot control, and in-situ support to operators have been reported in the literature.
  • Nevertheless, confusions exist in the relationships between robots and humans: coexistence, interaction, cooperation, and collaboration.
  • A systematic review and analysis on this very subject is needed, which is the motivation and objective of this keynote paper.

2.1. Classification of human‐robot relationships

  • Schmidtler et al. [220] analysed a human-robot cell in terms of working time, workspace, aim and contact.
  • Contents lists available at SciVerse ScienceDirect CIRP Annals Manufacturing Technology Journal homepage: www.elsevier.com/locate/cirp.
  • *Manuscript Click here to view linked References Interaction happens if a human and a robot sharing the same workspace are communicating with each other.
  • Both the human and the robot can work on the same task but complete the task step by step in a sequential order.
  • It requires typically a coordinated, synchronous activity from all parties [177] where physical contact is also allowed.

2.2. Definition of human‐robot collaboration

  • In the context of production and according to the standard terminology, HRC is a ‘state in which a purposely designed robot system and an operator work on simultaneous tasks within a collaborative workspace’, i.e., where the robot system and a human can perform tasks concurrently or even jointly [84].
  • HRC is also in demand in distributed manufacturing work environments and systems due to the limitation of automation and the maturation of agent technologies [170].
  • Due to its specific constraints, industrial production usually occupies a subset of possibilities.
  • Agent autonomy and closely related leader–follower relationships express how much of robot action is directly determined by human agents, and vice versa.
  • Task execution scenarios can be partitioned along the autonomy of participating agents (see Fig. 2).

2.3. Symbiotic human‐robot collaboration

  • Symbiotic cognitive computing takes place when human and machine agents co-exist in a physical space to interact with each other so as to solve hard tasks requiring large amounts of data along with significant mental and computational effort.
  • The main emphasis is on directly interacting with data as easily, directly, and naturally as possible.
  • By now, state-of-the-art sensor and communication technology, accompanied with almost unlimited data storage capacities and computing power serving faculties of data analytics and reasoning made this vision a reality [57].
  • (4) The agents apply at least partially shared representations of the environment they are operating in, which is the prerequisite for aligning their goals, roles, plans and activities.
  • All in all, a symbiotic HRC system possesses the skills and ability of perception, processing, reasoning, decision making, adaptive execution, mutual support and self-learning through real-time multimodal communication for context-aware human-robot collaboration.

2.4. Characteristics of HRC assembly

  • During HRC assembly, objects are arranged in space by actions in time so that products specified by design can be realised.
  • The space is densely populated not only by parts of the product but also by the applied technological resources and humans, whereas key objectives require the execution of actions within as short a timeframe as possible.
  • Objects and actions involved in HRC assembly are strongly related and constrain each other in many ways, due to technology, product structure, and geometry [100,101].
  • In assembly, the workplace design that allows efficient and dynamic human-robot task allocation is characteristic to safe, ergonomic and symbiotic HRC assembly [163].
  • This characteristic of human-robot task assignment was investigated and modelled as a search problem [233].

3.1. Sensor

  • The need for improving the effectiveness and efficiency as well as reducing the safety risks in HRC has led to increased interest in sensorrelated research and development for HRC.
  • Sensors deployed in the HRC environment can be categorised into two families: contact‐based and contact‐less.

3.1.1. Contact‐based sensing

  • The main application of physical contact-based sensing is one type of human gesture recognition (using wearable sensors, e.g., gloves) as opposed to camera-based gesture recognition methods.
  • Thin, elastic materials, on the other hand, can undergo a wide range of reversible deformation and therefore have become the leading candidate in fabricating wearable sensors that are both reliable and comfortable.
  • The sensors proposed by Hong et al. [79] have demonstrated the capability of accurately recognising wrist movement such as stretching and compressing.
  • Being able to sense pressure/force, hardness and texture at the points of contact are essential for both effective/delicate robot reaction when normal contact occurs and compliance to safety requirements during incidental contact.
  • Unlike the sensor by Cirillo et al. [43], this sensor is self-powered and can not only detect tiny pressure/force but also distinguish the hardness of the contact material by quantifying the shape change at the current peak.

3.1.2. Contact‐less sensing

  • Contact-less sensing, such as a laser, radar or vision system, helps reconstruct the geometric information of the surroundings in an HRC environment, guiding the robots to move around the workspace avoiding obstacles and work collaboratively with humans by identifying and locating the working parts.
  • In recent years, development of computer vision techniques has enabled context-aware interpretation of the environment, allowing robots to acquire complex skills.
  • In passive sensing, the measurement system does not illuminate the target; instead, the light from the target is either reflected ambient light or the light produced by the target itself.
  • It requires the reference point on the target to be captured by multiple cameras from different projection angles and fused to reconstruct in the 3D space through suitable transformation.
  • Common active sensing techniques include structured light, time of flight (ToF) and triangulation, which can natively capture the depth information that has to be inferred in passive techniques [197].

3.2. Smart sensor network and sensor data fusion

  • Different sensing techniques provide different aspects for varying interests [130].
  • Fig. 8 summarises the sensing techniques for human hand gesture recognition.
  • This section provides an overview of the techniques and a wide range of applications of sensor data fusion and integration reported for HRC, where measurement accuracy and robustness are improved and complex assembly procedures are coordinated.

3.2.1. Localisation, mapping and tracking in HRC

  • For robots to be truly interactive, they must have the ability to navigate the physical world autonomously to assist human operators.
  • Probabilistic data fusion methods are generally based on Bayes’ rule for combining the prior and the observed information.
  • Performance from different combinations of sensors were evaluated.
  • Specifically, three cascaded EKFs have been used to estimate the joint angles by the fusion of the outputs of tri-axial gyroscopes and accelerometers.
  • This method is marker-less and gives complementary information about the tracked body, enabling not only tracking of depth motions but also turning movements.

3.2.2. Human‐robot collaborative assembly

  • The complimentary nature of different sensing modalities, such as vision, voice and pressure/force, motivates the consideration of synergistically integrating them for improved effectiveness and efficiency in the advancement of symbiotic HRC assembly towards human-like capabilities.
  • On the other hand, the perception of pressure and force enables compliance to the local constraints, required by specific tasks.
  • One application of sensor integration is the screwing task proposed by Shauri et al. [221], where the trajectory of the robot arm is controlled based on the measurement from the vision system and the robot hand configuration is adjusted based on the pressure/force data.
  • De Gea Fernández et al. [62] extended sensor data integration from IMU, RGB-D (red, green, blue, depth) camera and laser scanner to robot whole-body control.
  • The results indicate that the approach can significantly enhance the operators’ integration in an HRC assembly system.

4.1. Safety standards and systems

  • Fig. 10 summarised the causes of potential accidents in HRC into three categories: (1) engineering failures, (2) human errors, and (3) poor environmental conditions [33,239].
  • All of these failures can lead to incorrect response by both the robot and human operator.
  • EN 62061 Safety of machinery – Functional safety of safety-related electrical, electronic and programmable electronic control systems human–robot interaction (HRI) into four levels.
  • The level of danger was estimated based on factors influencing the impact force during a human-robot collision, such as the effective robot inertia, the relative velocity and the distance between the robot and the human.
  • Based on the control method, Heinzmann and Zelinsky [77] described the formulation and implementation of a control strategy for robot manipulators which provides quantitative safety guarantees for the user of assistive robots.

4.2. HR collision detection

  • It is crucial to detect any collision between a robot and a human operator before any severe accident occurs.
  • The Institute of Robotics and Mechatronics of German Aerospace developed a light-weight robot based on the integrated torquecontrolled mechanism [26,137].
  • Geravand et al. [64] developed a closed control architecture to detect the collision based on the outer joint velocity reference to the robot manufacturer’s controller, together with the available measurements of motor currents and joint positions.
  • In 2002, Ebert and Henrich [55] presented a collision-detection method based on images taken from several stationary cameras in a work cell.
  • Flacco et al. [59] developed a fast method to evaluate distances between the robot and possibly moving obstacles (including humans), based on the depth data.

4.3. Active collision avoidance

  • An HRC environment requires the coexistence of both humans and robots.
  • The consistent safety of humans in such environment is paramount, including both the passive collision detection and active collision avoidance by monitoring human movements and controlling the robots, respectively, to achieve human safety at all-time [218].
  • Several recent approaches for HRC have also been reported.
  • Augustsson et al. [11,12] presented an approach to transferring data to the robot communicating the human’s position and movements, forcing the robot to respond to the triggers, and visualising the information about the settings and assembly order to the human.
  • The approach is able to dynamically and visually detect any safety interruption.

5.1. Context awareness and resource monitoring

  • Assembly tasks shared by humans and robots in HRC assembly are dynamic in nature [253].
  • This requires constant resource monitoring for better context awareness.
  • It facilitates reasoning and execution of those contexts for providing context-aware services.
  • Since the operator’s (work-related) motions are limited and repetitive, an assembly task can be modelled as a sequence of human motions.
  • In parallel, Liu and Wang [133] modelled the recognised human motions in a Hidden Markov model (HMM).

5.2.1. Task planning and scheduling

  • In HRC assembly like ROBO-PARTNER [162], the focus is given to combining robot strength, velocity, predictability, repeatability and precision with human intelligence and skills to achieve a hybrid solution that facilitates the safe cooperation of operators with adaptive robotic systems.
  • The aim of task planning and scheduling is to allocate and dispatch the tasks to be performed and required by the assembly process to the available resources (e.g., workers, machines and robots), so that the assembly operations are optimised according to a given criterion (e.g., time and energy consumption [167]).
  • In order to cope with the presence of human in the loop, the task plan is generally constructed at an abstract, high and discrete level and continuously evaluated to decide how and when to execute a planned task, considering temporal/causal constraints, spatial/geometric constraints and controllable/uncontrollable activities.
  • Each timeline is characterised by a sequence of states, i.e., temporally extended predicates consisting of a proposition and a list of parameters, i.e., start, end, and duration times.

5.2.2. Robot motion planning

  • In the last decade, a large part of the literature has focused on how to generate robot trajectories for human-robot collaborative tasks.
  • Among the online planning methodologies, Lasota et al. [120] proposed the use of a Markov Decision Process (MDP), where human actions and the process of human decision making are modelled as a stochastic transition function influencing real-time robot actions and states.
  • Similarly, McGhan et al. [156] adopted an MDP to model unsynchronised human-robot collaborative but independent tasks (i.e., the human and the robot have to work on different pre-allocated tasks sharing the same workspace).
  • On one hand, online generated trajectories are more flexible than offline generated trajectories and able to easily take into account variations in the environment and in the human behaviour.
  • Moreover, the estimation of the robot execution time represents a critical issue currently limiting the applicability of these techniques in real industrial contexts for which the task time is a relevant constraint.

5.3. On‐demand job dispatching

  • Tsarouchi et al. [233] proposed a method for task planning in a hybrid assembly cell which includes both humans and robots.
  • The sequence of the human and robot tasks is structured in three levels (Fig. 19).
  • Job rotation is the common practice in industry.
  • This identification triggers the automatic transmission of assembly instructions and multimedia materials to handheld or stationary terminals, reducing the time required to retrieve and assimilate the information.
  • They also discussed the architecture design and the implementation aspects of a pushlet-based wireless information environment.

6.1. Multimodal programming

  • The high effort for conventional (re)programming of industrial robots (online and offline) in relation to the decreasing lot sizes of customised production motivates the development of robotic programming and control with higher degree of adaptability.
  • According to Pavlovic et al. [192], gestures describe intended movements, usually the arms and hands, with a manipulative or communicative character (Fig. 22).
  • The presented programming system is in principle mobile and can therefore be used as a process-oriented programming method.
  • This approach describes an intuitive, natural way of defining trajectories, albeit using the marker as an artificial tool.
  • In the coming years, robot programming software tools are expected to be more intuitive and user friendly.

6.2. Smart algorithms embedding

  • Intuitive programming of robots based on modalities, such as gestures, speech and haptics, requires embedding of complex algorithms with different levels of abstraction.
  • A general overview about robotic frameworks and architectures was given by Kortenkamp and Simmons [106].
  • According to IEC 61499, FBs may exist in different types.
  • Thus, a complex assembly process can be decomposed into the assembly features and handled by FBs for the ease of robot control.
  • The ECC is a finite state machine, which specifies the transitions from the events to their corresponding algorithms for event-driven execution.

6.3. Programming‐free robot control

  • While mostly a robot is programmed for repetitive tasks, there are various applications in assembly where high flexibility needs either control without programming or auto-adaptation of tasks by the robot without explicit reprogramming by a human.
  • This mode can be used not only for online programming scenarios such as teaching or programming by demonstration [8], but also for direct collaborative workpiece handling in assembly.
  • The basis is the control algorithms, which ensure both the safety of the human in the haptic connection with the robot, and at the same time allows an efficient support of the human by the robot [109,225].
  • Levine et al. [123] demonstrated the high potential of deep learning algorithms for automated flexible robotic grasping of different objects in undefined poses.
  • Their method (Fig. 28) allows the progress, or the phase of movement, to be estimated even when observations of the human are partial and occluded; a problem typically found when using motion capture systems in cluttered environments.

6.4. Brainwave‐driven robot control

  • Recently, Mohammed and Wang [168] proposed using human brainwaves as a means for robot control in HRC assembly, where an Emotive Epoc+ device was chosen as the EEG measuring headset to record brain activities (Fig. 29).
  • After proper training, the mental command in the form of multichannel brainwave patterns can be recorded, which is then used for robot control.
  • Two major advantages were reported in their research work: (1) it frees the hands of an operator, allowing the operator to control a robot while performing a related task shared with the robot, and (2) it provides an auxiliary channel for multimodal symbiotic HRC assembly in addition to voice, gesture and haptic commands.
  • Using mental commands can overcome the difficulties in noisy environment when voice commands are used alone.

7.1. Mobile worker tracking and identification

  • In HRC assembly, the environment is mostly structured except humans and mobile objects.
  • To sustain the rapid processing, the closest range between the 3D model of the robot and the point cloud of the human is used to detect any collision in an augmented environment.
  • For worker identification, a quick response (QR) code is used, which is affixed to the worker’s uniform or helmet.
  • The system tracks the human using two Kinect sensors.
  • The amount of potential energy that is allowed to be generated in the HRC system during physical contact is used to modulate the contact forces.

7.2. AR‐based in‐situ decision support

  • Existing and evolving trends and paradigms in manufacturing, such as mass customisation and personalisation, call for better communication among product design and production execution.
  • Fig. 34 shows the system architecture of one AR-based in-situ decision-support system [134], consisting of four sub-systems: AR-based instruction system, task sequence planning & replanning system, worker monitoring system, and industrial robot control system.
  • Assembly instructions are placed at the correct location at the right time when the corresponding assembly parts are detected in the real-world coordinate system.
  • The tool was developed in the form of a software application for wearable devices, such as smartwatches in Fig. 36.
  • The technical implementation to be realised, including specific software frameworks that will be used was also described.

7.3.1. Aspects of ergonomics

  • Collaborative robots are currently used in assembly as human companions and helpers, making the human and the robot a team [76,18,39,193,198].
  • Specifically, the robot(s) should change its/their behaviours to make the human comfortable and to increase his/her ergonomics performance, while keeping low the time necessary to the team to perform all the tasks.
  • An essential aspect of HRC is how to cope with human ergonomics, process time, emotions and reaction during collaboration, and safety aspects.
  • The ErgoToolkit system [4] implements ergonomic analysis methods, already available in the literature or in industrial practice, into a state-of-the-art virtual manufacturing software.
  • The study concludes with presenting the application of the tools in an automotive case study (see Fig. 42).

7.3.2. Psychological challenges in human‐robot collaboration

  • In order to understand how to improve the efficiency of a human-robot team, several papers focused on the identification of a set parameters of the robot trajectory that influence human behaviour.
  • They proved that, on the one hand, trajectories are more legible if the probability to reach a certain goal given the current position is higher.
  • A similar study [189] was aimed at optimising human visibility, reachability and ergonomics during handing-over tasks.
  • This leads to the development of the concept of mutual adaptation, where the human trust is considered during the definition of the robot strategy.
  • Jimenez and Dunkl [94] summarised the psychological aspects of work situations, where the authors pointed out the vital degrees of freedom (degree of autonomy, buffered resources, etc.) and comprehensible relations to the working and social environment (team alignment, goals, rewards, fairness, etc.) must lie within a preferable range.

8.1. Towards new industrial standards of HRC

  • The potential of collaborative work of teams of multiple human and robotic agents creates such a new scenario of industrial production which requires not only novel scientific insights and technological solutions but also a new regulatory framework.
  • Current standards for collaborative robotic environments already take an integrated approach with the compound of control, robot, end-effector, workpiece and task being simultaneously subject to assessment.
  • This is a favourable starting point for further enhancement towards a complex multi-agent team.
  • While ISO 10075 does not specifically address such views, the perspectives of the standard are a consistent starting point for further synthesis.

8.2. Modelling the human worker

  • Continuous observation of behaviours and models of human disposition and emotion at workplace have been investigated intensively; however, industry-ripe application to work execution control at assembly workstations are still missing.
  • There is a need to elaborate and populate models of human workforce and develop task execution control and HMC adaptation approaches that can establish individual worker preference profiles, pick up transient changes in the state of the individual worker, and tune both communication and acquired models accordingly.

8.3. Digital twin for symbiotic HRC

  • Designing, planning, operating, supervising and controlling assembly workcells where humans and robots collaborate requires accurate and almost in real-time updated digitalised models.
  • Here more is needed than a kind of replica of the devices, equipment, processes or even human operators.
  • A socalled digital twin should combine and align all relevant aspects of modelling the function, structure and behaviour of the robotic cell including the worker, together with capturing the symbiotic interplay of the human and robotic agents.
  • This includes representing the multimodal and bidirectional channels of communication and control as well.
  • Time and again, the digital twin should be efficiently tuned – i.e. calibrated – to the real environment by using the actual measured data, so that both robot programming and operator instructions can automatically adapt to the actual situation.

8.4. Optimising and adapting plans of redundant robots

  • Redundancy—when the robot has more degrees of freedom than needed for performing a task—provides much opportunity for optimisation and adaptation, because, at least in principle, infinitely many joint configurations may result in executing the same task.
  • The solution requires a bi-directional transition between the task and configuration spaces for generating relevant and collision-free configurations only [56].
  • The computational complexity of such an approach is still too demanding for supporting real-time adaptive robot control.

8.5. Shared tasks and team formation

  • In today’s industrial practice of assembly, collaborative workstations exist already.
  • They do not exhibit all characteristics of a working environment hosting multiple collaborating humans and robotic agents.
  • Being more autonomous but less predictable and reliable actors, humans need more support for being aware of the current situation.
  • All these are prerequisites of true shared task execution and adaptive team formation in collaborative environments.

8.6. Handling of exceptions, emergency and recovery

  • Exceptions occur if the flow of processes involved is close or has passed the nominal limits; however, the processes can be brought back to normal without interruption, and the situation does not endanger the integrity or safety of human.
  • Emergencies occur when danger of irrevocable damage is imminent.
  • The latter case requires fast and guaranteed mitigation.
  • Therefore, certain modes of communication do not apply to emergency signals.
  • In addition, emergency mechanisms may have direct effect (e.g., emergency power-down by a single circuit breaker) that usually does not occur with communication in the usual sense.

8.8. Adaptive work instructions

  • Human–robot interfaces, especially those conveying work instructions, often provide adaptability to the given worker in discrete steps, namely, by skill level categories.
  • Adaptation to the worker’s current (and changing) fitness for the current assembly task is, however, not part of industrial practice.
  • The devices keeping track of the worker’s awareness primarily serve safety purposes only.
  • AR-based in-situ decision support to workers in dynamic HRC assembly environments deserves more attention to be both intuitive and mental stress-free.
  • Work instructions needs to be adaptive to not only the changing competence level of individual workers but also the declining focus and concentration during the day or within the week.

8.9. Programming‐free robot control

  • One potential application of data fusion in HRC assembly that should be actively explored is the multimodal fusion of human commands, for example, integration of both hand gestures and voice, to further improve the human command recognition rate and contribute to programming-free robot control.
  • Instead of defined gestures and voice commands, recognition and prediction of human motions through deep learning provides better context awareness and less interruption to normal performance induced by signalling gestures.
  • Haptic interaction with collaborating robots, especially for legacy industrial robots with few embedded sensors, is another way to achieve programming-free robot control.
  • ‘Think ahead’ combined with human motion prediction contributes to timely cognition and assistance from the robots, and adds value to the intelligence of an HRC system.
  • This unique combination paves the way towards the multimodal symbiotic HRC assembly.

8.10. HRC as social activity with trust

  • Research has already elaborated approaches for the modelling of emotional and social processes, as well as acquisition of individual or cultural profiles and their tuning to newly perceived changes.
  • Existing results, therefore, provide a solid starting point for the elaboration of similar solutions for industrial applications in the HRC context.
  • When adding humans to shared robotic environments, this aspect is unavoidably important and deserves attention.

8.11. Social responsibility of a new type of automation

  • An HRC team forms a meta society where each team member bears certain social responsibilities for others such as safety and work ethics.
  • The social stability of the team contributes to the level of automation for seamless HRC assembly.
  • In such a new automation environment, mental stress and even psychological discomfort leading to any potential accident can be monitored and diagnosed via the brainwaves of human workers, which can be collected by sensors embedded in a safety helmet.

8.12. Limitations and challenges

  • The wide interest of research and industry in the HRC related topics is proportional to the increased productivity and flexibility of production lines, as it combines human and robot capabilities [9,234].
  • The hardware utilised are prototype-level devices that cannot be transferred to industry directly.
  • Moreover, the feasibility of the HRC solutions have been well evaluated, but the safety performance needs to be assessed systematically.
  • The psychological reactions may change if the operator works with the robot for a long period.

9. Conclusions

  • This paper presents the state-of-the-art of symbiotic humanrobot collaborative assembly.
  • Research on HRC has been active for many years.
  • There is a need to classify these relationships, and identify their unique features and characteristics with clear definitions.
  • This paper is aimed to address these issues together with existing challenges and recent technological advancements.
  • Within the context of symbiotic HRC assembly, topics covered include sensing and communication techniques, human safety assurance, dynamic assembly planning, programming-free robot control, and in-situ decision support to operators.

Did you find this useful? Give us your feedback

Citations
More filters
Journal ArticleDOI
06 Dec 2019-Robotics
TL;DR: This paper provides an overview of collaborative robotics towards manufacturing applications, presenting the related standards and modes of operation and an analysis of the future trends in human–robot collaboration as determined by the authors.

234 citations


Cites background or result from "Symbiotic human-robot collaborative..."

  • ...Other reviews, such as [8], identify similar trends, namely those of improved modeling and understanding, better task planning, and adaptive learning....

    [...]

  • ...For a more complete overview, we refer to [8,13]....

    [...]

Journal ArticleDOI
TL;DR: Fundamental components and techniques necessary to make welding systems intelligent, including sensing and signal processing, feature extraction and selection, modeling, decision-making, and learning are examined.

137 citations

Journal ArticleDOI
TL;DR: A depth-sensor based model for workspace monitoring and an interactive Augmented Reality (AR) User Interface (UI) for safe HRC are proposed and evaluated in a realistic diesel engine assembly task.
Abstract: Industrial standards define safety requirements for Human-Robot Collaboration (HRC) in industrial manufacturing. The standards particularly require real-time monitoring and securing of the minimum protective distance between a robot and an operator. This paper proposes a depth-sensor based model for workspace monitoring and an interactive Augmented Reality (AR) User Interface (UI) for safe HRC. The AR UI is implemented on two different hardware: a projector-mirror setup and a wearable AR gear (HoloLens). The workspace model and UIs are evaluated in a realistic diesel engine assembly task. The AR-based interactive UIs provide 21–24% and 57–64% reduction in the task completion and robot idle time, respectively, as compared to a baseline without interaction and workspace sharing. However, user experience assessment reveal that HoloLens based AR is not yet suitable for industrial manufacturing while the projector-mirror setup shows clear improvements in safety and work ergonomics.

121 citations


Cites background from "Symbiotic human-robot collaborative..."

  • ...HRC has been active in the past to realize the future manufacturing expectations and made possible by several research results obtained during the past five to ten years within the robotics and automation scientific communities [3]....

    [...]

Journal ArticleDOI
TL;DR: The focus of this paper is to provide a systematic view for analyzing data and process dependencies at multiple levels that AI must comprehend, and identify challenges and opportunities to not only further leverage AI for manufacturing, but also influence the future development of AI to better meet the needs of manufacturing.
Abstract: Today’s manufacturing systems are becoming increasingly complex, dynamic, and connected. The factory operations face challenges of highly nonlinear and stochastic activity due to the countless uncertainties and interdependencies that exist. Recent developments in artificial intelligence (AI), especially Machine Learning (ML) have shown great potential to transform the manufacturing domain through advanced analytics tools for processing the vast amounts of manufacturing data generated, known as Big Data. The focus of this paper is threefold: (1) review the state-of-the-art applications of AI to representative manufacturing problems, (2) provide a systematic view for analyzing data and process dependencies at multiple levels that AI must comprehend, and (3) identify challenges and opportunities to not only further leverage AI for manufacturing, but also influence the future development of AI to better meet the needs of manufacturing. To satisfy these objectives, the paper adopts the hierarchical organization widely practiced in manufacturing plants in examining the interdependencies from the overall system level to the more detailed granular level of incoming material process streams. In doing so, the paper considers a wide range of topics from throughput and quality, supervisory control in human–robotic collaboration, process monitoring, diagnosis, and prognosis, finally to advances in materials engineering to achieve desired material property in process modeling and control.

115 citations

Journal ArticleDOI
TL;DR: In this paper, a virtual counterpart of a physical human-robot assembly system is built as a "front-runner" for validation and control throughout its design, build and operation.
Abstract: Human-robot collaboration (HRC) can expand the level of automation in areas that have conventionally been difficult to automate such as assembly. However, the need of adaptability and the dynamics of human presence are keeping the full potential of human-robot collaborative systems difficult to achieve. This paper explores the opportunities of using a digital twin to address the complexity of collaborative production systems through an industrial case and a demonstrator. A digital twin, as a virtual counterpart of a physical human-robot assembly system, is built as a ‘front-runner’ for validation and control throughout its design, build and operation. The forms of digital twins along system's life cycle, its building blocks and the potential advantages are presented and discussed. Recommendations for future research and practice in the use of digital twins in the field of cobotics are given.

110 citations

References
More filters
Proceedings Article
01 Jan 2009
TL;DR: This paper discusses how ROS relates to existing robot software frameworks, and briefly overview some of the available application software which uses ROS.
Abstract: This paper gives an overview of ROS, an opensource robot operating system. ROS is not an operating system in the traditional sense of process management and scheduling; rather, it provides a structured communications layer above the host operating systems of a heterogenous compute cluster. In this paper, we discuss how ROS relates to existing robot software frameworks, and briefly overview some of the available application software which uses ROS.

8,387 citations

Journal ArticleDOI
Ronald Azuma1
TL;DR: The characteristics of augmented reality systems are described, including a detailed discussion of the tradeoffs between optical and video blending approaches, and current efforts to overcome these problems are summarized.
Abstract: This paper surveys the field of augmented reality AR, in which 3D virtual objects are integrated into a 3D real environment in real time. It describes the medical, manufacturing, visualization, path planning, entertainment, and military applications that have been explored. This paper describes the characteristics of augmented reality systems, including a detailed discussion of the tradeoffs between optical and video blending approaches. Registration and sensing errors are two of the biggest problems in building effective augmented reality systems, so this paper summarizes current efforts to overcome these problems. Future directions and areas requiring further research are discussed. This survey provides a starting point for anyone interested in researching or using augmented reality.

8,053 citations

Journal ArticleDOI
TL;DR: This paper describes the simultaneous localization and mapping (SLAM) problem and the essential methods for solving the SLAM problem and summarizes key implementations and demonstrations of the method.
Abstract: This paper describes the simultaneous localization and mapping (SLAM) problem and the essential methods for solving the SLAM problem and summarizes key implementations and demonstrations of the method. While there are still many practical issues to overcome, especially in more complex outdoor environments, the general SLAM method is now a well understood and established part of robotics. Another part of the tutorial summarized more recent works in addressing some of the remaining issues in SLAM, including computation, feature representation, and data association

3,760 citations


"Symbiotic human-robot collaborative..." refers background or methods in this paper

  • ...Durrant-Whyte and Bailey [52] reported the SLAM solutions based on extended Kalman filter (EKF) and PF using vision system data....

    [...]

  • ...Based on this assumption, the object state update process enables asynchronous sensor data fusion [52]....

    [...]

Journal ArticleDOI
TL;DR: A comprehensive survey of robot Learning from Demonstration (LfD), a technique that develops policies from example state to action mappings, which analyzes and categorizes the multiple ways in which examples are gathered, as well as the various techniques for policy derivation.

3,343 citations


"Symbiotic human-robot collaborative..." refers methods in this paper

  • ...This mode can be used not only for online programming scenarios such as teaching or programming by demonstration [8], but also for direct collaborative workpiece handling in assembly....

    [...]

BookDOI
01 Nov 2007
TL;DR: The contents have been restructured to achieve four main objectives: the enlargement of foundational topics for robotics, the enlightenment of design of various types of robotic systems, the extension of the treatment on robots moving in the environment, and the enrichment of advanced robotics applications.
Abstract: The second edition of this handbook provides a state-of-the-art cover view on the various aspects in the rapidly developing field of robotics. Reaching for the human frontier, robotics is vigorously engaged in the growing challenges of new emerging domains. Interacting, exploring, and working with humans, the new generation of robots will increasingly touch people and their lives. The credible prospect of practical robots among humans is the result of the scientific endeavour of a half a century of robotic developments that established robotics as a modern scientific discipline. The ongoing vibrant expansion and strong growth of the field during the last decade has fueled this second edition of the Springer Handbook of Robotics. The first edition of the handbook soon became a landmark in robotics publishing and won the American Association of Publishers PROSE Award for Excellence in Physical Sciences & Mathematics as well as the organizations Award for Engineering & Technology. The second edition of the handbook, edited by two internationally renowned scientists with the support of an outstanding team of seven part editors and more than 200 authors, continues to be an authoritative reference for robotics researchers, newcomers to the field, and scholars from related disciplines. The contents have been restructured to achieve four main objectives: the enlargement of foundational topics for robotics, the enlightenment of design of various types of robotic systems, the extension of the treatment on robots moving in the environment, and the enrichment of advanced robotics applications. Further to an extensive update, fifteen new chapters have been introduced on emerging topics, and a new generation of authors have joined the handbooks team. A novel addition to the second edition is a comprehensive collection of multimedia references to more than 700 videos, which bring valuable insight into the contents. The videos can be viewed directly augmented into the text with a smartphone or tablet using a unique and specially designed app.

3,174 citations

Frequently Asked Questions (2)
Q1. What have the authors contributed in "Symbiotic human-robot collaborative assembly" ?

In the context of human-robot collaborative assembly ( HRC ) this paper, the main objective of the collaboration is to integrate the best of two worlds: strength, endurance, repeatability and accuracy of the robots with the intuition, flexibility and versatile problem solving and sensory skills of the humans. 

12 future research directions and challenges are also identified, hoping to shed some light on further advancement in the years to come. With the support of the latest technologies of sensing, communication, AI, AR and robot control, HRC will find its way to practical applications on shop floors in factories of the future.