GB2595289A - Collaborative robot system - Google Patents

Collaborative robot system Download PDF

Info

Publication number
GB2595289A
GB2595289A GB2007583.4A GB202007583A GB2595289A GB 2595289 A GB2595289 A GB 2595289A GB 202007583 A GB202007583 A GB 202007583A GB 2595289 A GB2595289 A GB 2595289A
Authority
GB
United Kingdom
Prior art keywords
robot
human
workspace
mode
move
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2007583.4A
Other versions
GB202007583D0 (en
Inventor
David Story Matthew
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BAE Systems PLC
Original Assignee
BAE Systems PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BAE Systems PLC filed Critical BAE Systems PLC
Priority to GB2007583.4A priority Critical patent/GB2595289A/en
Publication of GB202007583D0 publication Critical patent/GB202007583D0/en
Publication of GB2595289A publication Critical patent/GB2595289A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1674Programme controls characterised by safety, monitoring, diagnostic
    • B25J9/1676Avoiding collision or forbidden zones
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • B25J9/1666Avoiding collision or forbidden zones
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40201Detect contact, collision with human
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40202Human robot coexistence

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

A system for robot and human collaboration, comprises a robot 102 in a workspace 106, a sensor system 110, and a controller 108 configured to detect a human 104 in the workspace 106 using the sensor system 110 and to operate the robot 102 in a first mode if the human 104 is not in the workspace 106. If the controller 108 detects the human 104 is in the workspace 106 it determines, using the sensor system 110, relative positions of the human 104 and the robot 102 and determines, using the determined positions, whether the human 104 is positioned on a route for the robot 102, i.e. will a collision occur. If the human 104 is not on the route, the controller 108 controls the robot 102 to move along the route in a second mode of operation. If the human 104 is on the route, the controller 108 controls the robot 102 to either stop moving or move in the second mode of operation along a different route to avoid a collision. In the first mode of operation the robot 102 may move at a higher speed and along curved, swinging or swinging paths which are more efficient. In the second mode of operation the robot 102 may move at a slower speed and only along substantially straight paths which motion is easier for a human 104 to predict.

Description

COLLABORATIVE ROBOT SYSTEM
FIELD OF THE INVENTION
The present invention relates to collaborative robot systems, and more particularly to systems for robot and human collaboration.
BACKGROUND
Typically, robots, e.g. robot arms, operate in workspaces that are remote from the workspaces of humans. The workspaces of robot and human are typically separated by, for example, safety barriers or fences.
In recent years, the use of collaborative robots that share workspaces with humans has increased. Collaborative robots are designed to work with or near humans to enable humans and robots to collaborate so as to complete tasks. Such tasks include, but are not limited to, vehicle (e.g. aircraft) manufacturing and assembly tasks. Humans may work within or near the working space of the robot, for example in accordance with ISO 10218 and ISO/TS 15066. In other words, humans and a collaborative robot may share a common workspace.
In collaborative robot systems, safety tends to be important in order to prevent injury to a human.
SUMMARY OF THE INVENTION
In a first aspect, the present invention provides a system for robot and human collaboration. The system comprises: a multi-axis robot located within a workspace; a sensor system configured to detect objects within the workspace; and a controller operatively coupled to the robot and the sensor system, the controller being configured to: control operation of the robot; detect, using an output of the sensor system, a presence of the human in the workspace; responsive to detecting that the human is not located in the workspace, operate -2 -the robot in a first mode of operation; responsive to detecting that the human is located in the workspace, operate the robot in a second mode of operation, the second mode of operation being different to the first mode of operation; determine, using the output of the sensor system, relative positions in the workspace of the human and the robot; determine, using the determined relative positions in the workspace of the human and the robot, whether the human is positioned on a route along which the robot is to be controlled to move; responsive to determining that the human is not positioned on the route along which the robot is to be controlled to move, control the robot to move along the route; and, responsive to determining that the human is positioned on the route along which the robot is to be controlled to move, control the robot to either stop moving or move along a different route so as to avoid collision with the human.
The first mode of operation may comprise the robot being controlled to move at less than or equal to a first speed. The second mode of operation may comprise the robot being controlled to move at less than or equal to a second speed, the second speed being different to the first speed. The first speed may be greater than the second speed. In the first mode of operation, movement of the robot may be restricted such that the robot can rotate around only a limited, proper subset of its axes. In the first mode of operation, the robot may be controlled such that an end effector of the robot moves along one or more curved paths. In the second mode of operation, the robot may be permitted to be rotated around all of its axes. In the second mode of operation, the robot may be controlled such that an end effector of the robot moves along only one or more substantially straight paths.
The controller may be further configured to: detect, using the output of the sensor system, that the human has entered the workspace; and, responsive to detecting that the human has entered the workspace, switch operation of the robot from the first mode to the second mode.
The system may further comprise a memory, and a program stored in the memory. The program may be for controlling the robot to perform a task. The program may specify a plurality of different routes for the robot. The controller -3 -may be configured to control the robot using the program. The controller may be configured to: responsive to determining that the human is positioned on the route along which the robot is to be controlled to move, determine, using the program, whether an alternative route for the robot is available, the alternative route avoiding collision with the human and such that the robot completes the task; responsive to determining that the alternative route for the robot is available, control, using the program, the robot to move along the alternative route; and, responsive to determining that the alternative route for the robot is not available, control the robot to either stop moving or to move away from the human.
The controller may be configured to, responsive to determining that the human is positioned on the route along which the robot is to be controlled to move, control the robot to move away from the human in a single plane of motion.
The sensor system may comprise one or more sensors selected from the group of sensors consisting of: one or more cameras, one or more visible light detecting cameras, one or more infrared cameras, and one or more Light Detection and Ranging -LIDAR -sensors.
In a further aspect, the present invention provides a method for controlling a robot in a robot-human collaborative system. The system comprises a multi-axis robot located within a workspace, a sensor system configured to detect objects within the workspace, and a controller operatively coupled to the robot and the sensor system. The method comprises: detecting, by the controller, using an output of the sensor system, a presence of the human in the workspace; responsive to detecting that the human is not located in the workspace, operating, by the controller, the robot in a first mode of operation; responsive to detecting that the human is located in the workspace, operating, by the controller, the robot in a second mode of operation, the second mode of operation being different to the first mode of operation; determining, by the controller, using the output of the sensor system, relative positions in the workspace of the human and the robot; determining, by the controller, using the determined relative positions in the workspace of the -4 -human and the robot, whether the human is positioned on a route along which the robot is to be controlled to move; responsive to determining that the human is not positioned on the route along which the robot is to be controlled to move, controlling, by the controller, the robot to move along the route; and, responsive to determining that the human is positioned on the route along which the robot is to be controlled to move, controlling, by the controller, the robot to either stop moving or move along a different route so as to avoid collision with the human.
In a further aspect, the present invention provides a program or plurality of programs arranged such that when executed by a computer system or one or more processors comprised within a controller it/they cause the computer system or the one or more processors to operate in accordance with any preceding aspect.
In a further aspect, the present invention provides a machine-readable storage medium storing a program or at least one of the plurality of programs according to any preceding aspect.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 is a schematic illustration (not to scale) showing a human-collaborative robot system; Figure 2 is a schematic illustration (not to scale) showing controller of a collaborative robot; Figure 3 is a process flow chart showing certain steps of a process of controlling the robot 102; and Figure 4 is a schematic illustration representative of a program corresponding to the predetermined task.
DETAILED DESCRIPTION
Figure 1 is a schematic illustration (not to scale) showing an embodiment of a system 100 for robot-human collaboration. The system 100 comprises a -5 -robot 102 (which may also be referred to as a robot arm) and a human 104 which share a common workspace 106. Both the robot 102 and the human 104 may move within the common workspace 106.
The system 100 further comprises a controller 108 operatively coupled to the robot 102 and configured to control operation of the robot 102. The controller 108 is described in more detail later below with reference to Figure 2.
The system 100 further comprises a camera system 110 operatively coupled to the controller 108. The camera system 110 includes one or more cameras, and preferably multiple cameras. In this embodiment, the one or more cameras are visible light detecting cameras. The camera system 110 is configured to capture images of the workspace and entities therein, including, in this embodiment, the robot 102 and the human 104. The camera system 110 is further configured to send the captured images to the controller 108. The controller 108 is further configured to process the received images, as described in more detail later below with reference to Figure 3.
The robot 102 is a six-axis articulated robot. More specifically, the robot 102 comprises six arm portions, namely a first portion 111, a second portion 112, a third portion 113, a fourth portion 114, a fifth portion 115, and a sixth portion 116.
The first portion 111 provides a base of the robot. The first portion 111 (i.e. the base) may be mounted to a floor, or may be mounted to a fixed structure, a piece of moving equipment, or any other suitable mounting surface or structure.
The robot 102 further comprises an end effector 120 located at the distal end of the robot 102. In this embodiment the end effector 120 comprises a gripper 140. The gripper 140 comprises a plurality of gripper arms. The gripper 140 is configured to be controlled by the controller 108 to allow the robot 102 to grip, move, and release a workpiece. The gripper 140 may allow the robot 102 to perform tasks including, but not limited to, holding parts, loading parts, unloading parts, assembling parts together, inspecting parts, and adjusting parts. In some embodiments, the end effector 120 may comprise one or more -6 -sensors, such as one or more tactile sensors which may be configured to measure information arising from physical interaction of that tactile sensor with its environment. For example, a tactile sensor may be configured to detect contact of the gripper arms with an object that is being gripped by the gripper 140, for example by measuring a force or pressure exerted on the tactile sensor, and/or may be configured to measure parameters of an object gripped by the gripper 140, including one or more parameters selected from the group of parameters consisting of shape, size, mass, weight, texture, stiffness, centre of mass, coefficient of friction, and thermal conductivity.
The robot 102 includes a first rotary axis 121, a second rotary axis 122, a third rotary axis 123, a fourth rotary axis 124, a fifth rotary axis 125, and a sixth rotary axis 126. The robot 100 further comprises a first motor 131, a second motor 132, a third motor 133, a fourth motor 134, a fifth motor 135, and a sixth motor 131, which separately operate a respective axis 121-126, i.e. the first motor 131 controls rotation about the first axis 121, the second motor controls rotation about the second axis 122, and so on.
The operation of the individual axes 121-126 allows the robot's end effector 120 to be repeatably and accurately positioned, for example, with respect to a workpiece. Roll, pitch and yaw of the end effector 120 also tend to zo be controllable.
Figure 2 is a schematic illustration (not to scale) showing further details of the controller 108.
The controller 108 may be provided by configuring or adapting any suitable apparatus, for example one or more computers or other processing apparatus or processors, and/or providing additional modules. The apparatus may comprise a computer, a network of computers, or one or more processors, for implementing instructions and using data, including instructions and data in the form of a computer program or plurality of computer programs stored in or on a machine-readable storage medium such as computer memory, a computer disk, ROM, PROM etc., or any combination of these or other storage media. -7 -
In this embodiment, the controller 108 comprises a sensor interface 200, a collision detection module 202, a motor control module 204, and a memory 206.
In this embodiment, the sensor interface 200 is operatively coupled, e.g. via wired or wireless links, to the camera system 110, and is configured to receive image data from the camera system 110, the image data corresponding to images of the workspace 106 captured by the camera system 110.
The sensor interface 200 is further coupled to the collision detection module 202 such that the sensor interface 200 may send the received image data to the collision detection module 202.
The memory 206 stores a program, hereinafter referred to as the "robot program" and indicated in Figure 2 by the reference numeral 208, for controlling the robot 102 to perform a predetermined task is stored in the memory 206.
The memory 206 is operatively coupled, e.g. via wired or wireless links, to the collision detection module 202 such that the program 208 is accessible or may be retrieved from the memory 206 by the collision detection module 202, and/or such that the memory 206 may send the program 208 to the collision detection module 202.
The memory 206 is operatively coupled, e.g. via wired or wireless links, to the motor control module 204 such that the program 208 is accessible or may be retrieved from the memory 206 by the motor control module 204, and/or such that the memory 206 may send the program 208 to the motor control module 204.
The collision detection module 202 is configured to process the program 208 and the image data received from the sensor interface 200 as described in more detail later below with reference to Figure 3. In this embodiment, the collision detection module 202 is configured to process the program 208 and the image data received from the sensor interface 200 to determine whether or not the robot 102 would have collided or impacted with an object, for example the human 104, if the robot 102 were to be controlled as specified by the program 208. In this embodiment, the collision detection module 202 is -8 -configured to process the image data received from the sensor interface 200 to detect the presence of the robot 102 and/or the human 104 in the workspace 106.
In addition to being operatively coupled to the sensor interface 200 and the memory 206, the collision detection module 202 is operatively coupled, e.g. via wired or wireless links, to the motor control module 204. The collision detection module 202 is operatively coupled to the motor control module 204 such that an output of the collision detection module 202, e.g. a determination as to whether a collision between the robot 102 and the human 104 would o occur, may be sent from the collision detection module 202 to the motor control module 204.
The motor control module 204 is configured to control the robot 102 using the program 208 and the output of the collision detection module 202 received from the collision detection module 202, as described in more detail later below with reference to Figure 3. In particular, the motor control module 204 is operatively coupled, e.g. via wired or wireless links, to the motors 131136. The motor control module 204 is configured to control operation of the motors 131-136 to move the robot 102 based on the program 208 and the output of the collision detection module 202. For example, in operation, the motor control module 204 may generate command signals for the robot 102, and send those command signals to the motors 131-136 of the robot 102 so as to control the robot 102 to perform the predetermined task specified by the program 208. Also, the motor control module 204 is configured to control the end effector 120, e.g. in accordance with the program 208. Control of the end effector may be based on the output of the collision detection module 202.
Figure 3 is a process flow chart showing certain steps of a process of controlling the robot 102.
It should be noted that certain of the process steps depicted in the flowchart of Figure 3 and described below may be omitted or such process 30 steps may be performed in differing order to that presented below and shown in Figure 3. Furthermore, although all the process steps have, for convenience -9 -and ease of understanding, been depicted as discrete temporally-sequential steps, nevertheless some of the process steps may in fact be performed simultaneously or at least overlapping to some extent temporally.
At step s2, a program 208 for controlling the robot 102 to perform a predetermined task is specified. The program 208 is stored in the memory 206.
In this embodiment, the predetermine task includes the robot 102 moving within the workspace 106 from an initial position within the workspace 106 to a final position with the workspace 106. However, it will be appreciated by those skilled in the art that in practice, the predetermined task may be a different task o and may be any appropriate task capable of being performed by the robot.
Figure 4 is a schematic illustration representative of an example of a program 208 corresponding to the predetermined task.
The program 208 specifies a plurality of positions, locations, or waypoints for the robot 102 within the workspace 106, namely the initial position 401 and the final position 405, a first intermediate position 402, a second intermediate position 403, and a third intermediate position 404.
The program 208 further specifies a plurality of paths between the positions 401-405. In particular, the program specifies: a first path 411 between the initial position 401 and the first intermediate position 402; a second path 412 between the first intermediate position 402 and the final position 405; a third path 413 between the first intermediate position 402 and the third intermediate position 404; a fourth path 414 between the third intermediate position 404 and the final position 405; a fifth path 415 between the initial position 401 and the second intermediate position 403; and a sixth path 416 between the second intermediate position 403 and the third intermediate position 404.
As such, the program 208 specifies a plurality of routes for the robot 102 within the workspace 106 that the robot 102 may be controlled to follow in order to complete the predetermined task, i.e. to move from the initial position 401 to the final position 405. For example, a first route is defined by moving from the initial position 401 to the first intermediate position 402 via the first path 411 and then from the first intermediate position 402 to the final position 405 via the -10 -second path 412 (i.e. 401 3 402 3 405). Also, a second route is defined by moving from the initial position 401 to the first intermediate position 402 via the first path 411, then from the first intermediate position 402 to the third intermediate position 404 via the third path 413, and then from the third intermediate position 404 to the final position 405 via the fourth path 414 (i.e. 401 4 402 3 404 4 405). Also, a third route is defined by moving from the initial position 401 to the second intermediate position 403 via the fifth path 415, then from the second intermediate position 403 to the third intermediate position 404 via the sixth path 416, and then from the third intermediate position 404 to the final position 405 via the fourth path 414 (i.e. 401 4 403 4 404 4 405).
The routes for the robots specified by the program may be ranked. For example, the first route (i.e. 401 4 402 4 405) may be a preferred route for the robot 102, for example because it represents the shortest paths between the initial position 401 and the final position 405.
Returning now to the description of Figure 3, at step s4, the camera system 110 captures images of the workspace 106 and objects located therein.
At step s6, the camera system 110 sends image data corresponding to the captured images to the sensor interface 200. The sensor interface then forwards the received image data to the collision detection module 202.
At step s8, the collision detection module 202 processes the received image data to detect whether the human 104 is present in the workspace 106.
If, at step s8, the collision detection module 202 determines that the human 104 is not present in the workspace 106, the collision detection module 202 sends an indication to the motor control module 204 informing the motor control module 204 that the human 104 is not present in the workspace 106. The method then proceeds to step s10.
However, if, at step s8, the collision detection module 202 determines that the human 104 is present in the workspace 106, the method proceeds to step s12. Step s12 will be described in more detail later below after a
description of step s10.
At step sl 0, in response to receiving the indication that the human 104 is not present in the workspace 106, the motor control module 204 uses the program 208 to control the robot 102 to perform the predetermined task specified by the program 208. For example, the robot 102 may be controlled by the motor control module 204 to move from the initial position 401 to the final position 405 as specified by the program 208, preferably via the preferred first path (i.e. 401 4 402 4 405).
In this embodiment, because no human is detected in the workspace 106, at step sl 0 the motor control module 204 controls the robot 102 to move in accordance with a first mode of operation. In the first mode of operation, the robot 102 is controlled to move at a relatively high, first speed (i.e. higher than the below-mentioned, relatively low second speed). In some embodiments, in the first mode, the speed of the robot 102 may be limited to a relatively high threshold speed. In the first mode of operation, the robot 102 is controlled to move with so-called "joint movement". In particular, the first mode of operation is such that only a limited, proper subset of the robot's joints are allowed to move either absolutely, or at any given time. Thus, in the first mode of operation, fewer of the robot's joints move (compared to the below-described second mode of operation). Thus, the robot tends to be moved in a swinging manner along curved or sweeping paths; the robot's end effector 120 tends to be moved along curved paths. Such movement of the robot 102 tends to be more efficient compared to operation of the robot in the below described second mode. Operation of the robot in the first mode tends to result in completion of the predetermined task more quickly than if the robot had been controlled in the second mode.
Thus, the predetermined task is completed by the robot 102. After step sl 0, the process of Figure 3 may end.
Returning now to the case where, at step s8, the collision detection module 202 determines that the human 104 is present in the workspace 106, at step s12, the collision detection module 202 uses the received image data to determine relative positions of the robot's end effector 120 and the human 104 in the workspace 106. For example, respective positions of the robot's end -12 -effector 120 and the human 104 within a common coordinate system or reference frame may be determined using the image data. In some embodiments, the collision detection module 202 may use other sensor data instead of or in addition to the image data to determine the relative positions of the robot's end effector 120 and the human 104 in the workspace 106. For example, in some embodiments, the robot 102 may comprise one or more position sensors that measure a position of one or more of its portions 111-116 and/or the end effector 120. Data from such position sensors may be used to determine a position of the robot 102 within the workspace 106. Similarly, the o human 104 may wear or carry a position sensor, data from which may be used to determine the position of the human 104.
At step s14, the collision detection module 202 determines whether a collision between the robot's end effector 120 and the human 104 will occur at a future time. The collision detection module 202 uses the program 208 and the determined relative positions of the robot's end effector 120 and the human 104 in the workspace 106 to determine if a collision will occur.
More specifically, in this embodiment, the collision detection module 202 determines whether a collision between the robot's end effector 120 and the human 104 will occur if the human 104 maintain his/her current position and the robot's end effector 120 is controlled in accordance with the program 208 to move a long a current path. For example, if the robot's end effector 120 is currently being controlled, or is due to be controlled, to move its end effector 120 from the initial position 401 to the final position 405 along the preferred path, via the first intermediate position 402 (i.e. 401 4 402 4 405, as shown in Figure 4), and if the human 104 is located along this path (e.g. the human 104 is located at the first intermediate position 402), the collision detection module 202 determines that a collision between the robot's end effector 120 and the human 104 will occur.
If, at step s14, the collision detection module 202 determines that a collision between the robot's end effector 120 and the human 104 will not occur, the collision detection module 202 sends an indication to the motor control module 204 informing the motor control module 204 that the human 104 is -13 -present in the workspace 106 but that a collision will not occur. The method then proceeds to step s16.
However, if, at step s14, the collision detection module 202 determines that a collision between the robot's end effector 120 and the human 104 will occur, the collision detection module 202 sends an indication to the motor control module 204 informing the motor control module 204 that the human 104 is present in the workspace 106 and that a collision will occur. The method then proceeds to step s18. Step s18 will be described in more detail later below after a description of step s16.
In this embodiment, because no collision is anticipated but the human 104 is nevertheless present in the workspace 106, at step s16 the motor control module 204 controls the robot 102 to move in accordance with a second mode of operation. In the second mode of operation, the robot 102 is controlled to move at a relatively low, second speed (i.e. lower than the above-mentioned, relatively high first speed). In some embodiments, in the second mode the speed of the robot 102 may be limited to a relatively low threshold speed (e.g. less than or equal to 250m m/s). In the second mode of operation, the robot 102 is controlled to move with so-called "linear movement". In particular, the second mode of operation is such that all of the robot's joints are allowed to move either absolutely, or at any given time (or at least more of the robot's joints are permitted to move compared to the number of joints that are permitted to move in the first mode of operation). Thus, in the second mode of operation, more of the robot's joints move (compared to the above-described first mode of operation). In the second mode of operation, the robot is controlled such that the end effector 120 moves along straight lines between positions/waypoints 401-405, with the orientation of the end effector locked. Such movement of the robot 102 tends to be more predictable for the human 104 in the workspace 106, thus making it easier for the human 104 to avoid colliding with the robot when both are moving around the workspace. Operation of the robot in the second mode may result in completion of the predetermined task in a longer time compared if the robot had been controlled in the first mode.
-14 -Thus, the predetermined task is completed by the robot 102. After step s16, the process of Figure 3 may end.
Returning now to the case where, at step s14, the collision detection module 202 determines that a collision between the robot 102 and the human 104 will occur, at step s18, the collision detection module 202 determines whether an alternative route for the robot 102 is available. More specifically, the collision detection module 202 uses the program 208 and the determined relative positions of the robot's end effector 120 and the human 104 in the workspace 106 to determine whether the program 208 specifies a route for the o robot 102 that would allow the robot 102 to complete the predetermined task and which avoids collision with the human 104. For example, if the robot's current route is the preferred route (i.e. 401 4 402 4 405), and if the human 104 is located along this path at the first intermediate position 402, the collision detection module 202 may determine that an alternative route is available, is namely the third route (i.e. 401 4 403 4 404 4 405). This third route avoids the first intermediate position 402 occupied by the human 104. On the other hand, if the robot's current route is the preferred route (i.e. 401 4 402 4 405), and if the human 104 is located along this path at the final position 405, the collision detection module 202 may determine that no alternative route is available.
It will be appreciated by those skilled in the art, that the robot 102 may be controlled to move back along a path that it previously travelled in order to follow an unimpeded route. For example, consider the case where the robot's current route is the third route (i.e. 401 4 403 4 404 4 405) and the robot is currently located at the second intermediate position 403. If the human 104 is detected at the third intermediate position 404 or along either of the fourth or sixth paths 414, 416, the collision detection module 202 may determine that the robot 102 cannot continue following the third route else a collision would occur. In this case, the collision detection module 202 may identify an alternative route that includes returning the robot 122 to a previous waypoint/position. For example, in this example case, the collision detection module 202 may determine that an alternative route 403 4 401 4 402 4 405 is available.
-15 -lf, at step s18, the collision detection module 202 determines that an alternative route is not available, the collision detection module 202 sends an indication to the motor control module 204 informing the motor control module 204 that no route for the robot 102 is available. The method then proceeds to step s20.
However, if, at step s18, the collision detection module 202 determines that an alternative route is available, the collision detection module 202 sends the alternative route for the robot 102 to the motor control module 204. The method then proceeds to step s22. Step s22 will be described in more detail
later below after a description of step s20.
At step s20, the motor control module 204 controls the robot 102 to stop moving. In this way, the movement of the robot 102 may be paused or frozen until an unimpeded route for the robot 102 becomes available, for example until the human 104 moves into a position that allows the robot 102 to complete the predetermined task. Once the human 104 is no longer impeding movement of the robot 102, movement of the robot 102 may be restarted and the robot 102 may continue to move along an unimpeded route.
In some embodiments, if the robot is unable to complete or continue with its current task, a new task may be started, and the previous task returned to at a later time.
After step s20, the process of Figure 3 may end.
At step s22, the motor control module 204 controls the robot 102 to move along the alternative route in accordance with the second mode of operation.
In some embodiments, if a potential collision is detected or determined, the robot 102 will be controlled to move away from the human 104 in one plane of motion, whilst continuing towards the goal. The avoidance path may be determined by calculating a set distance in one plane away from the detected human 104, and moving the robot by that distance in that distance. An advantage of this is that the number of potential avoidance movements the robot arm can execute tends to be limited, making the movements of the robot 102 easily predictable by the human 104.
-16 -Thus, the predetermined task is completed by the robot 102 After step s22, the process of Figure 3 may end.
Thus, a process of controlling the robot 102 is provided.
Advantageously, the above-described system and method tends to improve safety in the workspace in which both the collaborative robot and the human are working. Risk of injury to the human and damage to the robot tends to be reduced.
Advantageously, the above-described system and method tends to provide that the behaviour of the robot, e.g. how it moves and the speed at which it moves, changes when a human is proximate. In particular, when a human enters the workspace, the robot changes from moving at high speed in a swinging, sweeping manner to moving at relatively lower speed and such that the end effector is move along linear paths. The present inventors have found that this tends to have a positive benefit on the psychological well-being of the human. For example, human operators tend to be reassured by the change of behaviour of the robot when they move into proximity with the robot. The human's trust in the robotic system tends to be improved. The human tends to be reassured that they will not be injured or impeded by the robot. The mental workload and stress of the human working in the common workspace with the robot tends to be reduced as a result, which tends to result in greater productivity of the human. Also, the slower, linear movement of the robot tends to be easier for the human to predict, facilitating the human in predicting the robot's route and moving to a position that does not impede the robot, if necessary.
The above described system and method tends to be advantageously simple. The system also allows for rapid processing with a low demand on the processors (i.e. the controller). Although the above-described system and method tends to be less complex compared to conventional collision avoidance systems and methods, safety of the operator tends not to be not compromised.
Furthermore, the lower complexity tends to allow for integration of additional -17 -variables to improve the psychological safety of the operator (such as the speed of the robot and the proximity of the robot to the human).
In the above embodiments, the system comprises a single robot. However, in other embodiments the system comprises multiple robots.
In the above embodiments, the system comprises a single human.
However, in other embodiments the system comprises multiple humans.
In the above embodiments, the robot is a six-axis robot. However, in other embodiments the robot has a different number of rotary axes about which it may be controlled to move. The robot may also include a different number of motors, i.e. other than six motors, for moving the robot. Also, in some embodiments, the robot may include one or more linear axes along which the robot may be moved. For example, the robot may be mounted to a rail or track along which it may be slid.
In the above embodiments, the end effector comprises a gripper.
However, in other embodiments, the robot may comprise a different type of end effector which may include a tool or device other than a gripper.
In the above embodiments, the system comprises a camera system comprising visible light detecting cameras, image data from which is used to detect the presence of the human in workspace and determine relative positions of the human and robot. However, in other embodiments, the system may comprise a different type of sensor system instead of or in addition to the camera system. Also, a different type of sensor data may be used instead of or in addition the image data. For example, the system may comprise a Light Detection and Ranging (LIDAR) system, and LIDAR data may be used. In some embodiments, a different type of camera, e.g. one or more infrared cameras, may be used.
In the above embodiments, the first mode of operation corresponds to the robot moving at relatively high speed with "joint motion", i.e. such that the end effector is swept along curved paths. However, in other embodiments, the first mode of operation corresponds to the robot moving in a different manner (e.g. with linear motion).
-18 -In the above embodiments, the second mode of operation corresponds to the robot moving at relatively low speed with "linear motion", i.e. such that the end effector follows substantially straight-line paths. However, in other embodiments, the second mode of operation corresponds to the robot moving in a different manner (e.g. with joint motion). In the above embodiments, the second mode of operation includes the robot being controlled to move at a speed that is some function of (e.g. proportional to, such as or linearly proportional to) the distance between the robot and the human in the workspace. This distance may be determined from the image data, e.g. from the relative positions of the robot and human.
In the above embodiments, the program is as described above with reference to Figure 4. However, in other embodiments the program is a different type of program.

Claims (15)

  1. -19 -CLAIMS1. A system for robot and human collaboration, the system comprising: a multi-axis robot located within a workspace; a sensor system configured to detect objects within the workspace; and a controller operatively coupled to the robot and the sensor system, the controller being configured to: control operation of the robot; detect, using an output of the sensor system, a presence of the human in the workspace; responsive to detecting that the human is not located in the workspace, operate the robot in a first mode of operation; responsive to detecting that the human is located in the workspace, operate the robot in a second mode of operation, the second mode of operation being different to the first mode of operation; determine, using the output of the sensor system, relative positions in the workspace of the human and the robot; determine, using the determined relative positions in the workspace of the human and the robot, whether the human is positioned on a route along which the robot is to be controlled to move, responsive to determining that the human is not positioned on the route along which the robot is to be controlled to move, control the robot to move along the route; and, responsive to determining that the human is positioned on the route along which the robot is to be controlled to move, control the robot to either stop moving or move along a different route so as to avoid collision with the human.
  2. 2. The system of claim 1, wherein: -20 -the first mode of operation comprises the robot being controlled to move at less than or equal to a first speed; the second mode of operation comprises the robot being controlled to move at less than or equal to a second speed, the second speed being different to the first speed.
  3. 3. The system of claim 2, wherein the first speed is greater than the second speed.
  4. 4. The system of any of claims 1 to 3, wherein, in the first mode of operation, movement of the robot is restricted such that the robot can rotate around only a limited, proper subset of its axes.
  5. 5. The system of any of claims 1 to 4, wherein, in the first mode of operation, the robot is controlled such that an end effector of the robot moves along one or more curved paths.
  6. 6. The system of any of claims 1 to 5, wherein, in the second mode of operation, the robot is permitted to be rotated around all of its axes.
  7. 7. The system of any of claims 1 to 6, wherein, in the second mode of operation, the robot is controlled such that an end effector of the robot moves along only one or more substantially straight paths.
  8. 8. The system of any of claims 1 to 7, wherein the controller is further configured to: detect, using the output of the sensor system, that the human has entered the workspace; and, -21 -responsive to detecting that the human has entered the workspace, switch operation of the robot from the first mode to the second mode.
  9. 9. The system of any of claims 1 to 8, further comprising: a memory and a program stored in the memory; wherein the program is for controlling the robot to perform a task; the program specifies a plurality of different routes for the robot; and the controller is configured to control the robot using the program.
  10. 10. The system of claim 9, wherein the controller is configured to: responsive to determining that the human is positioned on the route along which the robot is to be controlled to move, determine, using the program, whether an alternative route for the robot is available, the alternative route avoiding collision with the human and such that the robot completes the task; responsive to determining that the alternative route for the robot is available, control, using the program, the robot to move along the alternative route; and, responsive to determining that the alternative route for the robot is not available, control the robot to either stop moving or to move away from the human.
  11. 11. The system of any of claims 1 to 10, wherein the controller is configured to, responsive to determining that the human is positioned on the route along 25 which the robot is to be controlled to move, control the robot to move away from the human in a single plane of motion. -22 -
  12. 12. The system of any of claims 1 to 11, wherein the sensor system comprises one or more sensors selected from the group of sensors consisting of: one or more cameras, one or more visible light detecting cameras, one or more infrared cameras, and one or more Light Detection and Ranging -LIDAR -sensors.
  13. 13. A method for controlling a robot in a robot-human collaborative system, the system comprising a multi-axis robot located within a workspace, a sensor system configured to detect objects within the workspace, and a controller operatively coupled to the robot and the sensor system, the method comprising: detecting, by the controller, using an output of the sensor system, a presence of the human in the workspace; responsive to detecting that the human is not located in the workspace, operating, by the controller, the robot in a first mode of operation; responsive to detecting that the human is located in the workspace, operating, by the controller, the robot in a second mode of operation, the second mode of operation being different to the first mode of operation; determining, by the controller, using the output of the sensor system, relative positions in the workspace of the human and the robot; determining, by the controller, using the determined relative positions in the workspace of the human and the robot, whether the human is positioned on a route along which the robot is to be controlled to move; responsive to determining that the human is not positioned on the route along which the robot is to be controlled to move, controlling, by the controller, the robot to move along the route; and, responsive to determining that the human is positioned on the route along which the robot is to be controlled to move, controlling, by the controller, the robot to either stop moving or move along a different route so as to avoid collision with the human. -23 -
  14. 14. A program or plurality of programs arranged such that when executed by a computer system or one or more processors comprised within a controller it/they cause the computer system or the one or more processors to operate in accordance with the method of claim 13.
  15. 15. A machine-readable storage medium storing a program or at least one of the plurality of programs according to claim 14.
GB2007583.4A 2020-05-21 2020-05-21 Collaborative robot system Pending GB2595289A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2007583.4A GB2595289A (en) 2020-05-21 2020-05-21 Collaborative robot system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2007583.4A GB2595289A (en) 2020-05-21 2020-05-21 Collaborative robot system

Publications (2)

Publication Number Publication Date
GB202007583D0 GB202007583D0 (en) 2020-07-08
GB2595289A true GB2595289A (en) 2021-11-24

Family

ID=71406363

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2007583.4A Pending GB2595289A (en) 2020-05-21 2020-05-21 Collaborative robot system

Country Status (1)

Country Link
GB (1) GB2595289A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009155947A1 (en) * 2008-06-26 2009-12-30 Abb Ag Control system and method for control
US20140244037A1 (en) * 2013-02-27 2014-08-28 Rockwell Automation Technologies, Inc. Recognition-based industrial automation control with person and object discrimination
US20180243920A1 (en) * 2015-08-25 2018-08-30 Kawasaki Jukogyo Kabushiki Kaisha Industrial remote control robot system
US20190070730A1 (en) * 2017-09-07 2019-03-07 Fanuc Corporation Robot system
EP3473387A1 (en) * 2017-10-20 2019-04-24 ABB Schweiz AG Robot supervision system
US20190351554A1 (en) * 2018-05-15 2019-11-21 Fanuc Corporation Robot system
US20200086487A1 (en) * 2018-09-13 2020-03-19 The Charles Stark Draper Laboratory, Inc. Robot Interaction With Human Co-Workers

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009155947A1 (en) * 2008-06-26 2009-12-30 Abb Ag Control system and method for control
US20140244037A1 (en) * 2013-02-27 2014-08-28 Rockwell Automation Technologies, Inc. Recognition-based industrial automation control with person and object discrimination
US20180243920A1 (en) * 2015-08-25 2018-08-30 Kawasaki Jukogyo Kabushiki Kaisha Industrial remote control robot system
US20190070730A1 (en) * 2017-09-07 2019-03-07 Fanuc Corporation Robot system
EP3473387A1 (en) * 2017-10-20 2019-04-24 ABB Schweiz AG Robot supervision system
US20190351554A1 (en) * 2018-05-15 2019-11-21 Fanuc Corporation Robot system
US20200086487A1 (en) * 2018-09-13 2020-03-19 The Charles Stark Draper Laboratory, Inc. Robot Interaction With Human Co-Workers

Also Published As

Publication number Publication date
GB202007583D0 (en) 2020-07-08

Similar Documents

Publication Publication Date Title
US10252424B2 (en) Systems and methods for control of robotic manipulation
US11865732B2 (en) Integrated robotic system and method for autonomous vehicle maintenance
US11039895B2 (en) Industrial remote control robot system
US10759051B2 (en) Architecture and methods for robotic mobile manipulation system
US9463574B2 (en) Mobile inspection robot
KR102003216B1 (en) Motor control and / or adjustment for the robot
US20130178980A1 (en) Anti-collision system for moving an object around a congested environment
US20080231221A1 (en) Arm-equipped mobile robot and method for controlling the same
KR101038581B1 (en) Method, system, and operation method for providing surveillance to power plant facilities using track-type mobile robot system
Siradjuddin et al. A position based visual tracking system for a 7 DOF robot manipulator using a Kinect camera
US11049287B2 (en) Sensing system, work system, augmented-reality-image displaying method, and program
JP6450737B2 (en) Robot system
JPH07281753A (en) Moving robot
US11433538B2 (en) Trajectory generation system and trajectory generating method
CN112706158B (en) Industrial man-machine interaction system and method based on vision and inertial navigation positioning
CN107363851A (en) Control device and robot system
US20070216332A1 (en) Method for Effecting the Movement of a Handling Device and Image Processing Device
JPH07271415A (en) Cooperative robot control method
US20200246974A1 (en) Handling assembly comprising a handling device for carrying out at least one work step, method, and computer program
GB2595289A (en) Collaborative robot system
US20220341906A1 (en) Mobile Robot Environment Sensing
CN116348912A (en) Method and system for object tracking in robotic vision guidance
Anderson et al. Coordinated control and range imaging for mobile manipulation
Li et al. Image based approach to obstacle avoidance in mobile manipulators
Teh et al. Extended Dijkstra algorithm in path planning for vision based patrol robot