WO2022090895A1 - Computer-implemented method for the real-time control of an antropomorphic robot and related system for the real-time control - Google Patents

Computer-implemented method for the real-time control of an antropomorphic robot and related system for the real-time control Download PDF

Info

Publication number
WO2022090895A1
WO2022090895A1 PCT/IB2021/059822 IB2021059822W WO2022090895A1 WO 2022090895 A1 WO2022090895 A1 WO 2022090895A1 IB 2021059822 W IB2021059822 W IB 2021059822W WO 2022090895 A1 WO2022090895 A1 WO 2022090895A1
Authority
WO
WIPO (PCT)
Prior art keywords
anthropomorphic robot
grippers
operator
fact
implemented method
Prior art date
Application number
PCT/IB2021/059822
Other languages
French (fr)
Inventor
Davide PASSONI
Original Assignee
Sir S.P.A.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sir S.P.A. filed Critical Sir S.P.A.
Priority to EP21807253.6A priority Critical patent/EP4237210A1/en
Publication of WO2022090895A1 publication Critical patent/WO2022090895A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1689Teleoperation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/35Nc in input of data, input till input file format
    • G05B2219/35444Gesture interface, controlled machine observes operator, executes commands
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/37Measurements
    • G05B2219/375673-D vision, stereo vision, with two cameras

Definitions

  • the present invention relates to a computer-implemented method for the realtime control of an anthropomorphic robot and to a related system for the realtime control.
  • the main aim of the present invention is to devise a computer-implemented method for the real-time control of an anthropomorphic robot and a related system for the real-time control which allows effective real-time control of a robot.
  • Figure 1 is a schematic view of a robotic cell of the system according to the invention.
  • Figure 2 is a schematic view of a real-time control station of the system according to the invention.
  • Figure 3 schematically illustrates a possible image displayed inside the control station and collected by a camera on the anthropomorphic robot inside the robotic cell;
  • Figure 4 is a general block diagram of the computer-implemented method according to the invention.
  • reference numeral 1 globally relates to a system for the real-time control of an anthropomorphic robot 2.
  • the system 1 comprises a robotic cell 3, schematically shown in Figure 1.
  • the robotic cell 3 comprises the anthropomorphic robot 2 and at least one area 4 intended to house at least one object 5 to be handled by means of the anthropomorphic robot 2.
  • the robotic cell 3 comprises a main area 4 intended to house a plurality of objects 5 which are heterogeneous with respect to each other to be sorted and at least one secondary area 6 intended to house the sorted objects 5 of the same type.
  • the main area 4 inside the robotic cell 3 may comprise at least one bin containing a plurality of objects 5 which are heterogeneous with respect to each other to be sorted by means of the anthropomorphic robot 2, while the secondary area 6 may comprise a plurality (two or more) of other bins intended to house the sorted objects 5.
  • the anthropomorphic robot 2 comprises grippers 7 which are adapted to grasp and handle the object 5.
  • the grippers 7 of the anthropomorphic robot 2 may have two or three jaws.
  • the grippers 7 are preferably made in a claw fashion (i.e. with a compass closure). The use of grippers 7 of different types cannot however be ruled out.
  • the anthropomorphic robot 2 comprises at least one camera 8 arranged at the point where the grippers 7 are located.
  • the camera 8 is arranged in a substantially central position between the grippers 7.
  • the camera 8 is built into the wrist of the anthropomorphic robot 2 in a central position between the grippers 7.
  • the robotic cell 3 comprises an additional camera 9 associated with the anthropomorphic robot 2 in a different position from the camera 8 and/or arranged in a fixed position inside the robotic cell itself.
  • the additional camera 9 may be positioned on the wrist of the anthropomorphic robot 2 but externally to the grippers 7.
  • the additional camera 9 may be positioned inside the robotic cell 3, at the top and in a fixed position, so as to frame the entire working area of the anthropomorphic robot 2.
  • system 1 comprises at least one distance sensor 10 associated with the grippers 7 of the anthropomorphic robot 2.
  • the distance sensor 10 is used to detect the height (vertical distance) of the grippers 7 with respect to an object 5 to be picked up.
  • the distance sensor 10 may comprise at least one of either an ultrasonic sensor, a laser sensor, or a contact sensor.
  • the system 1 also comprises a real-time control station 11 of the anthropomorphic robot 2, schematically shown in Figure 2.
  • the real-time control station 11 comprises at least one processing unit 12 operationally connected to the anthropomorphic robot 2.
  • the processing unit 12 may be composed of one or more processors of the type of personal computers or the like.
  • the real-time control station 11 comprises graphical user interface means 13 accessible by an operator O, operationally connected to the processing unit 12 and configured to display the images detected by said camera 8 and/or the images detected by the additional camera 9.
  • the real-time control station 11 also comprises at least one three-dimensional vision sensor 14 operationally connected to the processing unit 12 and configured to detect the movements of at least the hands of the operator O.
  • the three-dimensional vision sensor 14 is a high-speed stereoscopic sensor.
  • the three-dimensional vision sensor 14 is preferably positioned in front of the operator O and frames the operator’ s hands to detect the movements thereof.
  • the three-dimensional vision sensor 14 sends the acquired images to the processing unit 12 in a continuous mode.
  • the processing unit 12 comprises processing means 15 for the execution of the computer-implemented method described below for the real-time control of the anthropomorphic robot 2.
  • Such processing means 15 are implemented by means of one or more dedicated software programs.
  • processing means 15 for the execution of the computer- implemented method comprise a software program for the recognition of human gestures (and in particular all joints, or axes of movement, of human limbs).
  • the recognition software of human gestures is able to recognize the joint of the elbow, of the wrist, the knuckles of the fingers, the phalanges, the ends of the fingers of the right and left hand.
  • processing means 15 comprise a software program for the management of the robotic cell 3 configured to receive, at input, commands from the software for the recognition of gestures, input images from the camera 8, as well as from one or more additional cameras 9, as well as distance data from the distance sensor 10.
  • the software program for the management of the robotic cell 3 also receives input data from the control of the anthropomorphic robot 2, such as linear and reorientation speed, motion status, and status of the grippers 7.
  • the software program for the management of the robotic cell 3 sends predefined (linear or reorientation) movement instructions and commands to the anthropomorphic robot 2, as well as data regarding the speed to be set (linear and reorientation).
  • the computer- implemented method 100 for the real-time control of an anthropomorphic robot is described in detail below and is schematically shown in Figure 4.
  • the computer-implemented method 100 comprises at least the following steps:
  • step 140 moving the anthropomorphic robot 2 depending on the detected movements of the operator O for the real-time handling of the object 5
  • step 130 of detecting the movements of at least the hands of the operator O comprises at least the following steps:
  • the step of identifying at least one command for the real-time movement of the anthropomorphic robot 2 comprises at least the following steps:
  • RX, RY, RZ a rotary-translation matrix of the hands and/or of other parts of the body of the operator O in the space starting from the coordinates (X, Y, Z);
  • the coordinates (X, Y, Z) of the axes of movement comprise coordinates relating to the knuckles of the index, middle, ring and little fingers of one hand of the operator O.
  • the method 100 comprises at least one step of selecting a (linear and reorientation) speed of movement of said anthropomorphic robot.
  • the anthropomorphic robot 2 is first switched to the automatic mode (motors on and automatic status).
  • the computer- implemented method 100 comprises at least one preliminary step of positioning the anthropomorphic robot 2 in an initial home position which is retracted at the center of the robotic cell 3, as a result of a predefined command which is detected using the three-dimensional vision sensor 14.
  • the initial home position command may be given by the operator O with his left hand (closed fist and thumb down).
  • the anthropomorphic robot 2 moves until it reaches the home position (out of the way of all bins and in a retracted position at the center of the cell). If the operator O intends to stop the movement, he/she can raise his/her left open hand (stop).
  • Such commands are detected by the three-dimensional vision sensor 14 and interpreted by the software program for the gestures control implemented on the processing unit 12.
  • the method 100 comprises, as a result of a predefined command, which is detected by means of the three-dimensional vision sensor 14, at least one positioning step of the anthropomorphic robot 2 in a start picking position, wherein the grippers 7 are open and arranged above the object 5 (or objects) to be handled.
  • the operator O may give the command for the start picking position with his/her left hand (closed fist and thumb upwards).
  • the anthropomorphic robot 2 moves until it reaches the start picking position above and at the center of the bin of the objects 5 to be picked up, with the grippers 7 open. If the operator O wishes to stop the movement, he/she may raise his/her left open hand (stop).
  • Such commands are detected by the three-dimensional vision sensor 14 and interpreted by the software program for the gesture control implemented on the processing unit 12.
  • the method 100 also comprises, as a result of a predefined command which is detected by means of said three-dimensional vision sensor, at least one step of starting a real-time operating mode of the anthropomorphic robot 2, wherein the anthropomorphic robot 2 is ready to receive and execute real-time commands.
  • the anthropomorphic robot 2 is switched to the run mode in real time (e.g., by means of closed fist command without thumb up or down).
  • the anthropomorphic robot 2 is ready to receive movement commands in the various directions by means of the relevant gestures of the operator O. If the operator wishes to stop the movement, he/she will raise his/her left open hand (stop).
  • Such commands are detected by the three-dimensional vision sensor 14 and interpreted by the software program for the gesture control implemented on the processing unit 12.
  • the operator O sees the image of the objects O to be picked up inside the bin by means of the camera 8 on the wrist of the anthropomorphic robot 2, thus from the “point of view” of the grippers 7.
  • the position and rotation of the grippers 7 (and consequently of the jaws) is always correctly visible, since the rotational data of the wrist axis are continuously sent by the anthropomorphic robot 2 to the management software program, which will update the graphics displayed by the graphical user interface means 13 accordingly.
  • the jaws are displayed superimposed on the displayed image and are scaled in size depending on the distance between the grippers 7 and the objects 5 in the bin, detected by the distance sensor 10.
  • the aforementioned steps of 130 and 140 of detecting the movements of the operator O and of moving the anthropomorphic robot 2 comprise, respectively, at least the following steps:
  • step 141 moving the grippers 7 along such at least one direction in the horizontal and/or vertical plane (step 141).
  • the operator O by means of gestures, may position himself/herself on the object 5 he/she wishes to pick up.
  • the movement of the anthropomorphic robot 2 in the selected direction continues until the operator O carries out the opposite movement (if he/she has moved his/her hand forward, he/she must move it slightly backward): the change of direction is interpreted as a stop. Forward-right, backward-right, forward-left, backward-left diagonal movements are also possible.
  • the operator O by means of the gestures, can move the grippers 7 of the anthropomorphic robot 2 also in the vertical plane, varying the distance with respect to the object to be picked up. If he/she wishes to move in the vertical plane, he/she has to move his/her hand up, down, right, left in the vertical plane. These movements are detected by the three-dimensional vision sensor 14 and interpreted by the software program for the gesture control implemented on the processing unit 12. Linear movement instructions with the previously selected linear speed are sent to the anthropomorphic robot 2.
  • the method 100 also comprises at least the following steps:
  • step 150 detecting the distance of the grippers 7 from the object 5 to be handled
  • step 160 visually signaling to the operator O the distance of the grippers 7 from the object O to be handled (step 160).
  • the step 160 of visually signaling the distance of the grippers 7 from the object O to be handled comprises displaying signals of different colors according to the distance.
  • the graphic user interface means 13 display a signal of different color according to the distance: green in the event of the grippers 7 being out of the way, yellow in the event of these being at a short distance from the element to be picked up, red in the event of these being in a near collision position with the object O to be picked up (or in general with the elements inside a bin).
  • the step 160 of visually signaling the distance of the grippers 7 from the object O to be handled comprises displaying the approximate distance between the grippers and the object themselves.
  • step 130 and 140 of detecting the movements of the operator O and of moving the anthropomorphic robot 2 comprise, respectively, at least the following steps:
  • step 132 - detecting, by means of the three-dimensional vision sensor 14, at least one rotation of a hand of the operator O around at least one axis of rotation (step 132);
  • the method 100 according to the invention also allows interpolated and not only linear movements to be carried out with the anthropomorphic robot 2.
  • angle 0 in the three directions of space the position of the right hand with more or less horizontal knuckles (consider the forearm in a horizontal and forward position). If the hand (and consequently the knuckles) is rotated in a clockwise direction, the anthropomorphic robot 2 will orient the wrist axis, and consequently the grippers 7, in a clockwise direction. If the hand (and consequently the knuckles) is rotated counterclockwise, the anthropomorphic robot 2 will orient the wrist axis, and consequently the grippers 7, counterclockwise. This is necessary to align the jaws of the grippers 7 with the object O in order to perform an optimal grip.
  • the anthropomorphic robot 2 will be sent wrist reorientation movement instructions (clockwise and counterclockwise, respectively, for the wrist axis relative to the current angular position, incrementally) with the previously selected reorientation speed.
  • the reorientation movement of the wrist axis is substantially a rotational movement around the axis Z of the tool robot, whose origin of the reference triad is positioned at the end of the grippers 7 (Z exiting the grippers and parallel to the vertical axis thereof).
  • the reorientation movement of the anthropomorphic robot 2 in the selected direction continues until the operator carries out the opposite movement (if he/she has rotated his/her hand clockwise, he must bring it back to the zero or home position or carry out the opposite movement): the change of direction is interpreted as a stop.
  • the operator O can also carry out more complex interpolated movements with the anthropomorphic robot 2, in order to manage the possible inclined grip of an object O.
  • Such movements substantially represent a rotation around the axes X and Y of the tool robot, whose origin of the reference triad is positioned at the end of the grippers 7 (Z exiting the grippers and parallel to the vertical axis thereof).
  • the anthropomorphic robot 2 will carry out the respective tool rotational movement around the axis X (identified by the left/right hand rotation) or axis Y (identified by the forward/backward hand rotation).
  • the tool origin triad, positioned at the end of the grippers 7, will remain the fixed point of rotation.
  • the method 100 comprises at least one step of managing two-axis interpolated rotation movements as a result of the detection of the forward-right, forward-left, backward-right, backward-left inclination of the operator hand knuckles.
  • the anthropomorphic robot 2 is sent respective rotational movement instructions around the axes X and Y of the tool and with respect to the current angular position, incrementally, with the previously selected reorientation speed.
  • the reorientation movement of the anthropomorphic robot 2 in the selected direction continues until the operator O carries out the opposite movement (if he/she has rotated his/her hand in one direction, he/she must bring it back to 0 position or carry out the opposite movement): the change of direction is interpreted as a stop.
  • the aforementioned steps 130 and 140 of detecting the movements of the operator O and of moving the anthropomorphic robot 2 comprise, respectively, at least the following steps:
  • step 133 detecting at least one closing command of the grippers or one opening command of the grippers
  • the anthropomorphic robot 2 is brought to stop mode by opening the left hand in the air.
  • the operator O can close in joint position the thumb and the index finger of the right hand: this operation is interpreted by the anthropomorphic robot 2 as a gripper closing command.
  • the gripper closing can only occur with the robot in stop mode: gestures of closing and opening fingers during the movement phase will not be taken over by the management software.
  • the operator O after having set the anthropomorphic robot 2 back in the movement condition with his/her left hand (closed fist without thumb up or down), can bring the anthropomorphic robot 2 back to the out of the way position above the bin with the object 5 picked up in his/her hand, using, for the ascent, the movements previously described for the right hand.
  • the start picking position command closed fist and thumb upwards, like the OK gesture
  • steps 130 and 140 of detecting the movements of the operator O and of moving the anthropomorphic robot 2 comprise, respectively, at least the following steps:
  • step 144 moving the anthropomorphic robot 2 between a picking position and a storage position (step 144).
  • the operator O moves the forearm completely clockwise (the robot makes a clockwise reorientation movement on the axis 1, at its base) or counterclockwise (the robot makes a counterclockwise reorientation movement on the axis 1, at its base).
  • the movement of the anthropomorphic robot 2 in the selected direction continues until the operator O carries out the opposite movement (if he/she has rotated his/her forearm clockwise, he/she must rotate it again to the zero position, straight ahead, therefore with a counterclockwise movement): the change of direction is interpreted as a stop.
  • the speed used for the movement of the axis 1 of the anthropomorphic robot 2 is the previously selected reorientation speed.
  • the operator O can also use all the other movements of the linear type previously described for the correct positioning over the selected storage bin.
  • the first camera 8 is dimmed by the presence of the object 5 being gripped.
  • the operator O will therefore use as a reference the additional camera 9 located on the robot wrist (but externally to the grippers) or located high above the cell.
  • the operator O judges that he/she is correctly positioned in the storage position, above the desired bin, he/she may stop the movement, bringing the anthropomorphic robot 2 to the stop mode.
  • the operator O can open the fingers of his/her right hand, by stretching them completely: this operation is interpreted as a grippers opening command, thus depositing the object 5 in the storage bin.
  • the opening of the grippers 7 can only occur with the anthropomorphic robot 2 in the stop mode: closing and opening gestures of the fingers during the movement phase will not be taken over by the management software.
  • the robot By means of a run command and the various movements described above, the robot can be placed back in the out of the way position above the rough parts picking bin to start a new cycle.
  • the start picking position command can be given with the left hand, which automatically returns the robot to the same position.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

The computer- implemented method (100) for the real-time control of an anthropomorphic robot (2) comprises the following steps: by means of at least one camera (8) arranged at the point where the grippers (7) of an anthropomorphic robot (2) are located, detecting images of at least one object (5) to be handled and of the surrounding area; by means of graphical user interface means (13) accessible by an operator (O), displaying at least the images detected by the camera (8); by means of at least one three-dimensional vision sensor (14), detecting the movements of at least the hands of the operator (O); moving the anthropomorphic robot (2) depending on the movements of the operator (O) detected for the real-time handling of the object (5).

Description

COMPUTER-IMPLEMENTED METHOD FOR THE REAL-TIME CONTROL OF AN ANTROPOMORPHIC ROBOT AND RELATED SYSTEM FOR THE REAL-TIME CONTROL
Technical Field
The present invention relates to a computer-implemented method for the realtime control of an anthropomorphic robot and to a related system for the realtime control.
Background Art
With reference to robotics for industrial application, there are known applications, particularly with reference to complex handling operations, in which a robot cannot effectively perform the expected functions by means of a preset automatic program.
By way of an example, a possible application which is extremely difficult from the robotic automation point of view is waste sorting. In fact, a robot dedicated to sorting waste inside a bin cannot operate autonomously, as it cannot know in advance which elements it will have to handle and what material each element is made of. In this case, even the application of advanced vision systems does not allow fixing the problem.
Description of the Invention
The main aim of the present invention is to devise a computer-implemented method for the real-time control of an anthropomorphic robot and a related system for the real-time control which allows effective real-time control of a robot.
The aforementioned objects are achieved by the present computer- implemented method for the real-time control of an anthropomorphic robot according to the characteristics described in claim 1.
The aforementioned objects are also achieved by the present system for the realtime control of an anthropomorphic robot according to the characteristics described in claim 11.
Brief Description of the Drawings
Other characteristics and advantages of the present invention will become more apparent from the description of a preferred, but not exclusive, embodiment of a computer-implemented method for the real-time control of an anthropomorphic robot and a related system for the real-time control, illustrated by way of an indicative, yet non-limiting example, in the accompanying tables of drawings wherein:
Figure 1 is a schematic view of a robotic cell of the system according to the invention;
Figure 2 is a schematic view of a real-time control station of the system according to the invention;
Figure 3 schematically illustrates a possible image displayed inside the control station and collected by a camera on the anthropomorphic robot inside the robotic cell;
Figure 4 is a general block diagram of the computer-implemented method according to the invention.
Embodiments of the Invention
With particular reference to such figures, reference numeral 1 globally relates to a system for the real-time control of an anthropomorphic robot 2.
The system 1 comprises a robotic cell 3, schematically shown in Figure 1.
The robotic cell 3 comprises the anthropomorphic robot 2 and at least one area 4 intended to house at least one object 5 to be handled by means of the anthropomorphic robot 2.
In particular, if the system 1 is used for sorting objects 5 which are heterogeneous with respect to each other, the robotic cell 3 comprises a main area 4 intended to house a plurality of objects 5 which are heterogeneous with respect to each other to be sorted and at least one secondary area 6 intended to house the sorted objects 5 of the same type.
For example, the main area 4 inside the robotic cell 3 may comprise at least one bin containing a plurality of objects 5 which are heterogeneous with respect to each other to be sorted by means of the anthropomorphic robot 2, while the secondary area 6 may comprise a plurality (two or more) of other bins intended to house the sorted objects 5. According to a possible embodiment, wherein the system 1 is employed for the handling of objects, the anthropomorphic robot 2 comprises grippers 7 which are adapted to grasp and handle the object 5.
For example, the grippers 7 of the anthropomorphic robot 2 may have two or three jaws. In the case of a sorting application, the grippers 7 are preferably made in a claw fashion (i.e. with a compass closure). The use of grippers 7 of different types cannot however be ruled out.
Furthermore, the anthropomorphic robot 2 comprises at least one camera 8 arranged at the point where the grippers 7 are located.
According to a preferred embodiment, the camera 8 is arranged in a substantially central position between the grippers 7.
Preferably, the camera 8 is built into the wrist of the anthropomorphic robot 2 in a central position between the grippers 7.
According to a possible embodiment of the system 1, the robotic cell 3 comprises an additional camera 9 associated with the anthropomorphic robot 2 in a different position from the camera 8 and/or arranged in a fixed position inside the robotic cell itself.
For example, the additional camera 9 may be positioned on the wrist of the anthropomorphic robot 2 but externally to the grippers 7.
Alternatively, the additional camera 9 may be positioned inside the robotic cell 3, at the top and in a fixed position, so as to frame the entire working area of the anthropomorphic robot 2.
Furthermore, the system 1 comprises at least one distance sensor 10 associated with the grippers 7 of the anthropomorphic robot 2.
The distance sensor 10 is used to detect the height (vertical distance) of the grippers 7 with respect to an object 5 to be picked up.
The distance sensor 10 may comprise at least one of either an ultrasonic sensor, a laser sensor, or a contact sensor.
The system 1 also comprises a real-time control station 11 of the anthropomorphic robot 2, schematically shown in Figure 2.
The real-time control station 11 comprises at least one processing unit 12 operationally connected to the anthropomorphic robot 2.
The processing unit 12 may be composed of one or more processors of the type of personal computers or the like.
Furthermore, the real-time control station 11 comprises graphical user interface means 13 accessible by an operator O, operationally connected to the processing unit 12 and configured to display the images detected by said camera 8 and/or the images detected by the additional camera 9.
The real-time control station 11 also comprises at least one three-dimensional vision sensor 14 operationally connected to the processing unit 12 and configured to detect the movements of at least the hands of the operator O.
Preferably, the three-dimensional vision sensor 14 is a high-speed stereoscopic sensor.
The three-dimensional vision sensor 14 is preferably positioned in front of the operator O and frames the operator’ s hands to detect the movements thereof.
The three-dimensional vision sensor 14 sends the acquired images to the processing unit 12 in a continuous mode.
The processing unit 12 comprises processing means 15 for the execution of the computer-implemented method described below for the real-time control of the anthropomorphic robot 2.
Such processing means 15 are implemented by means of one or more dedicated software programs.
In particular, the processing means 15 for the execution of the computer- implemented method comprise a software program for the recognition of human gestures (and in particular all joints, or axes of movement, of human limbs).
Therefore, starting from the image coming from the three-dimensional vision sensor 14 which frames the operator O, the recognition software of human gestures is able to recognize the joint of the elbow, of the wrist, the knuckles of the fingers, the phalanges, the ends of the fingers of the right and left hand.
In addition, the processing means 15 comprise a software program for the management of the robotic cell 3 configured to receive, at input, commands from the software for the recognition of gestures, input images from the camera 8, as well as from one or more additional cameras 9, as well as distance data from the distance sensor 10.
The software program for the management of the robotic cell 3 also receives input data from the control of the anthropomorphic robot 2, such as linear and reorientation speed, motion status, and status of the grippers 7.
In addition, the software program for the management of the robotic cell 3 sends predefined (linear or reorientation) movement instructions and commands to the anthropomorphic robot 2, as well as data regarding the speed to be set (linear and reorientation).
The computer- implemented method 100 for the real-time control of an anthropomorphic robot is described in detail below and is schematically shown in Figure 4.
The computer-implemented method 100 comprises at least the following steps:
- by means of the camera 8 arranged at the point where the grippers 7 of the anthropomorphic robot 2 are located, detecting images of at least one object 5 to be handled and of the surrounding area (step 110);
- by means of the graphical user interface means 13 accessible by an operator O, displaying at least the images detected by the camera 8 (step 120);
- by means of the three-dimensional vision sensor 14, detecting the movements of at least the hands of the operator O (step 130);
- moving the anthropomorphic robot 2 depending on the detected movements of the operator O for the real-time handling of the object 5 (step 140).
In particular, the step 130 of detecting the movements of at least the hands of the operator O comprises at least the following steps:
- acquiring, by means of the three-dimensional vision sensor 14, at least one image of the hands and/or of other parts of the body of the operator O;
- starting from the at least one acquired image, processing at least one points cloud of the hands and/or of other parts of the body of the operator O;
- starting from such processed points cloud, identifying at least one command for the real-time movement of the anthropomorphic robot 2.
According to a possible embodiment of the method 100, the step of identifying at least one command for the real-time movement of the anthropomorphic robot 2 comprises at least the following steps:
- starting from the processed points cloud, determining the coordinates (X, Y, Z) of axes of movement relating to the hands and/or to other parts of the body of the operator O;
- calculating a rotary-translation matrix (RX, RY, RZ) of the hands and/or of other parts of the body of the operator O in the space starting from the coordinates (X, Y, Z);
- determining at least one command for the real-time movement of the anthropomorphic robot 2 starting from the coordinates (X, Y, Z) of the axes of movement and from the succession of the angles of the rotary-translation matrix (RY, RY, RZ).
In particular, the coordinates (X, Y, Z) of the axes of movement comprise coordinates relating to the knuckles of the index, middle, ring and little fingers of one hand of the operator O.
Conveniently, the method 100 comprises at least one step of selecting a (linear and reorientation) speed of movement of said anthropomorphic robot.
In particular, by means of a command of the software program for the management of the robotic cell 3, the anthropomorphic robot 2 is first switched to the automatic mode (motors on and automatic status).
Through the management interface of the software program for the management of the robotic cell 3, it is also possible to select (via text-box) the (linear and reorientation) speed of the subsequent movements.
Conveniently, the computer- implemented method 100 comprises at least one preliminary step of positioning the anthropomorphic robot 2 in an initial home position which is retracted at the center of the robotic cell 3, as a result of a predefined command which is detected using the three-dimensional vision sensor 14.
For example, according to one possible embodiment, the initial home position command may be given by the operator O with his left hand (closed fist and thumb down). As a result of such a command, the anthropomorphic robot 2 moves until it reaches the home position (out of the way of all bins and in a retracted position at the center of the cell). If the operator O intends to stop the movement, he/she can raise his/her left open hand (stop). Such commands are detected by the three-dimensional vision sensor 14 and interpreted by the software program for the gestures control implemented on the processing unit 12.
According to a preferred embodiment, the method 100 comprises, as a result of a predefined command, which is detected by means of the three-dimensional vision sensor 14, at least one positioning step of the anthropomorphic robot 2 in a start picking position, wherein the grippers 7 are open and arranged above the object 5 (or objects) to be handled.
For example, according to a possible embodiment, the operator O may give the command for the start picking position with his/her left hand (closed fist and thumb upwards). As a result of such a command, the anthropomorphic robot 2 moves until it reaches the start picking position above and at the center of the bin of the objects 5 to be picked up, with the grippers 7 open. If the operator O wishes to stop the movement, he/she may raise his/her left open hand (stop). Such commands are detected by the three-dimensional vision sensor 14 and interpreted by the software program for the gesture control implemented on the processing unit 12.
Furthermore, the method 100 also comprises, as a result of a predefined command which is detected by means of said three-dimensional vision sensor, at least one step of starting a real-time operating mode of the anthropomorphic robot 2, wherein the anthropomorphic robot 2 is ready to receive and execute real-time commands.
Therefore, after positioning in the start picking position, the anthropomorphic robot 2 is switched to the run mode in real time (e.g., by means of closed fist command without thumb up or down). In this mode, the anthropomorphic robot 2 is ready to receive movement commands in the various directions by means of the relevant gestures of the operator O. If the operator wishes to stop the movement, he/she will raise his/her left open hand (stop). Such commands are detected by the three-dimensional vision sensor 14 and interpreted by the software program for the gesture control implemented on the processing unit 12.
In the start picking position, the operator O sees the image of the objects O to be picked up inside the bin by means of the camera 8 on the wrist of the anthropomorphic robot 2, thus from the “point of view” of the grippers 7. The position and rotation of the grippers 7 (and consequently of the jaws) is always correctly visible, since the rotational data of the wrist axis are continuously sent by the anthropomorphic robot 2 to the management software program, which will update the graphics displayed by the graphical user interface means 13 accordingly.
Preferably, the jaws are displayed superimposed on the displayed image and are scaled in size depending on the distance between the grippers 7 and the objects 5 in the bin, detected by the distance sensor 10.
The aforementioned steps of 130 and 140 of detecting the movements of the operator O and of moving the anthropomorphic robot 2 comprise, respectively, at least the following steps:
- detecting by means of the three-dimensional vision sensor 14 at least one movement of a hand of the operator O along at least one direction in a horizontal and/or vertical plane (step 131);
- in case of movement in the detected horizontal and/or vertical plane, moving the grippers 7 along such at least one direction in the horizontal and/or vertical plane (step 141).
Therefore, at this point the operator O, by means of gestures, may position himself/herself on the object 5 he/she wishes to pick up.
If he/she wishes to move in the horizontal plane, he/she must move his/her hand forward, backward, right, left in the horizontal plane. Such movements are detected by the three-dimensional vision sensor 14 and interpreted by the software program for the gesture control implemented on the processing unit 12. Corresponding linear movement instructions (forward, backward, right, left of the current position incrementally, respectively) with the previously selected linear speed are sent to the anthropomorphic robot 2.
According to a preferred embodiment, the movement of the anthropomorphic robot 2 in the selected direction continues until the operator O carries out the opposite movement (if he/she has moved his/her hand forward, he/she must move it slightly backward): the change of direction is interpreted as a stop. Forward-right, backward-right, forward-left, backward-left diagonal movements are also possible.
Similarly, the operator O, by means of the gestures, can move the grippers 7 of the anthropomorphic robot 2 also in the vertical plane, varying the distance with respect to the object to be picked up. If he/she wishes to move in the vertical plane, he/she has to move his/her hand up, down, right, left in the vertical plane. These movements are detected by the three-dimensional vision sensor 14 and interpreted by the software program for the gesture control implemented on the processing unit 12. Linear movement instructions with the previously selected linear speed are sent to the anthropomorphic robot 2. The movement of the robot in the selected direction continues until the operator O carries out the opposite movement (if he/she has moved his/her hand up, he/she must move it slightly down): the change of direction is interpreted as a stop. Up-right, downright, up-left, down-left diagonal movements are also possible.
It is possible to make joint movements in both the horizontal and vertical planes by moving the hand along a diagonal in space.
The method 100 also comprises at least the following steps:
- by means of the distance sensor 10 associated with the grippers 7 of the anthropomorphic robot 2, detecting the distance of the grippers 7 from the object 5 to be handled (step 150);
- by means of the graphical user interface means 13, visually signaling to the operator O the distance of the grippers 7 from the object O to be handled (step 160).
In particular, the step 160 of visually signaling the distance of the grippers 7 from the object O to be handled comprises displaying signals of different colors according to the distance. For example, when the grippers 7, during the descent phase, start approaching the object O to be picked up, the graphic user interface means 13 display a signal of different color according to the distance: green in the event of the grippers 7 being out of the way, yellow in the event of these being at a short distance from the element to be picked up, red in the event of these being in a near collision position with the object O to be picked up (or in general with the elements inside a bin).
The descent movement of the grippers 7 is inhibited for safety reasons after entering the red area.
In addition, the step 160 of visually signaling the distance of the grippers 7 from the object O to be handled comprises displaying the approximate distance between the grippers and the object themselves.
By means of the user interface means 13, additional text boxes are also displayed in which the current position of the anthropomorphic robot 2 is indicated in Cartesian coordinates (X, Y, Z and Euler angles RX, RY, RZ) and in absolute angles of the axes (Axis 1, Axis 2, Axis 3, Axis 4, Axis 5, Axis 6). Furthermore, the aforementioned steps 130 and 140 of detecting the movements of the operator O and of moving the anthropomorphic robot 2 comprise, respectively, at least the following steps:
- detecting, by means of the three-dimensional vision sensor 14, at least one rotation of a hand of the operator O around at least one axis of rotation (step 132);
- in case of rotation around at least one detected axis, rotating the grippers 7 around said at least one axis of rotation (step 142).
Therefore, the method 100 according to the invention also allows interpolated and not only linear movements to be carried out with the anthropomorphic robot 2.
For example, it may be assumed as “angle 0” in the three directions of space the position of the right hand with more or less horizontal knuckles (consider the forearm in a horizontal and forward position). If the hand (and consequently the knuckles) is rotated in a clockwise direction, the anthropomorphic robot 2 will orient the wrist axis, and consequently the grippers 7, in a clockwise direction. If the hand (and consequently the knuckles) is rotated counterclockwise, the anthropomorphic robot 2 will orient the wrist axis, and consequently the grippers 7, counterclockwise. This is necessary to align the jaws of the grippers 7 with the object O in order to perform an optimal grip. The anthropomorphic robot 2 will be sent wrist reorientation movement instructions (clockwise and counterclockwise, respectively, for the wrist axis relative to the current angular position, incrementally) with the previously selected reorientation speed. The reorientation movement of the wrist axis is substantially a rotational movement around the axis Z of the tool robot, whose origin of the reference triad is positioned at the end of the grippers 7 (Z exiting the grippers and parallel to the vertical axis thereof). The reorientation movement of the anthropomorphic robot 2 in the selected direction continues until the operator carries out the opposite movement (if he/she has rotated his/her hand clockwise, he must bring it back to the zero or home position or carry out the opposite movement): the change of direction is interpreted as a stop.
The operator O can also carry out more complex interpolated movements with the anthropomorphic robot 2, in order to manage the possible inclined grip of an object O. Such movements substantially represent a rotation around the axes X and Y of the tool robot, whose origin of the reference triad is positioned at the end of the grippers 7 (Z exiting the grippers and parallel to the vertical axis thereof).
Notwithstanding the definition of “angle 0” above, if the knuckles are oriented (by means of hand rotation) to the right, left, forward, backward, the anthropomorphic robot 2 will carry out the respective tool rotational movement around the axis X (identified by the left/right hand rotation) or axis Y (identified by the forward/backward hand rotation). The tool origin triad, positioned at the end of the grippers 7, will remain the fixed point of rotation.
The method 100 according to the invention comprises at least one step of managing two-axis interpolated rotation movements as a result of the detection of the forward-right, forward-left, backward-right, backward-left inclination of the operator hand knuckles. The anthropomorphic robot 2 is sent respective rotational movement instructions around the axes X and Y of the tool and with respect to the current angular position, incrementally, with the previously selected reorientation speed.
The reorientation movement of the anthropomorphic robot 2 in the selected direction continues until the operator O carries out the opposite movement (if he/she has rotated his/her hand in one direction, he/she must bring it back to 0 position or carry out the opposite movement): the change of direction is interpreted as a stop.
The aforementioned steps 130 and 140 of detecting the movements of the operator O and of moving the anthropomorphic robot 2 comprise, respectively, at least the following steps:
- by means of the three-dimensional vision sensor 14, detecting at least one closing command of the grippers or one opening command of the grippers (step 133);
- in case of detected grippers closing command or grippers opening command, closing/opening said grippers 7 to grasp/release said object O to be handled (step 143).
In particular, according to a possible embodiment, once in the gripping position, the anthropomorphic robot 2 is brought to stop mode by opening the left hand in the air. Once the movement of the anthropomorphic robot 2 has stopped, the operator O can close in joint position the thumb and the index finger of the right hand: this operation is interpreted by the anthropomorphic robot 2 as a gripper closing command. The gripper closing can only occur with the robot in stop mode: gestures of closing and opening fingers during the movement phase will not be taken over by the management software.
Once the object 5 to be handled has been grasped, the operator O, after having set the anthropomorphic robot 2 back in the movement condition with his/her left hand (closed fist without thumb up or down), can bring the anthropomorphic robot 2 back to the out of the way position above the bin with the object 5 picked up in his/her hand, using, for the ascent, the movements previously described for the right hand. Alternatively, if the conditions and the current position of the robot allow, he/she can give with his/her left hand the start picking position command (closed fist and thumb upwards, like the OK gesture), which automatically brings the anthropomorphic robot 2 back to the out of the way position above the picking bin.
Furthermore, the aforementioned steps 130 and 140 of detecting the movements of the operator O and of moving the anthropomorphic robot 2 comprise, respectively, at least the following steps:
- by means of said three-dimensional vision sensor 14, detecting at least one movement of the forearm of the operator O (step 134);
- in case of detected movement of the forearm, moving the anthropomorphic robot 2 between a picking position and a storage position (step 144).
In particular, according to a possible embodiment, in order to rotate the arm of the anthropomorphic robot 2 in the storage position, e.g. on three different bins in which to sort the objects 5, the operator O moves the forearm completely clockwise (the robot makes a clockwise reorientation movement on the axis 1, at its base) or counterclockwise (the robot makes a counterclockwise reorientation movement on the axis 1, at its base). The movement of the anthropomorphic robot 2 in the selected direction continues until the operator O carries out the opposite movement (if he/she has rotated his/her forearm clockwise, he/she must rotate it again to the zero position, straight ahead, therefore with a counterclockwise movement): the change of direction is interpreted as a stop. The speed used for the movement of the axis 1 of the anthropomorphic robot 2 is the previously selected reorientation speed. The operator O can also use all the other movements of the linear type previously described for the correct positioning over the selected storage bin.
Subsequent to the picking up of the object 5 to be handled, the first camera 8 is dimmed by the presence of the object 5 being gripped. To view the movements of the anthropomorphic robot 2 during positioning towards the storage bins, the operator O will therefore use as a reference the additional camera 9 located on the robot wrist (but externally to the grippers) or located high above the cell. When the operator O judges that he/she is correctly positioned in the storage position, above the desired bin, he/she may stop the movement, bringing the anthropomorphic robot 2 to the stop mode.
Once the movement of the anthropomorphic robot 2 has stopped, the operator O can open the fingers of his/her right hand, by stretching them completely: this operation is interpreted as a grippers opening command, thus depositing the object 5 in the storage bin. The opening of the grippers 7 can only occur with the anthropomorphic robot 2 in the stop mode: closing and opening gestures of the fingers during the movement phase will not be taken over by the management software.
By means of a run command and the various movements described above, the robot can be placed back in the out of the way position above the rough parts picking bin to start a new cycle. Alternatively, if conditions and the current position of the robot allow, the start picking position command can be given with the left hand, which automatically returns the robot to the same position.
It has in practice been ascertained that the described invention achieves the intended objects.

Claims

1) Computer- implemented method (100) for the real-time control of an anthropomorphic robot (2), characterized by the fact that it comprises at least the following steps:
- by means of at least one camera (8) arranged at the point where the grippers (7) of an anthropomorphic robot (2) are located, detecting images of at least one object (5) to be handled and of the surrounding area;
- by means of graphical user interface means (13) accessible by an operator (O), displaying at least said images detected by said camera (8);
- by means of at least one three-dimensional vision sensor (14), detecting the movements of at least the hands of said operator (O);
- moving said anthropomorphic robot (2) depending on said movements of the operator (O) detected for the real-time handling of said at least one object (5).
2) Computer- implemented method (100) according to claim 1, characterized by the fact that said step of detecting the movements of at least the hands of said operator (O) comprises at least the following steps:
- acquiring by means of said three-dimensional vision sensor (14) at least one image of the hands and/or of the other parts of the body of said operator (O);
- starting from said at least one acquired image, processing at least one points cloud of the hands and/or of other parts of the body of said operator (O);
- starting from said processed points cloud, identifying at least one command for the real-time movement of said anthropomorphic robot (2).
3) Computer-implemented method (100) according to claim 2, characterized by the fact that said step of identifying at least one command for the real-time movement of said anthropomorphic robot (2) comprises at least the following steps:
- starting from said points cloud, determining the coordinates (X, Y, Z) of axes of movement relating to said hands and/or to other parts of the body of the operator (O);
- calculating a rotary-translation matrix (RX, RY, RZ) of said hands and/or of other parts of the body of the operator (O) in the space starting from said coordinates (X, Y, Z);
- determining said command for the real-time movement of said anthropomorphic robot (2) starting from said coordinates (X, Y, Z) of the axes of movement and from the succession of the angles of said rotarytranslation matrix (RY, RY, RZ).
4) Computer- implemented method (100) according to claim 3, characterized by the fact that said coordinates (X, Y, Z) of the axes of movement comprise coordinates relating to the knuckles of the index, middle, ring and little finger of the same hand.
5) Computer- implemented method (100) according to one or more of the preceding claims, characterized by the fact that it comprises, as a result of a predefined command, detected by means of said three-dimensional vision sensor (14), at least one positioning step of said anthropomorphic robot (2) in a start picking position, wherein said grippers of the anthropomorphic robot (2) are open and arranged above said at least one object (5) to be handled.
6) Computer- implemented method (100) according to one or more of the preceding claims, characterized by the fact that it comprises at least the following steps:
- detecting by means of said three-dimensional vision sensor (14) at least one movement of a hand of said operator (O) along at least one direction in a horizontal and/or vertical plane (step 131);
- in case of movement in the detected horizontal and/or vertical plane, moving said grippers (7) along said at least one direction in the horizontal and/or vertical plane (step 141).
7) Computer- implemented method (100) according to one or more of the preceding claims, characterized by the fact that it comprises at least the following steps:
- by means of at least one distance sensor (10) associated with said grippers (7) of the anthropomorphic robot (2), detecting the distance of said grippers (7) from said object (5) to be handled (step 150);
- by means of said graphical user interface means (13), visually signaling to 17 said operator (O) the distance of said grippers (7) from said object (5) to be handled (step 160).
8) Computer- implemented method (100) according to one or more of the preceding claims, characterized by the fact that it comprises at least the following steps:
- detecting by means of said three-dimensional vision sensor (14) at least one rotation of a hand of said operator (O) around at least one axis of rotation (step 132);
- in case of rotation around at least one detected axis, rotating said grippers (7) around said at least one detected axis (step 142).
9) Computer- implemented method (100) according to one or more of the preceding claims, characterized by the fact that it comprises at least the following steps:
- by means of said three-dimensional vision sensor (14) detecting at least one closing command of the grippers or one opening command of the grippers (step 133);
- in case of detected grippers closing command or grippers opening command, closing/opening said grippers (7) to grasp/release said object (5) to be handled (step 143).
10) Computer- implemented method (100) according to one or more of the preceding claims, characterized by the fact that it comprises at least the following steps:
- by means of said three-dimensional vision sensor (14) detecting at least one movement of the forearm of said operator (O) (step 134);
- in case of detected movement of the forearm, moving said anthropomorphic robot (2) between a picking position and a storage position (step 144).
11) System (1) for the real-time control of an anthropomorphic robot (2), characterized by the fact that it comprises:
- a robotic cell (3) comprising said anthropomorphic robot (2) and at least one area (4, 6) intended to house at least one object (5) to be handled by means of said anthropomorphic robot (2), wherein said anthropomorphic robot (2) 18 comprises grippers (7) adapted to grasp and handle said at least one object (5) and wherein said anthropomorphic robot (2) comprises at least one camera (8) arranged at the point where said grippers (7) are located;
- a real-time control station (11) of said anthropomorphic robot (2), comprising at least one processing unit (12) operationally connected to said anthropomorphic robot (2), graphical user interface means (13) operationally connected to said processing unit (12) and configured to display at least said images detected by said camera (8), and at least one three-dimensional vision sensor (14) operationally connected to said processing unit (12) and configured to detect the movements of at least the hands of said operator (O), wherein said processing unit (12) comprises processing means (15) for the execution of the computer- implemented method (100) according to one or more of the preceding claims.
12) System (1) according to claim 11, characterized by the fact that said camera (8) is arranged in a substantially central position between said grippers (7).
13) System (1) according to one or more of claims 11 and 12, characterized by the fact that it comprises at least one additional camera (8) associated with said anthropomorphic robot (2) in a different position from said camera (8) and/or arranged in a fixed position inside said robotic cell (3).
14) System (1) according to one or more of claims 11 to 13, characterized by the fact that it comprises at least one distance sensor (10) associated with said grippers (7) of the anthropomorphic robot (2).
15) System (1) according to one or more of claims 11 to 14, characterized by the fact that said processing means (15) for the execution of the computer- implemented method (100) comprise a software program for the recognition of human gestures.
16) System (1) according to one or more of claims 11 to 15, characterized by the fact that said processing means (15) for the execution of the computer- implemented method (100) comprise a software program for the management of the robotic cell (3) configured to receive, at input, commands from the software for the recognition of gestures and images at input coming from said camera (8) 19 and said additional camera (9) and of distance data coming from said distance sensor (10).
PCT/IB2021/059822 2020-10-28 2021-10-25 Computer-implemented method for the real-time control of an antropomorphic robot and related system for the real-time control WO2022090895A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP21807253.6A EP4237210A1 (en) 2020-10-28 2021-10-25 Computer-implemented method for the real-time control of an antropomorphic robot and related system for the real-time control

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IT102020000025567 2020-10-28
IT102020000025567A IT202000025567A1 (en) 2020-10-28 2020-10-28 METHOD IMPLEMENTED BY COMPUTER FOR THE REAL-TIME CONTROL OF AN ANTHROPOMORPHIC ROBOT AND RELATED REAL-TIME CONTROL SYSTEM

Publications (1)

Publication Number Publication Date
WO2022090895A1 true WO2022090895A1 (en) 2022-05-05

Family

ID=74184764

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2021/059822 WO2022090895A1 (en) 2020-10-28 2021-10-25 Computer-implemented method for the real-time control of an antropomorphic robot and related system for the real-time control

Country Status (3)

Country Link
EP (1) EP4237210A1 (en)
IT (1) IT202000025567A1 (en)
WO (1) WO2022090895A1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011110620A (en) * 2009-11-24 2011-06-09 Toyota Industries Corp Method of controlling action of robot, and robot system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011110620A (en) * 2009-11-24 2011-06-09 Toyota Industries Corp Method of controlling action of robot, and robot system

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
KOFMAN JONATHAN ET AL: "Robot-Manipulator Teleoperation by Markerless Vision-Based Hand-Arm Tracking", vol. 1, no. 3, 7 September 2007 (2007-09-07), US, pages 331 - 357, XP055813514, ISSN: 1559-9612, Retrieved from the Internet <URL:https://www.tandfonline.com/doi/pdf/10.1080/15599610701580467?needAccess=true> [retrieved on 20210614], DOI: 10.1080/15599610701580467 *
LEONIDAS DELIGIANNIDIS ET AL: "Designing a Lightweight Gesture Recognizer Based on the Kinect Version 2", PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, COMPUTER VISION, AND PATTERN RECOGNITION (IPCV), 1 January 2015 (2015-01-01), Athens, pages 10 - 16, XP055813622, Retrieved from the Internet <URL:http://worldcomp-proceedings.com/proc/p2015/IPC2434.pdf> [retrieved on 20210614] *
PARK TAE MUN ET AL: "Force feedback based gripper control on a robotic arm", 2016 IEEE 20TH JUBILEE INTERNATIONAL CONFERENCE ON INTELLIGENT ENGINEERING SYSTEMS (INES), IEEE, 30 June 2016 (2016-06-30), pages 107 - 112, XP032953295, DOI: 10.1109/INES.2016.7555102 *
YANG YUANRUI ET AL: "Real-time human-robot interaction in complex environment using kinect v2 image recognition", 2015 IEEE 7TH INTERNATIONAL CONFERENCE ON CYBERNETICS AND INTELLIGENT SYSTEMS (CIS) AND IEEE CONFERENCE ON ROBOTICS, AUTOMATION AND MECHATRONICS (RAM), IEEE, 15 July 2015 (2015-07-15), pages 112 - 117, XP033206516, ISBN: 978-1-4673-7337-1, [retrieved on 20150923], DOI: 10.1109/ICCIS.2015.7274606 *

Also Published As

Publication number Publication date
IT202000025567A1 (en) 2022-04-28
EP4237210A1 (en) 2023-09-06

Similar Documents

Publication Publication Date Title
US11724388B2 (en) Robot controller and display device using augmented reality and mixed reality
US8965580B2 (en) Training and operating industrial robots
EP3222393B1 (en) Automated guidance system and method for a coordinated movement machine
Kofman et al. Teleoperation of a robot manipulator using a vision-based human-robot interface
US8155787B2 (en) Intelligent interface device for grasping of an object by a manipulating robot and method of implementing this device
EP3342562A1 (en) Remote control robot system
US20190202058A1 (en) Method of programming an industrial robot
US9878446B2 (en) Determination of object-related gripping regions using a robot
Shirwalkar et al. Telemanipulation of an industrial robotic arm using gesture recognition with Kinect
US11833697B2 (en) Method of programming an industrial robot
CN113538459B (en) Multimode grabbing obstacle avoidance detection optimization method based on drop point area detection
US11478932B2 (en) Handling assembly comprising a handling device for carrying out at least one work step, method, and computer program
Watanabe et al. Cooking behavior with handling general cooking tools based on a system integration for a life-sized humanoid robot
WO2022090895A1 (en) Computer-implemented method for the real-time control of an antropomorphic robot and related system for the real-time control
Wu et al. Kinect-based robotic manipulation: From human hand to end-effector
TWI649169B (en) Holding position and posture teaching device, holding position and posture teaching method, and robot system
Cserteg et al. Assisted assembly process by gesture controlled robots
Bolano et al. Towards a vision-based concept for gesture control of a robot providing visual feedback
Kofman et al. Teleoperation of a robot manipulator from 3D human hand-arm motion
CN111002295A (en) Teaching glove and teaching system of two-finger grabbing robot
JPH04182710A (en) Relative positioning system
US20230120598A1 (en) Robot program generation method from human demonstration
Sun et al. The System Design of Avionics Coordinated Test Based on Dual-arm Cooperative Robot
JP7493816B2 (en) ROBOT, SYSTEM, METHOD, AND PROGRAM
WO2024023934A1 (en) Workpiece removal device, workpiece removal method, and control device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21807253

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2021807253

Country of ref document: EP

Effective date: 20230530