WO2012076038A1 - Procédé permettant d'étalonner une unité de robot, ordinateur, unité de robot et utilisation d'une unité de robot - Google Patents

Procédé permettant d'étalonner une unité de robot, ordinateur, unité de robot et utilisation d'une unité de robot Download PDF

Info

Publication number
WO2012076038A1
WO2012076038A1 PCT/EP2010/068997 EP2010068997W WO2012076038A1 WO 2012076038 A1 WO2012076038 A1 WO 2012076038A1 EP 2010068997 W EP2010068997 W EP 2010068997W WO 2012076038 A1 WO2012076038 A1 WO 2012076038A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
target point
unit
camera unit
pose
Prior art date
Application number
PCT/EP2010/068997
Other languages
English (en)
Inventor
Soenke Kock
Mikael Hedelind
Original Assignee
Abb Research Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Abb Research Ltd. filed Critical Abb Research Ltd.
Priority to PCT/EP2010/068997 priority Critical patent/WO2012076038A1/fr
Publication of WO2012076038A1 publication Critical patent/WO2012076038A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1694Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
    • B25J9/1697Vision controlled systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • B25J9/1692Calibration of manipulator
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/39Robotics, robotics to robotics hand
    • G05B2219/39016Simultaneous calibration of manipulator and camera

Definitions

  • a M ETHOD FOR CALI BRATIN G A ROBOT U N IT, A COM- PUTER U N IT, A ROBOT U N IT AN D USE OF A ROBOT U N IT
  • the present invention relates to a method for cali brating a first coordinate system of a robot unit with a second coordi nate system of an object identification unit.
  • the robot unit comprises a robot arm with a cali bration tool and the object identification unit comprises a camera unit.
  • the present i nvention furthermore relates to a computer unit adapted to execute the method , a robot unit comprising an object identification unit, which robot unit is adapted to execute the method and use of the robot unit.
  • a robot unit may use an object identification unit for identifying certain objects in a work station of the robot unit in order to per- form work on the objects, such as picking and sorting objects, welding , assembling , etc.
  • the robot unit is arranged with the first coordinate system and the object identification unit is arranged with the second coordi- nate system.
  • the object identification unit comprises the camera unit and a computer unit adapted to, based on the information from the camera, identify an object in relation to the second coordinate system.
  • the first and the second coordination system need to be essentially the same in order for the robot unit to perform work on the object. During cali bration procedure the first and the second coordination system are adjusted to be the same.
  • the robot unit is cali brated by means of that an operator manually operates the robot arm of the robot unit comprising the cali bration tool so that the cali bration tool is moved to a plurality of target points in which the robot unit is cali brated .
  • the calibration tool is held in a static position for a period of time duri ng which the object identification unit determines the coordinates of the calibration tool in the second coordinate system.
  • the second coordi nate system is adjusted to the coordinate of the target poi nt accordi ng to the first coordi nate system of the robot unit.
  • a problem with above descri bed cali bration procedure is that the result of the cali bration is dependent on the operator due to that different operators have different ways of setti ng the cali bration tool in the target poi nts. Furthermore, the manual cali bration procedure is time consumi ng . The result of the cali bration is improved by increasi ng the number of target points. This, however, further increases the cali bration time. The qual ity of the cali bration is also influenced by how the target points are selected . I n order to obtain a high quality the target points shall be distri ubbed over the area of the work station . A problem with prior art cali bration procedure is that the target points are not utilized optimal .
  • Another problem is that the operator must visually verify that the cali bration tool is in a target point that can be used for cali bration. For example, some target points can not be used for calibration due to that the cali bration tool is not visi ble for the cam- era. Furthermore, a target poi nts may involve a collision with objects at the work station of the robot unit. There is a trend to design robot units with less rigid structure and in some cases the robot unit is not firmly attached at the work station. Such robot units need to be calibrated frequently and thus it is desired to reduce the duration of the calibration procedure.
  • EP1468792 discloses a method for calibrating a coordinate system of a camera with a coordinate system of a robot unit, wherein the method may be executed automatically.
  • a problem with the method is that target points which are not suitable for the calibration may be used. Accordingly, an operator is required to visually confirm that the target points are valid and collision free. The quality of the calibration would however not be optimal, because not all possible target points are used for the calibration.
  • the object of the present invention is to provide an improved method for calibrating a first coordinate system of a robot unit with a second coordinate system of an object identification unit.
  • a first object of the method is to reduce the duration of the calibration procedure.
  • a second object of the method is to utilize the target points in order to obtain an optimal quality of the calibration.
  • a third object of the method is to enable calibration without visual confirmation of an operator.
  • the method comprises
  • the method further comprises, for each target point, evaluating the target point by executing the following steps until all target points are either maintained or rejected: a) determining whether the target point is visi ble for the camera unit,
  • the plurality of target points are generated so that the target points are separated from each other.
  • the method steps a-g re- gard an eval uation of the validity of the generated target point.
  • the target point can not be used for calibration and the target point is moved towards the camera unit by the certain increment.
  • the lack of visi bility may for example occur d ue to that an object blocks the visi bility.
  • the object may for example be a fixture, a shelf, a machine, etc.
  • Non-visi ble target points are moved towards the camera unit until the target points become visi ble or until the target points reach the limit of the certain range of distance from the camera unit. Outside the certai n range of distance from the camera unit, the quality of the target poi nts is insufficient for the cali bration , wherei n such target point is rejected .
  • Each target point is evaluated and moved until the target point is either maintained or rejected .
  • the method step of generating and eval uating the target point is performed without the necessity of physical movement of the robot arm. Accordingly, the term "move" refers to that the position of a target poi nt is changed in a direction by the certain increment without involvement if a physical movement.
  • increment refers to a distance that a target poi nt is to be moved .
  • the eval uation time is dependent on the size of the increment. Small sized increments result in high quality of the target points at the cost of high eval uation time i nvolving a large number of repetition for evaluation and movi ng the target points, and vice versa.
  • the size of the i ncrement shall involve a balanced between quality of the target points and evaluation time.
  • target point refers to a position and an orientation of the cali bration tool to which the cali bration tool is adapted to be moved by the robot arm for cali brating the first coordinate system with the second coordi nate system.
  • the target point comprises six deg rees of freedom ; three deg rees of freedom i n re- gards to a position of the cali bration tool and three degrees of freedom in regards to the orientation of the cali bration tool .
  • the term "mai ntained target point” refers to a target point that has been evaluated and is stored in a memory unit for generat- ing the robot program or for further evaluation .
  • the term "rejected target point” refers to a target point that can not be used for the calibration. A target point that is not visible for the camera unit can not be used for the calibration. Likewise, a target point that is outside the certain range of distance from the camera unit will not provide sufficient quality for the calibration procedure and thus the target point can not be used for calibration.
  • Each target point is evaluated and moved until it is either main- tained or rejected. Thereafter, the robot program is generated based on the maintain target points and the robot program is executed, wherein the robot arm physically moves the calibration tool to the target points, while calibrating the first coordinate system with the second coordinate system.
  • the evaluation and correction of the target points are performed prior to executing the calibration.
  • the calibration is performed automatically without that the operator visually must confirm that all the target points can be used.
  • the actual physical calibration procedure where the calibration tool is moved to the maintained target points by means of the robot program, is performed in a time ef- ficient manner.
  • the calibration tool comprises a calibration feature that is adapted to be recog- nized by the object identification unit.
  • the cali bration feature comprises a specific form i n order to enable recognition of the cali bration tool for cali bration.
  • the cali bration feature comprises an L-shaped element, a rectangular element, etc.
  • the steps a-g are simulated based on a virtual model of the robot arm, characteristics and position of the camera unit and characteristics of a work station of the robot unit. Accordingly, the simulation is per- formed without movi ng the robot arm prior to the execution of the generated robot program. Thereby, the eval uation of the target points is performed prior to physical movement of the calibration tool to the maintained target points. Accordi ngly, the duration of the cali bration procedure that requires the physical movement of the robot unit is reduced .
  • the method further comprises:
  • the quality of the target point is optimal in the focal plane. As the target point is moved towards the camera unit, the quality of the target points gradually decreases. If the target poi nt is out- side the certain range of distance from the camera unit, the quality of the target point is not sufficient for the cali bration.
  • a target point in the focal plane of the camera unit provides the highest quality picture for determini ng the position of the target point i n the second coord inate system .
  • the target points are generated in a position providing the best quality.
  • Target poi nts that are moved towards the camera unit due to lack of visi bility will thus have lower quality compared to the generated position of the targets points.
  • the target points are generated so that they are distributed evenly from each other in a plane within the range of distance from the focal plane of the camera unit. Thereby, the generated target points will provide an optimal calibration quality if all the targets points are maintained.
  • the target points are generated in the focal plane of the camera unit.
  • the method further comprises for each of the target points, evaluating the target points by executing the following steps until all target points are either maintained or rejected:
  • step a if the moved target point is beyond said distance from the robot base, evaluate the moved target point according to step a, n) repeat the steps h-m until the moved target point is either maintained or rejected if the moved target point is visible for the camera unit,
  • the method steps h-o regard an evaluation of if it is possible for the robot unit to move the calibration tool to the maintained tar- get points.
  • the target point can not be used for calibration and the target point is moved towards the robot base by a certain i ncrement.
  • the lack of reach may be due to that the target point is at a distance too far away from the robot base or d ue to that the robot arm must be stretch around an object to the target poi nt and therefore can not reach the target poi nt.
  • By means of moving the target poi nt towa rd s th e robot base the target point may be reached by the robot arm.
  • the target poi nts are moved towards the robot base until the target points reaches the certai n distance from the robot base.
  • the robot arm can not reach target points withi n the certain distance from the robot base. Accordingly, such target poi nts are rejected .
  • the steps a-o are simulated based on a virtual model of the robot arm, characteristics and position of the camera u n it and characteristics of a work station of the robot unit.
  • the method further comprises for each of the target points:
  • the method further comprises, for each robot pose, evaluati ng the robot pose by executi ng the following steps until all robot poses are either maintained or rejected :
  • the robot pose is generated for each maintained target point.
  • the robot pose defines a certain orientation of the robot arm.
  • the method steps p-t regard an evaluation of possible collision of the robot poses.
  • a robot pose for a target point involves a collision
  • the robot pose can not be used for calibration and one or more alternative robot poses for the same target points are evaluated for collision. If an alternative robot pose can reach the target point without collision, the generated robot pose is changed to the alternative robot pose, otherwise the target point and the robot pose are rejected.
  • the term "maintained robot pose” refers to a robot pose that has been evaluated and is stored in a memory unit for generating the robot program or for further evaluation.
  • rejected robot pose refers to a robot pose that can not reach its target point without collision. The robot pose is therefore not stored and the target point associated to it is rejected.
  • Each generated robot pose is evaluated for collision and is maintained or changed to an alternative robot pose. Otherwise the robot pose is rejected. Thereafter, the robot program is generated based on the maintain target points and the maintained robot poses. Thereafter the robot program is executed while performing calibration.
  • the steps a-t are simulated based on a virtual model of the robot arm, characteristics and position of the camera un it and characteristics of a work station of the robot unit.
  • the method comprises for each of the target points:
  • step p determining whether the cali bration tool is visi ble for the camera unit using the robot pose
  • step q maintain the robot pose if the cali bration tool is visible for the camera unit usi ng the robot pose,
  • step r if the cali bration tool is visi ble for the camera unit using the robot pose, determine whether at least one alternative robot pose for the target point is visi ble for the camera unit, - in step s, change the robot pose to the alternative robot pose if the alternative robot pose is visi ble for the camera unit.
  • step t reject the alternative robot pose and the target point if the alternative robot pose is not visible for the camera unit.
  • the method steps p-t furthermore regard an evaluation of the visi bility of the cali bration tool in the generated robot pose.
  • the robot pose may for example be oriented so that the robot arm itself obstructs the visi bility for the camera unit of the target point.
  • the robot pose can not be used for cali bration and one or more alternative robot poses for the same target poi nts are evaluated for visi bility. If an alternative robot pose is visi ble in the target point, the generated robot pose is changed to the alternative robot pose, otherwise the target poi nt and the robot pose are rejected .
  • the method fur- ther comprises, - in step j , determine whether at least one alternative robot pose is within reach of the target point,
  • step k change the robot pose to an alternative robot pose if the alternative robot pose enables reach of the target poi nt, oth- erwise reject the alternative robot pose.
  • the reach of a target poi nt may be dependent on the robot pose.
  • the target point may be within reach without moving the target point.
  • the robot pose comprises a first orientation of the robot arm and the at least one alternative robot pose comprises a second orientation of the robot arm.
  • the robot pose comprises a first orientation of the cali bration tool and the at least one alternative robot pose comprises a second orientation of the cali bration tool .
  • the steps a-o are simulated without moving the robot arm prior to the execution of the generated robot program.
  • the determination on whether the target point is visible for the camera unit is based on i nformation on the geometry of a work station of the robot unit and the position of the camera unit.
  • the determination on the whether the target point is withi n reach of the robot arm is based on the position of the robot base and the geometry of the robot unit.
  • the determination on the whether the robot poses are collision free is based on i n- formation on geometry of the robot unit and a work station of the robot unit.
  • the determi nation on the whether the cali bration tool is visi ble for the camera unit is based on i nformation on the geometry of the robot unit, the cali bration tool and the camera unit.
  • said range of dis- tance from the camera unit in step e is dependent on a focus distance of the camera unit.
  • the target points must be at the range of distance for sufficient quality for the cali bration.
  • said range of dis- tance from the camera unit in step e is dependent on depth of field of the camera unit.
  • depth of field refers to the distance around the focal plane, which the target points are sufficiently sharp for the cali bration .
  • the determination on whether the target poi nt is visi ble is based on information on camera unit field of view.
  • field of view refers to the extent to which target points are visi ble for the camera unit. The term is related to the angle of view of the camera unit.
  • said distance from the robot base in step k is dependent on the geometry of the robot arm.
  • the method further comprises:
  • step a-t is simulated based on a model of the characteristics of the robot unit and the object identification unit. Accordingly, the evaluation is performed prior to the actual calibration that is performed according to the robot program. Accordingly, the duration of the actual calibration where the calibration tool is moved to the plurality of target points are reduced.
  • Fig. 1 shows an example of a robot unit with a first coordinate system and an object identification unit with a second coordinate system, which first and second coordinate systems are adapted to be calibrated by an embodiment of the invention.
  • Fig.2 shows examples of parameters of the camera unit used in the method.
  • Fig.3 shows an example of a first evaluation of the target points according to an embodiment of the invention.
  • Fig.4 shows an example of a second evaluation of the target points.
  • Fig.5 shows an example of a first evaluation of the robot poses.
  • Fig.6 shows an example of a second evaluation of the robot poses.
  • Fig.7 shows a general flow chart of the method.
  • Figure 1 shows an example of a robot unit 1 with a first coordi nate system and an object identification unit 2 with a second co ordinate system.
  • the robot unit 1 comprises a robot arm 3 with a calibration tool 4.
  • the robot unit 1 and the object identification unit 2 are located at a work station 7.
  • the robot unit 1 is adapted to perform work at the work station 7.
  • the robot unit 1 comprises a robot controller 9 adapted to control the movements of the robot arm 3 by means of controlling a plurality of electric motors on the robot arm 3.
  • the robot controller 9 comprises a central processing unit (CPU) 10, a memory unit 1 1 and a drive unit 12.
  • the CPU 10 is adapted to execute a robot program located on the memory unit, wherei n the robot arm is moved to a plurality of position using a plurality or robot poses.
  • the drive unit 12 is adapted control the electric motors of the robot arm 3 i n dependency of the executed robot program.
  • the object identification unit 2 comprises a camera un it 20 and an information processi ng unit 22. The camera unit is adapted to be directed towards the work station 7 of the robot unit 1 .
  • the information processing unit 22 comprises a central process- ing unit (CPU) 24 and a memory unit 26.
  • the information processing unit 22 is adapted to receive information from the camera unit 20 in the form of a depiction of one or more object at the work station 7.
  • the information processing unit 22 is adapted to process the information so that the object is recognized and the position of the object in the second coordinate system is determined by means of certai n object recognition algorithms.
  • the object identification unit 2 is adapted to recognize a cali bration feature of the cali bration tool 4 on the robot arm 3.
  • the robot unit 1 is adapted to move the robot arm 3 to the position of the object and perform work on the object, such as picking , welding , painting , assembly, etcetera. Accordingly, the robot unit 1 and the object identification unit 2 are co-operating in the work at the work station 7.
  • the first coordinate system of the robot unit 1 and the second coordi nate system of the object identification unit 2 must be essentially the same. Therefore, the first and the second coordi nate system must be cali brated with each other by means of a cali bration method prior to performi ng work at the work station 7. It shall be understood that the calibration comprises correcting one of the first and the second coordinate system with the other of the first and the second coor- dinate system. The cali bration is in some working condition repeated frequently in order to assure the accuracy of the work performed by the robot unit 1 .
  • the robot unit 1 further comprises a computer u nit 30 compris- ing a central processi ng u n it (CPU ) 32 and a memory u n it 34.
  • the computer unit 30 is adapted to generate a plurality of target points to which the cali bration tool 4 is to be moved by the robot arm 3 and a plurality of robot poses in which the robot pose the robot arm 3 is adapted to be oriented .
  • the computer unit 30 unit is further adapted to evaluate the target poi nts and robot poses prior to executing the cali bration. After the evaluation a set of mai ntained target points and maintained robot poses is obtai ned .
  • the robot controller 9 After receiving the information, the robot controller 9 is adapted to generate a robot program based on the maintained target points and maintained robot poses and the robot program is adapted to be executed wh ile cal i brati ng the fi rst coordi nate system with the second coordinate system.
  • Figure 2 shows an example of parameters used i n an embodi- ment of the method for cali brating the first coordinate system of the robot unit 1 with the second coordinate system of the object identification unit 2.
  • the parameters are determined by the characteristics of a camera unit 20.
  • the camera unit 20 comprises a view cone, i n fig . 2 , il lustrated as a triangle. Only targets points within the view cone are visi ble and thus useful for the cali bration.
  • the camera unit 20 comprises a focus distance.
  • the focus distance is a distance from the camera unit 20 to a focal plane of the camera unit 20.
  • the focal plane is a plane where objects are depicted with the highest quality.
  • the object identification unit 2 is arranged to determine the position of objects i n the focal plane with the highest accuracy.
  • the camera unit 5 is also characterized by the term depth of field .
  • the depth of field is a range of distance around the focal plane where the quality of the depicted object is sufficient for determi ning the position of the objects.
  • Figu re 7 shows a g eneral flow chart of an embodi ment of the method for cali brating the first coordinate system of a robot unit 1 with the second coordinate system of the object identification unit 2.
  • the first step of a method comprises generati ng a pl ural ity of target poi nts to which the cali bration tool 4 is to be moved by the robot arm 3 for cali bration.
  • the target points are preferably generated so that they are separated from each other by an equal distance.
  • the target points are preferably gen- erated i n the focal plane of a camera unit 20.
  • Each target point is a point in space where the robot arm 3 is adapted to move the cali bration tool 4 for cali bration of a first coordinate system with the second coordinate system.
  • the target point further comprises an orientation . Accordingly, each target point comprises six degrees of freedom , three degree regarding the position of the target point and three degrees regarding the orientation of the target point.
  • the number of generated target points depends on the desired quality of the cali bration. Accordingly, a large number of target points are necessary in order to obtain a high quality cali bration .
  • the target points shall preferably be located in the focal plane of the camera unit 20.
  • I n order to enable cali bration of the first coordinate system with the second coordinate system, th e ta rg et po i nts a re required to be located within a certai n range of distance from the camera unit 20.
  • the certai n range of distance from the camera unit 20 provides an acceptable quality of the target points in order to perform the cali bration.
  • the certain range of distance from the camera unit 20 is the distance from the camera unit 20 to and withi n the depth of field of the camera unit 20, see fig . 2.
  • target points that are located outside the certain range of distance from the camera unit 20 will not provide sufficient quality in order to cali brate the first coordinate system with the second coordinate system.
  • the evaluation of a target point comprises a first evaluation and a second evaluation.
  • the first evaluation comprises determination and correction i n ord er to assure that the target points are visi ble for the camera unit 20 and within the certain range of distance from the camera unit 20 in order to enable the calibration .
  • the first evaluation is performed for each target point.
  • the first evaluation is initiated by the determining whether the target point is visi ble for the camera unit 20. If a target point is visi ble for the camera unit 20 the target point is mai ntained . If a target point is not visi ble for the camera unit 20 the target point is corrected by moving the target point by a certain increment in a di- rection towards the camera unit 20, illustrated as "+ 1 camera" in fig . 3.
  • the moved target poi nt i s rejected if the moved target poi nt is not within the certain range of distance from the camera unit 20. If the moved target point is within the certain range of distance from the camera unit 20, the process is repeated until the moved target point is either maintained or rejected .
  • a maintained target point is a target point that has been evaluated and possi bly corrected by moving the target point towards the camera unit 20.
  • the maintained target point is stored in the memory unit 26 of the object identification unit.
  • the maintained target point is adapted to be used for generating a robot program or for further evaluation.
  • a rejected target point is a target poi nt that can not be used for the cali bration. I n regards to the first evaluation , the rejected target point is either not visi ble for the camera unit 20 or is outside the certain range of distance from the camera unit 20.
  • the second evaluation of the target point is shown in figure 4.
  • the second evaluation of the target point is shown is performed on each of the target poi nts and is initiated by determini ng whether the target poi nt is within reach of the robot arm 3. If a target poi nt is within reach of the robot arm 3, the target point is maintai ned . If a target poi nt is not withi n reach of the robot arm 3, the target point is moved by a certain increment in a direction towards a robot base 28 of the robot arm 3, i ll ustrated as "+ 1 robot" i n fi g . 4. By moving the target point towards the robot base, the target poi nt is moved closer to the robot base 28.
  • the robot unit 1 is adapted to reach target points that are located at a distance beyond the certain distance from the robot base 28.
  • the certai n range of distance from the robot base 28 regards a mi n imum distance that the target poi nts shal l be located in order for the robot unit 1 to move the cali bration tool 4 to the target point.
  • the target point is rejected . If a target point is not beyond the distance from the robot base, the target point is rejected . If a target point is beyond the certain distance from the robot base 28, the target point is evaluated accordi ng to a first evaluation in figure 3 until the target poi nt is either re-evaluated regarding if the robot arm 3 can reach the target point or rejected . The second evaluation is performed for each target poi nt unti l the target point is either maintai ned or rejected .
  • a set of maintained target points are stored in the memory unit 34 of the com- puter unit 30.
  • the first eval uation and second evaluation of the target points are performed by means of a simulation software program on the computer unit.
  • the software program uses a model of the characteristics of the work space 7, the robot unit 1 and the camera unit 20.
  • a robot pose associated to each of the mai ntained target points is generated by the computer unit 30.
  • Each robot pose defines a certain orientation of the robot arm 3 that positions the cali bration tool 4 i n the spe- cific target point.
  • a specific target point can often be reached by a plurality of different robot poses.
  • each robot pose are evaluated .
  • the evalua- tion of the robot poses comprises a fi rst eval uation and a second evaluation.
  • the first evaluation and second evaluation of the robot poses are performed by means of a simulation software program on the computer unit 30.
  • the software program controls if the robot pose involves a collision and if the cali bra- tion tool is visible using a certain robot pose.
  • a first evaluation of the robot poses are shown in figure 5.
  • the evaluation comprises, for each robot pose, determining whether the robot pose is collision free. If the robot pose is collision free, the robot pose is mai ntained . If a robot pose is not collision free, it is determined whether at least one alternative robot pose for the same target point is collision free. If such alternative robot pose is available for the target point, the robot pose is changed to the alternative robot pose and the alternative robot pose is maintai ned . If no alternative robot pose for the target poi nt is collision free, the robot pose is rejected and the target poi nt associated with the robot pose is also rejected .
  • the target point that is to be rejected according to above may by moved with the certain range of distance from the camera un it 20 in order to fi nd an alternative target point that may be assigned a robot pose without collision.
  • the moved target point is also reevaluated accordi ng to the first and second evaluation of the target point accordi ng to fig . 3 and 4.
  • the mai ntained collision free robot poses are stored in the memory unit 1 1 of the robot controller 9 for generating the robot program or for further evaluation.
  • the rejected robot poses relates to that the robot pose and its associated target poi nt are deleted and thus not used for the calibration .
  • the second eval uation is shown in figure 6.
  • the evaluation comprises, for each robot pose, determining whether cali bration tool 4 is visi ble for the camera unit 20 using the robot pose. If the cali bration tool 4 is visi ble for the camera unit 20 using the robot pose, the robot pose is mai ntained . If the cali bration tool 4 is not visi ble for the camera unit 20 using the robot pose, it is determi ned whether at least one alternative robot pose for the same target point is visi ble for the camera unit 20. If such alter- native robot pose is available for the target point, the robot pose is changed to the alternative robot pose and the alternative robot pose is maintained . If no alternative robot pose for the target point enables the cali bration tool 4 to be visi ble for the camera unit 20, the robot pose is rejected and the target point associ- ated with the robot pose is also rejected .
  • the target point that is to be rejected according to above may by moved with the certain range of distance from the camera unit 20 in order to fi nd an alternative target poi nt may be assigned a robot pose, which robot pose enables the cali bration tool 4 to be visi ble for the camera unit 20.
  • the moved target point is also reeval uated according to the first and second evaluation of the target point according to fig . 3 and 4.
  • a set of maintained robot poses are stored in the memory unit 34 of the computer unit 30.
  • the first eval uation and second evaluation of the robot poses are performed by means of a si mulation software prog ra m on th e computer u n it.
  • Th e software prog ram uses a model of the characteristics of the work space 7, the robot unit 1 and the camera unit 20.
  • the rejected robot poses and their associated target poi nt are deleted and are th us not used for the cali bration .
  • a set of mai ntained target poi nts and a set of maintai ned robot poses are stored in the memory unit 34 of the computer unit 30 and the information is transferred to the robot controller 12.
  • a robot program is generated by the robot control- ler 12 based on the set of mai ntained target points and the set of maintai n ed ro bot poses .
  • the robot program is executed while cali brating the first coordinate system with the second coordi nate system. The method has the benefit that the target points and the robot poses are generated and evaluated before the actual cali bration procedure.
  • the duration of the cali bration while the robot unit 1 by means of the robot arm 3 moves the cali bration tool 4 to the plurality of target points while cali brating the first coordinate system with the second coordinate system is reduced in comparison to prior art cali bration methods.
  • a further benefit is that the operator is not required to visually confirm the target points and the robot poses while the robot arm 3 moves the cali bration tool 4 to the plurality of target points by means of the plurality of robot poses.
  • the robot controller 9, the information processing unit 22 and the computer unit 30 may be the same unit. It shall also be understood that a single CPU 10 and a single memory unit 1 1 of the robot controller 9 may be used for achieving the method . It shall furthermore be understood that the evaluation step of the target points and robot poses are performed "off-line" without requiring movement of the robot arm 3 of the robot unit 1.
  • first and the second evaluation in fig. 7 also involves the corrective measures of moving the target point.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

La présente invention a trait à un procédé permettant d'étalonner un premier système de coordonnées d'une unité de robot (1) sur un second système de coordonnées d'une unité d'identification d'objet (2). Le procédé comprend les étapes consistant à générer une pluralité de points cibles vers lesquels un outil d'étalonnage (4) doit être déplacé par l'unité de robot en vue de procéder à un étalonnage, à évaluer les points cibles en vue d'obtenir une visibilité de l'unité de caméra et une plage de distance à partir de l'unité de caméra, et à déplacer les points cibles vers l'unité de caméra jusqu'à ce que les points cibles soient conservés ou rejetés, à générer un programme de robot en fonction des points cibles conservés, et à exécuter le programme de robot tout en étalonnant le premier système de coordonnées sur le second système de coordonnées.
PCT/EP2010/068997 2010-12-06 2010-12-06 Procédé permettant d'étalonner une unité de robot, ordinateur, unité de robot et utilisation d'une unité de robot WO2012076038A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2010/068997 WO2012076038A1 (fr) 2010-12-06 2010-12-06 Procédé permettant d'étalonner une unité de robot, ordinateur, unité de robot et utilisation d'une unité de robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2010/068997 WO2012076038A1 (fr) 2010-12-06 2010-12-06 Procédé permettant d'étalonner une unité de robot, ordinateur, unité de robot et utilisation d'une unité de robot

Publications (1)

Publication Number Publication Date
WO2012076038A1 true WO2012076038A1 (fr) 2012-06-14

Family

ID=43909967

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2010/068997 WO2012076038A1 (fr) 2010-12-06 2010-12-06 Procédé permettant d'étalonner une unité de robot, ordinateur, unité de robot et utilisation d'une unité de robot

Country Status (1)

Country Link
WO (1) WO2012076038A1 (fr)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103363899A (zh) * 2013-07-05 2013-10-23 科瑞自动化技术(深圳)有限公司 一种用于标定机器手坐标系的标定装置及标定方法
WO2014065744A1 (fr) 2012-10-23 2014-05-01 Cognibotics Ab Procédé et système pour la détermination d'au moins une caractéristique d'un joint
WO2014161603A1 (fr) * 2013-04-05 2014-10-09 Abb Technology Ltd Système robot et méthode d'étalonnage
JP2015062991A (ja) * 2013-08-28 2015-04-09 キヤノン株式会社 座標系校正方法、ロボットシステム、プログラム及び記録媒体
CN105773613A (zh) * 2016-03-30 2016-07-20 东莞市速美达自动化有限公司 一种水平机器人相机坐标系校定方法
WO2017167687A2 (fr) 2016-03-29 2017-10-05 Cognibotics Ab Procédé, dispositif de contrainte et système de détermination des propriétés géométriques d'un manipulateur
CN108463313A (zh) * 2016-02-02 2018-08-28 Abb瑞士股份有限公司 机器人系统校准
CN108942927A (zh) * 2018-06-29 2018-12-07 齐鲁工业大学 一种基于机器视觉的像素坐标与机械臂坐标统一的方法
CN111531547A (zh) * 2020-05-26 2020-08-14 华中科技大学 一种基于视觉测量的机器人标定及检测方法
US10926414B2 (en) 2017-09-29 2021-02-23 Industrial Technology Research Institute System and method for calibrating tool center point of robot
CN112802122A (zh) * 2021-01-21 2021-05-14 珠海市运泰利自动化设备有限公司 机器人视觉引导组装方法
CN113246128A (zh) * 2021-05-20 2021-08-13 菲烁易维(重庆)科技有限公司 一种基于视觉测量技术的机器人示教方法
CN113635311A (zh) * 2021-10-18 2021-11-12 杭州灵西机器人智能科技有限公司 固定标定板的眼在手外标定方法和系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1468792A2 (fr) 2003-04-16 2004-10-20 VMT Bildverarbeitungssysteme GmbH Méthode pour calibrer un robot
US20090062960A1 (en) * 2007-08-30 2009-03-05 Sekar Krishnasamy Method and system for robot calibrations with a camera
US20090118864A1 (en) * 2007-11-01 2009-05-07 Bryce Eldridge Method and system for finding a tool center point for a robot using an external camera
EP2070664A1 (fr) * 2007-12-14 2009-06-17 Montanuniversität Leoben Système de traitement d'objets

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1468792A2 (fr) 2003-04-16 2004-10-20 VMT Bildverarbeitungssysteme GmbH Méthode pour calibrer un robot
US20090062960A1 (en) * 2007-08-30 2009-03-05 Sekar Krishnasamy Method and system for robot calibrations with a camera
US20090118864A1 (en) * 2007-11-01 2009-05-07 Bryce Eldridge Method and system for finding a tool center point for a robot using an external camera
EP2070664A1 (fr) * 2007-12-14 2009-06-17 Montanuniversität Leoben Système de traitement d'objets

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
HANQI ZHUANG ET AL: "Camera-Assisted Calibration of SCARA Arms", IEEE ROBOTICS & AUTOMATION MAGAZINE, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 3, no. 4, 1 December 1996 (1996-12-01), pages 46 - 53, XP011089687, ISSN: 1070-9932, DOI: DOI:10.1109/100.556482 *

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104736307A (zh) * 2012-10-23 2015-06-24 康格尼博提克斯股份公司 用于确定接头的至少一个特性的方法和系统
WO2014065744A1 (fr) 2012-10-23 2014-05-01 Cognibotics Ab Procédé et système pour la détermination d'au moins une caractéristique d'un joint
US9645565B2 (en) 2012-10-23 2017-05-09 Cognibotics Ab Method and system for determination of at least one property of a joint
CN104736307B (zh) * 2012-10-23 2017-03-08 康格尼博提克斯股份公司 用于确定接头的至少一个特性的方法和系统
US9457470B2 (en) 2013-04-05 2016-10-04 Abb Technology Ltd Robot system and method for calibration
CN105073348B (zh) * 2013-04-05 2016-11-09 Abb技术有限公司 用于校准的机器人系统和方法
CN105073348A (zh) * 2013-04-05 2015-11-18 Abb技术有限公司 用于校准的机器人系统和方法
WO2014161603A1 (fr) * 2013-04-05 2014-10-09 Abb Technology Ltd Système robot et méthode d'étalonnage
CN103363899A (zh) * 2013-07-05 2013-10-23 科瑞自动化技术(深圳)有限公司 一种用于标定机器手坐标系的标定装置及标定方法
CN103363899B (zh) * 2013-07-05 2016-09-14 深圳科瑞技术股份有限公司 一种用于标定机器手坐标系的标定装置及标定方法
JP2015062991A (ja) * 2013-08-28 2015-04-09 キヤノン株式会社 座標系校正方法、ロボットシステム、プログラム及び記録媒体
CN108463313A (zh) * 2016-02-02 2018-08-28 Abb瑞士股份有限公司 机器人系统校准
US11230011B2 (en) 2016-02-02 2022-01-25 Abb Schweiz Ag Robot system calibration
CN109196429B (zh) * 2016-03-29 2021-10-15 康格尼博提克斯股份公司 用于确定操纵器的几何特性的方法、约束装置和系统
WO2017167687A2 (fr) 2016-03-29 2017-10-05 Cognibotics Ab Procédé, dispositif de contrainte et système de détermination des propriétés géométriques d'un manipulateur
CN109196429A (zh) * 2016-03-29 2019-01-11 康格尼博提克斯股份公司 用于确定操纵器的几何特性的方法、约束装置和系统
US11192243B2 (en) 2016-03-29 2021-12-07 Cognibotics Ab Method, constraining device and system for determining geometric properties of a manipulator
CN105773613A (zh) * 2016-03-30 2016-07-20 东莞市速美达自动化有限公司 一种水平机器人相机坐标系校定方法
US10926414B2 (en) 2017-09-29 2021-02-23 Industrial Technology Research Institute System and method for calibrating tool center point of robot
CN108942927A (zh) * 2018-06-29 2018-12-07 齐鲁工业大学 一种基于机器视觉的像素坐标与机械臂坐标统一的方法
CN108942927B (zh) * 2018-06-29 2022-04-26 齐鲁工业大学 一种基于机器视觉的像素坐标与机械臂坐标统一的方法
CN111531547B (zh) * 2020-05-26 2021-10-26 华中科技大学 一种基于视觉测量的机器人标定及检测方法
CN111531547A (zh) * 2020-05-26 2020-08-14 华中科技大学 一种基于视觉测量的机器人标定及检测方法
CN112802122A (zh) * 2021-01-21 2021-05-14 珠海市运泰利自动化设备有限公司 机器人视觉引导组装方法
CN112802122B (zh) * 2021-01-21 2023-08-29 珠海市运泰利自动化设备有限公司 机器人视觉引导组装方法
CN113246128A (zh) * 2021-05-20 2021-08-13 菲烁易维(重庆)科技有限公司 一种基于视觉测量技术的机器人示教方法
CN113246128B (zh) * 2021-05-20 2022-06-21 菲烁易维(重庆)科技有限公司 一种基于视觉测量技术的机器人示教方法
CN113635311A (zh) * 2021-10-18 2021-11-12 杭州灵西机器人智能科技有限公司 固定标定板的眼在手外标定方法和系统

Similar Documents

Publication Publication Date Title
WO2012076038A1 (fr) Procédé permettant d'étalonner une unité de robot, ordinateur, unité de robot et utilisation d'une unité de robot
US10525597B2 (en) Robot and robot system
EP3705239B1 (fr) Système et procédé d'étalonnage pour cellules robotiques
US11396100B2 (en) Robot calibration for AR and digital twin
KR102276259B1 (ko) 비전-기반 조작 시스템들의 교정 및 동작
US8989897B2 (en) Robot-cell calibration
US9519736B2 (en) Data generation device for vision sensor and detection simulation system
US9043024B2 (en) Vision correction method for tool center point of a robot manipulator
EP2350750B1 (fr) Procédé et appareil d'étalonnage d'un système robotique industriel
US20200298411A1 (en) Method for the orientation of an industrial robot, and industrial robot
JP7153085B2 (ja) ロボットキャリブレーションシステム及びロボットキャリブレーション方法
US20190022867A1 (en) Automatic Calibration Method For Robot System
US20090234502A1 (en) Apparatus for determining pickup pose of robot arm with camera
US20140229005A1 (en) Robot system and method for controlling the same
JP2018012184A (ja) 制御装置、ロボットおよびロボットシステム
JP6235664B2 (ja) ロボットの機構パラメータを校正するために使用される計測装置
WO2012004232A2 (fr) Procédé pour étalonner un robot positionné sur une plateforme mobile
CN103442858A (zh) 机器人工作对象单元校准装置、系统、和方法
JP6900290B2 (ja) ロボットシステム
Mustafa et al. A geometrical approach for online error compensation of industrial manipulators
JP2006110705A5 (fr)
JP6565175B2 (ja) ロボットおよびロボットシステム
JP6869159B2 (ja) ロボットシステム
WO2014206787A1 (fr) Procédé d'étalonnage de robot
US20190030722A1 (en) Control device, robot system, and control method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10784327

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10784327

Country of ref document: EP

Kind code of ref document: A1