WO2018142582A1 - Dispositif de calcul de coefficient de conversion, procédé de calcul de coefficient de conversion et programme de calcul de coefficient de conversion - Google Patents

Dispositif de calcul de coefficient de conversion, procédé de calcul de coefficient de conversion et programme de calcul de coefficient de conversion Download PDF

Info

Publication number
WO2018142582A1
WO2018142582A1 PCT/JP2017/004006 JP2017004006W WO2018142582A1 WO 2018142582 A1 WO2018142582 A1 WO 2018142582A1 JP 2017004006 W JP2017004006 W JP 2017004006W WO 2018142582 A1 WO2018142582 A1 WO 2018142582A1
Authority
WO
WIPO (PCT)
Prior art keywords
coordinate system
conversion coefficient
sensor
coefficient calculation
robot
Prior art date
Application number
PCT/JP2017/004006
Other languages
English (en)
Japanese (ja)
Inventor
衛 三浦
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2017/004006 priority Critical patent/WO2018142582A1/fr
Priority to JP2017536596A priority patent/JP6275345B1/ja
Priority to CN201780084839.XA priority patent/CN110267771A/zh
Priority to TW106117839A priority patent/TW201830272A/zh
Publication of WO2018142582A1 publication Critical patent/WO2018142582A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/10Programme-controlled manipulators characterised by positioning means for manipulator elements

Definitions

  • the present invention relates to a conversion coefficient calculation device, a conversion coefficient calculation method, and a conversion coefficient calculation program for calculating a conversion coefficient used when converting a position in a sensor coordinate system to a position in a robot coordinate system.
  • the robot system is, for example, a system including a robot arm to which a robot hand that holds an object such as a processing target is attached.
  • the robot system also includes a sensor that measures the periphery of the object and the robot hand, and controls the position and orientation of the robot arm and the robot hand using sensor information measured by the sensor.
  • the sensor information measured by the sensor is information indicating a three-dimensional position of the object or the like in the sensor coordinate system. Therefore, in order for the robot system to be able to use sensor information when controlling the position and orientation of the robot arm or the like, the three-dimensional position indicated by the sensor information is converted from the sensor coordinate system to the robot coordinate system. There is a need to.
  • the sensor coordinate system is a coordinate system centered on the sensor, and the robot coordinate system is a robot coordinate system centered on the robot.
  • Patent Document 1 discloses a conversion coefficient calculation apparatus that calculates a conversion coefficient of coordinates by a robot hand holding a marker for image recognition and a camera as a sensor photographing the marker.
  • This conversion coefficient calculation device uses a sensor based on the position of a marker in an image captured by a camera in the sensor coordinate system (camera coordinate system) and the position of a marker held by a robot hand in the robot coordinate system.
  • a parameter indicating a relative positional relationship between a certain camera and a robot is calculated.
  • the parameter indicating the relative positional relationship between the camera as a sensor and the robot corresponds to a coordinate conversion coefficient used when converting a three-dimensional position in the sensor coordinate system to a three-dimensional position in the robot coordinate system.
  • the conversion coefficient calculating apparatus uses the parameter indicating the relative positional relationship to calculate the object or the like indicated by the sensor information measured by the sensor.
  • the three-dimensional position is converted into a three-dimensional position in the robot coordinate system.
  • the conventional conversion coefficient calculation apparatus is configured as described above, if the sensor is a camera and the robot hand grasps the marker, it indicates the relative positional relationship between the sensor camera and the robot. Parameters can be calculated. However, when the sensor is a three-dimensional sensor such as a laser scanner, for example, there is a problem that a parameter indicating the relative positional relationship between the sensor and the robot cannot be calculated because the marker cannot be photographed. It was.
  • the present invention has been made to solve the above-described problem. Even when the sensor is a three-dimensional sensor such as a laser scanner, the position in the sensor coordinate system is converted to the position in the robot coordinate system. It is an object to obtain a conversion coefficient calculation device, a conversion coefficient calculation method, and a conversion coefficient calculation program that can calculate a conversion coefficient to be used.
  • the conversion coefficient calculation apparatus acquires position and orientation information indicating the position and orientation of a robot hand to which an object is fixed, and uses the position and orientation information of the object in the robot coordinate system or the object in the robot coordinate system. From the position and orientation information acquisition unit that identifies the plane parameter that is the plane parameter of the sensor and the sensor that measures the object fixed to the robot hand, the position of the object in the sensor coordinate system or the plane of the object in the sensor coordinate system A robot coordinate system identified by the position / orientation information acquisition unit every time the conversion coefficient calculation unit switches the position or orientation of the robot hand. Position or plane parameter in the sensor coordinate system indicated by the sensor information acquired by the sensor information acquired by the sensor information acquisition unit. Holding a set of the parameters, a plurality of sets of holding is the position of the sensor coordinate system that calculate the transform coefficients to be used for converting the position in the robot coordinate system.
  • the conversion coefficient calculation unit is acquired by the position or plane parameter in the robot coordinate system specified by the position and orientation information acquisition unit and the sensor information acquisition unit. This is a conversion that is used when converting the position in the sensor coordinate system to the position in the robot coordinate system from the plurality of stored groups. Since the coefficient is calculated, the conversion coefficient used when converting the position in the sensor coordinate system to the position in the robot coordinate system can be calculated even if the sensor is a three-dimensional sensor such as a laser scanner. There is an effect that can be done.
  • FIG. 1 is a block diagram showing a robot system in which a conversion coefficient calculation device 4 according to Embodiment 1 of the present invention is mounted.
  • a robot hand 1 is attached to the tip of a robot arm and the like, for example, an object 10 such as a processing target is fixed.
  • the robot arm includes at least one joint, and the position or posture of the robot hand 1 is switched by controlling the one or more joints.
  • the robot hand 1 has a function of gripping the object 10.
  • the robot hand 1 only needs to be able to fix the object 10, and any method for gripping the object 10 may be used. For example, there are a method of sandwiching the object 10 by opening and closing many fingers, a method of gripping the object 10 by magnetic force or suction, and the like.
  • the robot control unit 2 is a control device that switches the position or posture of the robot hand 1 by controlling the joints of the robot arm or the robot hand 1.
  • the posture of the robot hand 1 corresponds to, for example, an angle with respect to the surface on which the robot is placed.
  • the sensor 3 is a three-dimensional sensor such as a laser scanner that measures the object 10 held by the robot hand 1, and outputs sensor information indicating the measurement result of the object 10.
  • the sensor information output from the sensor 3 is a set of three-dimensional coordinates (X, Y, Z) of the surface of the object 10 in the space.
  • a method for measuring the object 10 by the sensor 3 a method based on stereo vision, a Time-of-Flight method, and the like are known.
  • the conversion coefficient calculation device 4 includes a position / orientation information acquisition unit 11, a sensor information acquisition unit 12, a conversion coefficient calculation unit 13, and a coordinate conversion unit 14.
  • FIG. 2 is a hardware configuration diagram showing the conversion coefficient calculation device 4 according to Embodiment 1 of the present invention.
  • the position / orientation information acquisition unit 11 is realized by, for example, the position / orientation information acquisition circuit 21 illustrated in FIG. 2, acquires position / orientation information indicating the position and orientation of the robot hand 1, and the robot of the object 10 from the position / orientation information. A process of specifying a three-dimensional position in the coordinate system is performed.
  • the sensor information acquisition unit 12 is realized by, for example, the sensor information acquisition circuit 22 illustrated in FIG.
  • the sensor coordinate system is a coordinate system centered on the sensor 3
  • the robot coordinate system is a robot coordinate system centered on the robot hand 1.
  • the conversion coefficient calculation unit 13 is realized by, for example, a conversion coefficient calculation circuit 23 illustrated in FIG.
  • the conversion coefficient calculation unit 13 is acquired by the three-dimensional position and sensor information acquisition unit 12 in the robot coordinate system specified by the position / orientation information acquisition unit 11 every time the position or posture of the robot hand 1 is switched by the robot control unit 2.
  • the storage unit 13a holds a set with a three-dimensional position in the sensor coordinate system indicated by the sensor information.
  • the conversion coefficient calculation unit 13 uses, as a conversion coefficient used when converting a three-dimensional position in the sensor coordinate system into a three-dimensional position in the robot coordinate system from a plurality of sets held in the storage unit 13a.
  • a process of calculating the rotation matrix R and the translation vector t is performed.
  • the conversion coefficient calculation unit 13 every time the conversion coefficient calculation unit 13 holds a set of a three-dimensional position in the robot coordinate system and a three-dimensional position in the sensor coordinate system, the conversion coefficient calculation unit 13 sends a control signal instructing switching of the position or orientation of the robot hand 1. A process of outputting to the robot control unit 2 is performed.
  • the coordinate conversion unit 14 is realized by, for example, a coordinate conversion circuit 24 shown in FIG.
  • the coordinate conversion unit 14 uses the rotation matrix R and the translation vector t that are the conversion coefficients calculated by the conversion coefficient calculation unit 13 to use the three-dimensional position in the sensor coordinate system indicated by the sensor information acquired by the sensor information acquisition unit 12. Is converted into a three-dimensional position in the robot coordinate system.
  • each of the position / orientation information acquisition unit 11, the sensor information acquisition unit 12, the conversion coefficient calculation unit 13, and the coordinate conversion unit 14, which is a component of the conversion coefficient calculation device 4, has dedicated hardware as illustrated in FIG. 2.
  • Hardware that is, a position and orientation information acquisition circuit 21, a sensor information acquisition circuit 22, a conversion coefficient calculation circuit 23, and a coordinate conversion circuit 24 are assumed.
  • the storage unit 13a included in the conversion coefficient calculation unit 13 is realized by, for example, a storage processing circuit.
  • This memory processing circuit is, for example, a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable Read Only Memory), or an EEPROM (Electrically Erasable Programmable Memory).
  • the position / orientation information acquisition circuit 21, the sensor information acquisition circuit 22, the conversion coefficient calculation circuit 23, and the coordinate conversion circuit 24 are, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application), or the like.
  • a specific integrated circuit (FPGA), a field-programmable gate array (FPGA), or a combination thereof is applicable.
  • the components of the conversion coefficient calculation device 4 are not limited to those realized by dedicated hardware, but the conversion coefficient calculation device 4 is realized by software, firmware, or a combination of software and firmware. May be.
  • Software or firmware is stored as a program in the memory of a computer.
  • the computer means hardware that executes a program, and includes, for example, a CPU (Central Processing Unit), a central processing unit, a processing unit, an arithmetic unit, a microprocessor, a microcomputer, a processor, a DSP (Digital Signal Processor), and the like.
  • FIG. 3 is a hardware configuration diagram of a computer when the conversion coefficient calculation device 4 is realized by software or firmware.
  • the storage unit 13a included in the conversion coefficient calculation unit 13 is configured on the memory 31 of the computer and the position / orientation information acquisition unit 11 acquires the position / orientation information.
  • a memory 31 stores a conversion coefficient calculation program for causing a computer to execute the processing procedure, the sensor information acquisition processing procedure of the sensor information acquisition unit 12, the conversion coefficient calculation processing procedure of the conversion coefficient calculation unit 13, and the coordinate conversion processing procedure of the coordinate conversion unit 14.
  • the computer processor 32 may execute the program stored in the memory 31.
  • a communication circuit 33 is a circuit that receives position and orientation information from the robot hand 1, receives sensor information from the sensor 3, and outputs a control signal to the robot control unit 2.
  • the display device 34 is realized by, for example, a liquid crystal display, and displays the processing state of the processor 32 and the like.
  • FIG. 4 is a flowchart showing a conversion coefficient calculation method corresponding to a processing procedure when the conversion coefficient calculation device 4 is realized by software or firmware.
  • the robot hand 1 holds the object 10.
  • the robot control unit 2 switches the position or posture of the robot hand 1 by controlling the joints of the robot arm or the robot hand 1.
  • the robot control unit 2 switches the position or posture of the robot hand 1 by changing the angle of the joint of the robot arm or the robot hand 1 according to the control signal output from the conversion coefficient calculation unit 13.
  • N is an integer of 3 or more
  • the position P r n is represented by a coordinate X r on the X axis, a coordinate Y r on the Y axis, and a coordinate Z r on the Z axis in the robot coordinate system, as shown in the following formula (1).
  • T is the symbol denoting a transposed, for example, x T is the transpose of vector x.
  • Position and orientation information acquisition unit 11 the step of acquiring position and orientation information indicating the position and orientation of the robot hand 1, identifies a three-dimensional position P r n in the robot coordinate system of the object 10 from the position and orientation information (FIG. 4 ST2). Processing of the position and orientation information for specifying the three-dimensional position P r n in the robot coordinate system and a detailed description thereof will be omitted because it is a known technique, for example, a three-dimensional position P r n in the process as shown below Can be identified.
  • the position and orientation of the robot hand 1 and preparing a table showing the correspondence between the three-dimensional position P r n of the object 10 which is gripped by the robot hand 1 in advance, the position and orientation information acquisition unit 11, the table by reference to the, to identify the three-dimensional position P r n corresponding to the position and orientation indicated by the position and orientation information.
  • the sensor information output from the sensor 3 is three-dimensional point group data p 1 n , p 2 n ,..., P M n indicating the position of the M point on the surface of the object 10. .
  • the sensor information acquisition unit 12 uses the shape data indicating the shape of the object 10, 3D point group data p 1 n, p 2 n, ⁇ , sensors of the object 10 from p M n A three-dimensional position P s n in the coordinate system is calculated.
  • the robot hand 1 since the robot hand 1 holds the spherical object 10, the three-dimensional point cloud data p 1 n , p 2 n ,..., P M n are stored on the surface of the sphere. That is the point.
  • the sensor information acquisition part 12 uses the equation of the sphere showing the shape of the sphere as the shape data indicating the shape of the object 10, and uses the three-dimensional point cloud data p 1 n , p 2 n ,.
  • the three-dimensional position P s n can be obtained as the center position (X 0 , Y 0 , Z 0 ) of the object 10.
  • Equation (3) the equation of the sphere is represented by the following formula (3).
  • r is the radius of the sphere.
  • the square error of M point three-dimensional point group data p 1 n , p 2 n ,..., P M n is expressed by the following equation (4).
  • is a symbol representing the norm of the vector x.
  • the sensor information acquisition unit 12 calculates the P s n squared error expressed by the formula (4) is minimum as the three-dimensional position P s n of the sensor coordinate system.
  • Non-Patent Document 1 discloses a method for calculating P s n that minimizes the square error represented by Equation (4).
  • a method such as “Direct Least Squares Algorithm” disclosed in Non-Patent Document 1 is used. Can do.
  • the object 10 may be an ellipsoid that is not a true sphere, and in this case, an equation representing the shape of the ellipsoid is used. What should I do?
  • Non-Patent Document 1 David Eberly, “Least Squares Fitting of Data,” published July 15, 1999, https://www.geometrictools.com/Documentation/LeastSquaresFitting.pdf
  • Non-Patent Document 2 the three-dimensional point group data p 1 n of M points, p 2 n, ⁇ ⁇ ⁇ , from among p M n, the measurement results of the region of the object 10 which is gripped by the robot hand 1 A method for automatic extraction is described.
  • Non-Patent Document 2 Published by Anas Abuzaina et al., “Sphere Detection in Kinect Point Clouds via the 3D Hough Transform,” Computer Analysis of Images and Patterns, vol. 8048, pp. 290-297, Springer, 2013
  • Conversion coefficient calculation section 13 in the three-dimensional position P r n and the sensor coordinate system of the object 10 identified by the sensor information acquisition unit 12 in the robot coordinate system of the object 10 specified by the position and orientation information acquisition unit 11 obtains the three-dimensional position P s n, it stores a set of the three-dimensional position P s n in the three-dimensional position P r n and the sensor coordinate system in the robot coordinate system in the storage unit 13a (step ST4 in FIG. 4 ).
  • Conversion coefficient calculation section 13 save a set of the three-dimensional position P s n in the three-dimensional position P r n and the sensor coordinate system at the robot coordinate system in the storage unit 13a, the jointed robot arm or robot hand 1 It is determined whether or not the controlled number n has reached N times (step ST5 in FIG. 4).
  • N is an integer equal to or greater than 3.
  • n 1, and therefore, it is determined that the number n of controlling the joint of the robot arm or the robot hand 1 has not reached N times. Is done.
  • step ST5 NO in FIG. 4
  • step ST6 the position or posture of the robot hand 1 Is output to the robot controller 2
  • steps ST1 to ST5 are repeated.
  • the robot control unit 2 when the robot control unit 2 receives a control signal from the conversion coefficient calculation unit 13, the robot control unit 2 performs control for switching the position and posture of the robot hand 1 by changing the angle of the joint of the robot arm or the robot hand 1 according to the control signal. Do.
  • the object 10 which is gripped by the robot hand 1 is moved to the position of P r n in the robot coordinate system (step ST1 in FIG. 4).
  • the number n of times when the joint of the robot arm or the robot hand 1 is controlled is 2, it is moved to the position of P r 2 in the robot coordinate system.
  • the robot arm is moved to the position of P r 3 in the robot coordinate system.
  • Position and orientation information acquisition unit 11 the step of acquiring position and orientation information indicating the position and orientation of the robot hand 1, identifies a three-dimensional position P r n in the robot coordinate system of the object 10 from the position and orientation information (FIG. 4 ST2).
  • the sensor 3 measures the object 10 moved to the position of P r n in the robot coordinate system by the robot control unit 2, and as a measurement result of the object 10, the three-dimensional position P s n of the object 10 in the sensor coordinate system.
  • the sensor information indicating is output.
  • the sensor information acquisition unit 12 acquires sensor information output from the sensor 3, to identify the three-dimensional position P s n in the sensor coordinate system of the object 10 from the sensor information (step ST3 in FIG. 4).
  • Conversion coefficient calculation section 13 in the three-dimensional position P r n and the sensor coordinate system of the object 10 identified by the sensor information acquisition unit 12 in the robot coordinate system of the object 10 specified by the position and orientation information acquisition unit 11 obtains the three-dimensional position P s n, it stores a set of the three-dimensional position P s n in the three-dimensional position P r n and the sensor coordinate system in the robot coordinate system in the storage unit 13a (step ST4 in FIG. 4 ).
  • Conversion coefficient calculation section 13 save a set of the three-dimensional position P s n in the three-dimensional position P r n and the sensor coordinate system in the robot coordinate system in the storage unit 13a, the joint robot arm or robot hand 1 It is determined whether or not the controlled number n has reached N times (step ST5 in FIG. 4).
  • the conversion coefficient calculation unit 13 determines that the number n of times when the joint of the robot arm or the robot hand 1 is controlled has reached N times (in the case of YES in step ST5 in FIG. 4), the conversion matrix R and the rotation matrix R and A translation vector t is calculated (step ST7 in FIG. 4). Rigid transformation in converting the three-dimensional position P s n of the sensor coordinate system into three-dimensional position P r n of the robot coordinate system, using the rotation matrix R and translation vector t, the following equation (5) expressed.
  • the calculation process of the rotation matrix R and the translation vector t by the conversion coefficient calculation unit 13 will be specifically described.
  • the conversion coefficient calculation unit 13 acquires N sets stored in the storage unit 13a. That is, conversion coefficient calculation section 13, as shown below, a set of the three-dimensional position P s n in the three-dimensional position P r n and the sensor coordinate system in the robot coordinate system to the N acquisition.
  • n 1, 2,..., N.
  • Conversion coefficient calculation section 13 a set of the three-dimensional position P s n in the three-dimensional position P r n and the sensor coordinate system in the robot coordinate system when the N to obtain, as shown below, from the N pairs A rotation matrix R and a translation vector t are calculated.
  • Method described in Non-Patent Document 3 is a three-dimensional position P r n and how close as possible a three-dimensional position P s n in the N pairs. That is, the method described in Non-Patent Document 3 is a method for calculating a rotation matrix R that minimizes the following equation (6).
  • Non-Patent Document 3 Edited by Yasushi Yagi et al., “Computer Vision Cutting-Edge Guide 3”, Adcom Media Co., Ltd., issued December 8, 2010, p. 36-37
  • the conversion coefficient calculation unit 13 first calculates a covariance matrix A represented by the following equation (7).
  • mu r is the barycentric coordinates of the observation point of the object 10 in the robot coordinate system is expressed by the following equation (8).
  • ⁇ s is the barycentric coordinate of the observation point of the object 10 in the sensor coordinate system, and is expressed by the following equation (9).
  • the conversion coefficient calculation unit 13 After calculating the covariance matrix A, the conversion coefficient calculation unit 13 performs singular value decomposition on the covariance matrix A as shown in the following equation (10).
  • U and V are unitary matrices, and the unitary matrices U and V are obtained by singular value decomposition of the covariance matrix A.
  • the conversion coefficient calculation unit 13 calculates the rotation matrix R using the unitary matrices U and V as shown in the following equation (11).
  • det (X) represents the determinant of the matrix X.
  • the conversion coefficient calculation unit 13 uses the rotation matrix R and the barycentric coordinates ⁇ r and ⁇ s of the observation points of the object 10 as shown in the following equation (12), to obtain a translation vector. t is calculated.
  • the coordinate conversion unit 14 acquires the rotation matrix R and the translation vector t calculated by the conversion coefficient calculation unit 13, and stores the rotation matrix R and the translation vector t. Thereafter, when the coordinate conversion unit 14 receives the sensor information from the sensor information acquisition unit 12, the stored rotation matrix R and translation vector t and the three-dimensional position P in the sensor coordinate system of the object 10 indicated by the sensor information. s to by substituting the equation (13) below, to convert the three-dimensional position P s of the sensor coordinate system of the object 10 to the three-dimensional position P r of the robot coordinate system (step ST8 in FIG. 4).
  • the conversion coefficient calculation unit 13 uses the robot coordinate system specified by the position / orientation information acquisition unit 11 every time the position or orientation of the robot hand 1 is switched. And the position in the sensor coordinate system indicated by the sensor information acquired by the sensor information acquisition unit 12 is held, and the position in the sensor coordinate system is determined from the plurality of held sets in the robot coordinate system. Since the conversion coefficient used when converting to the position is calculated, even when the sensor 3 is a three-dimensional sensor such as a laser scanner, the position in the sensor coordinate system is converted to the position in the robot coordinate system. There is an effect that the conversion coefficient used in the calculation can be calculated.
  • Embodiment 2 FIG. In the first embodiment, the example in which the object 10 gripped by the robot hand 1 is a sphere is shown, but in the second embodiment, the object 10 gripped by the robot hand 1 is a flat plate. explain.
  • FIG. 5 is a block diagram showing a robot system in which the conversion coefficient calculation apparatus 6 according to Embodiment 2 of the present invention is mounted.
  • the sensor 5 is a three-dimensional sensor such as a laser scanner that measures the object 10 held by the robot hand 1, and outputs sensor information indicating the measurement result of the object 10.
  • the sensor information output from the sensor 5 is a set of plane parameters (hereinafter referred to as “plane parameters”) that are the surface of the object 10 in the space.
  • plane parameters As a method for measuring the object 10 by the sensor 5, a method based on stereo vision, a Time-of-Flight method, and the like are known.
  • the sensor 5 outputs sensor information that is a set of plane parameters, and also outputs sensor information that is a set of three-dimensional coordinates (X, Y, Z) of the surface of the object 10 in the same manner as the sensor 3 of FIG. Output.
  • the conversion coefficient calculation device 6 includes a position / orientation information acquisition unit 41, a sensor information acquisition unit 42, a conversion coefficient calculation unit 43, and a coordinate conversion unit 44.
  • FIG. 6 is a hardware configuration diagram showing a conversion coefficient calculation apparatus 6 according to Embodiment 2 of the present invention.
  • the position / orientation information acquisition unit 41 is realized by, for example, the position / orientation information acquisition circuit 51 illustrated in FIG. 6.
  • the position / orientation information acquisition unit 41 acquires position / orientation information indicating the position and orientation of the robot hand 1.
  • a process for specifying a plane parameter in the coordinate system is performed.
  • the sensor information acquisition unit 42 is realized by, for example, the sensor information acquisition circuit 52 illustrated in FIG. 6, and performs a process of acquiring sensor information indicating a plane parameter in the sensor coordinate system of the object 10 from the sensor 5.
  • the conversion coefficient calculation unit 43 is realized by, for example, a conversion coefficient calculation circuit 53 illustrated in FIG.
  • the conversion coefficient calculation unit 43 is acquired by the plane parameter and sensor information acquisition unit 42 in the robot coordinate system specified by the position / orientation information acquisition unit 41 every time the position or posture of the robot hand 1 is switched by the robot control unit 2.
  • a storage unit 43a that holds a set of plane parameters in the sensor coordinate system indicated by the sensor information.
  • the conversion coefficient calculation unit 43 uses, as a conversion coefficient used when converting a three-dimensional position in the sensor coordinate system into a three-dimensional position in the robot coordinate system from a plurality of sets held in the storage unit 43a.
  • a process of calculating the rotation matrix R and the translation vector t is performed.
  • a control signal that instructs switching of the position or orientation of the robot hand 1 is controlled by the robot. The process of outputting to the unit 2 is performed.
  • the coordinate conversion unit 44 is realized by, for example, a coordinate conversion circuit 54 shown in FIG.
  • the coordinate conversion unit 44 uses the rotation matrix R and the translation vector t, which are the conversion coefficients calculated by the conversion coefficient calculation unit 43, to provide a three-dimensional position in the sensor coordinate system indicated by the sensor information acquired by the sensor information acquisition unit 42. Is converted into a three-dimensional position in the robot coordinate system.
  • the components of the conversion coefficient calculation device 6 are realized by dedicated hardware.
  • the components of the conversion coefficient calculation device 6 are not limited to those realized by dedicated hardware, and the conversion coefficient calculation device 6 is realized by software, firmware, or a combination of software and firmware. It may be.
  • the storage unit 45a included in the conversion coefficient calculation unit 45 is configured on the memory 31 of the computer shown in FIG.
  • FIG. 7 is a flowchart showing a conversion coefficient calculation method corresponding to a processing procedure when the conversion coefficient calculation device 4 is realized by software or firmware.
  • the flat plate that is the object 10 is a plane that extends infinitely in the robot coordinate system, and the plane of the flat plate is represented by a plane parameter ⁇ r represented by the following equation (14).
  • ⁇ r is the normal vector of the plane
  • d r is the distance from the origin of the robot coordinate system to the plane.
  • the origin of the robot coordinate system is, for example, the center position of the robot hand 1.
  • the robot hand 1 holds a flat object 10.
  • the robot control unit 2 switches the position or posture of the robot hand 1 by controlling the joints of the robot arm or the robot hand 1. Also in the second embodiment, as in the first embodiment, the robot control unit 2 switches the position or posture of the robot hand 1 by changing the angle of the joint of the robot arm or the robot hand 1 according to the control signal. And In the second embodiment, for convenience of explanation, the joint of the robot arm or the robot hand 1 is controlled N (N is an integer of 3 or more) times.
  • the plane is represented by the plane parameter ⁇ r n shown in the following equation (15) (step ST11 in FIG. 7).
  • the position / orientation information acquisition unit 41 acquires position / orientation information indicating the position and orientation of the robot hand 1 and specifies a plane parameter ⁇ r n in the robot coordinate system of the object 10 from the position / orientation information (step ST12 in FIG. 7). ). Processing the position and orientation information for specifying the plane parameters [pi r n at the robot coordinate system has a detailed description thereof will be omitted because it is a known technique, for example, to identify the plane parameters [pi r n in the process as shown below be able to.
  • the position and orientation of the robot hand 1 and preparing a table showing the correspondence between the plane parameter [pi r n of the object 10 which is gripped by the robot hand 1 in advance, the position and orientation information acquisition unit 41, the table by referring to identify the plane parameters [pi r n corresponding to the position and orientation indicated by the position and orientation information.
  • sensor information output from the sensor 5 is three-dimensional point group data p 1 n , p 2 n ,..., P M n indicating M plane parameters on the surface of the object 10. is there.
  • the sensor information acquisition unit 42 uses the shape data indicating the shape of the object 10, 3D point group data p 1 n, p 2 n, ⁇ , sensors of the object 10 from p M n
  • the plane parameter ⁇ s n in the coordinate system is calculated.
  • the robot hand 1 since the robot hand 1 is grasping the object 10 of a flat plate, 3D point group data p 1 n, p 2 n, ⁇ , p M n are all on the surface of the plate That is the point.
  • the sensor information acquisition part 42 uses the equation of the plane showing the shape of a flat plate as the shape data indicating the shape of the object 10, and uses the three-dimensional point group data p 1 n , p 2 n ,.
  • the plane parameter ⁇ s n in the sensor coordinate system of the object 10 can be obtained.
  • Conversion coefficient calculation unit 43 plane in the plane parameters [pi r n in the robot coordinate system of the object 10 is identified, the sensor coordinate system of the object 10 identified by the sensor information obtaining unit 42 by the position and orientation information acquisition unit 41 obtains the parameter [pi s n, stores a set of the plane parameters [pi s n at plane parameters [pi r n and the sensor coordinate system in the robot coordinate system in the storage unit 43a (step ST14 in FIG. 7).
  • Conversion coefficient calculation unit 43 save a set of the plane parameters [pi s n at plane parameters [pi r n and the sensor coordinate system in the robot coordinate system in the storage unit 43a, and controls the joint robot arm or robot hand 1 It is determined whether or not the number n has reached N times (step ST15 in FIG. 7).
  • step ST15 in FIG. 7 NO
  • step ST16 in FIG. 7 the position or orientation of the robot hand 1 Is output to the robot controller 2
  • steps ST11 to ST15 are repeated.
  • the robot control unit 2 upon receiving a control signal from the conversion coefficient calculation unit 43, the robot control unit 2 performs control for switching the position and posture of the robot hand 1 by changing the angle of the robot arm or the joint of the robot hand 1 according to the control signal. Do. Thereby, the plane of the object 10 held by the robot hand 1 becomes a plane represented by the plane parameter ⁇ r n (step ST11 in FIG. 7).
  • the position / orientation information acquisition unit 41 acquires position / orientation information indicating the position and orientation of the robot hand 1 and specifies a plane parameter ⁇ r n in the robot coordinate system of the object 10 from the position / orientation information (step ST12 in FIG. 7). ).
  • the sensor information acquisition unit 42 acquires the sensor information output from the sensor 5, and specifies the plane parameter ⁇ s n in the sensor coordinate system of the object 10 from the sensor information (step ST13 in FIG. 7).
  • Conversion coefficient calculation unit 43 plane in the plane parameters [pi r n in the robot coordinate system of the object 10 is identified, the sensor coordinate system of the object 10 identified by the sensor information obtaining unit 42 by the position and orientation information acquisition unit 41 obtains the parameter [pi s n, stores a set of the plane parameters [pi s n at plane parameters [pi r n and the sensor coordinate system in the robot coordinate system in the storage unit 43a (step ST14 in FIG. 7).
  • Conversion coefficient calculation unit 43 save a set of the plane parameters [pi s n at plane parameters [pi r n and the sensor coordinate system in the robot coordinate system in the storage unit 43a, and controls the joint robot arm or robot hand 1 It is determined whether or not the number n has reached N times (step ST15 in FIG. 7).
  • the conversion coefficient calculation unit 43 determines that the number n of times when the robot arm or the joint of the robot hand 1 is controlled has reached N times (in the case of step ST15 in FIG. 7: YES), the conversion coefficient calculation unit 43 converts the rotation matrix R and A translation vector t is calculated (step ST17 in FIG. 7). Relationship between the plane parameter [pi s n at plane parameters [pi r n and the sensor coordinate system in the robot coordinate system, using the rotation matrix R and translation vectors t, it is expressed by the following equation (18). Hereinafter, the calculation process of the rotation matrix R and the translation vector t by the conversion coefficient calculation unit 13 will be specifically described.
  • the conversion coefficient calculation unit 43 acquires N sets held in the storage unit 43a. That is, conversion coefficient calculation unit 43, as shown below, a set of the plane parameters [pi s n at plane parameters [pi r n and the sensor coordinate system in the robot coordinate system to the N acquisition.
  • n 1, 2,..., N.
  • ⁇ r 1 - ⁇ s 1 set ⁇ r 2 - ⁇ s 2 set: ⁇ r N - ⁇ s N pair
  • Conversion coefficient calculation unit 43 when a set of the plane parameters [pi s n at plane parameters [pi r n and the sensor coordinate system in the robot coordinate system of N to obtain, as shown below, the rotation matrix from the N pairs R and translation vector t are calculated.
  • the method described in Non-Patent Document 4 can be used as a method for calculating the rotation matrix R from the N sets.
  • the method described in Non-Patent Document 4 is a method for calculating a rotation matrix R that minimizes the following equation (19).
  • the conversion coefficient calculation unit 43 calculates a translation vector t by solving simultaneous linear equations shown in the following equation (20).
  • the coordinate conversion unit 44 acquires the rotation matrix R and the translation vector t calculated by the conversion coefficient calculation unit 43, and stores the rotation matrix R and the translation vector t. Then, the coordinate transformation unit 44 receives the sensor information indicating the three-dimensional position P s of the sensor coordinate system of the object 10 from the sensor information acquiring unit 12, and the rotation matrix R and translation vector t stores sensor information by substituting the equation (21) below the three-dimensional position P s indicated converts the three-dimensional position P s of the sensor coordinate system of the object 10 to the three-dimensional position P r of the robot coordinate system (FIG. 7 step ST18).
  • the conversion coefficient calculation unit 43 uses the robot coordinate system specified by the position / orientation information acquisition unit 41 every time the position or orientation of the robot hand 1 is switched.
  • the plane parameter in the sensor coordinate system indicated by the sensor information acquired by the sensor information acquisition unit 42 is held, and the position in the sensor coordinate system is determined from the plurality of held sets in the robot coordinate system. Since the conversion coefficient used for the conversion to the position in the camera is calculated, even if the sensor 5 is a three-dimensional sensor such as a laser scanner, the position in the sensor coordinate system is converted into the position in the robot coordinate system. There is an effect that it is possible to calculate a conversion coefficient to be used.
  • Embodiment 3 In the first embodiment, an example in which the object 10 gripped by the robot hand 1 is a sphere is shown, and in the second embodiment, the object 10 gripped by the robot hand 1 is a flat plate. Yes.
  • the object 10 held by the robot hand 1 is not limited to a sphere or a flat plate.
  • the shape of the object 10 is a shape represented by a three-dimensional model
  • the three-dimensional model is used as shape data indicating the shape of the object 10.
  • the object 10 having a feature such as a cube having corners can be represented by a three-dimensional model.
  • points having features such as corners can be extracted from the three-dimensional point cloud data indicated by the sensor information.
  • a CAD model or polygon data of the object 10 held by the robot hand 1 can be used.
  • the sensor information acquisition unit 12 when calculating the three-dimensional position P s n in the sensor coordinate system of the object 10, the shape data representing the shape of the object 10, the sphere representing the shape of a sphere
  • the third embodiment is different in that a three-dimensional model of the object 10 is used. Therefore, in the third embodiment, the sensor information acquisition unit 12 uses the three-dimensional model of the object 10 as the shape data indicating the shape of the object 10, and indicates the position of a point having a feature such as a corner of the object 10.
  • the three-dimensional position P s n is obtained as the center position of the object 10. Can do. Since others are the same as that of the said Embodiment 1, detailed description is abbreviate
  • the conversion coefficient calculation unit 13 or the conversion coefficient calculation unit 43 calculates a conversion coefficient used when converting the three-dimensional position in the sensor coordinate system to the three-dimensional position in the robot coordinate system.
  • a large error may be included in the measurement result of the sensor 3 or the sensor 5. If the measurement result of the sensor 3 or sensor 5 includes a large error, the sensor information including the large error is output from the sensor 3 or sensor 5 to the conversion coefficient calculation unit 13 or the conversion coefficient calculation unit 43. The accuracy of the conversion coefficient calculated by the calculation unit 13 or the conversion coefficient calculation unit 43 deteriorates.
  • the object 10 is a flat plate
  • comparing the situation where the flat plate faces the sensor 5 and the situation where the flat plate does not face the sensor 5 sensor information measured in a situation where the flat plate does not face the sensor 5 Contains a larger error.
  • the error included in the sensor information may not change even if the posture of the object 10 changes.
  • the fourth embodiment an example in which the situation in which the accuracy of the conversion coefficient calculated by the conversion coefficient calculation unit 13 or the conversion coefficient calculation unit 43 deteriorates depending on the posture of the object 10 will be described.
  • FIG. 8 is a block diagram showing a robot system equipped with a conversion coefficient calculation device 6 according to Embodiment 4 of the present invention. 8, the same reference numerals as those in FIG. 5 denote the same or corresponding parts.
  • FIG. 9 is a hardware configuration diagram showing a conversion coefficient calculation apparatus 6 according to Embodiment 4 of the present invention. 9, the same reference numerals as those in FIG. 6 denote the same or corresponding parts.
  • the conversion coefficient calculation device 6 includes a position / orientation information acquisition unit 41, a sensor information acquisition unit 42, a conversion coefficient calculation unit 45, a coordinate conversion unit 44, and a position / orientation adjustment unit 46.
  • the conversion coefficient calculation unit 45 is realized by, for example, a conversion coefficient calculation circuit 55 shown in FIG. Similar to the conversion coefficient calculation unit 43 in FIG. 5, the conversion coefficient calculation unit 45 uses the robot coordinate system specified by the position / orientation information acquisition unit 41 every time the position or posture of the robot hand 1 is switched by the robot control unit 2.
  • the storage unit 45a holds a set of the plane parameter and the plane parameter in the sensor coordinate system indicated by the sensor information acquired by the sensor information acquisition unit 42.
  • the conversion coefficient calculation unit 45 converts a three-dimensional position in the sensor coordinate system into a three-dimensional position in the robot coordinate system from a plurality of sets held in the storage unit 45a.
  • the conversion coefficient calculation unit 45 stores the position of the robot hand 1 or the position of the robot hand 1 each time a set of a plane parameter in the robot coordinate system and a plane parameter in the sensor coordinate system is held. Processing for outputting a control signal for instructing switching of the posture to the robot controller 2 is performed.
  • the conversion coefficient calculation unit 45 adjusts the position or posture of the robot hand 1 by the position / orientation adjustment unit 46, and then the position or posture of the robot hand 1 by the robot control unit 2. Each time is switched, a set of a plane parameter in the robot coordinate system and a plane parameter in the sensor coordinate system is held in the storage unit 45a.
  • the conversion coefficient calculation unit 45 uses a rigid body as a conversion coefficient used when converting a three-dimensional position in the sensor coordinate system to a three-dimensional position in the robot coordinate system from a plurality of sets held in the storage unit 45a. A process of calculating again the rotation matrix R and the translation vector t in the transformation is performed.
  • the position / orientation adjustment unit 46 is realized by, for example, a position / orientation adjustment circuit 56 shown in FIG.
  • the position / orientation adjustment unit 46 adjusts the position or orientation of the robot hand 1 by controlling the robot control unit 2 after the conversion coefficient is first calculated by the conversion coefficient calculation unit 45.
  • the conversion coefficient calculation unit 45 is a three-dimensional robot coordinate system specified by the position and orientation information acquisition unit 11 as in the conversion coefficient calculation unit 13 in FIG. 1.
  • a set of the position and the three-dimensional position in the sensor coordinate system indicated by the sensor information acquired by the sensor information acquisition unit 12 is held.
  • the conversion coefficient calculation unit 45 calculates a three-dimensional position in the sensor coordinate system from a plurality of sets held in the storage unit 45a.
  • a rotation matrix R and a translation vector t in rigid body conversion are calculated as conversion coefficients used when converting into positions.
  • the components of the transform coefficient calculation device 6 are realized by dedicated hardware.
  • the components of the conversion coefficient calculation device 6 are not limited to those realized by dedicated hardware, and the conversion coefficient calculation device 6 is realized by software, firmware, or a combination of software and firmware. It may be.
  • the storage unit 45a included in the conversion coefficient calculation unit 45 is configured on the memory 31 of the computer shown in FIG.
  • Position and orientation information acquisition processing procedure sensor information acquisition processing procedure of the sensor information acquisition unit 42, conversion coefficient calculation processing procedure of the conversion coefficient calculation unit 45, coordinate conversion processing procedure of the coordinate conversion unit 44 and processing procedure of the position and orientation adjustment unit 46
  • a conversion coefficient calculation program to be executed by the computer may be stored in the memory 31, and the computer processor 32 may execute the program stored in the memory 31.
  • the conversion coefficient calculation unit 45 executes steps ST11 to ST17 in FIG. 7 to obtain the rotation matrix R and the translation vector t as conversion coefficients. calculate.
  • the conversion coefficient calculation unit 45 executes steps ST11 to ST15 in FIG. 7 N times to thereby obtain a plane parameter in the robot coordinate system. hold [pi r n and a set of the plane parameters [pi s n of the sensor coordinate system to the respective storage unit 45a. Conversion coefficient calculation unit 45, a set of the plane parameters [pi s n at plane parameters [pi r n and the sensor coordinate system in the robot coordinate system and N pieces from the storage unit 45a, the rotation matrix R from the N pairs Calculate again. When the conversion coefficient calculation unit 45 calculates the rotation matrix R again, the conversion coefficient calculation unit 45 calculates the translation vector t again using the rotation matrix R.
  • the position / orientation adjustment unit 46 that adjusts the position or posture of the robot hand 1 is provided, and the conversion coefficient The calculation unit 45 adjusts the position or posture of the robot hand 1 by the position / orientation adjustment unit 46, and then the robot identified by the position / orientation information acquisition unit 41 every time the position or posture of the robot hand 1 is switched again.
  • a set of the plane parameter in the coordinate system and the plane parameter in the sensor coordinate system indicated by the sensor information acquired by the sensor information acquisition unit 42 is held, and the position in the sensor coordinate system is determined from the plurality of held sets. Since the conversion coefficient used when converting the position in the robot coordinate system is calculated, the conversion coefficient calculation unit 45 calculates the conversion coefficient according to the posture of the object 10. That the accuracy of the transform coefficients provided an advantage of being able to avoid a situation to deteriorate.
  • the present invention is suitable for a conversion coefficient calculation device, a conversion coefficient calculation method, and a conversion coefficient calculation program for calculating a conversion coefficient used when converting a position in the sensor coordinate system to a position in the robot coordinate system.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne, chaque fois que la position ou une posture d'une main robotique (1) est commutée, une unité de calcul de coefficient de conversion (13) qui conserve une paire de positions, dont l'une est une position dans un système de coordonnées de robot et est spécifiée par une unité d'acquisition d'informations de posture de position (11) et dont l'autre est une position dans un système de coordonnées de capteur et est indiquée par des informations de capteur qui sont acquises par une unité d'acquisition d'informations de capteur (12) ; et calcule, à partir de multiples paires conservées, un coefficient de conversion qui est utilisé lorsque la position dans le système de coordonnées de capteur est convertie en la position dans le système de coordonnées de robot.
PCT/JP2017/004006 2017-02-03 2017-02-03 Dispositif de calcul de coefficient de conversion, procédé de calcul de coefficient de conversion et programme de calcul de coefficient de conversion WO2018142582A1 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
PCT/JP2017/004006 WO2018142582A1 (fr) 2017-02-03 2017-02-03 Dispositif de calcul de coefficient de conversion, procédé de calcul de coefficient de conversion et programme de calcul de coefficient de conversion
JP2017536596A JP6275345B1 (ja) 2017-02-03 2017-02-03 変換係数算出装置、変換係数算出方法及び変換係数算出プログラム
CN201780084839.XA CN110267771A (zh) 2017-02-03 2017-02-03 变换系数计算装置、变换系数计算方法及变换系数计算程序
TW106117839A TW201830272A (zh) 2017-02-03 2017-05-31 變換係數算出裝置、變換係數算出方法以及變換係數算出程式

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/004006 WO2018142582A1 (fr) 2017-02-03 2017-02-03 Dispositif de calcul de coefficient de conversion, procédé de calcul de coefficient de conversion et programme de calcul de coefficient de conversion

Publications (1)

Publication Number Publication Date
WO2018142582A1 true WO2018142582A1 (fr) 2018-08-09

Family

ID=61158391

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/004006 WO2018142582A1 (fr) 2017-02-03 2017-02-03 Dispositif de calcul de coefficient de conversion, procédé de calcul de coefficient de conversion et programme de calcul de coefficient de conversion

Country Status (4)

Country Link
JP (1) JP6275345B1 (fr)
CN (1) CN110267771A (fr)
TW (1) TW201830272A (fr)
WO (1) WO2018142582A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06190756A (ja) * 1992-12-25 1994-07-12 Mitsubishi Electric Corp 視覚センサ付きロボットの座標系の設定方法
JP2007061979A (ja) * 2005-09-01 2007-03-15 Sharp Corp ロボットアームの視覚センサ補正方法およびコンピュータプログラム
JP2009006452A (ja) * 2007-06-29 2009-01-15 Nissan Motor Co Ltd カメラとロボット間のキャリブレーション方法及びその装置
JP2014180720A (ja) * 2013-03-19 2014-09-29 Yaskawa Electric Corp ロボットシステム及びキャリブレーション方法
JP2016052695A (ja) * 2014-09-03 2016-04-14 キヤノン株式会社 ロボット装置、およびロボット装置の制御方法
JP2016103230A (ja) * 2014-11-28 2016-06-02 キヤノン株式会社 画像処理装置、画像処理方法、及びプログラム

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4218952B2 (ja) * 2003-09-30 2009-02-04 キヤノン株式会社 データ変換方法及び装置
CN101419070B (zh) * 2008-12-03 2010-11-10 南京航空航天大学 基于激光测距成像仪的相对位姿确定方法
WO2012140770A1 (fr) * 2011-04-14 2012-10-18 三菱電機株式会社 Appareil de commande de robot
JP6108860B2 (ja) * 2013-02-14 2017-04-05 キヤノン株式会社 ロボットシステム及びロボットシステムの制御方法
CN103337066B (zh) * 2013-05-27 2016-05-18 清华大学 3d获取系统的校准方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06190756A (ja) * 1992-12-25 1994-07-12 Mitsubishi Electric Corp 視覚センサ付きロボットの座標系の設定方法
JP2007061979A (ja) * 2005-09-01 2007-03-15 Sharp Corp ロボットアームの視覚センサ補正方法およびコンピュータプログラム
JP2009006452A (ja) * 2007-06-29 2009-01-15 Nissan Motor Co Ltd カメラとロボット間のキャリブレーション方法及びその装置
JP2014180720A (ja) * 2013-03-19 2014-09-29 Yaskawa Electric Corp ロボットシステム及びキャリブレーション方法
JP2016052695A (ja) * 2014-09-03 2016-04-14 キヤノン株式会社 ロボット装置、およびロボット装置の制御方法
JP2016103230A (ja) * 2014-11-28 2016-06-02 キヤノン株式会社 画像処理装置、画像処理方法、及びプログラム

Also Published As

Publication number Publication date
TW201830272A (zh) 2018-08-16
JP6275345B1 (ja) 2018-02-07
JPWO2018142582A1 (ja) 2019-02-07
CN110267771A (zh) 2019-09-20

Similar Documents

Publication Publication Date Title
KR102532072B1 (ko) 로봇 모션 용 비전 시스템의 자동 핸드-아이 캘리브레이션을 위한 시스템 및 방법
EP2981397B1 (fr) Système robot et méthode d'étalonnage
JP5850962B2 (ja) ビジュアルフィードバックを利用したロボットシステム
EP2543482B1 (fr) Appareil et procédé de traitement d'informations
JP7027299B2 (ja) ビジョンベース操作システムのキャリブレーション及びオペレーション
JP6815309B2 (ja) 作動システムおよびプログラム
JP6324025B2 (ja) 情報処理装置、情報処理方法
US20190015989A1 (en) Robot Control Device, Robot, Robot System, And Calibration Method Of Camera
KR101857472B1 (ko) 카메라 보정 방법 및 이에 대한 시스템
CN111801198A (zh) 一种手眼标定方法、系统及计算机存储介质
CN113910219A (zh) 运动臂系统以及控制方法
WO2023134237A1 (fr) Procédé, appareil et système d'étalonnage de système de coordonnées pour un robot, et support
Melchiorre et al. Collison avoidance using point cloud data fusion from multiple depth sensors: a practical approach
JP2019155556A (ja) ロボットの制御装置、ロボット、ロボットシステム、並びに、カメラの校正方法
CN111612794A (zh) 基于多2d视觉的零部件高精度三维位姿估计方法及系统
Farag et al. Grasping and positioning tasks for selective compliant articulated robotic arm using object detection and localization: Preliminary results
CN116766194A (zh) 基于双目视觉的盘类工件定位与抓取系统和方法
CN115401685A (zh) 相机导引的装置的手眼校准
JP6275345B1 (ja) 変換係数算出装置、変換係数算出方法及び変換係数算出プログラム
US20230123629A1 (en) 3d computer-vision system with variable spatial resolution
US20230100182A1 (en) Alignment Of A Radar Measurement System With A Test Target
EP3985347A1 (fr) Dispositif d'observation de nuage, procédé d'observation de nuage, et programme
JP2005186193A (ja) ロボットのキャリブレーション方法および三次元位置計測方法
Korak et al. Optical tracking system
Hu et al. Simultaneous hand-eye-workspace and camera calibration using laser beam projection

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2017536596

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17894784

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17894784

Country of ref document: EP

Kind code of ref document: A1