WO2022208963A1 - Dispositif d'étalonnage pour commander un robot - Google Patents

Dispositif d'étalonnage pour commander un robot Download PDF

Info

Publication number
WO2022208963A1
WO2022208963A1 PCT/JP2021/040394 JP2021040394W WO2022208963A1 WO 2022208963 A1 WO2022208963 A1 WO 2022208963A1 JP 2021040394 W JP2021040394 W JP 2021040394W WO 2022208963 A1 WO2022208963 A1 WO 2022208963A1
Authority
WO
WIPO (PCT)
Prior art keywords
posture
unit
robot
marker
calibration
Prior art date
Application number
PCT/JP2021/040394
Other languages
English (en)
Japanese (ja)
Inventor
毅 北村
聡 笹谷
大輔 浅井
信博 知原
Original Assignee
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立製作所 filed Critical 株式会社日立製作所
Publication of WO2022208963A1 publication Critical patent/WO2022208963A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J19/00Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
    • B25J19/02Sensing devices
    • B25J19/04Viewing devices

Definitions

  • the present invention relates to a calibration device for robot control.
  • a target marker with a known shape (hereinafter also referred to as a "marker") is attached to the robot, the robot is photographed in various postures by an imaging unit, and the photographed image and the posture of the robot are captured.
  • calibration man-hours may become enormous.
  • a mechanism for reducing the number of calibration man-hours or prototyping has attracted more attention.
  • Japanese Patent Laid-Open No. 2002-200000 describes a solution for "easily performing calibration", which states that "a robot control system, based on an image captured by an imaging unit, A measurement unit that measures the three-dimensional coordinates of an arbitrary object existing within the field of view of the robot, and the robot according to a pre-calculated correspondence relationship between the measured three-dimensional coordinates and the position and orientation of the action unit of the robot.
  • a command generation unit that generates a command for positioning the action part, a calibration execution part that executes calibration for calculating the correspondence, and a reference object associated with the action part of the robot in the calibration. and a setting reception unit that receives the setting of the calibration area, which is the area to be calibrated.”
  • a space (calibration area) in which the marker is moved during calibration is defined based on information about the robot captured by the imaging unit, and the marker is automatically arranged in that space.
  • the number of man-hours required for calibration is reduced.
  • the technique described in Patent Document 1 since the marker is moved within the range of the calibration area whose setting is received by the setting receiving unit, the number of man-hours required for teaching the posture set of the robot can be reduced and the calibration can be performed. can be executed.
  • Patent Literature 1 does not take into account the occlusion of markers (hereinafter also referred to as "occlusion") that may occur when the robot is changed to various postures. Therefore, there are the following problems.
  • occlusion markers
  • the present application includes a plurality of means for solving the above problems.
  • One of them is an imaging unit that takes an image from a predetermined position, and a marker that is attached to the robot and displaces according to the movement of the robot.
  • a posture set generation unit for generating a posture set including a plurality of postures of the robot for enabling the imaging unit to capture the marker;
  • An unshielded posture extraction unit that extracts a plurality of postures, and estimates coordinate transformation parameters for transforming the coordinate system of the robot and the coordinate system of the imaging unit based on the plurality of postures extracted by the unshielded posture extraction unit.
  • a calibration execution unit for a calibration execution unit.
  • FIG. 2 is a diagram showing a coordinate system and coordinate transformation of a physical device in this embodiment
  • FIG. 11 is a functional block diagram of a posture set generator
  • FIG. 7 is a flow chart showing a procedure for obtaining parameters by a parameter obtaining unit
  • 4 is a flow chart showing a procedure for generating and saving a posture set
  • FIG. 5 is a diagram for explaining a method of generating a teaching space by a teaching space generation unit;
  • FIG. 5 is a diagram illustrating a method of dividing a teaching space by a teaching space generation unit;
  • FIG. 5 is a diagram for explaining a teaching position generation method by a teaching position generation unit;
  • FIG. 10 is a diagram showing how a marker looks when the marker is placed at the teaching position in each small space in the initial posture;
  • FIG. 10 is a diagram showing a teaching posture of a marker generated by a teaching posture generation unit; It is a figure which shows an example of the simulator built by the simulator building part.
  • 9 is a flow chart showing a processing procedure of a non-physical interference attitude extraction unit;
  • FIG. 4 is a top view of the posture of the robot when it actually performs picking work.
  • FIG. 3 is a side view of the posture of the robot when it actually performs picking work.
  • FIG. 9 is a flow chart showing a processing procedure of a non-shielding posture extraction unit
  • FIG. 10 is a schematic diagram showing an operation example of a non-shielding posture extraction unit
  • 4 is a functional block diagram of a posture set evaluation unit
  • FIG. 5 is a flow chart showing a processing procedure of a calibration execution unit
  • FIG. 10 is a diagram showing a configuration example of a calibration device for robot control according to a second embodiment
  • FIG. 1 is a diagram showing a configuration example of a calibration device for robot control according to the first embodiment.
  • picking work for moving a work to a predetermined position is assumed.
  • a robot arm that is an articulated robot (hereinafter also simply referred to as "robot") is assumed.
  • robot an articulated robot
  • a calibration board is assumed in which a pattern with known dimensions is printed on a jig on a plane.
  • a monocular camera is assumed as an example of the imaging unit.
  • a wall and a conveyor are assumed as examples of environmental installation objects installed around the robot and around the imaging unit.
  • the content assumed in this embodiment is merely an example, and the configuration of the calibration device for robot control is not limited to the example assumed in this embodiment.
  • the calibration device 2 includes a control device 1, a real device 2A, and a calibration control device 2B.
  • the control device 1 centrally controls the entire calibration device 2 . Further, the control device 1 controls the motion of the robot 30 and the motion of the imaging unit 32 in the physical device 2A.
  • the control device 1 and the calibration control device 2B are shown separately in FIG. 1, the functions of the calibration control device 2B can be realized by the computer hardware resources of the control device 1. .
  • the real device 2A has a robot 30 configured by a robot arm, a marker 31 attached to the robot 30, and an imaging unit 32 that measures the work space of the robot 30 and its surroundings. Also, the physical device 2A has a wall 33A and a conveyor 33B as an example of the environmental installation objects 33 installed around the robot 30 and around the imaging unit 32 .
  • the calibration control device 2B includes a prior knowledge 21, a generation parameter 22, a posture set generation unit 23 that generates a posture set of the robot 30 using the prior knowledge 21 and the generation parameter 22, and a three-dimensional image using the prior knowledge 21.
  • the robot 30 and the marker 31 are selected from the pose sets generated by the simulator constructing unit 24 for constructing a simulator (hereinafter also simply referred to as "simulator") and the pose set generating unit 23, in which part of the physical device 2A.
  • a non-physical interference posture extraction unit 25 that extracts a plurality of postures (posture set) that do not cause physical interference, and a plurality of postures
  • a posture set evaluation unit 27 that evaluates the posture set extracted by the non-shielded posture extraction unit 26 from the viewpoint of calibration accuracy. and a calibration execution unit 29 that executes calibration when the posture set evaluation unit 27 evaluates high.
  • the posture set of the robot 30 is a set of multiple postures (posture data) of the robot 30 .
  • the prior knowledge 21 includes design information including the shape of the physical device 2A, information capable of specifying the space in which the robot 30 works (hereinafter also referred to as "work space"), an evaluation value of the imaging unit 32, It is a database having parameter information such as threshold values used in each functional unit.
  • the design information of the physical device 2A is information including the shape, specifications and layout of the three-dimensional model of the physical device 2A.
  • the evaluation value of the image pickup unit 32 is an evaluation value obtained by an inspection performed when the image pickup unit 32 is received or shipped, and represents the actual performance of the image pickup unit 32 with respect to product specifications and the degree of image distortion.
  • the generation parameter 22 is a database having parameters that determine the number and density of poses included in the pose set of the robot 30 .
  • the posture set generation unit 23 receives the prior knowledge 21 and the generation parameters 22 as input, generates a posture set of the robot 30 , and stores the generated posture set in the control device 1 .
  • the posture of the robot 30 included in the posture set generated by the posture set generation unit 23 is a posture that enables the imaging unit 32 to image (capture) the marker 31 .
  • the simulator construction unit 24 uses the prior knowledge 21 to construct a three-dimensional simulator that enables virtual motion of the robot 30 and imaging using the imaging unit 32 .
  • the non-physical interference posture extraction unit 25 reads the posture set generated by the posture set generation unit 23 from the control device 1, and moves the robot to each posture on the three-dimensional simulator constructed by the simulator construction unit 24, thereby A plurality of postures are extracted in which the robot 30 and the marker 31 do not physically interfere with either the robot 30 or the environmental installation object 33 .
  • the non-physical interference posture extraction unit 25 stores the plurality of extracted postures in the control device 1 as a set of postures without physical interference.
  • the non-shielding posture extraction unit 26 reads the posture set extracted by the non-physical interference posture extraction unit 25 from the control device 1, and moves the robot to each posture on the three-dimensional simulator constructed by the simulator construction unit 24.
  • the non-shielding posture extraction unit 26 stores the plurality of extracted postures in the control device 1 as a posture set.
  • the storage location of the posture set generated by the posture set generation unit 23 and the storage destination of the posture sets extracted by the non-physical interference posture extraction unit 25 and the non-shielding posture extraction unit 26 are set to the control device 1.
  • the storage destination of the posture set is not limited to the control device 1, and may be, for example, a storage unit provided in the calibration control device 2B.
  • the posture set evaluation unit 27 reads the posture set extracted by the non-shielding posture extraction unit 26 from the control device 1, and determines whether or not the estimation accuracy of the calibration using the posture set is equal to or greater than a predetermined value set in advance. or not, and the posture set is evaluated based on this determination result.
  • the parameter update unit 28 updates the generation parameter 22 when the posture set evaluation unit 27 cannot expect calibration with a higher accuracy than a predetermined value.
  • the calibration executing unit 29 controls the robot 30 and the imaging unit 32 based on the posture set (a plurality of postures) determined by the posture set evaluating unit 27 to be expected to perform calibration with higher accuracy than a predetermined level. Coordinate transformation parameters for transforming the coordinate system of 30 and the coordinate system of the imaging unit 32 are estimated.
  • control device 1 the real device 2A, the prior knowledge 21, the pose set generation unit 23, the simulator construction unit 24, the non-physical interference pose extraction unit 25, the non-shielding pose extraction unit 26, the pose set evaluation unit 27, and the parameter update unit 28, the calibration execution unit 29 will be described in detail.
  • FIG. 2 is a block diagram showing a configuration example of the control device 1 in this embodiment.
  • the control device 1 includes a CPU 11, a bus 12 for transmitting commands from the CPU 11, a ROM 13, a RAM 14, and a CPU 11 as computer hardware resources for overall control of the calibration device 2.
  • a storage device 15 a network I/F (I/F is an abbreviation for interface, hereinafter the same) 16, an imaging I/F 17 for connecting the imaging unit 32, and a screen display I/F 18 for screen output. , and an input I/F 19 for external input. That is, the control device 1 can be configured by a general computer device.
  • the storage device 15 stores a program 15A for executing each function, an OS (operating system) 15B, a three-dimensional model 15C, and parameters 15D such as a database.
  • a robot controller 16A is connected to the network I/F 16 .
  • the robot control device 16A is a device that controls and operates the robot 30 .
  • the robot 30 is a picking robot that performs picking work.
  • a picking robot is composed of an articulated robot.
  • the robot 30 may be of any type as long as it has a plurality of joints.
  • the robot 30 is installed on a single-axis slider, and by increasing the degree of freedom of movement of the robot 30 with this single-axis slider, it is possible to perform work in a wide space.
  • the marker 31 a calibration board on which a pattern with known dimensions is printed on a planar jig is used. When such a marker 31 is used, the position of the marker 31 in the three-dimensional space can be identified by analyzing the data measured by the imaging section 32 .
  • the three-dimensional position of the marker 31 can be identified from the data measured by the imaging unit 32, for example, a spherical jig or a jig with a special shape can be used as the marker 31.
  • the marker 31 is attached to the arm tip of the robot 30 which is a robot arm.
  • the attachment position of the marker 31 is not limited to the tip of the arm, and may be on the robot arm as long as the position satisfies the condition that the marker 31 is displaced according to the motion of the robot 30 .
  • a monocular camera is used as the imaging unit 32 .
  • the imaging unit 32 is not limited to a monocular camera, and may be configured by, for example, a ToF (Time of Flight) camera, a stereo camera, or the like.
  • the data obtained by the imaging unit 32 is data such as images and point clouds that can specify the three-dimensional position of the marker 31 .
  • the imaging unit 32 photographs the work space of the robot 30 and the like from a predetermined position.
  • the mounting position of the imaging unit 32 can be set at any location, such as the ceiling or wall of the building where the robot 30 works.
  • FIG. 3 is a diagram showing the coordinate system and coordinate conversion of the physical device 2A in this embodiment.
  • the coordinate system of the physical device 2A includes a base coordinate system C1 whose origin is the base of the arm of the robot 30, an arm coordinate system C2 whose origin is the tip of the arm of the robot 30, and a marker coordinate system whose origin is the center of the marker 31.
  • C3 and a camera coordinate system C4 whose origin is the optical center of the imaging unit 32 .
  • Each coordinate system is a three-dimensional coordinate system.
  • the coordinate transformation of the physical device 2A includes a base/camera transformation matrix M1 representing coordinate transformation between the base coordinate system C1 and the camera coordinate system C4, and a marker/camera transformation matrix representing coordinate transformation between the marker coordinate system C3 and the camera coordinate system C4.
  • a transformation matrix M2 an arm/base transformation matrix M3 representing coordinate transformation between the arm coordinate system C2 and the base coordinate system C1
  • an arm/marker transformation matrix M4 representing coordinate transformation between the arm coordinate system C2 and the marker coordinate system C3.
  • the base coordinate system C ⁇ b>1 corresponds to the coordinate system of the robot 30
  • the camera coordinate system C ⁇ b>4 corresponds to the coordinate system of the imaging unit 32 .
  • the base/camera transformation matrix M1 corresponds to a coordinate transformation parameter for transforming the coordinate system of the robot 30 and the coordinate system of the imaging unit 32 .
  • coordinate transformation when the rotation matrix is Rca and the translation matrix is tca in the base/camera transformation matrix M1, the coordinates C (Xc, Yc, Zc) in the camera coordinate system C4 can be expressed by the following formula: It can be transformed into a point P(Xr, Yr, Zr) in the base coordinate system C1 as shown.
  • the base-camera conversion matrix M1 can be obtained as one of the design values in the physical device 2A.
  • an error may occur in the position of the robot 30 .
  • One possible cause of such an error is, for example, misalignment of the mounting position of the imaging unit 32 in the physical device 2A.
  • the mounting position of the imaging unit 32 is designed so that the optical axis (z-axis) of the imaging unit 32 is parallel to the vertical direction, a slight positional deviation may occur during actual mounting. .
  • Calibration is performed to accurately obtain the base/camera transformation matrix M so that such a positional deviation does not cause an error in the position of the robot 30 .
  • the design value of the base/camera transformation matrix M1 is included in the prior knowledge 21. Therefore, the design value of the base-camera conversion matrix M1 can be obtained from the prior knowledge 21 . Also, by executing the calibration, it is possible to obtain the actual values of the base/camera transformation matrix M1 that reflects the positional deviation and the like.
  • Calibration in this embodiment means estimating the base/camera conversion matrix M1 from a plurality of sets of data of the marker/camera conversion matrix M2 and the arm/base conversion matrix M3 when the robot 30 is moved to a certain posture. Say things. Estimation of the base-to-camera transformation matrix M1 is performed computationally.
  • the marker-camera conversion matrix M2 can be obtained by analyzing an image obtained by capturing the marker 31 by the imaging unit 32 .
  • the arm-base conversion matrix M3 can be obtained by calculation from the encoder values of the robot 30 .
  • the joint portion of the robot 30 is provided with a motor (not shown) for driving the joint and an encoder for detecting the rotation angle of the motor, and the encoder value means the output value of the encoder.
  • the arm-marker conversion matrix M4 is unknown, or a design value can be obtained from the prior knowledge 21 . In this embodiment, since the design value of the arm-marker transformation matrix M4 is included in the prior knowledge 21, the design value of the arm-marker transformation matrix M4 can be obtained from this prior knowledge 21.
  • FIG. In this embodiment, the origin of each coordinate system, the orientation of the coordinate system, and the coordinate conversion are set as described above, but the present invention is not limited to this example.
  • the prior knowledge 21 is a database having parameters (1) to (11) below.
  • the function using the above prior knowledge 21 will be described, but the function may be implemented using only part of it.
  • the posture set generator 23 generates a posture set of the robot 30 based on the parameters obtained from the prior knowledge 21 and the parameters obtained from the generated parameters 22 . Also, the posture set generator 23 stores the generated posture set in the control device 1 .
  • FIG. 4 is a functional block diagram of the posture set generator 23.
  • the posture set generation unit 23 includes a parameter acquisition unit 231, a teaching space generation unit 232, a teaching position generation unit 233, a teaching posture generation unit 234, a coordinate system conversion unit 235, and a posture set generation unit 231.
  • a storage unit 236 is provided.
  • the parameter acquisition unit 231 acquires parameters necessary for generating a set of postures from the prior knowledge 21 and the generation parameters 22 .
  • the teaching space generation unit 232 determines a space in which the marker 31 is placed during calibration, and divides this space into a plurality of small spaces. Also, the teaching space generator 232 assigns an index to each small space.
  • the teaching position generation unit 233 generates positions at which the markers 31 are arranged in each small space, that is, teaching positions, based on the camera coordinate system C4.
  • the teaching posture generation unit 234 generates the posture of the marker 31 at each teaching position generated by the teaching position generation unit 233, that is, the teaching posture.
  • the coordinate system transformation unit 235 transforms the set of poses of the marker 31 with respect to the camera coordinate system C4 to the robot with respect to the base coordinate system C1 based on the design values of the base/camera transformation matrix M1 included in the prior knowledge 21. Transform into a set of 30 poses.
  • the posture set storage unit 236 stores the posture set generated by the teaching posture generation unit 234 and the posture set transformed by the coordinate system transformation unit 235 in the control device 1 . The configuration of each part of the posture set generator 23 will be described in more detail below.
  • FIG. 5 is a diagram showing a list of parameters acquired by the parameter acquisition unit 231.
  • the parameters acquired by the parameter acquisition unit 231 include the design values (rotation matrix Rca, translation matrix tca) of the base-camera conversion matrix M1, the measurement range of the imaging unit 32 (angle of view, resolution, optical blur), distortion information of the imaging unit 32 (distortion evaluation), workspace of the robot 30 (three-dimensional space information), shape of the marker 31 (board, sphere, size), resolution in each axis direction of the posture set (X1, Y1 , Z1).
  • FIG. 6 is a flowchart showing a parameter acquisition procedure by the parameter acquisition unit 231.
  • the parameter acquisition unit 231 acquires the design values of the base-camera conversion matrix M1 from the prior knowledge 21 (step S1).
  • the parameter acquisition unit 231 acquires the rotation matrix Rca and the translation matrix tca as the design values of the base-camera conversion matrix M1.
  • the parameter acquisition unit 231 acquires the measurement range of the imaging unit 32 and the distortion information of the imaging unit 32 from the prior knowledge 21 (step S2). At this time, the parameter acquisition unit 231 acquires the angle of view, resolution, and optical blur as the measurement range of the imaging unit 32 and acquires the distortion evaluation value as the distortion information of the imaging unit 32 .
  • the parameter acquisition unit 231 acquires information defining the workspace of the robot 30 from the prior knowledge 21 (step S3). At this time, the parameter acquisition unit 231 acquires three-dimensional space information indicating the work space as information defining the work space of the robot 30 . Next, the parameter acquisition unit 231 acquires shape data of the marker 31 (step S4). At this time, the parameter acquisition unit 231 acquires the type (board, sphere) and size data of the marker 31 as the shape data of the marker 31 . Next, the parameter acquisition unit 231 acquires the resolution (X1, Y1, Z1) in each axial direction of the posture set from the generation parameter 22 (step S5).
  • the number of poses (X1, Y1, Z1) on each axis is acquired as a parameter that determines the resolution in each axis direction.
  • the greater the number of postures to be created on each axis the greater each value of X1, Y1, and Z1, the higher the resolution in each axis direction.
  • the parameters X1, Y1, and Z1 correspond to parameters that determine the number and density of poses included in the pose set.
  • FIG. 7 is a flow chart showing a procedure for generating a posture set by the teaching space generation unit 232, the teaching position generation unit 233, the teaching posture generation unit 234, and the coordinate system conversion unit 235 and storing the posture set by the posture set storage unit 236. is.
  • the teaching space generator 232 generates a teaching space in which the markers 31 are arranged (step S6).
  • the teaching space generated by the teaching space generation unit 232 is a three-dimensional space based on the camera coordinate system C4.
  • the teaching position generating section 233 sets the position where the marker 31 is arranged in each small space as a teaching position (step S8). As a result, the teaching positions of the marker 31 are generated for the number of small spaces.
  • the teaching posture generation unit 234 initially sets the teaching posture of the marker 31 at each teaching position so that the marker 31 faces the imaging unit 32 (step S9). In the following description, the initially set teaching orientation of the marker 31 will be referred to as the initial orientation.
  • the teaching posture generation unit 234 generates the marker 31 from the initial posture by ⁇ (threshold) degrees from 0 degrees in each axis direction of the camera coordinate system C4 from the initial posture, based on the index assigned to each small space, or randomly. By rotating the posture of , the teaching posture of the marker 31 is generated (step S10). As a result, the teaching postures of the marker 31 are generated for the number of small spaces.
  • the coordinate system conversion unit 235 converts the coordinates of the taught posture of the marker 31 based on the camera coordinate system C4 generated by the taught posture generation unit 234 in step S10 using the design values of the base/camera transformation matrix M1. are converted into the coordinates of the base coordinate system C1 (step S11).
  • the teaching posture of the marker 31 based on the camera coordinate system C4 is converted into the teaching posture of the robot 30 based on the base coordinate system C1.
  • the teaching postures of the robot 30 are generated for the number of small spaces.
  • the posture set storage unit 236 saves the posture set (a plurality of taught postures) of the marker 31 generated by the taught posture generation unit 234 in step S10 and the A set of postures (a plurality of teaching postures) of the robot 30 is saved in the control device 1 (step S12).
  • the controller 1 stores a set of orientations of the marker 31 based on the camera coordinate system C4 and a set of orientations of the robot 30 based on the base coordinate system C1.
  • FIG. 8 is a diagram for explaining a teaching space generation method by the teaching space generation unit 232. As shown in FIG. The illustrated generation method is applied to step S6 in FIG. 7 above.
  • the teaching space is defined as a common area between the space in which the robot 30 works and the space in which the imaging unit 32 can accurately shoot (measurable) overlap.
  • the work space of the robot 30 is determined based on the camera coordinate system C4. As a method of determining the work space, for example, when the work performed by the robot 30 is a picking work, the space in which the marker 31 is positioned when the robot 30 performs the work of gripping a work (not shown) is determined in advance. be able to.
  • the reason why the work space of the robot 30 is used as the teaching space T1 is as follows.
  • movement errors systematically occur due to manufacturing errors of the robot 30 or the like.
  • the appearance of the movement error differs depending on the position in space. Therefore, by setting the posture taken by the robot 30 during calibration to be the same as the posture during work, it is possible to use data with the same movement error as during work for calibration.
  • the space that the imaging unit 32 can shoot is determined by the angle of view of the imaging unit 32, but the space that the imaging unit 32 can shoot with accuracy is limited to a narrower range than the shootable space.
  • the size of the space in the Z-axis direction in the camera coordinate system C4 is specified so that the value of the optical blur of the imaging unit 32 is equal to or less than the threshold.
  • the spatial dimension in the Z-axis direction is specified within a range in which the optical blur value is 1 px or less.
  • the size of the space in the X-axis direction and the Y-axis direction in the camera coordinate system C4 is the image captured by the imaging unit 32 (hereinafter also referred to as “captured image”). ) is subjected to distortion correction, and the luminance difference between the image before distortion correction and the image after distortion correction is specified to be within a range of a threshold value or less. As a result, it is possible to control so that the marker 31 is placed in a space in which the captured image is less likely to be distorted. Therefore, it is possible to reduce errors in calibration, which will be described later.
  • the size of the space in the Z-axis direction in the camera coordinate system C4 is defined by the working space of the robot 30. In the example of FIG. This is because the working space of the robot 30 is narrower than the space in which the optical blur value is 1 px or less when the size of the space in the Z-axis direction is compared.
  • the size of the space in the X-axis direction and the Y-axis direction in the camera coordinate system C4 is defined by the space in which the imaging unit 32 can accurately capture images.
  • a space in which the image capturing unit 32 can accurately capture images that is, a space in which the difference in brightness based on the distortion evaluation is equal to or less than the threshold is the space in which the robot 30 is working. This is because it is narrower than the space.
  • the teaching space T1 is generated by the above method.
  • FIG. 9 is a diagram for explaining how the teaching space generation unit 232 divides the teaching space.
  • the illustrated division method is applied to step S7 in FIG. 7 above.
  • parameters X1, Y1, and Z1 given as the resolution in each axial direction of the posture set are used.
  • an index (X, Y, Z) is assigned to each small space.
  • FIG. 10 is a diagram for explaining a teaching position generation method by the teaching position generation unit 233. As shown in FIG. The illustrated generation method is applied to step S8 in FIG. 7 above.
  • a teaching position in a small space with indices (X, Y, Z) (1, 1, 3)
  • a small A position moved by half the height of the space, that is, the central position of the small space given the index (1, 1, 3) can be generated as the teaching position. This point is the same for small spaces to which other indexes are assigned.
  • FIG. 11 is a diagram showing how the marker looks when the marker is placed at the teaching position in each small space in the initial posture.
  • indexes (X, Y, Z) (1, 1, 1) (1, 2, 1) (1, 3, 1)
  • indexes (X, Y, Z) (1, 1, 1) (1, 2, 1) (1, 3, 1)
  • teach position It shows how the marker 31 looks when the marker 31 is provisionally placed at the teaching position generated by the generation unit 233 and placed so as to face the imaging unit 32 .
  • the marker 31 is arranged in any small space such that the Z axis of the camera coordinate system C4 is oriented exactly opposite to the Z axis of the marker coordinate system C3. Therefore, the appearance of the marker 31 is the same in all small spaces.
  • the teaching posture generation unit 234 generates a teaching posture by rotating the posture of the marker 31 in each axis direction of the marker coordinate system C3 from the initial posture set in step S9.
  • FIG. 12 is a diagram showing the teaching posture of the marker 31 generated by the teaching posture generation unit 234.
  • the indexes (X, Y, Z) of each small space in FIG. 12 correspond to the indexes (X, Y, Z) of each small space in FIG.
  • the taught orientation of the marker 31 in the small space with index (3,3,1) is generated by rotating the marker 31 about the x-axis of the marker coordinate system C3 with respect to the initial orientation.
  • the magnitude of the rotation angle of each axis is from 0 degrees to ⁇ (threshold) degrees, and is determined based on an index, rule-based or randomly.
  • the posture set which is a plurality of postures generated by the teaching posture generation unit 234, includes postures in which the marker 31 is rotated in various directions, and postures in which the orientation of the marker 31 is the same and only translation is different. made up of combinations.
  • the posture set is automatically generated in the teaching space that satisfies both the work space of the robot 30 and the space in which the imaging unit 32 can accurately shoot. It is possible to generate
  • the rotation of the base/camera conversion matrix M1 is obtained in the form of the rotation matrix Rca, but a quaternion or the like may be used.
  • the shape of the teaching space T1 generated by the teaching space generating section 232 is not limited to the shape shown in FIG.
  • the teaching space T1 has a shape such as a rectangular parallelepiped.
  • the distortion evaluation value of the imaging unit 32 used in the present embodiment cannot be obtained as a parameter for generating the teaching space T1
  • the user may specify the shape and size of the teaching space in advance.
  • the teaching space T1 is divided based on the camera coordinate system C4 as a method of dividing the teaching space, but the teaching space T1 may be divided based on an arbitrary coordinate system.
  • the teaching position generated by the teaching position generation unit 233 may be set to an arbitrary position within the small space, for example, the vertex of the small space.
  • the teaching posture generation unit 234 generates the teaching posture by setting the rotation angle of the marker 31 about each axis within a threshold value. No need to set.
  • the posture set storage unit 236 stores the posture set in the control device 1.
  • the present invention is not limited to this, and may be stored in a memory (not shown). You can also load sets.
  • the simulator construction unit 24 has a function of virtually operating the robot 30 by constructing a simulator based on the prior knowledge 21 and generating an image (captured image) captured by the imaging unit 32 .
  • FIG. 13 is a diagram showing an example of a simulator constructed by the simulator construction unit 24. As shown in FIG. In FIG. 13, information such as a three-dimensional model including design information of the real device 2A, specifications, layout, etc. is acquired from the prior knowledge 21, and based on the acquired information, the robot 30, the marker 31, the imaging unit 32, and the environment installed objects are detected. 33 are virtually placed on the simulator.
  • the robot 30 by using the joint information obtained as the length of each link and the specifications as shape information of the robot 30, which is information obtained from the prior knowledge 21, the robot 30 can be virtually operated. be. In addition, by operating the robot 30 virtually, it is possible to apply a trajectory planning algorithm used in an actual machine.
  • each image captured by the image capturing unit 32 on the simulator is It is possible to generate pixel values.
  • the color information of the robot 30 and the environment installation object 33 can be easily changed on the simulator. For example, it is possible to change the RGB values representing the colors of the robot 30 and the environment installation object 33 to (255, 0, 0), etc., and generate an image taken on a simulator.
  • the layout information including the design values of the base-camera conversion matrix M1 is acquired from the prior knowledge 21, and the object is placed on the simulator based on the acquired layout information. do not have.
  • rough values indicating the size and arrangement of objects in the physical device 2A may be estimated from images captured by the imaging unit 32, and the objects may be arranged on the simulator based on the estimation results.
  • the shape is reproduced using the three-dimensional model of the physical device 2A.
  • the simulator may hold only the design value information for each coordinate system.
  • the imaging unit 32 is a stereo camera or ToF camera capable of three-dimensional measurement, or when a three-dimensional shape can be obtained by machine learning an image from a monocular camera, the real device 2A acquired by the imaging unit 32 3D shape may be utilized to construct a simulator.
  • FIG. 14 is a flow chart showing the processing procedure of the non-physical interference attitude extraction unit 25.
  • the non-physical interference posture extraction unit 25 extracts postures (non-physical interference postures) in which the units of the physical device 2A do not physically interfere with each other from the posture set generated by the posture set generation unit 23 .
  • the non-physical interference posture extraction unit 25 extracts the posture that does not cause physical interference by utilizing the simulator constructed by the simulator construction unit 24 .
  • Physical interference means that the robot 30 and the marker 31 come into contact with the robot 30, the imaging unit 32, the environmental installation object 33, and the like in the physical device 2A.
  • the non-physical interference posture extraction unit 25 determines the trajectory if the robot 30 and the marker 31 do not physically interfere with any part in the real device 2A. It is determined that the plan has succeeded, and if there is physical interference, it is determined that the trajectory plan has failed.
  • the non-physical interference attitude extraction unit 25 determines that the trajectory planning is successful in step S17, it holds the i-th attitude as a success pattern in the memory, and then (step S18). Holds success/failure information for the index of the small space in which the posture of is generated (step S19).
  • the success flag is held in association with the index of the small space in which the i-th posture is generated.
  • the index of the small space in which the i-th posture is generated is held in association with the failure flag.
  • the non-physical interference attitude extraction unit 25 determines that the trajectory planning has failed in step S17, the processes in steps S18 and S19 are skipped.
  • the trajectory planning fails, it corresponds to the case where it is determined that there is physical interference, and when the trajectory planning succeeds, it corresponds to the case where it is determined that there is no physical interference.
  • FIG. 15 is a bird's-eye view of the posture when the robot 30 actually performs the picking work.
  • the robot 30 does not physically interfere with any part of the physical device 2A. Therefore, when the trajectory planning is instructed to move the marker 31 to the destination 1, the trajectory planning succeeds and the attitude of the success pattern is obtained.
  • the robot 30 physically interferes with the environmental installation object 33 (wall 33A), and the trajectory planning fails.
  • the method of determining the presence or absence of physical interference for example, for a trajectory candidate generated by a trajectory plan, when the robot 30 operates according to the trajectory, the three-dimensional position of the environmental installation object 33 and the like and the robot 30 can be determined by whether or not they overlap.
  • FIG. 16 is a side view of the posture when the robot 30 actually performs the picking work.
  • the robot 30 when moving the marker 31 to the destination 3, the robot 30 does not physically interfere with any part of the physical device 2A. Therefore, when the trajectory planning is instructed to move the marker 31 to the destination 3, the trajectory planning succeeds and the attitude of the success pattern is obtained.
  • the robot 30 when moving the marker 31 to the destination 4, the robot 30 physically interferes with the environment installation object 33 (conveyor 33B), and the trajectory planning fails.
  • whether the trajectory planning is successful or not is determined by the presence or absence of physical interference between the robot 30 and the environmental installation object 33.
  • the present invention is not limited to this.
  • the user may add restrictions on the joint angles of the robot 30 to make the determination. Specifically, when moving the robot 30 to the destination, if the joint angle of the robot 30 is equal to or less than the upper limit, the trajectory planning is determined to be successful, and if the joint angle exceeds the upper limit, the trajectory planning is determined to be unsuccessful. good too.
  • FIG. 17 is a flow chart showing the processing procedure of the non-shielding posture extraction unit 26.
  • the non-shielding posture extraction unit 26 selects from the posture set generated by the posture set generation unit 23 and extracted by the non-physical interference posture extraction unit 25 whether the marker 31 is shielded between the marker 31 and the imaging unit 32 .
  • a posture that does not occur (non-shielding posture) is extracted.
  • the non-shielding posture extraction unit 26 extracts a posture in which the shielding of the marker 31 does not occur by utilizing the simulator constructed by the simulator construction unit 24 .
  • the non-shielding posture extraction unit 26 loads the posture set generated by the posture set generation unit 23 and extracted by the non-physical interference posture extraction unit 25 onto the memory of the control device 1 (step S23).
  • the non-shielding posture extraction unit 26 acquires the number N2 of postures included in the posture set (step S24).
  • the non-shielding posture extraction unit 26 instructs the trajectory plan to move the robot 30 to the i-th posture on the simulator (step S26).
  • the non-shielding posture extraction unit 26 generates a virtual captured image by the imaging unit 32 on the simulator (step S27). That is, the non-shielding posture extraction unit 26 virtually generates a photographed image obtained by the imaging unit 32 when the robot 30 is moved to the i-th posture.
  • the non-shielding posture extraction unit 26 estimates the marker-camera conversion matrix M2 by analyzing the generated captured image (step S28).
  • the photographed image for example, when a pattern of a plurality of black circles with known sizes and positions is printed on the marker 31, the known sizes and positions of the black circles and the above-mentioned virtual photographed image are analyzed.
  • the marker-camera transformation matrix M2 can be estimated by associating the sizes and positions of the black circles in . Estimating the marker-camera transformation matrix M2 substantially means estimating the position and orientation of the marker 31, that is, the three-dimensional position of the marker 31 in the three-dimensional camera coordinate system C4.
  • the non-shielding posture extraction unit 26 determines whether or not the estimation of the marker-camera conversion matrix M2 has succeeded (step S29). Then, when the estimation of the marker-camera conversion matrix M2 is successful, the non-shielding posture extraction unit 26 calculates the reliability of the estimation (step S30), and then determines whether or not the reliability is equal to or greater than the threshold. (step S31). Further, when the calculated reliability of the estimation is equal to or higher than the threshold, the non-shielding posture extraction unit 26 stores the i-th posture as a successful pattern in the memory (step S32), and then generates the i-th posture. Success/failure information for the index of the created small space is retained (step S33).
  • the success flag When holding the success information for the index of the small space, the success flag is held in association with the index of the small space in which the i-th posture is generated. Also, when holding failure information for the index of the small space, the index of the small space in which the i-th posture is generated is held in association with the failure flag.
  • the non-shielding posture extraction unit 26 determines in step S29 that the estimation of the marker-camera transformation matrix M2 has failed, it skips the processing in steps S30 to S34, and in step S31 the reliability is set to the threshold value. If it is determined that the above is not the case, the processing of steps S32 and S33 is skipped.
  • the estimation of the marker-camera conversion matrix M2 fails, or when the reliability of the estimation is not equal to or greater than the threshold, it corresponds to the case where it is determined that the marker 31 is shielded.
  • the marker-camera conversion matrix M2 is successfully estimated, or when the reliability of the estimation is equal to or higher than the threshold, it corresponds to the case where it is determined that the marker 31 is not shielded.
  • step S28 there are cases where the marker-camera conversion matrix M2 can be estimated even when part of the marker 31 is hidden by shielding.
  • the accuracy of estimating the marker-camera conversion matrix M2 may be lower than in the case where all the markers 31 appear in the captured image. It is not desirable to use an orientation with low estimation accuracy of the marker-camera transformation matrix M2 for calibration.
  • the non-shielding posture extraction unit 26 uses the fact that the relative posture between the marker 31 and the imaging unit 32 in each teaching posture is known on the simulator. The ratio between the area of the marker 31 at that time and the area of the marker 31 actually appearing on the captured image of the imaging unit 32 on the simulator is calculated as the reliability of the estimation. In addition to this, for example, the ratio between the number of feature points on the marker 31 obtained by analyzing the captured image and the total number of feature points is calculated as the reliability. Then, the non-shielding posture extraction unit 26 determines whether or not the reliability calculated as described above is higher than a threshold determined in advance according to the type of the marker 31 .
  • FIG. 18 is a schematic diagram showing an operation example of the non-shielding posture extraction unit 26.
  • a portion of the posture set loaded onto the memory of the control device 1 in step S23 is shown within the frame line indicated by the dashed line on the left side of FIG. 18 shows part of the set of poses after occlusion determination, extracted by the non-shielding pose extracting unit 26, inside a frame line indicated by a dashed dotted line on the right side of FIG.
  • the environmental installation object 33 is arranged so as to cover the upper, lower, and left sides of the image captured by the imaging unit 32 on the simulator.
  • the robot 30 partially shields the marker 31, and the estimation of the position and posture of the marker 31 may fail. If it were possible to estimate the position and orientation of the marker 31 from a part of the marker 31 appearing in the captured image I1, the unshielded orientation extraction unit 26 would capture the marker 31 without shielding, as described above. A ratio between the area of the marker 31 when it appears in the image and the area of the marker 31 actually appearing on the captured image of the imaging unit 32 on the simulator is calculated as the reliability of estimation. In calculating this reliability, the positions and orientations of the camera coordinate system C4 and the markers 31 in a certain posture of the robot 30 are known by the posture set generation unit 23 .
  • pixels of the marker 31 on the image when the marker 31 appears in the image without obstruction A set of locations (hereinafter referred to as a "pixel set") can be specified.
  • the non-shielding attitude extraction unit 26 changes the color information of the robot 30 and the environmental installation object 33 in the simulator to a color not included in the marker 31, for example, a color with an RGB value of (255, 0, 0).
  • the posture of the robot 30 is changed to virtually generate an image captured by the imaging unit 32 .
  • the position of the marker 31 in the captured image matches the set of pixels, and the color of the blocked pixels is (255, 0, 0). Therefore, the non-shielding posture extraction unit 26 calculates the ratio of pixels that are not of the color (255, 0, 0) in the pixel set as the reliability of the estimation.
  • the estimation reliability is calculated to be approximately 50%. Moreover, when the threshold value to be compared with the reliability is set to 90% in advance, the reliability in the photographed image I1 is less than the threshold.
  • the marker 31 is not blocked and the entire marker 31 is shown in the image. Therefore, in the case of the photographed image I2, the position and orientation of the marker 31 can be estimated, and the reliability of the estimation is higher than the threshold, so this is a successful pattern. 18, the marker 31 is placed within the angle of view of the imaging unit 32, but the marker 31 is partly blocked by the environmental installation object 33. In the posture shown in the photographed image I3 of FIG. Therefore, estimation of the position and orientation of the marker 31 may fail. Further, since 9 out of 12 black dots on the marker 31 are captured in the captured image I3, if the position and orientation of the marker 31 can be estimated, the reliability is calculated.
  • the non-shielding posture extraction unit 26 uses the ratio of the number of feature points on the marker 31 obtained by analyzing the captured image and the total number of feature points as the reliability of the estimation, for example, as described above. calculate.
  • the reliability of the estimation is calculated as 75%.
  • the threshold value to be compared with the reliability is set to 90% in advance, the reliability in the photographed image I3 is less than the threshold.
  • the non-shielded posture extraction unit 26 it is determined whether or not the marker 31 is shielded, and the posture set (postures shown in the captured images I2, I4, I5, . . . ) determined to be unshielded. is generated. As a result, only a posture is generated in which the marker 31 appears in the right and center portions of the captured image.
  • the unshielded posture extraction unit 26 obtains the reliability of the estimation, and determines whether the marker 31 is shielded or not depending on whether the reliability is equal to or greater than a threshold. do. Specifically, if the reliability of the estimation is equal to or higher than the threshold, it is determined that the marker 31 is not shielded, and if the reliability of the estimation is less than the threshold, it is determined that the marker 31 is shielded. As a result, among the postures for which the three-dimensional position of the marker 31 has been successfully estimated, only postures for which the reliability of estimation is higher than the threshold can be extracted as postures in which the marker 31 is not shielded.
  • the non-shielding posture extraction unit 26 calculates the reliability of the estimation, but depending on the shape or pattern of the marker 31, the reliability of the estimation may not be calculated. That is, the reliability of estimation may be calculated as necessary. Further, calculation of the reliability is not limited to the area ratio and the feature point ratio described above, and when the position and orientation of the marker 31 are estimated using machine learning or the like, the reliability of the estimation itself may be estimated.
  • the posture set evaluation unit 27 uses the posture sets generated by the posture set generation unit 23 and extracted by the non-physical interference posture extraction unit 25 and the non-shielding posture extraction unit 26 to perform calibration with a high accuracy equal to or higher than a predetermined value. It is determined whether or not the In addition, the posture set evaluation unit 27 instructs the calibration execution unit 29 to execute the calibration when high-precision calibration of a predetermined value or more can be expected. In addition, when the calibration with a higher accuracy than a predetermined value cannot be expected, the posture set evaluation unit 27 instructs the posture set generation unit 23 to add a posture set and/or changes the value of the generation parameter. command to regenerate the posture set.
  • FIG. 19 is a functional block diagram of the posture set evaluation unit 27. As shown in FIG. As shown in FIG. 19 , the posture set evaluation unit 27 includes a posture number evaluation unit 271 , a posture set extraction unit 272 , a calibration evaluation unit 273 and an index evaluation unit 274 .
  • the posture number evaluation unit 271 evaluates the number of postures included in the posture set generated by the posture set generation unit 23 and extracted by the non-physical interference posture extraction unit 25 and the non-shielding posture extraction unit 26 . Specifically, the posture number evaluation unit 271 determines whether or not the number of postures included in the posture set is within a predetermined number. When the posture number evaluation unit 271 determines that the number of postures included in the posture set is not within the predetermined number, the posture set extraction unit 272 extracts some of the multiple postures included in the posture set. Extract the subset where . For example, if a total of 100 poses are included in the pose set, 20 poses out of the 100 poses are extracted as a subset.
  • the calibration evaluation unit 273 generates data such as the marker/camera conversion matrix M2 and the arm/base conversion matrix M3 that are virtually used for calibration on the simulator using the above-described posture set, and estimates them by calibration. Evaluate the estimation accuracy of the marker-camera transformation matrix M2. That is, the calibration evaluation unit 273 evaluates the accuracy of calibration.
  • the index evaluation unit 274 adds or adds a posture set to the posture set generation unit 23 when the calibration evaluation unit 273 determines that a calibration accuracy higher than a predetermined value cannot be expected due to, for example, an insufficient number of postures. Instruct to generate again.
  • each unit of the posture set evaluation unit 27 will be described in detail below.
  • the posture number evaluation unit 271 extracts a portion of the posture set, that is, a subset.
  • the extraction unit 272 is instructed.
  • the number-of-postures evaluation unit 271 causes the calibration evaluation unit 273 to determine whether or not the subset extracted by the posture set extraction unit 272 can be expected to have high calibration accuracy equal to or higher than a predetermined value. In this way, by extracting a part of poses from the pose set as a subset, it is possible to suppress the length of time required for calibration when the number of poses included in the pose set is large, and to perform the calibration efficiently. Accuracy can be evaluated.
  • the posture set extraction unit 272 extracts some postures from a posture set, which is a set of multiple postures, based on a predetermined rule.
  • a predetermined rule for example, there is a method of extracting so that an index assigned to the small space in which each posture is generated appears on average.
  • posture set extraction section 272 extracts some postures based on the following rules.
  • the calibration evaluation unit 273 generates a marker-camera conversion matrix M2 and an arm-base conversion matrix M3 that are used when performing virtual calibration on the simulator, and performs optimization processing (described later) to convert the base-camera conversion matrix M1. Find an estimate. Further, the calibration evaluation unit 273 compares the estimated value of the base-camera conversion matrix M1 with the correct value at the time of simulator design, and determines the calibration accuracy required for executing the robot work, that is, the calibration accuracy higher than a predetermined value. It is determined whether or not there is A method for obtaining the estimated value of the base-camera conversion matrix M1 will be described later together with the processing content of the calibration execution unit 29 .
  • the calibration evaluation unit 273 may obtain an estimated value of the base-camera conversion matrix M1 by adding an error to the generated marker-camera conversion matrix M2 and arm-base conversion matrix M3.
  • the reason is as follows. When calibration is actually performed by the real device 2A, errors due to noise and the like are added to the observed data. Therefore, by adding an error to the marker/camera conversion matrix M2 and the arm/base conversion matrix M3 as described above, calibration evaluation can be performed under conditions close to those of the actual machine.
  • the errors to be added to the marker/camera conversion matrix M2 and the arm/base conversion matrix M3 may adopt, for example, errors following a normal distribution.
  • the standard deviation of the normal distribution at this time may be specified by the prior knowledge 21 or the like, or may be specified by a value having a constant ratio with respect to the values of the marker/camera conversion matrix M2 and the arm/base conversion matrix M3.
  • the observed data is the encoder value of the robot 30
  • the noise on the observed data is noise caused by an error in the robot mechanism. and noise caused by image distortion.
  • the observation data may also be the position and orientation of the marker 31 obtained by analyzing the image captured by the imaging unit 32 .
  • the calibration evaluation unit 273 determines that calibration with a higher accuracy than a predetermined value can be expected, it instructs the calibration execution unit 29 to perform calibration.
  • the calibration execution unit 29 receives an instruction from the calibration evaluation unit 273 and executes calibration to estimate the base-camera conversion matrix M1. Further, when the calibration evaluation unit 273 determines that calibration with a precision higher than a predetermined value cannot be expected, the calibration evaluation unit 273 instructs the index evaluation unit 274 to perform index evaluation.
  • the posture set generation unit 274 When the index evaluation unit 274 receives an index evaluation execution instruction from the calibration evaluation unit 273 (when calibration with a higher accuracy than a predetermined value cannot be expected), the posture set generation unit 274 generates additional postures. and/or instruct the parameter updating unit 28 to change the value of the generation parameter 22 . By performing such feedback, the posture set generation unit 23 can generate a posture set again, and can also generate a posture set including additional postures or a posture set based on the updated generation parameters 22. can be generated. As a result, when the posture set generation unit 23 generates a posture set again, it is possible to increase the possibility that calibration with a precision higher than the predetermined value can be expected.
  • the index evaluation unit 274 acquires the posture set evaluated by the calibration evaluation unit 273, and if the number of postures included in the posture set is less than a predetermined threshold value, the index (X , Y, Z), the parameter updating unit 28 is instructed to increment the index that has the lowest frequency of appearance when searching the set of all postures. Also, when the number of postures included in the acquired posture set is greater than or equal to a predetermined threshold value, the index evaluation unit 274 instructs the posture set generation unit 23 to generate additional postures.
  • the generation method there is a method of randomly extracting a plurality of postures from the obtained posture set and generating a teaching position and a teaching posture in a small space corresponding to the posture index. is not limited to
  • the posture set evaluation unit 27 Through the processing of the posture set evaluation unit 27 described above, it is possible to feed back the evaluation result of the posture set to the posture set generation unit 23 and the parameter update unit 28 . Further, only when highly accurate calibration equal to or higher than a predetermined value can be expected, the calibration executing section 29 can be caused to execute the calibration. Therefore, it is possible to prevent the accuracy of calibration from deteriorating due to an insufficient number of postures or the like. Further, when calibration with a precision higher than a predetermined value cannot be expected, the posture set generation unit 23 can be caused to generate a posture set again.
  • the posture set evaluation unit 27 has the posture number evaluation unit 271 and the posture set extraction unit 272, but is not limited to this, and has only the calibration evaluation unit 273 and the index evaluation unit 274. may be configured. Also, the function of the calibration evaluation unit 273 and the function of the index evaluation unit 274 may be integrated into one functional unit. Further, in the present embodiment, as a rule applied to the processing of the posture set extraction unit 272, the rule that the index given to each small space appears on average was exemplified. A part of postures may be extracted at random.
  • the parameter update unit 28 increments the values of the parameters X1, Y1, and Z1 that determine the resolution of the posture set based on the instruction from the posture set evaluation unit 27. Thereby, the generation parameter 22 is updated. Therefore, the pose set generator 23 uses the updated generation parameters 22 to generate a pose set.
  • FIG. 20 is a flow chart showing the processing procedure of the calibration executing section 29.
  • the calibration execution unit 29 controls the robot 30 and the imaging unit 32 of the physical device 2A, and calculates the marker/camera conversion matrix M2 and the arm/base conversion matrix M3 when the robot 30 is moved to each posture of the posture set. and get. Further, the calibration executing unit 29 estimates the base/camera conversion matrix M1 based on the acquired coordinate conversion parameters of the marker/camera conversion matrix M2 and the arm/base conversion matrix M3.
  • the calibration executing unit 29 loads, onto the memory of the control device 1, a set of postures that the posture set evaluating unit 27 has determined (evaluated) that calibration with a predetermined value or more can be expected (step S37).
  • the calibration execution unit 29 acquires the number N3 of postures included in the posture set (step S38).
  • the calibration executing unit 29 actually moves the robot 30 to the i-th posture in the physical device 2A (step S40).
  • the calibration execution unit 29 captures an image of the work space of the robot 30 using the imaging unit 32 (step S41).
  • the calibration executing unit 29 estimates the marker-camera conversion matrix M2 by analyzing the captured image obtained by the imaging unit 32 (step S42).
  • the calibration execution unit 29 determines whether or not the estimation of the marker-camera conversion matrix M2 has succeeded (step S43).
  • step S43 there is a possibility that the estimation of the marker-camera transformation matrix M2 may fail due to the actual lighting environment in the physical device 2A or the specification/arrangement deviation from the design values of the imaging unit 32 of the physical device 2A. be.
  • the calibration execution unit 29 succeeds in estimating the marker-camera conversion matrix M2
  • the arm-base conversion matrix M3 indicating the posture value of the i-th robot 30 and the posture of the i-th robot 30
  • a marker-camera conversion matrix M2 indicating the position and orientation of the corresponding marker 31 is stored in the memory of the control device 1 (step S44).
  • step S43 when the calibration execution unit 29 determines that the estimation of the marker-camera conversion matrix M2 has failed in step S43, it skips the process of step S44.
  • the calibration execution unit 29 solves the optimization problem from the data in the memory, estimates the base-camera conversion matrix M1 (step S47), and then performs a series of Finish processing. Solving the optimization problem in step S47 corresponds to optimization processing.
  • a known technique shown in the following references can be used as a method for estimating the base-camera conversion matrix M1 by this optimization process.
  • the coordinate transformation parameter AA is a known parameter based on robot encoder values
  • the coordinate transformation parameters BB and CC are unknown parameters
  • the base-camera conversion matrix M1 in the physical device 2A can be estimated.
  • the posture set generation unit 23 generates the posture set of the robot 30 for enabling the imaging unit 32 to image (shoot) the marker 31, and the posture set is From among them, the non-shielded posture extraction unit 26 extracts postures in which the marker 31 is not shielded. Therefore, it is possible to automatically generate a set of postures in which the marker 31 is not blocked. Therefore, the posture set of the robot 30 can be taught in consideration of the shielding of the marker 31, and the number of man-hours required for teaching the posture set of the robot 30 can be reduced.
  • the non-physical interference posture extraction unit 25 extracts postures in which the robot 30 and the marker 31 do not physically interfere from the posture set generated by the posture set generation unit 23. . Therefore, it is possible to automatically generate a set of postures in which the robot 30 and the marker 31 do not physically interfere. Therefore, the pose set of the robot 30 can be taught in consideration of the physical interference in the physical device 2A, and the number of man-hours required for teaching the pose set of the robot 30 can be reduced.
  • the non-physical interference posture extraction unit 25 extracts the non-physical interference posture from the posture set generated by the posture set generation unit 23, and from the posture set of the extracted non-physical interference postures
  • the non-shielding posture extraction unit 26 is configured to extract the non-shielding posture
  • the present invention is not limited to this.
  • the non-shielding posture extraction unit 26 extracts the non-shielding posture from the posture set generated by the posture set generation unit 23, and the non-physical interference posture extraction unit 25 extracts the non-shielding posture from the extracted posture set of the non-shielding posture. It may be configured to extract the physical interference posture.
  • the non-physical interference posture extraction unit 25 extracts a non-physical interference posture from the posture set generated by the posture set generation unit 23, and the non-shielding posture extraction unit 26 extracts the posture set generated by the posture set generation unit 23.
  • a configuration in which a non-physical interference posture is extracted from the non-physical interference posture extraction unit 25 and the non-shielding posture extraction unit 26, and the posture set evaluation unit 27 evaluates a common posture among the posture sets extracted respectively by the non-physical interference posture extraction unit 25 and the non-shielding posture extraction unit 26. may be Alternatively, of the non-physical interference posture extraction unit 25 and the non-shielding posture extraction unit 26, only the non-shielding posture extraction unit 26 may be provided. The above points also apply to the second embodiment described later.
  • FIG. 21 is a diagram showing a configuration example of a robot control calibration device according to the second embodiment.
  • the calibration device 2-1 includes a control device 1, a real device 2C, and a calibration control device 2D.
  • the real device 2C has a robot 30 consisting of a robot arm, a marker 31 attached to the robot 30, and an imaging unit 32-1 that measures the work space of the robot 30 and its surroundings.
  • the configuration of the physical device 2C is the same as that of the first embodiment except for the imaging unit 32-1.
  • the imaging unit 32-2 is composed of a plurality of (two in the figure) imaging devices 32A and 32B.
  • Each imaging device 32A, 32B is configured by, for example, a monocular camera, a ToF (Time of Flight) camera, a stereo camera, or the like.
  • the calibration control device 2D includes prior knowledge 21-1, generated parameters 22-1, posture set generator 23-1, simulator construction unit 24-1, non-physical interference posture extraction unit 25-1, and non-physical interference posture extraction unit 25-1.
  • a common posture extraction unit 201 is provided in addition to a shielding posture extraction unit 26-1, a posture set evaluation unit 27-1, a parameter update unit 28-1, and a calibration execution unit 29-1. Configurations other than the common posture extraction unit 201 are basically the same as those of the first embodiment.
  • the imaging section 32-1 is composed of a plurality of imaging devices 32A and 32B.
  • prior knowledge 21-1, generation parameter 22-1, posture set generation unit 23-1, simulator construction unit 24-1, non-physical interference posture extraction unit 25-1, non-shielding posture extraction unit 26-1, a posture set evaluation unit 27-1, a parameter update unit 28-1, and a calibration execution unit 29-1 perform the same operations as in the first embodiment on the plurality of imaging devices 32A and 32B, respectively. Similar processing operations are performed.
  • the posture set generation unit 23-1 generates a posture set of the robot 30 that enables the imaging device 32A to photograph the marker 31, and a posture set of the robot 30 that enables the imaging device 32B to photograph the marker 31. are generated separately.
  • Individual processing for each of the imaging devices 32A and 32B is performed not only by the posture set generation unit 23-1, but also by the simulator construction unit 24-1, the non-physical interference posture extraction unit 25-1, the non-shielding posture extraction unit 26-1, the posture set The evaluation section 27-1, the parameter update section 28-1, and the calibration execution section 29-1 are also performed.
  • the common posture extraction unit 201 is generated by the posture set generation unit 23-1 for each of the imaging devices 32A and 32B, and is generated by the non-physical interference posture extraction unit 25-1 and the non-shielding posture extraction unit 26-6 for the imaging devices 32A and 32B.
  • a plurality of postures that can be used in common by the plurality of imaging devices 32A and 32B are extracted from the posture set extracted for each 32B.
  • the common orientation extraction unit 201 changes the orientation set of each imaging device 32A, 32B so that the plurality of extracted orientations remain. A detailed description will be given below.
  • the common posture extraction unit 201 determines whether or not each posture included in a posture set generated corresponding to a certain imaging device 32A is applicable to another imaging device 32B, and performs similar determination processing. This is performed for all combinations of imaging devices 32A and 32B. As an example, whether or not each orientation included in a set of orientations generated corresponding to a certain imaging device 32A is applicable to another imaging device 32B is determined on the simulator constructed by the simulator construction unit 24-1. When the robot 30 is controlled by , and the marker 31 is moved to each posture, it is possible to determine whether or not the following conditions (1) to (3) are satisfied. (1) The marker 31 is included within the teaching space of the imaging device 32B. (2) The rotation matrices of the marker 31 and the imaging device 32B are within the preset threshold range. (3) The function of the non-occlusion posture extraction unit 26-1 can determine that occlusion does not occur.
  • an orientation that can be used in common by the plurality of imaging devices 32A and 32B constitutes an orientation set of the imaging devices 32A and 32B to which the orientation can be applied. Add as one of postures.
  • postures that can be used only by one of the imaging devices (32A or 32B) are removed from the posture set of the imaging devices 32A and 32B.
  • the motion executing unit 29 adjusts (reduces) the number of postures to be applied for calibration by the motion executing unit 29;
  • the priority in removing the orientation for example, there is a method of preferentially removing the orientation in the same small space as the small space in which the added orientation is located.
  • the common orientation extraction unit 201 extracts a plurality of orientations that can be used in common by the plurality of imaging devices 32A and 32B, and the calibration execution unit 29-1 performs calibration using the extracted plurality of orientations. run the Therefore, even when the imaging unit 32 is configured with a plurality of imaging devices 32A and 32B, it is possible to perform calibration while reducing man-hours required for teaching and calibrating the posture set of the robot 30 .
  • the common posture extraction unit 201 is generated by the posture set generation unit 23-1 for each of the imaging devices 32A and 32B, and the non-physical interference posture extraction unit 25-1 and the non-shielding posture extraction unit 26 -6 extracts a common orientation from among the orientation sets extracted for each of the imaging devices 32A and 32B, but the present invention is not limited to this.
  • the common posture extraction unit 201 may extract common postures from among the posture sets generated for each of the imaging devices 32A and 32B by the posture set generation unit 23-1.
  • the common posture extraction unit 201 may extract a common posture from among the posture sets extracted for each of the imaging devices 32A and 32B by the non-physical interference posture extraction unit 25-1.
  • the common posture extraction unit 201 may extract common postures from the posture set extracted for each of the imaging devices 32A and 32B by the unshielded posture extraction unit 26-6.
  • the common posture extraction unit 201 controls the robot 30 on the simulator and determines common usable postures. , the determination may be made only from the teaching position and orientation of the marker 31 . Further, by increasing the number of postures during calibration without setting the upper limit of the number of postures included in the posture set of each imaging device 32A, 32B, an improvement in accuracy can be expected in some cases. In that case, it is not necessary to predetermine the upper limit of the number of postures.
  • the present invention is not limited to the above-described embodiments, and includes various modifications.
  • the details of the present invention have been described for easy understanding, but the present invention is not necessarily limited to having all the configurations described in the above-described embodiments.
  • part of the configuration of one embodiment can be replaced with the configuration of another embodiment.
  • add the configuration of another embodiment to the configuration of one embodiment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)

Abstract

Ce dispositif d'étalonnage comprend : une unité d'imagerie qui capture une image à partir d'une position prédéterminée ; un marqueur fixé à un robot ; une unité de génération de collecte de posture qui génère une collecte de posture comprenant une pluralité de postures du robot ; une unité d'extraction de posture de non-blocage qui extrait de la collecte de posture une pluralité de postures qui ne bloquent pas le marqueur ; et une unité d'exécution d'étalonnage qui estime des paramètres de transformation de coordonnées pour transformer le système de coordonnées du robot et le système de coordonnées de l'unité d'imagerie sur la base de la pluralité extraite de postures.
PCT/JP2021/040394 2021-03-29 2021-11-02 Dispositif d'étalonnage pour commander un robot WO2022208963A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-055774 2021-03-29
JP2021055774A JP7437343B2 (ja) 2021-03-29 2021-03-29 ロボット制御用のキャリブレーション装置

Publications (1)

Publication Number Publication Date
WO2022208963A1 true WO2022208963A1 (fr) 2022-10-06

Family

ID=83458339

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/040394 WO2022208963A1 (fr) 2021-03-29 2021-11-02 Dispositif d'étalonnage pour commander un robot

Country Status (2)

Country Link
JP (1) JP7437343B2 (fr)
WO (1) WO2022208963A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115641384A (zh) * 2022-11-01 2023-01-24 桂林理工大学 一种基于神经架构搜索标定板检测的自动手眼标定方法
KR20240071533A (ko) * 2022-11-15 2024-05-23 주식회사 브레인봇 다중 연산을 이용한 핸드-아이 캘리브레이션 방법 및 장치

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014161603A1 (fr) * 2013-04-05 2014-10-09 Abb Technology Ltd Système robot et méthode d'étalonnage
JP2015182144A (ja) * 2014-03-20 2015-10-22 キヤノン株式会社 ロボットシステムおよびロボットシステムの校正方法
JP2019217571A (ja) * 2018-06-15 2019-12-26 オムロン株式会社 ロボット制御システム
JP2020172017A (ja) * 2019-03-05 2020-10-22 ザ・ボーイング・カンパニーThe Boeing Company ロボット光センサのオートキャリブレーション
JP2021000678A (ja) * 2019-06-20 2021-01-07 オムロン株式会社 制御システムおよび制御方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014161603A1 (fr) * 2013-04-05 2014-10-09 Abb Technology Ltd Système robot et méthode d'étalonnage
JP2015182144A (ja) * 2014-03-20 2015-10-22 キヤノン株式会社 ロボットシステムおよびロボットシステムの校正方法
JP2019217571A (ja) * 2018-06-15 2019-12-26 オムロン株式会社 ロボット制御システム
JP2020172017A (ja) * 2019-03-05 2020-10-22 ザ・ボーイング・カンパニーThe Boeing Company ロボット光センサのオートキャリブレーション
JP2021000678A (ja) * 2019-06-20 2021-01-07 オムロン株式会社 制御システムおよび制御方法

Also Published As

Publication number Publication date
JP2022152845A (ja) 2022-10-12
JP7437343B2 (ja) 2024-02-22

Similar Documents

Publication Publication Date Title
CN111482959B (zh) 机器人运动视觉系统的自动手眼标定系统与方法
US9616569B2 (en) Method for calibrating an articulated end effector employing a remote digital camera
US11338435B2 (en) Gripping system with machine learning
CN106873550B (zh) 模拟装置以及模拟方法
US20180066934A1 (en) Three-dimensional measurement apparatus, processing method, and non-transitory computer-readable storage medium
US9727053B2 (en) Information processing apparatus, control method for information processing apparatus, and recording medium
WO2022208963A1 (fr) Dispositif d'étalonnage pour commander un robot
US20090118864A1 (en) Method and system for finding a tool center point for a robot using an external camera
US20180250813A1 (en) Image Processing Device, Image Processing Method, And Computer Program
CN111360821A (zh) 一种采摘控制方法、装置、设备及计算机刻度存储介质
US20150125034A1 (en) Information processing apparatus, information processing method, and storage medium
CN113379849B (zh) 基于深度相机的机器人自主识别智能抓取方法及系统
JP2015212629A (ja) 検出装置およびこの装置を具えたマニプレータの動作制御
US11446822B2 (en) Simulation device that simulates operation of robot
JP6973444B2 (ja) 制御システム、情報処理装置および制御方法
WO2019059343A1 (fr) Dispositif de traitement d'informations de pièce à travailler et procédé de reconnaissance de pièce à travailler
EP3578321A1 (fr) Procédé d'utilisation avec une machine permettant de generer un environnement d'affichage de réalité augmentée
KR101972432B1 (ko) 레이저비전 센서 및 그 보정방법
CN114187312A (zh) 目标物的抓取方法、装置、系统、存储介质及设备
CN114494312A (zh) 训练从对象图像中识别对象的对象拓扑的机器学习模型的设备和方法
CN114494426A (zh) 用于控制机器人来在不同的方位拿起对象的装置和方法
KR102438490B1 (ko) 단일 체커보드를 이용하는 이종 센서 캘리브레이션 방법 및 장치
Motai et al. SmartView: hand-eye robotic calibration for active viewpoint generation and object grasping
JP2022006427A (ja) 撮像システム構築装置
KR102585332B1 (ko) 로봇 핸드와 로봇 핸드와 분리된 카메라 간의 캘리브레이션을 수행하는 방법 및 디바이스

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21935119

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21935119

Country of ref document: EP

Kind code of ref document: A1