WO2022172471A1 - 支援システム、画像処理装置、支援方法およびプログラム - Google Patents
支援システム、画像処理装置、支援方法およびプログラム Download PDFInfo
- Publication number
- WO2022172471A1 WO2022172471A1 PCT/JP2021/009562 JP2021009562W WO2022172471A1 WO 2022172471 A1 WO2022172471 A1 WO 2022172471A1 JP 2021009562 W JP2021009562 W JP 2021009562W WO 2022172471 A1 WO2022172471 A1 WO 2022172471A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- measurement
- unit
- orientation
- imaging device
- measurement positions
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 35
- 238000012545 processing Methods 0.000 title claims description 30
- 238000005259 measurement Methods 0.000 claims abstract description 296
- 238000011156 evaluation Methods 0.000 claims abstract description 105
- 230000033001 locomotion Effects 0.000 claims abstract description 102
- 238000003384 imaging method Methods 0.000 claims abstract description 63
- 230000008859 change Effects 0.000 claims description 19
- 230000004044 response Effects 0.000 claims description 17
- 230000000007 visual effect Effects 0.000 claims description 8
- 239000011159 matrix material Substances 0.000 description 37
- 230000009466 transformation Effects 0.000 description 37
- 238000010586 diagram Methods 0.000 description 18
- 238000004364 calculation method Methods 0.000 description 17
- 238000001514 detection method Methods 0.000 description 15
- 238000003860 storage Methods 0.000 description 14
- 230000036544 posture Effects 0.000 description 11
- 239000000203 mixture Substances 0.000 description 8
- 239000013598 vector Substances 0.000 description 8
- 238000005286 illumination Methods 0.000 description 7
- 230000001131 transforming effect Effects 0.000 description 6
- 238000009826 distribution Methods 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 238000004891 communication Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 239000012636 effector Substances 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 238000011960 computer-aided design Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 230000010363 phase shift Effects 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 229910052736 halogen Inorganic materials 0.000 description 1
- 150000002367 halogens Chemical class 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 238000002834 transmittance Methods 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
- B25J19/021—Optical sensing devices
- B25J19/023—Optical sensing devices including video camera means
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
- B25J9/1697—Vision controlled systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40564—Recognize shape, contour of object, extract position and orientation
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40613—Camera, laser scanner on end effector, hand eye manipulator, local
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
Definitions
- the present disclosure relates to a support system, an image processing device, a support method and a program.
- a technology in which a robot picks workpieces one by one from a plurality of bulk objects (hereinafter also referred to as "workpieces").
- workpieces a plurality of bulk objects
- the position and orientation of the workpiece are measured, and the motion of the robot is determined according to the measurement results.
- Patent Document 1 discloses a sensor that measures the position and orientation of a workpiece, a hand that grips the workpiece, a robot that moves the hand to a gripping position and moves from the gripping position,
- a work picking device is disclosed that includes a controller. The control device determines the status of the workpiece based on the measurement result of the position and orientation of the workpiece and the calculation result of the number of workpieces whose position and orientation are detected, and changes the measurement parameters when the determination result satisfies a predetermined condition. do.
- Patent Document 1 sets measurement parameters so as to improve the picking success rate when operating a picking device. That is, the technique described in Patent Literature 1 does not assume support for condition setting when measuring the position and orientation of a work when the picking device is started up.
- Bulk-loaded workpieces can assume various postures when viewed from the sensor. Therefore, in order to appropriately set the conditions for measuring the position and orientation of the workpiece when starting up the picking device, the user must manually change the orientation of the workpiece as appropriate while need to confirm. As a result, it takes time and effort to set the conditions for measuring the position and orientation of the workpiece. Therefore, it is desired to develop a system that supports condition setting considering possible postures of the object.
- the present disclosure has been made in view of the above problems, and aims to provide a support system capable of supporting condition setting when measuring the position and orientation of an object, taking into account possible orientations of the object, image processing, and the like. It is to provide an apparatus, a support method and a program.
- the support system includes an imaging device, a setting unit, a robot, a measurement unit, an evaluation unit, and a presentation unit.
- An imaging device images an object.
- the setting unit sets the movement range of the imaging device.
- the robot sequentially moves the imaging device to a plurality of measurement positions within the movement range.
- the measurement unit measures the position and orientation of the target object using captured images obtained by imaging with the imaging device for each of the plurality of measurement positions.
- the evaluation unit evaluates the result of measurement by the measurement unit.
- the presenting unit presents a map indicating the correspondence between each of the plurality of measurement positions and the evaluation result by the evaluating unit.
- the postures of the target object are different from each other in the captured images obtained by capturing images from a plurality of measurement positions. Therefore, by checking the map, the user can easily grasp the correspondence relationship between the position and orientation of the object shown in the captured image and the evaluation result of the measured position and orientation of the object. Based on the corresponding relationship, the user can determine whether the evaluation result for each measurement position is good or bad, and can appropriately set the conditions for measuring the position and orientation of the object using the image of the object. In this way, the support system can support condition setting when measuring the position and orientation of the target object, taking into account possible postures of the target object.
- the evaluation unit calculates at least one evaluation value representing evaluation results.
- the presenting unit receives designation of one of the plurality of measurement positions on the map, and presents at least one evaluation value corresponding to the one measurement position in response to the designation of one measurement position. .
- the setting unit sets a reference position within the movement range, and registers the position and orientation of the model of the object so as to match the position and orientation of the object measured from the captured image corresponding to the reference position. do.
- the presentation unit generates a virtual image of the model viewed from the one measurement position in response to the specification of one measurement position, and presents the virtual image.
- the measurement unit generates 3D point cloud data of the field of view area of the imaging device based on the captured image, and measures the position and orientation based on the 3D point cloud data and the template data.
- At least one evaluation value includes a correlation value between the 3D point cloud data and the template data.
- the user can determine whether or not there is a problem in searching for the object using the template data. If the correlation value is low, the user can take action (eg, adjusting illumination light intensity, shutter speed, camera gain, etc.) to increase the correlation value.
- action eg, adjusting illumination light intensity, shutter speed, camera gain, etc.
- the measurement unit performs multiple measurements for each of the multiple measurement positions.
- At least one evaluation value includes a value indicative of repeatability of multiple measurements.
- the user can confirm the stability of the measurement by confirming the value indicating the repeatability. If the repeatability is low, the user can take action (eg, adjust illumination light intensity, shutter speed, camera gain, etc.) to improve repeatability.
- the repeatability is low, the user can take action (eg, adjust illumination light intensity, shutter speed, camera gain, etc.) to improve repeatability.
- the plurality of measurement positions includes reference positions.
- the measurement unit measures the position and orientation of an object in a camera coordinate system based on the imaging device.
- At least one evaluation value includes, for each of the plurality of measurement positions, a first amount of change in the position and orientation of the object in the camera coordinate system, which is estimated based on the amount of movement from the reference position, and It includes a value indicating the difference between the position and orientation and the second amount of change from the position and orientation measured at the measurement position.
- the user confirms whether or not there is a problem in the process of searching for the object from the captured image by confirming the value indicating the difference between the first amount of change and the second amount of change. can. If such a problem occurs, the user should change the conditions regarding the process of searching for the object from the captured image.
- the user can easily grasp each of the plurality of measurement positions arranged on the spherical surface by checking the map.
- an image processing device includes a setting unit that sets a movement range of an imaging device that captures an image of an object; a measurement unit that measures the position and orientation of an object using a captured image obtained from a measurement unit; an evaluation unit that evaluates the measurement results of the measurement unit; and a presenting unit that presents the map shown.
- the support method includes first to fifth steps.
- a first step is a step of setting a movement range of an image pickup device that picks up an image of an object.
- the second step is to use the robot to sequentially move the imaging device to a plurality of measurement positions within the movement range.
- a third step is a step of measuring the position and orientation of the object using the captured image obtained by the imaging device for each of the plurality of measurement positions.
- the fourth step is a step of evaluating the position and orientation measurement results.
- a fifth step is a step of presenting a map showing the correspondence between each of the plurality of measurement positions and the evaluation result.
- the program causes the computer to execute the above support method.
- FIG. 1 is a schematic diagram showing the overall configuration of a support system according to an embodiment; FIG. It is a figure which shows an example of the workpiece
- 2 is a schematic diagram showing an example of a hardware configuration of an image sensor controller shown in FIG. 1; FIG. It is a figure which shows an example of the coordinate system used in an assistance system.
- 2 is a block diagram showing an example of a functional configuration of an image sensor controller shown in FIG. 1;
- FIG. 10 is a diagram showing an example of a screen (setting screen) for assisting setting of a movement range; It is a figure which shows an example of a performance report screen.
- FIG. 1 is a schematic diagram showing the overall configuration of a support system according to an embodiment.
- a support system 1 illustrated in FIG. 1 supports condition setting when measuring the position and orientation of a work W using an image of the work W, which is an object.
- the support system 1 is, for example, incorporated in a production line or the like and used for picking control of works W loaded in bulk.
- the support system 1 illustrated in FIG. 1 includes an image sensor controller 100, a measurement head 200, a robot 300, a robot controller 400, and a display 500.
- the measurement head 200 images the subject including the work W.
- the measurement head 200 includes a projection section 201 and an imaging section 202 .
- a projection unit 201 projects arbitrary projection pattern light onto a subject in accordance with an instruction from the image sensor controller 100 .
- the projection pattern is, for example, a pattern whose brightness periodically changes along a predetermined direction within the irradiation plane.
- the image capturing unit 202 captures an image of the subject on which the projection pattern light is projected according to an instruction from the image sensor controller 100 .
- the projection unit 201 includes, as main components, a light source such as an LED (Light Emitting Diode) or a halogen lamp, and a filter arranged on the irradiation surface side of the projection unit 201 .
- the filter generates projection pattern light necessary for measuring a three-dimensional shape as described later, and can arbitrarily change the in-plane light transmittance according to a command from the image sensor controller 100 .
- the projection unit 201 may project fixed projection pattern light that does not change with time, or may project projection pattern light that changes with time.
- the projection unit 201 may be configured using a liquid crystal or a DMD (Digital Mirror Device) and a light source (LED, laser light source, etc.).
- the imaging unit 202 includes, as main components, for example, an optical system such as a lens, and an imaging device such as a CCD (Coupled Charged Device) or CMOS (Complementary Metal Oxide Semiconductor) sensor.
- an optical system such as a lens
- an imaging device such as a CCD (Coupled Charged Device) or CMOS (Complementary Metal Oxide Semiconductor) sensor.
- the robot 300 moves the measurement head 200. Furthermore, the robot 300 may pick the workpiece W.
- the support system 1 may use a robot that picks up bulk works W to support condition setting when measuring the position and orientation of the work W.
- the support system 1 may use a robot different from the robot that picks up the bulk works W to support condition setting when measuring the position and orientation of the work W.
- the robot 300 includes an articulated arm 301 and a base 302 that supports the articulated arm 301 .
- a flange plate 303 is attached to the tip of the articulated arm 301 .
- a support member 305 that supports the measuring head 200 and the end effector 304 is attached to the flange plate 303 .
- the operation of robot 300 is controlled by robot controller 400 .
- the end effector 304 is configured to hold the work W and includes, for example, a suction pad or a two-fingered hand.
- the robot controller 400 controls the multi-joint arm 301 according to movement commands from the image sensor controller 100 .
- the movement command indicates the position and orientation of the flange plate 303, for example.
- the robot controller 400 controls the articulated arm 301 so that the position and orientation of the flange plate 303 match the movement command.
- the image sensor controller 100 provides a screen that helps set conditions when measuring the position and orientation of the workpiece W. As shown in FIG. 1, the image sensor controller 100 includes a setting section 11, a measurement section 12, an evaluation section 13, and a presentation section .
- the setting unit 11 sets the movement range of the measuring head 200 .
- the setting unit 11 may set the movement range according to user input.
- the setting unit 11 generates a movement command to sequentially move the measuring head 200 to a plurality of measurement positions within the movement range, and outputs the generated movement command to the robot controller 400 .
- the robot 300 sequentially moves the measuring head 200 to a plurality of measurement positions within the movement range. For example, measuring head 200 moves along arrows 50 and 51 .
- the measurement unit 12 measures the position and orientation of the workpiece W using the imaged image obtained by imaging the measurement head 200 for each of the plurality of measurement positions. Specifically, the measuring unit 12 measures the three-dimensional shape of the visual field region of the measuring head 200 based on the captured image, and searches for a shape corresponding to the workpiece W in the measured three-dimensional shape. The measurement unit 12 measures the position and orientation of the work W according to the search result.
- the evaluation unit 13 evaluates the measurement results of the position and orientation of the workpiece W by the measurement unit 12 for each of the plurality of measurement positions.
- the presentation unit 14 presents a map 60 showing the correspondence relationship between each of the plurality of measurement positions and the evaluation result by the evaluation unit 13.
- the postures of the work W are different from each other in the captured images obtained by capturing images from a plurality of measurement positions. Therefore, by checking the map 60, the user can easily grasp the correspondence relationship between the position and orientation of the work W shown in the captured image and the evaluation result of the measured position and orientation of the work W. Based on the corresponding relationship, the user can determine whether the evaluation result for each measurement position is good or bad, and can appropriately set the conditions for measuring the position and orientation of the work W.
- the support system 1 can support setting of conditions when measuring the position and orientation of the work W, taking into account possible orientations of the work W.
- FIG. 2 is a diagram showing an example of bulk-loaded workpieces.
- FIG. 2 shows a plurality of works W bulk-loaded inside the container 2 .
- a plurality of works W bulk-loaded in the container 2 are sequentially picked one by one.
- the program according to the present embodiment may be provided by being incorporated into a part of another program. Even in that case, the program itself does not include the modules included in the other program to be combined as described above, and the processing is executed in cooperation with the other program. That is, the program according to the present embodiment may be incorporated in such other program. A part or all of the functions provided by executing the program may be implemented as a dedicated hardware circuit.
- FIG. 3 is a schematic diagram showing an example of the hardware configuration of the image sensor controller shown in FIG.
- the image sensor controller 100 includes a CPU (Central Processing Unit) 101 as an arithmetic processing unit, a main memory 102 and a hard disk 103 as storage units, a measuring head interface 104, and an input interface 105. , a display controller 106 , a communication interface 107 and a data reader/writer 108 . These units are connected to each other via a bus 109 so as to be able to communicate with each other.
- a CPU Central Processing Unit
- the CPU 101 expands the programs (codes) installed in the hard disk 103 into the main memory 102 and executes them in a predetermined order to perform various calculations.
- the main memory 102 is typically a volatile storage device such as a DRAM (Dynamic Random Access Memory), and holds programs read from the hard disk 103, as well as images acquired by the measuring head 200. do. Further, the hard disk 103 stores various data as described later. In addition to the hard disk 103, or instead of the hard disk 103, a semiconductor storage device such as a flash memory may be employed.
- the measuring head interface 104 mediates data transmission between the CPU 101 and the measuring head 200 . That is, the measurement head interface 104 is connected with the measurement head 200 .
- the measuring head interface 104 gives a projection command and an imaging command to the measuring head 200 according to internal commands generated by the CPU 101 .
- the metrology head interface 104 includes an image buffer 104 a for temporarily storing images from the metrology head 200 . When a predetermined number of frames of images are accumulated in the image buffer 104a, the measurement head interface 104 transfers the accumulated images to the main memory 102.
- the input interface 105 mediates data transmission between the CPU 101 and the input device 600 . That is, the input interface 105 accepts input information input to the input device 600 by the user.
- the input device 600 includes a keyboard, mouse, touch panel, and the like.
- the display controller 106 is connected to the display 500 and controls the screen of the display 500 so as to notify the user of the processing results of the CPU 101 and the like.
- the communication interface 107 mediates data transmission between the CPU 101 and an external device such as the robot controller 400 .
- the communication interface 107 is typically made up of Ethernet (registered trademark), USB (Universal Serial Bus), or the like.
- the data reader/writer 108 mediates data transmission between the CPU 101 and the memory card 700, which is a recording medium. That is, memory card 700 is distributed in a state in which a program to be executed by image sensor controller 100 is stored, and data reader/writer 108 reads the program from memory card 700 . In addition, data reader/writer 108 writes an image captured by measurement head 200 and/or a processing result in image sensor controller 100 to memory card 700 in response to an internal command from CPU 101 .
- the memory card 700 is a general-purpose semiconductor storage device such as an SD (Secure Digital), a magnetic storage medium such as a flexible disk, or an optical storage medium such as a CD-ROM (Compact Disk Read Only Memory). etc.
- the CPU 101 calculates the positions and orientations of the workpiece W, the measuring head 200 and the robot 300 using coordinate values of a plurality of coordinate systems.
- FIG. 4 is a diagram showing an example of a coordinate system used in the support system.
- the assistance system 1 uses a robot base coordinate system, a flange coordinate system (also called tool coordinate system), a camera coordinate system and a work coordinate system.
- the robot base coordinate system is a coordinate system with the base 302 of the robot 300 as a reference.
- a robot base coordinate system is defined by an origin Ob set on the base 302 and base vectors Xb, Yb, and Zb.
- the measuring head 200 is fixed to the flange plate 303 via the support member 305 . Therefore, the relative positional relationship between the measurement head 200 and the flange plate 303 is constant.
- the relative positional relationship is indicated, for example, by a coordinate transformation matrix FHC that transforms the flange coordinate system into the camera coordinate system.
- the coordinate transformation matrix FHC represents the position of the origin Oc of the camera coordinate system and the basis vectors Xc, Yc, and Zc in the flange coordinate system.
- the coordinate transformation matrix FHC is represented by fixed values and obtained by calibration performed in advance.
- known hand-eye calibration can be employed in which the imaging unit 202 of the measurement head 200 captures an image of a marker installed at a fixed position while operating the robot 300 .
- a movement command output from the image sensor controller 100 to the robot controller 400 is represented by, for example, a coordinate transformation matrix BHF that transforms the base coordinate system into the flange coordinate system.
- the coordinate transformation matrix BHF represents the origin Of of the flange coordinate system and the basis vectors Xf , Yf, and Zf in the base coordinate system.
- the position and orientation of the work W in the robot base coordinate system are represented by a coordinate transformation matrix BHW that transforms the robot base coordinate system into the work coordinate system, for example.
- the coordinate transformation matrix C H W indicates the position and orientation of the workpiece W in the camera coordinate system, and is detected based on the captured image received from the measuring head 200 . Therefore, the CPU 101 can calculate the position and orientation of the workpiece W in the robot base coordinate system using the movement command output to the robot controller 400 and the measurement result based on the captured image.
- FIG. 5 is a block diagram showing an example of the functional configuration of the image sensor controller shown in FIG. 1;
- the image sensor controller 100 includes a setting section 11 , a measurement section 12 , an evaluation section 13 , a presentation section 14 and a storage section 15 .
- the setting unit 11, the measurement unit 12, the evaluation unit 13, and the presentation unit 14 are realized by executing a program stored in the hard disk 103 by the CPU 101 shown in FIG.
- Storage unit 15 is implemented by main memory 102 and hard disk 103 shown in FIG.
- the storage unit 15 stores calibration data 151 , workpiece shape data 152 , a plurality of template data 153 , measurement position list 154 , and model position/orientation data 155 .
- the work shape data 152 indicates the three-dimensional shape of the work W.
- Work shape data 152 is, for example, CAD (Computer-Aided Design) data.
- the model position and orientation data 155 indicates the position and orientation of the work model in the robot base coordinate system.
- the model position and orientation data 155 is generated by the setting unit 11 as described later.
- Setting unit 11 includes movement range determination unit 111 , registration unit 112 , and movement command unit 113 .
- the movement range determination unit 111 sets the movement range of the measurement head 200 according to the input to the input device 600 and the captured image received from the measurement head 200 at the designated position and orientation.
- the movement range of the measuring head 200 is, for example, a spherical area centered at a reference point set in the visual field area of the measuring head 200 and having a radius equal to the distance between the reference point and the imaging unit 202 of the measuring head 200. show.
- the reference point is, for example, the center of the visual field area of the measurement head 200 and a point having the same height (Z coordinate value) as the center of gravity of the workpiece W.
- the movement range determination unit 111 sets each of a plurality of positions within the movement range as measurement positions. Note that the movement range determination unit 111 sets the specified position as the reference position. A reference position is included in a plurality of measurement positions.
- the registration unit 112 registers information on the movement range determined by the movement range determination unit 111 in the storage unit 15 .
- the registration unit 112 generates a measurement position list 154 and model position/orientation data 155 as information about the movement range.
- the registration unit 112 generates a measurement position list 154 showing a list of each measurement position and the orientation of the measurement head 200 at each measurement position for the plurality of measurement positions determined by the movement range determination unit 111 .
- the posture of the measurement head 200 at each measurement position is set so that the reference point is positioned on the optical axis of the imaging unit 202 .
- the orientation of the measurement head 200 at each measurement position is set so that the orientation of the measurement head 200 when viewed from the reference point is constant. That is, in the camera coordinate system, the reference point is a fixed position (for example, the central position of the field of view). For example, when the work W is positioned at the reference point, the work W appears in the center of the captured image at each measurement position.
- the position and orientation of the measuring head 200 at each measurement position are represented by, for example, a coordinate transformation matrix BHC that transforms the robot base coordinate system into the camera coordinate system corresponding to the measuring head 200 at the relevant position and orientation.
- the measurement position list 154 includes a number that identifies each measurement position in order to distinguish a plurality of measurement positions. In the measurement position list 154, number "0" corresponds to the reference position.
- the registration unit 112 determines the position and orientation of the work W measured using the captured image received from the measurement head 200 at the specified position and orientation as the position and orientation of the work model.
- a work model is a model having the same shape as the work W.
- FIG. The registration unit 112 generates model position/orientation data 155 indicating the determined position/orientation of the work model, and registers it in the storage unit 15 .
- the model position and orientation data 155 indicates the position and orientation of the work model in the robot base coordinate system.
- the movement command unit 113 generates a movement command for sequentially moving the measuring head 200 to a plurality of measurement positions in response to a start instruction input to the input device 600, and outputs the generated movement command to the robot controller 400. .
- the robot controller 400 receives a movement command indicating the position and orientation of the flange plate 303 and controls the articulated arm 301 so that the position and orientation of the flange plate 303 match the movement command. Therefore, the movement command unit 113 reads the coordinate transformation matrix BHC representing the position and orientation of the measuring head 200 corresponding to each measurement position from the measurement position list 154 .
- the coordinate transformation matrix BHC , the coordinate transformation matrix FHC for transforming the flange coordinate system into the camera coordinate system, and the coordinate transformation matrix BHF for transforming the robot base coordinate system into the flange coordinate system are expressed by the following equations. Fulfill.
- the movement command unit 113 uses the read coordinate transformation matrix BHC and the coordinate transformation matrix FHC indicated by the calibration data 151 to calculate the coordinate transformation matrix BHF .
- the movement command unit 113 may generate a movement command indicating the coordinate transformation matrix BHF obtained by the calculation.
- the point cloud data generation unit 121 performs three-dimensional measurement of the visual field area of the measurement head 200 placed at each measurement position based on the captured image corresponding to each measurement position, and generates three-dimensional point cloud data.
- the 3D point cloud data indicates the 3D coordinates of each point on the object surface (measurement surface) existing in the visual field area of the measurement head 200 .
- a triangulation method or a coaxial method can be adopted as the three-dimensional measurement processing.
- the triangulation method is a method in which the optical axes of imaging and light projection are spaced apart by the length of the base line, and parallax is converted into distance.
- Triangulation methods include active methods and passive methods. Active methods include structured illumination methods, phase shift methods, and spatial code methods.
- the coaxial method is a method in which the optical axes of the imaging and distance measuring means are set to be the same.
- a focusing method is included as a distance measuring means.
- a TOF (Time of Flight) method is also included as a method close to coaxial.
- the point cloud data generation unit 121 may use any one of these methods to execute the three-dimensional measurement process.
- Three-dimensional measurement processing is performed using, for example, a phase shift method.
- the workpiece detection unit 122 detects the position and orientation of the workpiece W by three-dimensional search. Specifically, the workpiece detection unit 122 collates a plurality of template data 153 stored in the storage unit 15 with the three-dimensional point cloud data, and searches for data similar to the template from the three-dimensional point cloud data. . The searched data corresponds to the data of the portion where the workpiece W exists. The workpiece detection unit 122 detects the position and orientation of the workpiece W based on data similar to the template searched from the three-dimensional point cloud data. The position and orientation of the work W detected by the work detection unit 122 are indicated by the camera coordinate system.
- the workpiece detection unit 122 may detect the position and orientation of the workpiece W using a known detection algorithm. Specifically, the workpiece detection unit 122 calculates the correlation value between the three-dimensional point cloud data and the template data 153, and if the calculated correlation value is equal to or greater than the first threshold, the workpiece W is present. judge that there is Then, the workpiece detection unit 122 detects the position and orientation of the workpiece according to the template data 153 with the highest correlation value.
- the measurement unit 12 may measure the position and orientation of the workpiece W multiple times for each measurement position.
- the evaluation unit 13 includes a correlation value calculation unit 131 , a repeatability calculation unit 132 , a movement error calculation unit 133 and a determination unit 134 .
- the correlation value calculation unit 131 calculates a correlation value between the three-dimensional point cloud data and the template data 153 as an evaluation value for evaluating the measurement result of the measurement unit 12 for each measurement position.
- the correlation value calculator 131 calculates the correlation value for the template data 153 that is most similar to the three-dimensional point cloud data among the plurality of template data 153 .
- the correlation value calculation unit 131 calculates, for example, a correlation value (correlation value of a surface) indicating the degree of similarity of surface elements that are characteristic portions.
- the correlation value calculation unit 131 may calculate a correlation value (contour correlation value) indicating the degree of similarity of contour elements that are characteristic portions.
- the correlation value calculation unit 131 calculates a representative value (eg, average value, median value, maximum value) of the correlation values obtained from the multiple measurements. , minimum value, etc.).
- the repeatability calculation unit 132 calculates, for each measurement position, as an evaluation value for evaluating the measurement result of the measurement unit 12, repeatability indicating variations in a plurality of positions and orientations obtained by a plurality of measurements.
- the repeatability is represented, for example, by a value (for example, standard deviation, difference between maximum and minimum values, etc.) that indicates variation in parameter values of the coordinate transformation matrix CHW that indicates the position and orientation of the workpiece W.
- the movement error calculation unit 133 calculates a movement error (hereinafter also referred to as "linearity") as an evaluation value for evaluating the measurement result of the measurement unit 12 for each measurement position.
- the movement error is estimated based on the amount of movement of the measuring head 200 from the reference position to each measurement position, and is calculated from the first change amount of the position and orientation of the workpiece W in the camera coordinate system and the position and orientation measured at the reference position. The difference from the second amount of change in the position and orientation measured at each measurement position is shown.
- the movement error calculator 133 arranges the work model at the position and orientation indicated by the model position and orientation data 155 in the virtual space, and arranges the first virtual viewpoint at the reference position.
- the movement error calculator 133 calculates the position and orientation of the work model when the work model is viewed from the first virtual viewpoint (hereinafter referred to as “first position and orientation”).
- first position and orientation the position and orientation of the work model when the work model is viewed from the first virtual viewpoint
- second virtual viewpoint at each measurement position in the virtual space.
- the movement error calculator 133 calculates the position and orientation of the work model when the work model is viewed from the second virtual viewpoint (hereinafter referred to as “second position and orientation”).
- the movement error calculation unit 133 calculates the difference between the first position/posture and the second position/posture as the first change amount.
- the determination unit 134 determines the rank of the overall evaluation result of the measurement results of the position and orientation of the work W at each measurement position using the correlation value, the repeatability, and the movement error.
- Presentation unit 14 includes map generation unit 141 , virtual image generation unit 142 , and screen generation unit 143 .
- the map generation unit 141 uses the measurement position list 154 and the evaluation result by the evaluation unit 13 to generate a map showing the correspondence between each of the plurality of measurement positions and the evaluation result.
- the virtual image generation unit 142 arranges the work model in the position and orientation indicated by the model position and orientation data 155 in the virtual space, and also arranges the virtual viewpoint at one designated measurement position. Furthermore, the virtual image generator 142 arranges the virtual reference point at the same position as the reference point in the virtual space. Then, the virtual image generator 142 generates a virtual image when the virtual reference point is viewed from the virtual viewpoint.
- Screen generation unit 143 generates a screen including a map (hereinafter referred to as a “performance report screen”), and displays the performance report screen on display 500.
- Screen generation unit 143 generates data from a plurality of measurement positions on the map.
- a virtual image corresponding to the specified measurement position may be included in the performance report screen, and the screen generation unit 143 generates three-dimensional point cloud data corresponding to the specified measurement position. (hereinafter referred to as a "point cloud state diagram”) may be included in the performance report screen.
- the point cloud state diagram may be, for example, a range image.
- the CPU 101 sets the movement range of the measurement head 200 according to the input to the input device 600 (step S1).
- the CPU 101 selects one measurement position from among the plurality of measurement positions within the movement range (step S2).
- the CPU 101 uses the robot 300 to move the measurement head 200 to the selected measurement position (step S3).
- the CPU 101 generates a movement command for moving the measuring head 200 to the selected measurement position, and outputs the generated movement command to the robot controller 400 .
- the CPU 101 measures the position and orientation of the workpiece W using the image captured by the measurement head 200 positioned at the measurement position (step S4).
- step S5 determines whether there is an unselected measurement position among the plurality of measurement positions. If there is an unselected measurement position (YES in step S5), the process of CPU 101 returns to step S2. If there is no unselected measurement position (NO in step S5), the processing of CPU 101 proceeds to step S6. In this way, the CPU 101 uses the robot 300 to sequentially move the measuring head 200 to a plurality of measurement positions, and measures the position and orientation of the workpiece W using the captured image for each of the plurality of measurement positions.
- CPU 101 displays a screen for assisting setting of the movement range (hereinafter referred to as a "setting screen") on display 500, and registers (sets) information regarding the movement range according to the input to the setting screen.
- a setting screen a screen for assisting setting of the movement range
- FIG. 7 is a diagram showing an example of a screen (setting screen) for supporting setting of the movement range.
- the setting screen 70 includes areas 71 , 76 and 77 and a button 78 .
- Area 71 includes sub-areas 72-75 for obtaining and registering various types of information.
- a button 72a for acquiring the calibration data 151 and a button 72b for registering the calibration data 151 are displayed. Further, in the sub - region 72, values indicated by the acquired calibration data 151 (parameter values of the coordinate transformation matrix FHC for transforming the flange coordinate system into the camera coordinate system) are displayed.
- the CPU 101 registers the generated calibration data 151.
- the CPU 101 acquires information indicating the current state of the robot 300 from the robot controller 400 .
- the information indicating the current state of the robot 300 includes, for example, the amount of rotation of each axis of the robot 300, parameter values indicating the position and orientation of the flange plate 303, and the like.
- the CPU 101 includes parameter values indicating the position and orientation of the flange plate 303 in the sub-region 73 based on the acquired information.
- the CPU 101 calculates the current position and orientation of the measuring head 200 based on the information indicating the current state of the robot 300 and the calibration data 151 .
- the position and orientation are represented by a coordinate transformation matrix BHC that transforms the robot base coordinate system into the camera coordinate system.
- the CPU 101 registers the calculated position and orientation as the position and orientation of number “0” (reference position number) in the measurement position list 154 .
- a button 74a for measuring the position and orientation of the work W and a button 74b for registering the position and orientation of the work model are displayed.
- parameter values indicating the measured position and orientation of the workpiece W are displayed.
- the sub-area 74 displays the parameter values of the coordinate transformation matrix C H W for transforming the camera coordinate system into the work coordinate system.
- the sub-region 74 may display parameter values of the coordinate transformation matrix BHW that transforms the robot base coordinate system to the work coordinate system.
- the CPU 101 In response to the button 74 a being clicked, the CPU 101 outputs a projection instruction and an imaging instruction to the measurement head 200 and acquires a captured image from the measurement head 200 .
- the CPU 101 measures the position and orientation of the workpiece W based on the acquired captured image.
- the measured position and orientation of the workpiece W are represented by a coordinate transformation matrix C H W that transforms the camera coordinate system into the workpiece coordinate system.
- CPU 101 includes the parameter values of coordinate transformation matrix C H W in sub-region 74 . Note that the CPU 101 uses the coordinate transformation matrix C H W , the coordinate transformation matrix B H F obtained by clicking the button 73a, and the coordinate transformation matrix F H C obtained by clicking the button 72a, A coordinate transformation matrix BHW for transforming the robot base coordinate system into the work coordinate system may be calculated. CPU 101 may then include the parameter values of coordinate transformation matrix BHW in sub-region 74 .
- the CPU 101 determines the measured position and orientation of the work W as the position and orientation of the work model.
- the CPU 101 generates model position and orientation data 155 indicating the determined position and orientation of the work model, and registers the model position and orientation data 155 .
- the model position and orientation data 155 is data based on the robot base coordinate system.
- the CPU 101 determines a point that is the center of the captured image and has the same height (Z coordinate value) as the center of gravity of the workpiece W as a reference point. Alternatively, the CPU 101 determines, as a reference point, a point that is the center of the captured image and has the same height (Z coordinate value) as the origin of the work model when the work model is superimposed on the measured position and orientation. good too.
- the sub-region 75 displays input fields 75a to 75c for setting the movement range of the measuring head 200 and a button 75d for registering the movement range.
- a longitude range and a latitude range on a spherical surface centered on the reference point and having a radius corresponding to the distance between the reference point and the reference position registered by clicking the button 73b are entered.
- the CPU 101 determines the longitude range and the latitude range entered in the input fields 75a and 75b as the movement range on the spherical surface.
- the CPU 101 includes a list of each measurement position and the orientation of the measurement head 200 at that measurement position in the measurement position list 154 .
- the orientation of the measurement head 200 at each measurement position is set such that the orientation of the measurement head 200 when viewed from the reference point is constant, as described above. That is, the orientation of the measurement head 200 at each measurement position is set to be the same as the orientation of the measurement head 200 at the reference position when viewed from the reference point.
- the position and orientation of the measuring head 200 at each measurement position is represented by a coordinate transformation matrix BHC that transforms the robot base coordinate system into the camera coordinate system.
- a simulation image is displayed in which models of the robot 300, the measuring head 200, and the workpiece W are arranged in virtual space so that the positional relationship between the robot 300, the measuring head 200, and the workpiece W can be easily grasped.
- the CPU 101 arranges the model of the robot 300 (robot model 300M) in the virtual space based on the information acquired in response to the click of the button 73a.
- the CPU 101 arranges a model of the measuring head 200 (measuring head model 200M) based on the calibration data 151 in the virtual space.
- the CPU 101 arranges the work model WM in the virtual space based on the position and orientation measured in response to the click of the button 74a.
- the CPU 101 generates a simulation image when viewing each model from a preset viewpoint in the virtual space.
- the area 77 displays the captured image obtained by clicking the button 74a.
- a captured image captured under uniform illumination conditions is displayed.
- the user operates the robot controller 400 to adjust the position and orientation of the measuring head 200 and the position of the workpiece W while checking the areas 76 and 77 .
- the user can click buttons 73b, 74b, and 75d.
- the CPU 101 When the button 78 is clicked, the CPU 101 generates a movement command to sequentially move the measuring head 200 to a plurality of measurement positions according to the measurement position list 154, and outputs the generated movement command to the robot controller 400.
- the CPU 101 measures the position and orientation of the workpiece W using captured images acquired from the measuring heads 200 at each measurement position.
- FIG. 8 is a diagram showing an example of a performance report screen. As shown in FIG. 8, performance report screen 80 includes areas 81, 83-85.
- a map 60 showing the correspondence between each of the plurality of measurement positions and the evaluation results is displayed. Furthermore, radio buttons 82 for selecting evaluation items are displayed in the area 81 .
- the CPU 101 generates a map 60 showing the correspondence between each of the plurality of measurement positions and the evaluation result of the evaluation item selected by the radio button 82, and displays the generated map 60 in the area 81.
- Marks 61 indicating evaluation results are arranged at a plurality of measurement positions on the map 60 .
- the CPU 101 sets the shape of the mark 61 of the measurement position belonging to the rank “A” of the comprehensive evaluation result to “ ⁇ ”, and the comprehensive evaluation result is
- the shape of the mark 61 of the measurement position belonging to the rank "B” is assumed to be " ⁇ ”
- the shape of the mark 61 of the measurement position to which the comprehensive evaluation result belongs to the rank "C” is assumed to be "x”. Accordingly, by checking the map 60, the user can easily grasp the rank of the comprehensive evaluation result for each measurement position.
- a plurality of measurement positions are located on a spherical surface with a reference point as the center and a radius corresponding to the distance between the reference point and the imaging unit 202 of the measurement head 200 . Therefore, as shown in FIG. 8, map 60 is hemispherical. In other words, the map 60 is expressed in a polar coordinate system with the origin at the center of the sphere.
- Each measurement position is represented by a radius vector and two declination angles in a polar coordinate system.
- a radius is the distance between a reference point and a reference position.
- the two declinations are represented by latitude and longitude on the sphere.
- CPU 101 may include in map 60 a plurality of meridians 65 and a plurality of latitudes 66 on the spherical surface so that the user understands that map 60 is hemispherical. This makes it easier for the user to grasp the positional relationship between a plurality of measurement positions in the three-dimensional space.
- the map 60 is displayed so that the reference position is the center.
- the CPU 101 may move/rotate the map 60 according to the user's operation.
- the CPU 101 rotates or moves the map 60 in the drag direction in response to a mouse drag operation.
- the CPU 101 may rotate the map 60 about a designated rotation axis according to user's operation.
- the axis of rotation is specified from three mutually orthogonal axes that pass through the reference point.
- the CPU 101 can move from a horizontal direction (a direction perpendicular to the line connecting the reference point and the reference position) or from an oblique direction of 45° (a direction at an angle of 45° with the line connecting the reference point and the reference position). ) is presented. This allows the user to observe the map 60 when viewed from a desired viewpoint. As a result, user convenience is improved.
- the CPU 101 may include the work model WM in the position and orientation indicated by the model position and orientation data 155 in the map 60 . This allows the user to easily grasp the positional relationship between the workpiece W and each measurement position.
- the CPU 101 changes the shape of the mark 61 according to the value of the selected evaluation item.
- the CPU 101 sets the shape of the mark 61 of the measurement position where the correlation value is equal to or greater than the first threshold to “o”, and the correlation value is greater than or equal to the fourth threshold and less than the first threshold.
- the shape of the mark 61 at the measurement position where the correlation value is less than the fourth threshold is defined as " ⁇ ".
- the fourth threshold is smaller than the first threshold.
- Thresholds are also set in advance for "repeatability” and “linearity”, and the ranks of “repeatability” and “linearity” are evaluated according to the results of comparison with the thresholds.
- each evaluation item may be classified into two ranks, or may be classified into four or more ranks (for example, five ranks).
- the CPU 101 accepts selection of any mark 61 on the map 60 .
- the mark 61a is selected.
- the evaluation result corresponding to the measurement position of the mark 61a selected on the map 60 is displayed. Specifically, repeatability, linearity and correlation values are displayed. Furthermore, in the area 83, the determination result of whether or not the above conditions (a) to (c) are satisfied is also displayed.
- the condition (a) that the correlation value is equal to or greater than a predetermined first threshold is satisfied, as shown in FIG. 8, "correlation value: OK” is displayed. If condition (c) that the movement error (linearity) is equal to or less than a predetermined third threshold is not satisfied, as shown in FIG. 8, "linearity: NG" is displayed.
- a point cloud state diagram generated from the three-dimensional point cloud data corresponding to the measurement position of the mark 61a selected on the map 60 is displayed.
- a virtual image corresponding to the measurement position of the mark 61 a selected on the map 60 is displayed in the area 85 .
- the virtual image shows the workpiece model as viewed from the measurement position. Therefore, the virtual image corresponds to an ideal image viewed from the measurement position.
- the user can visually check the quality of the measurement result of the position and orientation of the work W.
- the conditions for measuring the position and orientation of the work W include imaging conditions and search conditions.
- the imaging conditions include, for example, the intensity of illumination light projected onto the workpiece W from the projection unit 201, the shutter speed of the imaging unit 202, and the camera gain.
- the search conditions include detection conditions of the workpiece detection unit 122 (for example, threshold values set for correlation values between the three-dimensional point cloud data and the template data 153).
- the user sets conditions for measuring the position and orientation of the work W while checking the performance report screen 80 shown in FIG.
- the user selects a comprehensive evaluation using the radio button 82 and confirms the map 60 .
- the comprehensive evaluation result of the measurement positions belonging to the wide range centered on the zenith of the hemispherical map 60 is rank "A”
- the user determines that the measurement conditions do not need to be changed.
- the distribution range of the measurement positions where the overall evaluation result is rank "A” is narrow, or when the distribution range is wide but there are measurement positions of rank "B" or "C” in part of the distribution range. If so, the user determines that the measurement conditions need to be changed.
- the user can determine whether the overall evaluation result is rank "B” or Specify the measurement position of "C”. Then, the user confirms the details of the evaluation result displayed in the area 83 and narrows down the cause of the deterioration of the measurement performance. The user tries the following adjustments (A) to (C) according to the narrowed down cause.
- the user can set other conditions related to matching between the three-dimensional point cloud data and the template data 153 (for example, the search range within the captured image, the template data 153 (type, number, etc.) to examine whether improvements can be made so that correct candidates can be measured.
- the user can easily set the conditions for measuring the position and orientation of the work W by trying adjustments (A) to (C) while checking the performance report screen 80 .
- the CPU 101 may change the color of the mark 61 displayed at each measurement position on the map 60 according to the evaluation result. In this case, the shape of the mark 61 may be the same regardless of the evaluation results.
- the method of determining the rank of the comprehensive evaluation result by the determining unit 134 is not limited to the above method.
- the determination unit 134 may determine that the overall evaluation result is rank "A" when condition (a) is satisfied and one of conditions (b) and (c) is satisfied.
- the determination unit 134 may determine that the overall evaluation result is rank “A” when condition (d) that the correlation value is equal to or greater than a predetermined fifth threshold is satisfied.
- the fifth threshold is greater than the first threshold.
- the CPU 101 may generate a list that associates the measurement positions with the evaluation results.
- the CPU 101 may create a file internally representing the map 60 in a list format, save the file, or output it to an external device.
- the CPU 101 can present the map 60 by reading the saved file.
- An external device that receives the file can present the map 60 by reading the file.
- the file is data in which measurement positions and evaluation results are associated with each other.
- the CPU 101 may use a plurality of saved files to generate a combined map by combining a plurality of maps 60 corresponding to the plurality of files. For example, by synthesizing a plurality of maps 60 with different movement ranges of the measuring head 200, a wide-range synthetic map including all movement ranges can be generated. Even if there are multiple files due to multiple evaluations such as initial evaluation and additional evaluation, multiple maps 60 are aggregated as one composite map. This improves user convenience.
- a support system (1) comprising: an imaging device (200) for imaging an object (W); a setting unit (11, 101) for setting a movement range of the imaging device; a robot (300) that sequentially moves the imaging device to a plurality of measurement positions within the movement range; a measurement unit (12, 101) that measures the position and orientation of the object using an image captured by the imaging device for each of the plurality of measurement positions; an evaluation unit (13, 101) for evaluating a measurement result by the measurement unit;
- a support system comprising: a presentation unit (14, 101) that presents a map (60) indicating a correspondence relationship between each of the plurality of measurement positions and evaluation results by the evaluation unit.
- the evaluation unit calculates at least one evaluation value representing the evaluation result,
- the presentation unit Receiving designation of one of the plurality of measurement positions on the map;
- Composition 4 The setting unit setting a reference position within the movement range; registering the position and orientation of the model of the object so as to match the position and orientation of the object measured from the captured image corresponding to the reference position; The presentation unit generating a virtual image of the model as viewed from the one measurement position in accordance with the specification of the one measurement position; 4. Assistance system according to configuration 3, presenting the virtual image.
- Composition 5 The measuring unit generating three-dimensional point cloud data of a viewing area of the imaging device based on the captured image; measuring the position and orientation based on matching between the three-dimensional point cloud data and template data; The support system according to configuration 2, wherein the at least one evaluation value includes a correlation value between the three-dimensional point cloud data and the template data.
- Composition 8 The plurality of measurement positions exist on the same spherical surface, 8.
- Composition 9 An image processing device (100), a setting unit (11, 101) for setting a movement range of an imaging device for imaging an object (W); a measurement unit (12, 101) for measuring the position and orientation of the object using captured images obtained from the imaging device (200) located at each of the plurality of measurement positions within the movement range; , an evaluation unit (13, 101) for evaluating a measurement result by the measurement unit;
- An image processing device comprising: a presentation unit (14, 101) that presents a map (60) showing correspondence between each of the plurality of measurement positions and evaluation results by the evaluation unit.
- a support method comprising: setting a movement range of an imaging device (200) for imaging an object (W); using a robot to sequentially move the imaging device to a plurality of measurement positions within the movement range; measuring the position and orientation of the object for each of the plurality of measurement positions using an image captured by the imaging device; evaluating the measurement result of the position and orientation; and presenting a map showing a correspondence relationship between each of the plurality of measurement positions and evaluation results.
- 1 support system 11 setting unit, 12 measurement unit, 13 evaluation unit, 14 presentation unit, 15 storage unit, 60 map, 61, 61a mark, 65 longitude, 66 latitude, 70 setting screen, 71, 76, 77, 81, 83-85 areas, 72-75 sub-areas, 72a, 72b, 73a, 73b, 74a, 74b, 75d, 78 buttons, 75a-75c input fields, 80 performance report screen, 82 radio buttons, 100 image sensor controller, 101 CPU , 102 main memory, 103 hard disk, 104 measurement head interface, 104a image buffer, 105 input interface, 106 display controller, 107 communication interface, 108 data reader/writer, 109 bus, 111 movement range determination unit, 112 registration unit, 113 movement Command unit, 121 Point cloud data generation unit, 122 Work detection unit, 131 Correlation value calculation unit, 132 Repeatability calculation unit, 133 Movement error calculation unit, 134 Judgment unit, 141 Map generation unit, 142 Virtual image generation unit,
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Software Systems (AREA)
- Geometry (AREA)
- Computer Graphics (AREA)
- Multimedia (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Manipulator (AREA)
- Image Analysis (AREA)
Abstract
Description
図1を参照して、本発明が適用される場面の一例について説明する。図1は、実施の形態に係る支援システムの全体構成を示す概略図である。図1に例示される支援システム1は、対象物であるワークWの写る画像を用いてワークWの位置姿勢を計測するときの条件設定を支援する。支援システム1は、たとえば、生産ラインなどに組み込まれ、ばら積みされたワークWに対するピッキングの制御のために利用される。
<A.ばら積みされたワークの例>
図2は、ばら積みされたワークの一例を示す図である。図2には、コンテナ2内にばら積みされた複数のワークWが示される。コンテナ2内にばら積みされた複数のワークWは、1つずつ順次ピッキングされる。
画像センサコントローラ100は、典型的には、汎用的なアーキテクチャを有しているコンピュータであり、予めインストールされたプログラム(命令コード)を実行することで、本実施の形態に係る支援処理を実行する。このようなプログラムは、典型的には、各種記録媒体などに格納された状態で流通し、あるいは、ネットワークなどを介して画像センサコントローラ100にインストールされる。
CPU101は、複数の座標系の座標値を用いて、ワークW、計測ヘッド200およびロボット300の位置姿勢を演算する。
BHC=BHF・FHC
を満たす。上述したように、そのため、座標変換行列FHCは、固定値で表される。そのため、CPU101は、移動先の計測ヘッド200の位置姿勢から、ロボットコントローラ400に出力すべき移動指令を演算できる。
BHW=BHF・FHC・CHW
を満たす。上述したように、座標変換行列CHWは、カメラ座標系におけるワークWの位置姿勢を示し、計測ヘッド200から受けた撮像画像に基づいて検出される。そのため、CPU101は、ロボットコントローラ400に出力した移動指令および撮像画像に基づく計測結果を用いて、ロボットベース座標系におけるワークWの位置姿勢を演算できる。
図5は、図1に示す画像センサコントローラの機能構成の一例を示すブロック図である。図5に示されるように、画像センサコントローラ100は、設定部11と、計測部12と、評価部13と、提示部14と、記憶部15と、を備える。設定部11、計測部12、評価部13、および提示部14は、図3に示すCPU101がハードディスク103に格納されたプログラムを実行することにより実現される。記憶部15は、図3に示すメインメモリ102およびハードディスク103によって実現される。
記憶部15は、キャリブレーションデータ151と、ワーク形状データ152と、複数のテンプレートデータ153と、計測位置リスト154と、モデル位置姿勢データ155と、を記憶する。
設定部11は、移動範囲決定部111と、登録部112と、移動指令部113と、を含む。
BHC=BHF・FHC
そのため、移動指令部113は、読み出した座標変換行列BHCとキャリブレーションデータ151によって示される座標変換行列FHCとを用いて、座標変換行列BHFを演算する。移動指令部113は、演算により得られた座標変換行列BHFを示す移動指令を生成すればよい。
計測部12は、計測ヘッド200から取得した撮像画像から被写体の三次元形状を計測する三次元計測処理と、計測された三次元形状の中からワークWの形状を探索する探索処理(サーチ処理)と、を行なうことにより、ワークWの位置姿勢を計測する。そのため、計測部12は、三次元計測処理を行なう点群データ生成部121と、探索処理を行なうワーク検出部122と、を含む。
評価部13は、相関値算出部131と、繰り返し精度算出部132と、移動誤差算出部133と、判定部134と、を含む。
(a)相関値が予め定められた第1閾値以上である。
(b)繰り返し精度が予め定められた第2閾値以下である。
(c)移動誤差が予め定められた第3閾値以下である。
提示部14は、マップ生成部141と、仮想画像生成部142と、画面生成部143と、を含む。
図6は、支援システムにおける支援処理の流れを示すフローチャートである。図6には、画像センサコントローラ100のCPU101が行なう処理の流れが示される。
CPU101は、移動範囲の設定を支援する画面(以下、「設定画面」と称する。)をディスプレイ500に表示し、設定画面への入力に応じて、移動範囲に関する情報を登録する(設定する)。
図8は、性能レポート画面の一例を示す図である。図8に示されるように、性能レポート画面80は、領域81,83~85を含む。
ワークWの位置姿勢の計測の条件には、撮像条件およびサーチ条件が含まれる。撮像条件には、たとえば、投影部201からワークWに投影される照明光の強度、撮像部202のシャッタースピードおよびカメラゲインが含まれる。サーチ条件には、ワーク検出部122の検出条件(たとえば三次元点群データとテンプレートデータ153との相関値に対して設定される閾値など)が含まれる。
CPU101は、マップ60において、各計測位置に表示されるマーク61の色を評価結果に応じて異ならせてもよい。この場合、マーク61の形状は、評価結果に依らず同一であってもよい。
以上のように、本実施の形態は以下のような開示を含む。
支援システム(1)であって、
対象物(W)を撮像する撮像装置(200)と、
前記撮像装置の移動範囲を設定する設定部(11,101)と、
前記移動範囲内の複数の計測位置に前記撮像装置を順次移動させるロボット(300)と、
前記複数の計測位置の各々について、前記撮像装置の撮像により得られる撮像画像を用いて前記対象物の位置姿勢を計測する計測部(12,101)と、
前記計測部による計測結果を評価する評価部(13,101)と、
前記複数の計測位置の各々と前記評価部による評価結果との対応関係を示すマップ(60)を提示する提示部(14,101)と、を備える支援システム。
前記評価部は、前記評価結果を表す少なくとも1つの評価値を計算し、
前記提示部は、
前記マップ上において、前記複数の計測位置のうちの1つの計測位置の指定を受け付け、
前記1つの計測位置の指定に応じて、前記1つの計測位置に対応する前記少なくとも1つの評価値を提示する、構成1に記載の支援システム。
前記計測部は、前記複数の計測位置の各々に対応する前記撮像画像に基づいて、前記撮像装置の視野領域の三次元点群データを生成し、
前記提示部は、
前記マップ上において、前記複数の計測位置のうちの1つの計測位置の指定を受け付け、
前記1つの計測位置の指定に応じて、前記1つの計測位置に対応する前記三次元点群データから生成される画像を提示する、構成1に記載の支援システム。
前記設定部は、
前記移動範囲内の基準位置を設定し、
前記基準位置に対応する前記撮像画像から計測される前記対象物の位置姿勢と一致するように、前記対象物のモデルの位置姿勢を登録し、
前記提示部は、
前記1つの計測位置の指定に応じて、前記1つの計測位置から見たときの前記モデルの仮想画像を生成し、
前記仮想画像を提示する、構成3に記載の支援システム。
前記計測部は、
前記撮像画像に基づいて、前記撮像装置の視野領域の三次元点群データを生成し、
前記三次元点群データとテンプレートデータとの照合に基づいて前記位置姿勢を計測し、
前記少なくとも1つの評価値は、前記三次元点群データと前記テンプレートデータとの相関値を含む、構成2に記載の支援システム。
前記計測部は、前記複数の計測位置の各々について、複数回の計測を行ない、
前記少なくとも1つの評価値は、前記複数回の計測の繰り返し精度を示す値を含む、構成2または5に記載の支援システム。
前記複数の計測位置は、基準位置を含み、
前記計測部は、前記撮像装置を基準とするカメラ座標系における前記対象物の位置姿勢を計測し、
前記少なくとも1つの評価値は、前記複数の計測位置の各々について、前記基準位置からの移動量に基づいて推定される、前記カメラ座標系における前記対象物の位置姿勢の第1変化量と、前記基準位置において計測された位置姿勢から当該計測位置において計測された位置姿勢への第2変化量との差分を示す値を含む、構成2、5および6のいずれかに記載の支援システム。
前記複数の計測位置は、同一の球面上に存在し、
前記マップは、前記球の中心を原点とする極座標系で表される、構成1から7のいずれか1項に記載の支援システム。
画像処理装置(100)であって、
対象物(W)を撮像する撮像装置の移動範囲を設定する設定部(11,101)と、
前記移動範囲内の複数の計測位置の各々について、当該計測位置に位置する前記撮像装置(200)から得られる撮像画像を用いて前記対象物の位置姿勢を計測する計測部(12,101)と、
前記計測部による計測結果を評価する評価部(13,101)と、
前記複数の計測位置の各々と前記評価部による評価結果との対応関係を示すマップ(60)を提示する提示部(14,101)と、を備える画像処理装置。
支援方法であって、
対象物(W)を撮像する撮像装置(200)の移動範囲を設定するステップと、
ロボットを用いて、前記移動範囲内の複数の計測位置に前記撮像装置を順次移動させるステップと、
前記複数の計測位置の各々について、前記撮像装置の撮像により得られる撮像画像を用いて前記対象物の位置姿勢を計測するステップと、
前記位置姿勢の計測結果を評価するステップと、
前記複数の計測位置の各々と評価結果との対応関係を示すマップを提示するステップと、を備える支援方法。
構成10に記載の支援方法をコンピュータに実行させるプログラム。
Claims (11)
- 支援システムであって、
対象物を撮像する撮像装置と、
前記撮像装置の移動範囲を設定する設定部と、
前記移動範囲内の複数の計測位置に前記撮像装置を順次移動させるロボットと、
前記複数の計測位置の各々について、前記撮像装置の撮像により得られる撮像画像を用いて前記対象物の位置姿勢を計測する計測部と、
前記計測部による計測結果を評価する評価部と、
前記複数の計測位置の各々と前記評価部による評価結果との対応関係を示すマップを提示する提示部と、を備える支援システム。 - 前記評価部は、前記評価結果を表す少なくとも1つの評価値を計算し、
前記提示部は、
前記マップ上において、前記複数の計測位置のうちの1つの計測位置の指定を受け付け、
前記1つの計測位置の指定に応じて、前記1つの計測位置に対応する前記少なくとも1つの評価値を提示する、請求項1に記載の支援システム。 - 前記計測部は、前記複数の計測位置の各々に対応する前記撮像画像に基づいて、前記撮像装置の視野領域の三次元点群データを生成し、
前記提示部は、
前記マップ上において、前記複数の計測位置のうちの1つの計測位置の指定を受け付け、
前記1つの計測位置の指定に応じて、前記1つの計測位置に対応する前記三次元点群データから生成される画像を提示する、請求項1に記載の支援システム。 - 前記設定部は、
前記移動範囲内の基準位置を設定し、
前記基準位置に対応する前記撮像画像から計測される前記対象物の位置姿勢と一致するように、前記対象物のモデルの位置姿勢を登録し、
前記提示部は、
前記1つの計測位置の指定に応じて、前記1つの計測位置から見たときの前記モデルの仮想画像を生成し、
前記仮想画像を提示する、請求項3に記載の支援システム。 - 前記計測部は、
前記撮像画像に基づいて、前記撮像装置の視野領域の三次元点群データを生成し、
前記三次元点群データとテンプレートデータとに基づいて前記位置姿勢を計測し、
前記少なくとも1つの評価値は、前記三次元点群データと前記テンプレートデータとの相関値を含む、請求項2に記載の支援システム。 - 前記計測部は、前記複数の計測位置の各々について、複数回の計測を行ない、
前記少なくとも1つの評価値は、前記複数回の計測の繰り返し精度を示す値を含む、請求項2または5に記載の支援システム。 - 前記複数の計測位置は、基準位置を含み、
前記計測部は、前記撮像装置を基準とするカメラ座標系における前記対象物の位置姿勢を計測し、
前記少なくとも1つの評価値は、前記複数の計測位置の各々について、前記基準位置からの移動量に基づいて推定される、前記カメラ座標系における前記対象物の位置姿勢の第1変化量と、前記基準位置において計測された位置姿勢から当該計測位置において計測された位置姿勢への第2変化量との差分を示す値を含む、請求項2、5および6のいずれか1項に記載の支援システム。 - 前記複数の計測位置は、同一の球面上に存在し、
前記マップは、前記球の中心を原点とする極座標系で表される、請求項1から7のいずれか1項に記載の支援システム。 - 画像処理装置であって、
対象物を撮像する撮像装置の移動範囲を設定する設定部と、
前記移動範囲内の複数の計測位置の各々について、当該計測位置に位置する前記撮像装置から得られる撮像画像を用いて前記対象物の位置姿勢を計測する計測部と、
前記計測部による計測結果を評価する評価部と、
前記複数の計測位置の各々と前記評価部による評価結果との対応関係を示すマップを提示する提示部と、を備える画像処理装置。 - 支援方法であって、
対象物を撮像する撮像装置の移動範囲を設定するステップと、
ロボットを用いて、前記移動範囲内の複数の計測位置に前記撮像装置を順次移動させるステップと、
前記複数の計測位置の各々について、前記撮像装置の撮像により得られる撮像画像を用いて前記対象物の位置姿勢を計測するステップと、
前記位置姿勢の計測結果を評価するステップと、
前記複数の計測位置の各々と評価結果との対応関係を示すマップを提示するステップと、を備える支援方法。 - 請求項10に記載の支援方法をコンピュータに実行させるプログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP21925720.1A EP4292777A1 (en) | 2021-02-12 | 2021-03-10 | Assistance system, image processing device, assistance method and program |
US18/274,357 US20240083038A1 (en) | 2021-02-12 | 2021-03-10 | Assistance system, image processing device, assistance method and non-transitory computer-readable storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-020864 | 2021-02-12 | ||
JP2021020864A JP7533265B2 (ja) | 2021-02-12 | 2021-02-12 | 支援システム、画像処理装置、支援方法およびプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022172471A1 true WO2022172471A1 (ja) | 2022-08-18 |
Family
ID=82837631
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/009562 WO2022172471A1 (ja) | 2021-02-12 | 2021-03-10 | 支援システム、画像処理装置、支援方法およびプログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20240083038A1 (ja) |
EP (1) | EP4292777A1 (ja) |
JP (1) | JP7533265B2 (ja) |
WO (1) | WO2022172471A1 (ja) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08313225A (ja) * | 1995-03-16 | 1996-11-29 | Nippondenso Co Ltd | 外観検査装置 |
JP2019155535A (ja) | 2018-03-13 | 2019-09-19 | オムロン株式会社 | ワークピッキング装置及びワークピッキング方法 |
JP2019185239A (ja) * | 2018-04-05 | 2019-10-24 | オムロン株式会社 | 物体認識処理装置及び方法、並びに、物体ピッキング装置及び方法 |
JP2020163502A (ja) * | 2019-03-28 | 2020-10-08 | セイコーエプソン株式会社 | 物体検出方法、物体検出装置およびロボットシステム |
-
2021
- 2021-02-12 JP JP2021020864A patent/JP7533265B2/ja active Active
- 2021-03-10 WO PCT/JP2021/009562 patent/WO2022172471A1/ja active Application Filing
- 2021-03-10 US US18/274,357 patent/US20240083038A1/en active Pending
- 2021-03-10 EP EP21925720.1A patent/EP4292777A1/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08313225A (ja) * | 1995-03-16 | 1996-11-29 | Nippondenso Co Ltd | 外観検査装置 |
JP2019155535A (ja) | 2018-03-13 | 2019-09-19 | オムロン株式会社 | ワークピッキング装置及びワークピッキング方法 |
JP2019185239A (ja) * | 2018-04-05 | 2019-10-24 | オムロン株式会社 | 物体認識処理装置及び方法、並びに、物体ピッキング装置及び方法 |
JP2020163502A (ja) * | 2019-03-28 | 2020-10-08 | セイコーエプソン株式会社 | 物体検出方法、物体検出装置およびロボットシステム |
Also Published As
Publication number | Publication date |
---|---|
JP7533265B2 (ja) | 2024-08-14 |
JP2022123510A (ja) | 2022-08-24 |
EP4292777A1 (en) | 2023-12-20 |
US20240083038A1 (en) | 2024-03-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5746477B2 (ja) | モデル生成装置、3次元計測装置、それらの制御方法及びプログラム | |
US9672630B2 (en) | Contour line measurement apparatus and robot system | |
JP6271953B2 (ja) | 画像処理装置、画像処理方法 | |
JP6594129B2 (ja) | 情報処理装置、情報処理方法、プログラム | |
JP5612916B2 (ja) | 位置姿勢計測装置、その処理方法、プログラム、ロボットシステム | |
JP3859574B2 (ja) | 3次元視覚センサ | |
JP5624394B2 (ja) | 位置姿勢計測装置、その計測処理方法及びプログラム | |
US9639942B2 (en) | Information processing apparatus, information processing method, and storage medium | |
JP6370038B2 (ja) | 位置姿勢計測装置及び方法 | |
JP2016103230A (ja) | 画像処理装置、画像処理方法、及びプログラム | |
KR20140008262A (ko) | 로봇 시스템, 로봇, 로봇 제어 장치, 로봇 제어 방법 및 로봇 제어 프로그램 | |
JP2019098409A (ja) | ロボットシステムおよびキャリブレーション方法 | |
JP2020027439A (ja) | 情報処理装置、情報処理方法 | |
JPWO2018043524A1 (ja) | ロボットシステム、ロボットシステム制御装置、およびロボットシステム制御方法 | |
JP2020179441A (ja) | 制御システム、情報処理装置および制御方法 | |
JP6922348B2 (ja) | 情報処理装置、方法、及びプログラム | |
JP7439410B2 (ja) | 画像処理装置、画像処理方法およびプログラム | |
WO2022172471A1 (ja) | 支援システム、画像処理装置、支援方法およびプログラム | |
JP3516668B2 (ja) | 3次元形状認識方法、装置およびプログラム | |
CN118436342B (zh) | 一种头部姿态数据采集方法、装置、电子设备及介质 | |
US20230154162A1 (en) | Method For Generating Training Data Used To Learn Machine Learning Model, System, And Non-Transitory Computer-Readable Storage Medium Storing Computer Program | |
JP2023011319A (ja) | 調整支援システムおよび調整支援方法 | |
CN115981492A (zh) | 三维笔迹生成方法、设备及系统 | |
JP2021043075A (ja) | 情報処理装置及びその制御方法及びプログラム | |
JP2021077290A (ja) | 情報処理装置、情報処理方法、プログラム、システム及び物品の製造方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21925720 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18274357 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2021925720 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2021925720 Country of ref document: EP Effective date: 20230912 |