WO2019239848A1 - Robot control system - Google Patents

Robot control system Download PDF

Info

Publication number
WO2019239848A1
WO2019239848A1 PCT/JP2019/020623 JP2019020623W WO2019239848A1 WO 2019239848 A1 WO2019239848 A1 WO 2019239848A1 JP 2019020623 W JP2019020623 W JP 2019020623W WO 2019239848 A1 WO2019239848 A1 WO 2019239848A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
calibration
unit
camera
control system
Prior art date
Application number
PCT/JP2019/020623
Other languages
French (fr)
Japanese (ja)
Inventor
中塚 均
豊男 飯田
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2019239848A1 publication Critical patent/WO2019239848A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/10Programme-controlled manipulators characterised by positioning means for manipulator elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Definitions

  • the present technology relates to a robot control system for controlling a robot using three-dimensional coordinates calculated based on an image captured by an imaging unit.
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2011-112402 is configured to improve the convenience of the three-dimensional visual sensor by easily displaying the three-dimensional shape of the set effective area together with the surroundings. Is disclosed.
  • Patent Document 1 The configuration disclosed in Patent Document 1 is a coordinate system for controlling a robot, although calibration between a three-dimensional coordinate system (measurement coordinate system) for stereo measurement and a camera coordinate system is considered. Calibration for is not considered.
  • a robot control system is present in a field of view of an imaging unit arranged to include an action unit of the robot in the field of view and an image captured by the imaging unit.
  • a measuring unit that measures the three-dimensional coordinates of an arbitrary object to be positioned, and positions the robot operating unit according to a pre-calculated correspondence between the measured three-dimensional coordinates and the position and orientation of the robot operating unit.
  • a region for placing a reference object associated with a robot action unit in a calibration a command generation unit for generating a command for a calibration, a calibration execution unit for executing a calibration for calculating a correspondence relationship
  • a setting receiving unit that receives the setting of the calibration area.
  • the setting reception unit may indicate the range of the calibration area that is set with the imaging unit as a reference.
  • the calibration area is set based on the imaging unit, the user can easily understand the range of the calibration area.
  • the setting reception unit may display the field of view range of the imaging unit.
  • the range in which the calibration area is set can be grasped at a glance with respect to the visual field range of the imaging unit.
  • the calibration area may be set as a rectangular parallelepiped based on the optical axis of the imaging unit.
  • the position where the reference object is arranged can be easily determined, and the operation range can be determined so as not to be out of the measurement range even when moved in the optical axis direction.
  • the size of the cross section of the rectangular parallelepiped set as the calibration area may be determined according to the field of view of the imaging unit and the distance from the imaging unit to the end face of the calibration area.
  • the setting reception unit may receive a range setting on the optical axis of the imaging unit as the calibration area setting.
  • the area actually used in the optical axis direction of the imaging unit can be set as the calibration area.
  • the calibration execution unit sequentially gives commands to the robot, and obtains when the reference control unit sequentially arranges the reference objects in the calibration area, and when the reference objects are sequentially arranged in the calibration area.
  • a calculation unit that calculates a correspondence relationship based on a set of the three-dimensional coordinates of the reference object and the position and orientation of the action unit of the robot.
  • the robot control system can be operated even by a user who has no specialized knowledge.
  • the calibration execution unit includes a position determination unit that determines a plurality of positions where the reference object is to be arranged in the calibration area, and a relationship between one of the determined positions and the current position of the reference object Based on the combination of the position display unit indicating the reference object and the three-dimensional coordinates of the reference object and the position and orientation of the action part of the robot, which are obtained when the reference object is sequentially arranged at a plurality of determined positions. And a calculation unit for calculating the relationship.
  • the user can set the calibration area and perform the calibration simply by sequentially arranging the reference objects at predetermined positions in accordance with the notification from the position display unit. Even if it exists, the robot control system can be operated.
  • FIG. 1 is a schematic diagram showing an application example of the robot control system 1 according to the present embodiment.
  • robot control system 1 according to the present embodiment includes imaging unit 50 arranged to include action unit 8 of robot 2 in the field of view as a configuration for controlling robot 2.
  • the robot control system 1 further includes a measurement unit 52 that measures the three-dimensional coordinates of an arbitrary object existing in the field of view of the imaging unit 50 based on the image captured by the imaging unit 50.
  • the three-dimensional coordinates measured by the measurement unit 52 are output to the command generation unit 54 and the calibration execution unit 58.
  • the three-dimensional coordinates measured by the measurement unit 52 can include a position (a position on each axis) and a posture (a rotation angle about each axis) on a three-dimensional coordinate of a specific object.
  • the command generation unit 54 generates a command for positioning the action unit 8 of the robot 2. Specifically, the command generation unit 54 compares the designated target position with the three-dimensional coordinates from the measurement unit 52, and calculates the correspondence calculated in advance between the position and orientation of the action unit 8 of the robot 2. A command is calculated according to the relationship (typically, calibration parameter 56).
  • the calibration parameter 56 is determined by the calibration execution unit 58. More specifically, the calibration execution unit 58 sets the calibration parameter 56 based on the three-dimensional coordinates from the measurement unit 52 and the setting of the spatial region (calibration region) used for calibration from the setting reception unit 60. Perform calibration to calculate.
  • This calibration area is set by the user by the setting receiving unit 60. That is, the setting receiving unit 60 receives a setting of a calibration region that is a region in which a reference object (for example, the reference plate 20 is shown) associated with the action unit 8 of the robot 2 is to be arranged.
  • a reference object for example, the reference plate 20 is shown
  • As the calibration area an arbitrary area of the effective visual field of the imaging unit 50 can be set.
  • the calibration for calculating the calibration parameter 56 can be executed efficiently.
  • FIG. 2 is a schematic diagram showing the overall configuration of the robot control system 1 according to the present embodiment.
  • the robot control system 1 includes a robot existing in the field of view of the imaging unit and an arbitrary image based on an image captured by the imaging unit arranged to include the action unit of the robot in the field of view.
  • the three-dimensional coordinates of the object are measured, and the robot is controlled based on the measured three-dimensional coordinates.
  • the robot control system 1 includes a robot 2, a 3D camera 10, an image measurement device 100, a control device 200, and a robot control device 300.
  • the robot 2 is a mechanism that performs an operation at an arbitrary position in accordance with a command from the robot control device 300.
  • a multi-joint robot is illustrated as a typical example of the robot 2, but it may be a SCARA robot or a parallel robot.
  • the robot 2 shown in FIG. 2 has one or a plurality of arms 4, and a hand piece 6 is attached to the tips of the one or the plurality of arms 4 (corresponding to the action part of the robot 2).
  • FIG. 2 schematically illustrates a state at the time of executing calibration as described later, and a reference plate 20 is mounted on the handpiece 6 as an example of a reference object.
  • the reference plate 20 is a reference object for determining the correspondence between the measured three-dimensional coordinates and position information indicating the position of the action part of the robot 2 in the calibration.
  • One or more markers 22 are drawn on the surface of the reference plate 20.
  • the 3D camera 10 is disposed so as to include the action part (the tip of the arm 4 and the handpiece 6) of the robot 2 in the field of view, and an image measurement device that captures an image captured in the field of view at every predetermined period or every predetermined event. Output to 100.
  • the robot control system 1 employs a configuration that can optically measure the three-dimensional coordinates of each part including the action part of the robot 2.
  • three-dimensional measurement may be realized using a technique called structured illumination.
  • the structured illumination method the subject is irradiated with measurement light, and the distance to the subject is measured based on an image obtained by imaging the subject with the measurement light projected.
  • a structured illumination method a spatial encoding method, a phase shift method, a light cutting method, or the like can be used.
  • the 3D camera 10 includes a light projecting unit that emits measurement light and an imaging unit that captures an image of the subject in a state where the measurement light is projected.
  • three-dimensional measurement may be realized using a multi-viewpoint camera.
  • the 3D camera 10 includes a plurality of cameras arranged so that the viewpoints are different from each other.
  • the 3D camera 10 is a stereo camera including a pair of cameras.
  • the image measuring device 100 measures a three-dimensional image of an arbitrary object existing in the field of view of the 3D camera 10 based on the image captured by the 3D camera 10.
  • the image measurement apparatus 100 analyzes the position and displacement of the light and shade pattern included in the image output from the 3D camera 10 to obtain a three-dimensional image of the subject in the field of view. Coordinates (or a coordinate group indicating a three-dimensional shape) are calculated.
  • the image measurement device 100 uses the three-dimensional coordinates (or the subject) in the field of view based on the parallax of each target position calculated by matching between images. , A coordinate group indicating a three-dimensional shape) is calculated.
  • the image measuring device 100 can also search for the three-dimensional coordinates of a specific object in the calculated three-dimensional coordinate group.
  • the image measuring apparatus 100 can search for the marker 22 drawn on the reference plate 20 by pattern matching and output the three-dimensional coordinates of each marker 22. Calibration is performed based on the three-dimensional coordinates of these markers 22.
  • the control device 200 is typically composed of a PLC (programmable controller) or the like, and executes calibration or issues a command to the robot control device 300 based on the three-dimensional coordinates measured by the image measurement device 100. Or give.
  • PLC programmable controller
  • the robot control device 300 controls the robot 2 according to a command from the control device 200. More specifically, the robot control device 300 drives a servo motor that drives each axis of the robot 2.
  • FIG. 3 is a schematic diagram showing a configuration example of the 3D camera 10 included in the robot control system 1 according to the present embodiment.
  • the 3D camera 10 includes a processing unit 11, a light projecting unit 12, an imaging unit 13, a display unit 14, and a storage unit 15.
  • the processing unit 11 manages the entire process in the 3D camera 10.
  • the processing unit 11 typically includes a processor, a storage that stores an instruction code executed by the processor, and a memory that expands the instruction code.
  • the processor realizes various processes by expanding and executing the instruction code on the memory. All or part of the processing unit 11 may be implemented using a dedicated hardware circuit (for example, ASIC (Application Specific Integrated Circuit) or FPGA (Field-Programmable Gate Array)).
  • the display unit 14 notifies various information acquired or calculated by the 3D camera 10 to the outside.
  • the storage unit 15 stores an image captured by the image capturing unit 13, a preset calibration parameter, and the like.
  • the communication interface (I / F) unit 16 is in charge of data exchange between the 3D camera 10 and the image measurement apparatus 100.
  • FIG. 4 is a schematic diagram illustrating a configuration example of the image measurement device 100 included in the robot control system 1 according to the present embodiment.
  • the image measuring apparatus 100 is typically realized using a general-purpose computer.
  • the image measurement apparatus 100 includes a processor 102, a main memory 104, a storage 106, an input unit 108, a display unit 110, an optical drive 112, and a communication interface (I / F) unit 114. Including. These components are connected via a processor bus 116.
  • the processor 102 is composed of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and reads a program (for example, an OS (Operating System) 1060 and a three-dimensional measurement program 1062) stored in the storage 106, By developing and executing in the main memory 104, various processes as described later are realized.
  • a program for example, an OS (Operating System) 1060 and a three-dimensional measurement program 1062
  • the main memory 104 includes a volatile storage device such as DRAM (Dynamic Random Access Memory) and SRAM (Static Random Access Memory).
  • the storage 106 includes, for example, a nonvolatile storage device such as an HDD (Hard Disk Drive) or an SSD (Solid State Drive).
  • the setting reception program 1066 executes processing for receiving a setting of a calibration area, which is an area in which a reference object associated with the action unit of the robot 2 is to be arranged, in camera / robot calibration as described later.
  • the input unit 108 is composed of a keyboard, a mouse, and the like, and accepts user operations.
  • the display unit 110 includes a display, various indicators, a printer, and the like, and outputs a processing result from the processor 102.
  • the communication interface unit 114 is in charge of data exchange between the 3D camera 10 and the image measurement device 100 and is in charge of data exchange between the image measurement device 100 and the control device 200.
  • the image measuring apparatus 100 includes an optical drive 112, and from a recording medium 113 (for example, an optical recording medium such as a DVD (Digital Versatile Disc)) that temporarily stores a computer-readable program. Is read and installed in the storage 106 or the like.
  • a recording medium 113 for example, an optical recording medium such as a DVD (Digital Versatile Disc)
  • DVD Digital Versatile Disc
  • the three-dimensional measurement program 1062 and the setting reception program 1066 executed by the image measurement apparatus 100 may be installed via the computer-readable recording medium 113, but may be installed by downloading from a server apparatus on the network. You may make it do. Further, the functions provided by the three-dimensional measurement program 1062 and the setting reception program 1066 according to the present embodiment may be realized by using a part of modules provided by the OS.
  • FIG. 4 shows a configuration example in which functions necessary for the image measurement apparatus 100 are provided when the processor 102 executes a program. However, some or all of these provided functions are provided with dedicated hardware.
  • a hardware circuit for example, ASIC or FPGA may be used.
  • FIG. 5 is a schematic diagram showing a configuration example of control device 200 included in robot control system 1 according to the present embodiment.
  • the control device 200 is typically realized using a PLC (programmable controller).
  • the control device 200 includes a processor 202, a main memory 204, a storage 206, a communication interface (I / F) unit 208, field network controllers 210 and 212, and a USB (Universal Serial Bus).
  • I / F communication interface
  • field network controllers 210 and 212 field network controllers 210 and 212
  • USB Universal Serial Bus
  • a controller 214, a memory card interface 216, and a local bus controller 220 are included. These components are connected via a processor bus 230.
  • the processor 202 corresponds to a calculation processing unit that executes control calculations and the like, and includes a CPU, a GPU, and the like. Specifically, the processor 202 reads out a program stored in the storage 206, develops it in the main memory 204, and executes it, thereby realizing control according to the control target and various processes as will be described later. .
  • the main memory 204 is configured by a volatile storage device such as DRAM or SRAM.
  • the storage 206 corresponds to a storage unit, and includes, for example, a nonvolatile storage device such as an HDD or an SSD.
  • the storage 206 stores a calibration parameter 2060, a command generation program 2062, a calibration execution program 2064, and the like in addition to a system program for realizing basic functions.
  • the calibration parameter 2060 corresponds to a correspondence relationship calculated in advance between the measured three-dimensional coordinates and the position and orientation of the action part of the robot 2.
  • the command generation program 2062 is for positioning the action part of the robot 2 in accordance with a previously calculated correspondence (calibration parameter 2060) between the measured three-dimensional coordinates and the position and orientation of the action part of the robot 2.
  • a process for generating a command is executed.
  • the calibration execution program 2064 is a calibration (to be described later) for calculating a pre-calculated correspondence relationship (calibration parameter 2060) between the measured three-dimensional coordinates and the position and orientation of the action part of the robot 2. Execute camera / robot calibration).
  • the communication interface unit 208 is in charge of data exchange between the image measuring device 100 and the control device 200.
  • the field network controllers 210 and 212 exchange data with an arbitrary device such as the robot controller 300 (see FIG. 2) via the field network. Although two field network controllers 210 and 212 are illustrated in FIG. 5, a single field network controller may be employed.
  • the USB controller 214 exchanges data with an arbitrary external device via a USB connection.
  • the memory card interface 216 receives a memory card 218 which is an example of a removable recording medium.
  • the memory card interface 216 can write data to the memory card 218 and read various data (such as logs and trace data) from the memory card 218.
  • the local bus controller 220 exchanges data with an arbitrary local IO unit via the local bus 222.
  • FIG. 5 illustrates a configuration example in which a necessary function is provided by the processor 202 executing a program. However, part or all of the provided function is transferred to a dedicated hardware circuit (for example, an ASIC). Alternatively, it may be implemented using an FPGA or the like. Or you may implement
  • a dedicated hardware circuit for example, an ASIC
  • FPGA field-programmable gate array
  • FIG. 6 is a schematic diagram showing a configuration example of the robot control apparatus 300 included in the robot control system 1 according to the present embodiment.
  • the robot control device 300 calculates an operation pattern of each axis of the robot 2 in accordance with a command from the control device 200 (typically, a three-dimensional coordinate of a target position, a posture (orientation), a moving speed, etc.).
  • the robot control apparatus 300 includes a processor 302, a main memory 304, a storage 306, a communication interface (I / F) unit 308, and a drive controller 310. These components are connected via a processor bus 312.
  • the robot controller 300 further includes one or more servo drivers 320-1, 320-2, 320-3,..., 320-n (hereinafter also referred to as “servo driver 320”) connected to the drive controller 310. Included).
  • the processor 302 corresponds to a calculation processing unit that executes control calculations and the like, and includes a CPU, a GPU, and the like. Specifically, the processor 302 implements various processes by reading out a program stored in the storage 306, developing it in the main memory 304, and executing it.
  • the main memory 304 is configured by a volatile storage device such as DRAM or SRAM.
  • the storage 306 corresponds to a storage unit, and includes, for example, a nonvolatile storage device such as an HDD or an SSD.
  • the storage 306 stores system programs related to robot control.
  • the communication interface unit 308 is in charge of data exchange between the control device 200 and the robot control device 300.
  • the drive controller 310 drives motors 330-1, 330-2, 330-3,..., 330-n (hereinafter also collectively referred to as “motors 330”) that operate the respective axes of the robot 2.
  • motors 330 Each of the servo drivers 320 is controlled.
  • the servo driver 320 drives the connected motor 330 with torque, acceleration, and rotation speed in a specified direction in accordance with a command from the drive controller 310.
  • FIG. 6 shows a configuration example in which a necessary function is provided by the processor 302 executing a program. However, part or all of the provided function is transferred to a dedicated hardware circuit (for example, an ASIC). Alternatively, it may be implemented using an FPGA or the like. Or you may implement
  • a dedicated hardware circuit for example, an ASIC
  • FPGA field-programmable gate array
  • One of the multiple types of calibration is calibration of a coordinate system (hereinafter also referred to as “camera coordinate system”) defined based on the field of view of the 3D camera 10.
  • camera coordinate system a coordinate system defined based on the field of view of the 3D camera 10.
  • reference objects on which a known pattern is drawn are sequentially arranged at a plurality of known positions, and positions and orientations corresponding to the positions and orientations measured by the image measuring apparatus 100 are acquired. .
  • a correspondence relationship between the position and orientation of the reference object and the position and orientation on the three-dimensional coordinate system typically, a calibration parameter that defines a conversion equation
  • this calibration parameter the three-dimensional coordinates of an arbitrary subject arranged in the camera coordinate system can be accurately measured.
  • the position on the 3D coordinate system means the coordinate specified by the value of each axis on the 3D coordinate system, and the attitude on the 3D coordinate system is based on each axis on the 3D coordinate system.
  • the direction of rotation defined by the direction of rotation. That is, the position on the three-dimensional coordinate system is a three-dimensional value and is information indicating which coordinate exists on the three-dimensional coordinate system.
  • the posture on the three-dimensional coordinate system is a three-dimensional value. It is information indicating which direction the three-dimensional coordinate system is facing.
  • Another one of the plurality of types of calibration is calibration of a coordinate system (hereinafter also referred to as “robot reference coordinate system”) defined based on the base position of the robot 2.
  • the calibration of the robot reference coordinate system is such that when the three-dimensional coordinates of the target position and target posture are given to the robot control device 300 as a command from the control device 200, the tip of the robot 2 moves to the designated target position. It is a process that guarantees.
  • An arithmetic expression for driving the motor 330 of the robot 2 in the robot control apparatus 300 is corrected by the calibration of the robot reference coordinate system.
  • the robot control system 1 can execute calibration (also referred to as “camera robot calibration”) for determining the correspondence between the camera coordinate system and the robot reference coordinate system. Yes.
  • FIG. 7 is a diagram for explaining one aspect of camera / robot calibration in robot control system 1 according to the present embodiment.
  • the robot control system 1 in the robot control system 1, three-dimensional coordinates are measured based on an image captured by the 3D camera 10, and the robot 2 is controlled based on the measured value. Therefore, it is efficient to perform the camera / robot calibration in the robot control system 1 within the field of view of the 3D camera 10. Furthermore, even within the field of view of the 3D camera 10, depending on the target application, not all of the field of view may be used, but only a part of the spatial region may be used. In such a case, camera / robot calibration may be executed for a space area that is actually used (that is, a space area where the action part of the robot 2 may exist).
  • a user can arbitrarily set a space area (hereinafter also referred to as “calibration area”) used for camera / robot calibration.
  • Camera / robot calibration is executed in accordance with a calibration area arbitrarily set by the user.
  • FIG. 8 is a schematic diagram showing an example of a user interface screen provided by robot control system 1 according to the present embodiment.
  • the setting of the calibration area 418 is received by the setting reception program 1066 of the image measurement apparatus 100, and the user interface screen 400 shown in FIG. It is displayed on the display unit 110 (FIG. 4) of the image measuring device 100.
  • the image measuring device 100 it is not limited to the image measuring device 100, and may be displayed on a display device (not shown) connected to the control device 200. As long as settings are given to the image measuring apparatus 100 and / or the control apparatus 200, any display and input forms may be used.
  • the 3D camera 10 is optically designed so that the cross section of the effective field of view is rectangular. Therefore, in the user interface screen 400 shown in FIG. 8, the effective field of view of the 3D camera 10 is two orthogonal directions corresponding to the rectangle of the cross section (referred to as “X-axis direction” and “Y-axis direction” for convenience). It is prescribed by. That is, the user interface screen 400 includes an X section setting object 401 and a Y section setting object 402.
  • the X section setting object 401 and the Y section setting object 402 have effective visual field displays 403 and 404, respectively.
  • the effective visual field displays 403 and 404 indicate a range in which the 3D camera 10 can measure the three-dimensional coordinates. That is, the visual field range 416 of the 3D camera 10 that is the imaging unit is also displayed.
  • the user sets an area actually used by the application, that is, a calibration area 418 on the user interface screen 400.
  • a measurement bottom surface setting bar 410 and a measurement top surface setting bar 411 are provided so as to extend over both the X cross section setting object 401 and the Y cross section setting object 402.
  • the user operates the measurement bottom surface setting bar 410 and the measurement top surface setting bar 411 to set the top surface and the bottom surface of the calibration area 418.
  • the uppermost surface and the lowermost surface set an effective range (effective range on the Z axis) along the optical axis AX of the 3D camera 10. That is, as a setting of the calibration area 418, a range setting on the optical axis AX of the 3D camera 10 that is the imaging unit can be set.
  • the lowermost surface set by the measurement lowermost surface setting bar 410 is set in consideration of the floor surface on which the robot 2 is arranged.
  • the top surface set by the measurement top surface setting bar 411 is set in consideration of the range in which the robot 2 grips and transports the workpiece.
  • the X section setting object 401 is provided with X axis direction width setting bars 412 and 413.
  • the user operates the X-axis direction width setting bars 412 and 413 to set the width of the calibration area 418 in the X-axis cross section. That is, the X-axis direction width setting bars 412 and 413 are for setting an effective range (effective range on the X-axis) along the direction orthogonal to the optical axis AX of the 3D camera 10.
  • Y-axis direction width setting bars 414 and 415 are provided for the Y cross-section setting object 402.
  • the user operates the Y-axis direction width setting bars 414 and 415 to set the width of the calibration area 418 in the Y-axis cross section. That is, the Y-axis direction width setting bars 414 and 415 are for setting an effective range (effective range on the Y axis) along a direction orthogonal to the optical axis AX of the 3D camera 10.
  • the user can set an arbitrary calibration area 418 on the user interface screen 400. That is, the robot control system 1 has a setting reception function for receiving the setting of the calibration area 418 that is an area in which the reference plate 20 associated with the action part of the robot 2 is to be arranged in the camera / robot calibration. .
  • the range in the height direction of the calibration area 418 is defined by the measurement bottom surface setting bar 410 and the measurement top surface setting bar 411.
  • the cross section of the calibration area 418 is defined by X-axis direction width setting bars 412 and 413 and Y-axis direction width setting bars 414 and 415.
  • the calibration area 418 is set as a rectangular parallelepiped (including a cube) based on the optical axis AX of the 3D camera 10 that is the imaging unit.
  • the cross-sectional width of the calibration area 418 is determined in accordance with the visual field range 416 from the 3D camera 10. That is, the size of the cross section of the rectangular parallelepiped set as the calibration area 418 is determined in accordance with the visual field range 416 of the 3D camera 10 and the distance from the 3D camera 10 to the end face of the calibration area 418.
  • the calibration area 418 does not necessarily have to be set as a rectangular parallelepiped, and a shape whose cross section becomes smaller toward the 3D camera 10 side according to the shape of the effective field of view of the 3D camera 10 (for example, a vertex of a quadrangular pyramid is cut off). May be set. However, when an actual application is assumed, setting as a rectangular parallelepiped is preferable from the viewpoint of measurement stability and the like.
  • the calibration area 418 is arbitrarily set according to the target application. For example, it may be set according to the shape of the target workpiece. Specifically, in an application such as pick and place that grips a placed work and transports it to a specified position, a calibration area 418 is set based on the shape of the container containing the work. Also good.
  • the user sets the calibration area 418 (that is, an area where measurement of three-dimensional coordinates is necessary) on the user interface screen 400.
  • each setting bar displayed in the user interface screen 400 is set by a mouse operation or the like.
  • the image measurement device 100 and / or the control device 200 determines the relative movement position of the reference plate 20 (FIG. 2) so as not to be excessive or insufficient based on the set calibration region 418. To do. Thereafter, the user places the reference plate 20 at the center of the visual field range 416 of the 3D camera 10 and instructs execution of camera / robot calibration. Thereby, a command is given to the robot 2 and camera / robot calibration is executed.
  • the camera / robot calibration typically includes a process of calculating matrix coefficients (parameters) for mutual conversion between the position and orientation on the camera coordinate system and the position and orientation on the robot reference coordinate system. Including.
  • a hand piece 6 is attached to the tip of the robot 2, and a coordinate system that defines the position of the hand piece 6 (hereinafter also referred to as “robot tip coordinate system”) is introduced. Furthermore, a coordinate system (hereinafter also referred to as “marker coordinate system”) that defines the position of the marker 22 on the reference plate 20 mounted on the handpiece 6 is introduced.
  • A be a matrix indicating the relationship between the camera coordinate system of the 3D camera 10 and the marker coordinate system.
  • This matrix A can be estimated by recognizing the marker 22 by the 3D camera 10.
  • B be a matrix indicating the relationship between the robot tip coordinate system and the robot reference coordinate system. This matrix B corresponds to a command from the control device 200. Then, a matrix X and a matrix Z that satisfy the following relationship are estimated.
  • AX ZB
  • the measurement values by the image measurement device 100 and the commands from the control device 200 at each position are known, and the matrix X and the matrix Z are estimated based on a set of these values.
  • Such matrix estimation requires a three-dimensional coordinate data set (for example, 10 to 20 points) based on the camera coordinate system and the robot reference coordinate system, respectively. Therefore, the reference plate 20 having one or more markers 22 drawn on the surface is attached to the tip of the robot 2 and the following operations a) and b) are repeated 10 to 20 times.
  • the tip position of the robot 2 is arranged at a predetermined position so that the marker 22 is included in the field of view of the 3D camera 10, and the position and posture of the marker 22 on the robot reference coordinate system at that time (tip of the robot 2)
  • the matrix as described above is estimated by using the obtained set of positions and orientations (10 to 20 data sets). By using the estimated line example, it is possible to calculate the position and orientation of an arbitrary subject in the robot reference coordinate system based on the positional relationship between the robot 2 and the 3D camera 10 in the camera / robot calibration.
  • FIG. 9 is a sequence diagram showing an example of an automatic processing procedure of camera / robot calibration in the robot control system 1 according to the present embodiment. As shown in FIG. 9, the camera / robot calibration automatic processing is mainly executed by the image measuring device 100 and the control device 200.
  • the image measuring apparatus 100 displays the user interface screen 400 for accepting the setting of the calibration area on the display unit 110 (sequence SQ100).
  • the user operates input unit 108 such as a mouse to set a calibration area on user interface screen 400 (sequence SQ102).
  • Image measuring apparatus 100 accepts the calibration area set by the user in sequence SQ102 (sequence SQ104).
  • the image measuring device 100 determines whether or not the reference plate 20 exists within the field of view of the 3D camera 10 based on the image captured by the 3D camera 10 (sequence SQ106).
  • sequence SQ106 When the reference plate 20 does not exist in the field of view of the 3D camera 10 (NO in sequence SQ106), the image measurement device 100 operates the robot 2 with a teaching pendant or the like to the user, and the reference plate 20 Is placed within the field of view of the 3D camera 10 (sequence SQ108). Then, the process of sequence SQ106 is repeated.
  • image measurement apparatus 100 receives a user operation (sequence SQ110) and receives the calibration area received in sequence SQ104. Together with the above information, a camera / robot calibration start command is transmitted to the control device 200 (sequence SQ112).
  • the information of the calibration area includes positions on the camera coordinate system indicating the positions of the vertices of the calibration area (eight positions if a rectangular parallelepiped).
  • Control device 200 first acquires a position on the robot reference coordinate system corresponding to the calibration area defined in the camera coordinate system (sequences SQ200 to SQ206).
  • control device 200 selects one position among positions on the camera coordinate system indicating each vertex of the calibration area acquired from image measurement apparatus 100 (sequence SQ200).
  • the image measuring device 100 acquires the position on the camera coordinate system of the marker 22 on the reference plate 20 measured from the image captured by the 3D camera 10 and gives it to the control device 200 (sequence SQ120).
  • Controller 200 arranges marker 22 on reference plate 20 at the selected position based on the difference between the position selected in sequence SQ200 and the position of marker 22 on reference plate 20 on the camera coordinate system.
  • a command position on the robot reference coordinate system
  • the calculated command is given to the robot controller 300 (sequence SQ204).
  • sequences SQ120, SQ202, and SQ204 are repeated until the position selected in sequence SQ200 substantially matches the position of marker 22 on reference plate 20 on the camera coordinate system. It is.
  • the control device 200 instructs the corresponding point in time, that is, on the robot reference coordinate system. Is stored as a position corresponding to one vertex of the calibration area (sequence SQ206).
  • the processing of the sequences SQ200, SQ120, SQ202, SQ204, and SQ206 (the processing of * 2 in FIG. 9) is repeated by the number of vertices in the calibration area acquired from the image measuring device 100.
  • a set of positions on the robot reference coordinate system corresponding to each vertex of the calibration area (eight positions if a rectangular parallelepiped) is acquired. If necessary, the posture on the robot reference coordinate system corresponding to each vertex of the calibration area is also acquired.
  • control device 200 determines a set of positions where the reference plate 20 should be arranged based on the set of positions on the robot reference coordinate system acquired in the sequence SQ206 (sequence SQ208). In sequence SQ208, a plurality of positions on the robot reference coordinate system are determined. The control device 200 may determine a trajectory or the like for moving the reference plate 20 in addition to a set of positions where the reference plate 20 should be arranged.
  • the set of positions where the reference plate 20 should be arranged may be determined, for example, by equally dividing the vertices of the calibration area into a predetermined number. Or you may make it set more positions as the area
  • control device 200 selects one position among a plurality of positions where reference plate 20 determined in sequence SQ208 is to be arranged (sequence SQ210). Then, control device 200 gives a command to robot control device 300 to instruct the placement of reference plate 20 at the selected position (position on the robot reference coordinate system) (sequence SQ212).
  • sequences SQ210 and SQ212 executed by control device 200 corresponds to an arrangement control function for sequentially giving commands to robot 2 and sequentially arranging reference plates 20 (reference objects) in the calibration area.
  • the image measurement device 100 acquires the position on the camera coordinate system of the marker 22 on the reference plate 20 measured from the image captured by the 3D camera 10. To control device 200 (sequence SQ122).
  • the control device 200 includes a position (and posture) on the robot reference coordinate system that indicates the position of the reference plate 20 that is currently arranged, and a position (and posture) on the camera coordinate system measured by an image captured by the 3D camera 10. Are stored in association with each other (sequence SQ214).
  • the control apparatus 200 estimates a matrix that defines the correspondence between the camera coordinate system and the robot reference coordinate system by executing a predetermined calculation process based on the acquired data set (sequence). SQ216). This estimated matrix is stored as a result of camera / robot calibration.
  • the control device 200 sets a combination of the three-dimensional coordinates of the reference plate 20 and the position and orientation of the action part of the robot 2 acquired when the reference plate 20 (reference object) is sequentially arranged in the calibration area. Based on the above, a calibration parameter (correspondence) is calculated.
  • FIG. 10 is a schematic diagram showing an example of a user interface screen provided in the automatic camera / robot calibration process in the robot control system 1 according to the present embodiment.
  • a user interface screen 420 shown in FIG. 10 simultaneously displays a three-dimensional measurement result of the reference plate 20 obtained by imaging with the 3D camera 10 and a two-dimensional image of the reference plate 20 obtained by imaging with the 3D camera 10. To do. Further, the user interface screen 420 displays the calibration area and the height of the reference plate 20 preset by the user.
  • an effective area frame 422 indicating a preset calibration area is displayed for an image captured by the 3D camera 10.
  • the user interface screen 420 includes a height display bar 424.
  • the height display bar 424 displays the measurable range of the Z axis. It is assumed that the distance between the 3D camera 10 and the base of the robot 2 is input as an initial setting. Then, an indicator 426 indicating the height direction position of the reference plate 20 measured by the image measuring device 100 is displayed in association with the height display bar 424.
  • the size of the effective area frame 422 is also displayed on the user interface screen 420.
  • the height changes according to the height of the reference plate 20.
  • the user When the user refers to the user interface screen 420 and the reference plate 20 does not exist within the set calibration area, the user operates the teaching pendant or the like so that the reference plate 20 is arranged within the calibration area. Adjust to.
  • the range in which the robot 2 should move is automatically determined according to the set calibration area as described above. Is done. At this time, by using the two-dimensional image captured by the 3D camera 10 and the three-dimensional measurement result as feedback information, a movement amount command for the robot 2 is sequentially determined.
  • the movement step amount is determined equally from the upper limit and the lower limit of the determined movement range of the robot 2. Then, by acquiring the two-dimensional image and performing the three-dimensional measurement while moving the reference plate 20 by the determined step amount, the correspondence between the position on the camera coordinate system and the position and orientation on the robot reference coordinate system To get. Based on the correlation between the acquired position on the camera coordinate system and the position and orientation on the robot reference coordinate system, camera / robot calibration is executed to calculate necessary parameters.
  • FIG. 11 is a sequence diagram showing an example of a manual processing procedure of camera / robot calibration in the robot control system 1 according to the present embodiment. As shown in FIG. 11, manual processing of camera / robot calibration is mainly executed by the image measurement device 100 and the control device 200.
  • the image measuring apparatus 100 displays a user interface screen 400 for accepting the setting of the calibration area on the display unit 110 (sequence SQ150).
  • the user operates input unit 108 such as a mouse to set a calibration area on user interface screen 400 (sequence SQ152).
  • Image measuring apparatus 100 accepts the calibration area set by the user in sequence SQ152 (sequence SQ154).
  • the image measuring apparatus 100 determines a set of positions where the reference plate 20 is to be arranged based on the set calibration area (sequence SQ156). That is, the image measuring apparatus 100 has a position determination function that determines a plurality of positions where the reference plate 20 (reference object) is to be arranged in the calibration area as a calibration execution function. In sequence SQ156, a plurality of positions on the camera coordinate system are determined.
  • the control device 200 may determine a trajectory or the like for moving the reference plate 20 in addition to a set of positions where the reference plate 20 should be arranged.
  • the set of positions where the reference plate 20 should be arranged may be determined, for example, by equally dividing the vertices of the calibration area into a predetermined number. Or you may make it set more positions as the area
  • the image measuring apparatus 100 selects one position as a target position among a plurality of positions where the reference plate 20 determined in the sequence SQ156 is to be arranged (sequence SQ158).
  • the image measuring apparatus 100 acquires the current position on the camera coordinate system of the marker 22 on the reference plate 20 based on the image captured by the 3D camera 10 (sequence SQ160), and the selected target position and Based on the difference from the acquired current position, it is determined whether or not the reference plate 20 is disposed at the target position (sequence SQ162).
  • the image measuring device 100 When the reference plate 20 is not arranged at the target position (NO in sequence SQ162), the image measuring device 100 notifies the user of information for arranging the reference plate 20 at the target position (sequence SQ164). The user refers to this notification and operates the teaching pendant or the like to adjust the position and posture of the robot 2 so that the reference plate 20 is disposed at the target position. Then, the processes after sequence SQ160 are repeated.
  • image measurement apparatus 100 notifies the user that reference plate 20 is disposed at the target position.
  • sequence SQ166 the position (and orientation) of the marker 22 on the reference plate 20 on the camera coordinate system is transmitted to the control device 200 (sequence SQ168).
  • control device 200 acquires a position (and posture) on the robot reference coordinate system indicating the position of reference plate 20 currently arranged from robot control device 300 (sequence SQ250). Then, the control device 200 determines the position (and orientation) on the robot reference coordinate system indicating the position of the reference plate 20 currently arranged and the position (and orientation) on the camera coordinate system from the image measuring device 100. Correspondingly stored (sequence SQ252).
  • the control apparatus 200 estimates a matrix that defines the correspondence between the camera coordinate system and the robot reference coordinate system by executing a predetermined calculation process based on the acquired data set (sequence). SQ254). This estimated matrix is stored as a result of camera / robot calibration. In other words, the control device 200 sets a combination of the three-dimensional coordinates of the reference plate 20 and the position and orientation of the action part of the robot 2 acquired when the reference plate 20 (reference object) is sequentially arranged in the calibration area. Based on the above, a calibration parameter (correspondence) is calculated.
  • FIG. 12 is a schematic diagram showing an example of a user interface screen provided in the camera / robot calibration manual process in the robot control system 1 according to the present embodiment.
  • FIG. 12 is a schematic diagram showing an example of a user interface screen provided in the camera / robot calibration manual process in the robot control system 1 according to the present embodiment.
  • a user interface screen 430 shown in FIG. 12 simultaneously displays a three-dimensional measurement result of the reference plate 20 obtained by imaging with the 3D camera 10 and a two-dimensional image of the reference plate 20 obtained by imaging with the 3D camera 10. To do. Further, the user interface screen 430 displays the calibration area and the height of the reference plate 20 preset by the user.
  • an effective area frame 422 indicating a preset calibration area is displayed for an image captured by the 3D camera 10.
  • an indicator 432 indicating the two-dimensional position of the selected target position is displayed.
  • the size of the effective area frame 422 also changes on the user interface screen 430 according to the height of the reference plate 20. It will be.
  • the user interface screen 420 includes a height display bar 424.
  • the height display bar 424 displays the measurable range of the Z axis. It is assumed that the distance between the 3D camera 10 and the base of the robot 2 is input as an initial setting.
  • An indicator 426 indicating the position in the height direction of the reference plate 20 measured by the image measuring apparatus 100 and an indicator 428 indicating the height of the selected target position are associated with the height display bar 424. Is displayed.
  • the image measuring apparatus 100 provides a position display function that indicates the relationship between one of a plurality of positions where the determined reference plate 20 should be placed and the current position of the reference plate 20 (reference object). To do.
  • the user can place the reference plate 20 at an appropriate position by referring to the user interface screen 420. That is, the user operates the robot 2 so that the arrangement height of the reference plate 20 matches a predetermined height and the indicator 432 indicating the two-dimensional position of the reference plate 20 coincides with the center of the reference plate 20. To do. Even during the operation of the robot 2, the acquisition of the two-dimensional image and the three-dimensional measurement by the 3D camera 10 are repeatedly executed.
  • the position on the camera coordinate system and the position and orientation on the robot reference coordinate system at that time are stored in association with each other. This series of operations and processes is repeated for the number of times for executing the camera / robot calibration. Finally, based on the correlation between the obtained position on the camera coordinate system and the position and orientation on the robot reference coordinate system, camera / robot calibration is executed to calculate necessary parameters.
  • a notification form a change in display color, display of a warning message, generation of an alarm sound, or the like may be employed.
  • a protection function implemented in the robot control device 300 is given by giving a command to the robot control device 300 from the control device 200.
  • the robot 2 may be forcibly stopped.
  • the functions provided by the image measurement device 100 and the control device 200 may be interchanged, or the image measurement device 100 and the control device 200 may be configured as an integrated device.
  • the robot control apparatus 300 can be configured as an integrated apparatus.
  • any implementation may be adopted as long as it can provide the processing and functions as described above.
  • the position at which the reference plate 20 is arranged for acquiring a data set necessary for executing the camera / robot calibration can be arbitrarily set. Further, a route for sequentially arranging the reference plates 20 at a plurality of set positions can be arbitrarily set. However, with respect to the path where the reference plates 20 are sequentially arranged, the shortest path may be set according to a predetermined optimization algorithm in order to improve work efficiency.
  • a robot control system (1), An imaging unit (10) disposed so as to include the action unit (8) of the robot (2) in the field of view; A measurement unit (52; 100) for measuring a three-dimensional coordinate of an arbitrary object existing in the field of view of the imaging unit based on an image captured by the imaging unit; A command generation unit (54) that generates a command for positioning the action part of the robot according to a pre-calculated correspondence relationship between the measured three-dimensional coordinates and the position and posture of the action part of the robot; 200), A calibration execution unit (58; 100; 200) for executing calibration for calculating the correspondence relationship; In the calibration, a robot control including a setting reception unit (60; 400) that receives a setting of a calibration region (418) that is a region where a reference object (20) associated with the action unit of the robot is to be arranged system.
  • the calibration execution unit A position determination unit (100; SQ156) for determining a plurality of positions where the reference object is to be arranged in the calibration area; A position display unit (100; SQ164) showing a relationship between one of the determined positions and the current position of the reference object; Based on a set of the three-dimensional coordinates of the reference object and the position and orientation of the action part of the robot, which are obtained when the reference object is sequentially arranged at the determined positions, the correspondence relationship is obtained.
  • the robot control system according to any one of configurations 1 to 6, further comprising a calculation unit (200; SQ254) for calculating.
  • the reference object needs to be arranged at a predetermined position by the robot, and it is not easy to specify the position and orientation of the reference object.
  • an area (calibration area) where the camera / robot calibration should be executed can be arbitrarily set based on the 3D camera.
  • an appropriate calibration area can be set. Then, camera / robot calibration can be executed automatically or manually for the set calibration area.
  • the position where the reference object is arranged can be automatically determined by setting the cross section of the calibration area to substantially the same size at any position in the optical axis direction. At the same time, the setting according to the application can be facilitated.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Manipulator (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

This robot control system includes: a measurement unit that, on the basis of an image captured by an imaging unit, measures three-dimensional coordinates of a given object present in the field of view of the imaging unit; a command generating unit that, in accordance with a previously calculated correspondence between the measured three-dimensional coordinates and the position and attitude of an action part of the robot, generates a command for positioning the action part of the robot; a calibration execution unit that executes a calibration for calculating the correspondence; and a setting receiving unit that, during the calibration, receives a setting for a calibration region, which is a region wherein a reference object associated with the action part of the robot is to be disposed.

Description

ロボット制御システムRobot control system
 本技術は、撮像部により撮像される画像に基づいて算出される三次元座標を利用してロボットを制御するためのロボット制御システムに関する。 The present technology relates to a robot control system for controlling a robot using three-dimensional coordinates calculated based on an image captured by an imaging unit.
 従来から、三次元計測が可能なセンサとロボットとを組合せたシステムが知られている。このようなシステムにおいては、設定項目が多く、また、設定した内容を直感的には把握し難い場合がある。そこで、例えば、特開2011-112402号公報(特許文献1)は、設定された有効領域の立体形状を周囲との関係とともに簡単に表示できるようにして、3次元視覚センサの利便性を高める構成を開示する。 Conventionally, a system in which a sensor capable of three-dimensional measurement and a robot are combined is known. In such a system, there are many setting items, and it may be difficult to intuitively grasp the set contents. Therefore, for example, Japanese Patent Application Laid-Open No. 2011-112402 (Patent Document 1) is configured to improve the convenience of the three-dimensional visual sensor by easily displaying the three-dimensional shape of the set effective area together with the surroundings. Is disclosed.
特開2011-112402号公報JP 2011-112402 A
 特許文献1に開示される構成は、ステレオ計測のための3次元座標系(計測座標系)とカメラ座標系との間のキャリブレーションについては考慮されているものの、ロボットを制御するための座標系についてのキャリブレーションは考慮されていない。 The configuration disclosed in Patent Document 1 is a coordinate system for controlling a robot, although calibration between a three-dimensional coordinate system (measurement coordinate system) for stereo measurement and a camera coordinate system is considered. Calibration for is not considered.
 実際には、撮像された画像により計測される三次元座標とロボットの位置および姿勢との関係を予めキャリブレーションしておく必要がある。このようなキャリブレーションを実行するための設定を容易化したいというニーズが存在する。 Actually, it is necessary to calibrate the relationship between the three-dimensional coordinates measured from the captured image and the position and posture of the robot in advance. There is a need to facilitate the setting for executing such calibration.
 本技術の一つの実施形態に従うロボット制御システムは、ロボットの作用部を視野内に含むように配置された撮像部と、撮像部により撮像された画像に基づいて、当該撮像部の視野内に存在する任意の対象物の三次元座標を計測する計測部と、計測された三次元座標とロボットの作用部の位置および姿勢との間の予め算出された対応関係に従って、ロボットの作用部を位置決めするための指令を生成する指令生成部と、対応関係を算出するためのキャリブレーションを実行するキャリブレーション実行部と、キャリブレーションにおいて、ロボットの作用部に関連付けられた基準物体を配置すべき領域であるキャリブレーション領域の設定を受付ける設定受付部とを含む。 A robot control system according to an embodiment of the present technology is present in a field of view of an imaging unit arranged to include an action unit of the robot in the field of view and an image captured by the imaging unit. A measuring unit that measures the three-dimensional coordinates of an arbitrary object to be positioned, and positions the robot operating unit according to a pre-calculated correspondence between the measured three-dimensional coordinates and the position and orientation of the robot operating unit. A region for placing a reference object associated with a robot action unit in a calibration, a command generation unit for generating a command for a calibration, a calibration execution unit for executing a calibration for calculating a correspondence relationship A setting receiving unit that receives the setting of the calibration area.
 この開示によれば、設定受付部を介して設定されるキャリブレーション領域の範囲でキャリブレーションを実行できるので、キャリブレーションを効率化できる。 According to this disclosure, since calibration can be executed in the range of the calibration area set via the setting reception unit, calibration can be made more efficient.
 上述の開示において、設定受付部は、撮像部を基準として、設定されているキャリブレーション領域の範囲を示すようにしてもよい。 In the above disclosure, the setting reception unit may indicate the range of the calibration area that is set with the imaging unit as a reference.
 この開示によれば、撮像部を基準としてキャリブレーション領域が設定されるので、ユーザによるキャリブレーション領域の範囲の把握を容易化できる。 According to this disclosure, since the calibration area is set based on the imaging unit, the user can easily understand the range of the calibration area.
 上述の開示において、設定受付部は、撮像部の視野範囲を併せて表示するようにしてもよい。 In the above disclosure, the setting reception unit may display the field of view range of the imaging unit.
 この開示によれば、撮像部の視野範囲に対して、キャリブレーション領域が設定されている範囲を一見して把握できる。 According to this disclosure, the range in which the calibration area is set can be grasped at a glance with respect to the visual field range of the imaging unit.
 上述の開示において、キャリブレーション領域は、撮像部の光軸を基準とした直方体として設定されるようにしてもよい。 In the above disclosure, the calibration area may be set as a rectangular parallelepiped based on the optical axis of the imaging unit.
 この開示によれば、基準物体を配置する位置を容易に決定できるとともに、光軸方向に移動した場合でも計測範囲外にならないように動作範囲を決定できる。 According to this disclosure, the position where the reference object is arranged can be easily determined, and the operation range can be determined so as not to be out of the measurement range even when moved in the optical axis direction.
 上述の開示において、キャリブレーション領域として設定される直方体の断面の大きさは、撮像部の視野と撮像部からキャリブレーション領域の端面までの距離とに応じて、決定されるようにしてもよい。 In the above disclosure, the size of the cross section of the rectangular parallelepiped set as the calibration area may be determined according to the field of view of the imaging unit and the distance from the imaging unit to the end face of the calibration area.
 この開示によれば、撮像部からの距離に応じた最適なキャリブレーション領域を設定できる。 According to this disclosure, it is possible to set an optimal calibration area according to the distance from the imaging unit.
 上述の開示において、設定受付部は、キャリブレーション領域の設定として、撮像部の光軸上の範囲設定を受付けるようにしてもよい。 In the above disclosure, the setting reception unit may receive a range setting on the optical axis of the imaging unit as the calibration area setting.
 この開示によれば、撮像部の光軸方向において実際に使用される領域のみをキャリブレーション領域として設定できる。 According to this disclosure, only the area actually used in the optical axis direction of the imaging unit can be set as the calibration area.
 上述の開示において、キャリブレーション実行部は、ロボットに指令を順次与えて、キャリブレーション領域内に基準物体を順次配置する配置制御部と、基準物体がキャリブレーション領域内に順次配置されたときに取得される、基準物体の三次元座標とロボットの作用部の位置および姿勢との組に基づいて、対応関係を算出する算出部とを含むようにしてもよい。 In the above disclosure, the calibration execution unit sequentially gives commands to the robot, and obtains when the reference control unit sequentially arranges the reference objects in the calibration area, and when the reference objects are sequentially arranged in the calibration area. And a calculation unit that calculates a correspondence relationship based on a set of the three-dimensional coordinates of the reference object and the position and orientation of the action unit of the robot.
 この開示によれば、ユーザは、キャリブレーション領域を設定するだけで、キャリブレーションが実行されるので、専門知識などがないユーザであっても、ロボット制御システムを運用できる。 According to this disclosure, since the user performs the calibration only by setting the calibration area, the robot control system can be operated even by a user who has no specialized knowledge.
 上述の開示において、キャリブレーション実行部は、キャリブレーション領域において基準物体を配置すべき複数の位置を決定する位置決定部と、決定された複数の位置のうち一つと基準物体の現在位置との関係を示す位置表示部と、基準物体が決定された複数の位置に順次配置されたときに取得される、基準物体の三次元座標とロボットの作用部の位置および姿勢との組に基づいて、対応関係を算出する算出部とを含むようにしてもよい。 In the above disclosure, the calibration execution unit includes a position determination unit that determines a plurality of positions where the reference object is to be arranged in the calibration area, and a relationship between one of the determined positions and the current position of the reference object Based on the combination of the position display unit indicating the reference object and the three-dimensional coordinates of the reference object and the position and orientation of the action part of the robot, which are obtained when the reference object is sequentially arranged at a plurality of determined positions. And a calculation unit for calculating the relationship.
 この開示によれば、ユーザは、キャリブレーション領域を設定するとともに、位置表示部からの通知に従って基準物体を所定位置に順次配置するだけで、キャリブレーションが実行されるので、専門知識の乏しいユーザであっても、ロボット制御システムを運用できる。 According to this disclosure, the user can set the calibration area and perform the calibration simply by sequentially arranging the reference objects at predetermined positions in accordance with the notification from the position display unit. Even if it exists, the robot control system can be operated.
 本技術によれば、撮像された画像により計測される三次元座標とロボットの位置および姿勢との関係を予めキャリブレーションする際に必要な設定を容易化できる。 According to the present technology, it is possible to facilitate the setting required when previously calibrating the relationship between the three-dimensional coordinates measured from the captured image and the position and orientation of the robot.
本実施の形態に従うロボット制御システムの適用例を示す模式図である。It is a schematic diagram which shows the example of application of the robot control system according to this Embodiment. 本実施の形態に従うロボット制御システムの全体構成を示す模式図である。It is a schematic diagram which shows the whole structure of the robot control system according to this Embodiment. 本実施の形態に従うロボット制御システムに含まれる3Dカメラの構成例を示す模式図である。It is a schematic diagram which shows the structural example of the 3D camera contained in the robot control system according to this Embodiment. 本実施の形態に従うロボット制御システムに含まれる画像計測装置の構成例を示す模式図である。It is a schematic diagram which shows the structural example of the image measuring device contained in the robot control system according to this Embodiment. 本実施の形態に従うロボット制御システムに含まれる制御装置の構成例を示す模式図である。It is a schematic diagram which shows the structural example of the control apparatus contained in the robot control system according to this Embodiment. 本実施の形態に従うロボット制御システムに含まれるロボット制御装置の構成例を示す模式図である。It is a schematic diagram which shows the structural example of the robot control apparatus contained in the robot control system according to this Embodiment. 本実施の形態に従うロボット制御システムにおけるカメラ・ロボットキャリブレーションの一側面を説明するための図である。It is a figure for demonstrating one side of the camera robot calibration in the robot control system according to this Embodiment. 本実施の形態に従うロボット制御システムが提供するユーザインターフェイス画面の一例を示す模式図である。It is a schematic diagram which shows an example of the user interface screen which the robot control system according to this Embodiment provides. 本実施の形態に従うロボット制御システムにおけるカメラ・ロボットキャリブレーションの自動処理手順の一例を示すシーケンス図である。It is a sequence diagram which shows an example of the automatic processing procedure of the camera robot calibration in the robot control system according to the present embodiment. 本実施の形態に従うロボット制御システムにおけるカメラ・ロボットキャリブレーションの自動処理において提供されるユーザインターフェイス画面の一例を示す模式図である。It is a schematic diagram which shows an example of the user interface screen provided in the automatic process of the camera and robot calibration in the robot control system according to the present embodiment. 本実施の形態に従うロボット制御システムにおけるカメラ・ロボットキャリブレーションの手動処理手順の一例を示すシーケンス図である。It is a sequence diagram which shows an example of the manual processing procedure of the camera robot calibration in the robot control system according to the present embodiment. 本実施の形態に従うロボット制御システムにおけるカメラ・ロボットキャリブレーションの手動処理において提供されるユーザインターフェイス画面の一例を示す模式図である。It is a schematic diagram which shows an example of the user interface screen provided in the manual process of the camera robot calibration in the robot control system according to the present embodiment.
 本発明の実施の形態について、図面を参照しながら詳細に説明する。なお、図中の同一または相当部分については、同一符号を付してその説明は繰返さない。 Embodiments of the present invention will be described in detail with reference to the drawings. Note that the same or corresponding parts in the drawings are denoted by the same reference numerals and description thereof will not be repeated.
 <A.適用例>
 まず、本発明が適用される場面の一例について説明する。
<A. Application example>
First, an example of a scene to which the present invention is applied will be described.
 図1は、本実施の形態に従うロボット制御システム1の適用例を示す模式図である。図1を参照して、本実施の形態に従うロボット制御システム1は、ロボット2を制御する構成として、ロボット2の作用部8を視野内に含むように配置された撮像部50を含む。ロボット制御システム1は、撮像部50により撮像された画像に基づいて、撮像部50の視野内に存在する任意の対象物の三次元座標を計測する計測部52をさらに含む。計測部52により計測される三次元座標は、指令生成部54およびキャリブレーション実行部58へ出力される。計測部52により計測される三次元座標は、特定の対象物の三次元座標上の位置(各軸上の位置)および姿勢(各軸を中心とする回転角)を含み得る。 FIG. 1 is a schematic diagram showing an application example of the robot control system 1 according to the present embodiment. Referring to FIG. 1, robot control system 1 according to the present embodiment includes imaging unit 50 arranged to include action unit 8 of robot 2 in the field of view as a configuration for controlling robot 2. The robot control system 1 further includes a measurement unit 52 that measures the three-dimensional coordinates of an arbitrary object existing in the field of view of the imaging unit 50 based on the image captured by the imaging unit 50. The three-dimensional coordinates measured by the measurement unit 52 are output to the command generation unit 54 and the calibration execution unit 58. The three-dimensional coordinates measured by the measurement unit 52 can include a position (a position on each axis) and a posture (a rotation angle about each axis) on a three-dimensional coordinate of a specific object.
 指令生成部54は、ロボット2の作用部8を位置決めするための指令を生成する。具体的には、指令生成部54は、指定された目標位置と計測部52からの三次元座標とを比較して、ロボット2の作用部8の位置および姿勢との間の予め算出された対応関係(典型的には、キャリブレーションパラメータ56)に従って、指令を算出する。 The command generation unit 54 generates a command for positioning the action unit 8 of the robot 2. Specifically, the command generation unit 54 compares the designated target position with the three-dimensional coordinates from the measurement unit 52, and calculates the correspondence calculated in advance between the position and orientation of the action unit 8 of the robot 2. A command is calculated according to the relationship (typically, calibration parameter 56).
 キャリブレーションパラメータ56は、キャリブレーション実行部58によって決定される。より具体的には、キャリブレーション実行部58は、計測部52からの三次元座標および設定受付部60からのキャリブレーションに用いられる空間領域(キャリブレーション領域)の設定に基づいて、キャリブレーションパラメータ56を算出するためのキャリブレーションを実行する。このキャリブレーション領域は、設定受付部60によってユーザから設定される。すなわち、設定受付部60は、キャリブレーションにおいて、ロボット2の作用部8に関連付けられた基準物体(一例として、基準プレート20を示す)を配置すべき領域であるキャリブレーション領域の設定を受付ける。キャリブレーション領域は、撮像部50の有効視野のうち任意の領域を設定可能である。 The calibration parameter 56 is determined by the calibration execution unit 58. More specifically, the calibration execution unit 58 sets the calibration parameter 56 based on the three-dimensional coordinates from the measurement unit 52 and the setting of the spatial region (calibration region) used for calibration from the setting reception unit 60. Perform calibration to calculate. This calibration area is set by the user by the setting receiving unit 60. That is, the setting receiving unit 60 receives a setting of a calibration region that is a region in which a reference object (for example, the reference plate 20 is shown) associated with the action unit 8 of the robot 2 is to be arranged. As the calibration area, an arbitrary area of the effective visual field of the imaging unit 50 can be set.
 このように、本実施の形態においては、撮像部50を基準とした任意のキャリブレーション領域を設定できるので、キャリブレーションパラメータ56を算出するためのキャリブレーションを効率的に実行できる。 As described above, in the present embodiment, since an arbitrary calibration area based on the imaging unit 50 can be set, the calibration for calculating the calibration parameter 56 can be executed efficiently.
 <B.ロボット制御システム1の全体構成>
 まず、本実施の形態に従うロボット制御システム1の全体構成について説明する。
<B. Overall Configuration of Robot Control System 1>
First, the overall configuration of robot control system 1 according to the present embodiment will be described.
 図2は、本実施の形態に従うロボット制御システム1の全体構成を示す模式図である。図2を参照して、ロボット制御システム1は、ロボットの作用部を視野内に含むように配置された撮像部により撮像された画像に基づいて、撮像部の視野内に存在するロボットおよび任意の対象物の三次元座標を計測し、その計測した三次元座標に基づいてロボットを制御する。 FIG. 2 is a schematic diagram showing the overall configuration of the robot control system 1 according to the present embodiment. Referring to FIG. 2, the robot control system 1 includes a robot existing in the field of view of the imaging unit and an arbitrary image based on an image captured by the imaging unit arranged to include the action unit of the robot in the field of view. The three-dimensional coordinates of the object are measured, and the robot is controlled based on the measured three-dimensional coordinates.
 より具体的には、ロボット制御システム1は、ロボット2と、3Dカメラ10と、画像計測装置100と、制御装置200と、ロボット制御装置300とを含む。 More specifically, the robot control system 1 includes a robot 2, a 3D camera 10, an image measurement device 100, a control device 200, and a robot control device 300.
 ロボット2は、ロボット制御装置300からの指令に従って、任意の位置の動作を行なう機構である。図2には、ロボット2の典型例として、多関節ロボットを例示するが、スカラロボットやパラレルロボットであってもよい。 The robot 2 is a mechanism that performs an operation at an arbitrary position in accordance with a command from the robot control device 300. In FIG. 2, a multi-joint robot is illustrated as a typical example of the robot 2, but it may be a SCARA robot or a parallel robot.
 図2に示すロボット2は、1または複数のアーム4を有しており、1または複数のアーム4の先端(ロボット2の作用部に相当)には、ハンドピース6が装着されている。図2には、後述するようなキャリブレーション実行時の状態が模式的に描かれており、ハンドピース6には基準物体の一例として基準プレート20が装着されている。 The robot 2 shown in FIG. 2 has one or a plurality of arms 4, and a hand piece 6 is attached to the tips of the one or the plurality of arms 4 (corresponding to the action part of the robot 2). FIG. 2 schematically illustrates a state at the time of executing calibration as described later, and a reference plate 20 is mounted on the handpiece 6 as an example of a reference object.
 基準プレート20は、キャリブレーションにおいて、計測される三次元座標とロボット2の作用部の位置を示す位置情報との間の対応関係を決定するための基準物体である。基準プレート20の表面には、1または複数のマーカ22が描かれている。 The reference plate 20 is a reference object for determining the correspondence between the measured three-dimensional coordinates and position information indicating the position of the action part of the robot 2 in the calibration. One or more markers 22 are drawn on the surface of the reference plate 20.
 3Dカメラ10は、ロボット2の作用部(アーム4の先端およびハンドピース6)を視野内に含むように配置されており、所定周期毎あるいは所定イベント毎に視野内を撮像した画像を画像計測装置100へ出力する。 The 3D camera 10 is disposed so as to include the action part (the tip of the arm 4 and the handpiece 6) of the robot 2 in the field of view, and an image measurement device that captures an image captured in the field of view at every predetermined period or every predetermined event. Output to 100.
 本実施の形態に従うロボット制御システム1においては、ロボット2の作用部を含む各部分の三次元座標を光学的に計測できる構成を採用する。 The robot control system 1 according to the present embodiment employs a configuration that can optically measure the three-dimensional coordinates of each part including the action part of the robot 2.
 一例として、構造化照明と称される手法を用いて三次元計測を実現するようにしてもよい。構造化照明の手法では、計測光を被写体に照射するともに、計測光が投影された状態で被写体を撮像することで得られる画像に基づいて、被写体までの距離を計測する。このような構造化照明の手法としては、空間コード化法、位相シフト法、光切断法などを用いることができる。このような構造化照明を採用した場合には、3Dカメラ10は、計測光を照射する投光部と、計測光が投影された状態で被写体を撮像する撮像部とを有することになる。 As an example, three-dimensional measurement may be realized using a technique called structured illumination. In the structured illumination method, the subject is irradiated with measurement light, and the distance to the subject is measured based on an image obtained by imaging the subject with the measurement light projected. As a structured illumination method, a spatial encoding method, a phase shift method, a light cutting method, or the like can be used. When such structured illumination is employed, the 3D camera 10 includes a light projecting unit that emits measurement light and an imaging unit that captures an image of the subject in a state where the measurement light is projected.
 別の例として、多視点カメラを用いて三次元計測を実現するようにしてもよい。この場合には、3Dカメラ10は、視点が互いに異なるように配置された複数のカメラを含む。典型的には、3Dカメラ10は、一対のカメラからなるステレオカメラである。 As another example, three-dimensional measurement may be realized using a multi-viewpoint camera. In this case, the 3D camera 10 includes a plurality of cameras arranged so that the viewpoints are different from each other. Typically, the 3D camera 10 is a stereo camera including a pair of cameras.
 画像計測装置100は、3Dカメラ10により撮像された画像に基づいて、3Dカメラ10の視野内に存在する任意の対象物の三次元画像を計測する。 The image measuring device 100 measures a three-dimensional image of an arbitrary object existing in the field of view of the 3D camera 10 based on the image captured by the 3D camera 10.
 例えば、構造化照明を採用する場合には、画像計測装置100は、3Dカメラ10から出力される画像に含まれる濃淡パターンの位置やズレなどを解析することで、視野内の被写体についての三次元座標(あるいは、三次元形状を示す座標群)を算出する。また、3Dカメラ10として多視点カメラを採用する場合には、画像計測装置100は、画像間のマッチングにより算出される各注目位置の視差に基づいて、視野内の被写体についての三次元座標(あるいは、三次元形状を示す座標群)を算出する。 For example, when structured illumination is employed, the image measurement apparatus 100 analyzes the position and displacement of the light and shade pattern included in the image output from the 3D camera 10 to obtain a three-dimensional image of the subject in the field of view. Coordinates (or a coordinate group indicating a three-dimensional shape) are calculated. When a multi-viewpoint camera is employed as the 3D camera 10, the image measurement device 100 uses the three-dimensional coordinates (or the subject) in the field of view based on the parallax of each target position calculated by matching between images. , A coordinate group indicating a three-dimensional shape) is calculated.
 画像計測装置100は、さらに、算出した三次元座標群のうち、特定の対象物の三次元座標を探索することもできる。例えば、画像計測装置100は、基準プレート20に描かれるマーカ22をパターンマッチングにより探索し、各マーカ22の三次元座標を出力できる。これらのマーカ22の三次元座標に基づいて、キャリブレーションが実行される。 The image measuring device 100 can also search for the three-dimensional coordinates of a specific object in the calculated three-dimensional coordinate group. For example, the image measuring apparatus 100 can search for the marker 22 drawn on the reference plate 20 by pattern matching and output the three-dimensional coordinates of each marker 22. Calibration is performed based on the three-dimensional coordinates of these markers 22.
 制御装置200は、典型的には、PLC(プログラマブルコントローラ)などで構成され、画像計測装置100により計測された三次元座標に基づいて、キャリブレーションを実行したり、ロボット制御装置300への指令を与えたりする。 The control device 200 is typically composed of a PLC (programmable controller) or the like, and executes calibration or issues a command to the robot control device 300 based on the three-dimensional coordinates measured by the image measurement device 100. Or give.
 ロボット制御装置300は、制御装置200からの指令に従って、ロボット2を制御する。より具体的には、ロボット制御装置300は、ロボット2の各軸を駆動するサーボモータなどを駆動する。 The robot control device 300 controls the robot 2 according to a command from the control device 200. More specifically, the robot control device 300 drives a servo motor that drives each axis of the robot 2.
 <C.ロボット制御システム1を構成する各装置の構成例>
 次に、本実施の形態に従うロボット制御システム1を構成する各装置の構成例について説明する。
<C. Configuration example of each device constituting robot control system 1>
Next, a configuration example of each device constituting robot control system 1 according to the present embodiment will be described.
 (c1:3Dカメラ10)
 図3は、本実施の形態に従うロボット制御システム1に含まれる3Dカメラ10の構成例を示す模式図である。図3を参照して、3Dカメラ10は、処理部11と、投光部12と、撮像部13と、表示部14と、記憶部15とを含む。
(C1: 3D camera 10)
FIG. 3 is a schematic diagram showing a configuration example of the 3D camera 10 included in the robot control system 1 according to the present embodiment. With reference to FIG. 3, the 3D camera 10 includes a processing unit 11, a light projecting unit 12, an imaging unit 13, a display unit 14, and a storage unit 15.
 処理部11は、3Dカメラ10における全体処理を司る。処理部11は、典型的には、プロセッサと、プロセッサで実行される命令コードを格納するストレージと、命令コードを展開するメモリとを含む。この場合、処理部11において、プロセッサが命令コードをメモリ上に展開して実行することで各種処理を実現する。処理部11の全部または一部を専用のハードウェア回路(例えば、ASIC(Application Specific Integrated Circuit)またはFPGA(Field-Programmable Gate Array)など)を用いて実装してもよい。 The processing unit 11 manages the entire process in the 3D camera 10. The processing unit 11 typically includes a processor, a storage that stores an instruction code executed by the processor, and a memory that expands the instruction code. In this case, in the processing unit 11, the processor realizes various processes by expanding and executing the instruction code on the memory. All or part of the processing unit 11 may be implemented using a dedicated hardware circuit (for example, ASIC (Application Specific Integrated Circuit) or FPGA (Field-Programmable Gate Array)).
 表示部14は、3Dカメラ10において取得あるいは算出された各種情報を外部へ通知する。 The display unit 14 notifies various information acquired or calculated by the 3D camera 10 to the outside.
 記憶部15は、撮像部13により撮像された画像や予め設定されるキャリブレーションパラメータなどを格納する。 The storage unit 15 stores an image captured by the image capturing unit 13, a preset calibration parameter, and the like.
 通信インターフェイス(I/F)部16は、3Dカメラ10と画像計測装置100との間のデータの遣り取りを担当する。 The communication interface (I / F) unit 16 is in charge of data exchange between the 3D camera 10 and the image measurement apparatus 100.
 (c2:画像計測装置100)
 図4は、本実施の形態に従うロボット制御システム1に含まれる画像計測装置100の構成例を示す模式図である。画像計測装置100は、典型的には、汎用コンピュータを用いて実現される。図4を参照して、画像計測装置100は、プロセッサ102と、メインメモリ104と、ストレージ106と、入力部108と、表示部110と、光学ドライブ112と、通信インターフェイス(I/F)部114とを含む。これらのコンポーネントは、プロセッサバス116を介して接続されている。
(C2: Image measuring device 100)
FIG. 4 is a schematic diagram illustrating a configuration example of the image measurement device 100 included in the robot control system 1 according to the present embodiment. The image measuring apparatus 100 is typically realized using a general-purpose computer. Referring to FIG. 4, the image measurement apparatus 100 includes a processor 102, a main memory 104, a storage 106, an input unit 108, a display unit 110, an optical drive 112, and a communication interface (I / F) unit 114. Including. These components are connected via a processor bus 116.
 プロセッサ102は、CPU(Central Processing Unit)やGPU(Graphics Processing Unit)などで構成され、ストレージ106に格納されたプログラム(一例として、OS(Operating System)1060および三次元計測プログラム1062)を読出して、メインメモリ104に展開して実行することで、後述するような各種処理を実現する。 The processor 102 is composed of a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), and the like, and reads a program (for example, an OS (Operating System) 1060 and a three-dimensional measurement program 1062) stored in the storage 106, By developing and executing in the main memory 104, various processes as described later are realized.
 メインメモリ104は、DRAM(Dynamic Random Access Memory)やSRAM(Static Random Access Memory)などの揮発性記憶装置などで構成される。ストレージ106は、例えば、HDD(Hard Disk Drive)やSSD(Solid State Drive)などの不揮発性記憶装置などで構成される。 The main memory 104 includes a volatile storage device such as DRAM (Dynamic Random Access Memory) and SRAM (Static Random Access Memory). The storage 106 includes, for example, a nonvolatile storage device such as an HDD (Hard Disk Drive) or an SSD (Solid State Drive).
 ストレージ106には、基本的な機能を実現するためのOS1060に加えて、画像計測装置100としての機能を提供するための三次元計測プログラム1062、マーカ22やワークなどの物体認識に用いられるモデルデータ1064、および、設定受付プログラム1066が格納される。 In the storage 106, in addition to the OS 1060 for realizing basic functions, a three-dimensional measurement program 1062 for providing functions as the image measurement apparatus 100, model data used for object recognition such as the markers 22 and workpieces, and the like. 1064 and a setting reception program 1066 are stored.
 設定受付プログラム1066は、後述するようなカメラ・ロボットキャリブレーションにおいて、ロボット2の作用部に関連付けられた基準物体を配置すべき領域であるキャリブレーション領域の設定を受付ける処理を実行する。 The setting reception program 1066 executes processing for receiving a setting of a calibration area, which is an area in which a reference object associated with the action unit of the robot 2 is to be arranged, in camera / robot calibration as described later.
 入力部108は、キーボードやマウスなどで構成され、ユーザ操作を受付ける。表示部110は、ディスプレイ、各種インジケータ、プリンタなどで構成され、プロセッサ102からの処理結果などを出力する。 The input unit 108 is composed of a keyboard, a mouse, and the like, and accepts user operations. The display unit 110 includes a display, various indicators, a printer, and the like, and outputs a processing result from the processor 102.
 通信インターフェイス部114は、3Dカメラ10と画像計測装置100との間のデータの遣り取りを担当するとともに、画像計測装置100と制御装置200との間のデータの遣り取りを担当する。 The communication interface unit 114 is in charge of data exchange between the 3D camera 10 and the image measurement device 100 and is in charge of data exchange between the image measurement device 100 and the control device 200.
 画像計測装置100は、光学ドライブ112を有しており、コンピュータ読取可能なプログラムを非一過的に格納する記録媒体113(例えば、DVD(Digital Versatile Disc)などの光学記録媒体)から、その中に格納されたプログラムが読取られてストレージ106などにインストールされる。 The image measuring apparatus 100 includes an optical drive 112, and from a recording medium 113 (for example, an optical recording medium such as a DVD (Digital Versatile Disc)) that temporarily stores a computer-readable program. Is read and installed in the storage 106 or the like.
 画像計測装置100で実行される三次元計測プログラム1062および設定受付プログラム1066などは、コンピュータ読取可能な記録媒体113を介してインストールされてもよいが、ネットワーク上のサーバ装置などからダウンロードする形でインストールするようにしてもよい。また、本実施の形態に従う三次元計測プログラム1062および設定受付プログラム1066が提供する機能は、OSが提供するモジュールの一部を利用する形で実現される場合もある。 The three-dimensional measurement program 1062 and the setting reception program 1066 executed by the image measurement apparatus 100 may be installed via the computer-readable recording medium 113, but may be installed by downloading from a server apparatus on the network. You may make it do. Further, the functions provided by the three-dimensional measurement program 1062 and the setting reception program 1066 according to the present embodiment may be realized by using a part of modules provided by the OS.
 図4には、プロセッサ102がプログラムを実行することで、画像計測装置100として必要な機能が提供される構成例を示したが、これらの提供される機能の一部または全部を、専用のハードウェア回路(例えば、ASICまたはFPGAなど)を用いて実装してもよい。 FIG. 4 shows a configuration example in which functions necessary for the image measurement apparatus 100 are provided when the processor 102 executes a program. However, some or all of these provided functions are provided with dedicated hardware. A hardware circuit (for example, ASIC or FPGA) may be used.
 (c3:制御装置200)
 図5は、本実施の形態に従うロボット制御システム1に含まれる制御装置200の構成例を示す模式図である。制御装置200は、典型的には、PLC(プログラマブルコントローラ)を用いて実現される。図5を参照して、制御装置200は、プロセッサ202と、メインメモリ204と、ストレージ206と、通信インターフェイス(I/F)部208と、フィールドネットワークコントローラ210,212と、USB(Universal Serial Bus)コントローラ214と、メモリカードインターフェイス216と、ローカルバスコントローラ220とを含む。これらのコンポーネントは、プロセッサバス230を介して接続されている。
(C3: Control device 200)
FIG. 5 is a schematic diagram showing a configuration example of control device 200 included in robot control system 1 according to the present embodiment. The control device 200 is typically realized using a PLC (programmable controller). Referring to FIG. 5, the control device 200 includes a processor 202, a main memory 204, a storage 206, a communication interface (I / F) unit 208, field network controllers 210 and 212, and a USB (Universal Serial Bus). A controller 214, a memory card interface 216, and a local bus controller 220 are included. These components are connected via a processor bus 230.
 プロセッサ202は、制御演算などを実行する演算処理部に相当し、CPUやGPUなどで構成される。具体的には、プロセッサ202は、ストレージ206に格納されたプログラムを読出して、メインメモリ204に展開して実行することで、制御対象に応じた制御、および、後述するような各種処理を実現する。 The processor 202 corresponds to a calculation processing unit that executes control calculations and the like, and includes a CPU, a GPU, and the like. Specifically, the processor 202 reads out a program stored in the storage 206, develops it in the main memory 204, and executes it, thereby realizing control according to the control target and various processes as will be described later. .
 メインメモリ204は、DRAMやSRAMなどの揮発性記憶装置などで構成される。ストレージ206は、記憶部に相当し、例えば、HDDやSSDなどの不揮発性記憶装置などで構成される。ストレージ206には、基本的な機能を実現するためのシステムプログラムに加えて、キャリブレーションパラメータ2060、指令生成プログラム2062、および、キャリブレーション実行プログラム2064などが格納される。 The main memory 204 is configured by a volatile storage device such as DRAM or SRAM. The storage 206 corresponds to a storage unit, and includes, for example, a nonvolatile storage device such as an HDD or an SSD. The storage 206 stores a calibration parameter 2060, a command generation program 2062, a calibration execution program 2064, and the like in addition to a system program for realizing basic functions.
 キャリブレーションパラメータ2060は、計測された三次元座標とロボット2の作用部の位置および姿勢との間の予め算出された対応関係に相当する。 The calibration parameter 2060 corresponds to a correspondence relationship calculated in advance between the measured three-dimensional coordinates and the position and orientation of the action part of the robot 2.
 指令生成プログラム2062は、計測された三次元座標とロボット2の作用部の位置および姿勢との間の予め算出された対応関係(キャリブレーションパラメータ2060)に従って、ロボット2の作用部を位置決めするための指令を生成する処理を実行する。 The command generation program 2062 is for positioning the action part of the robot 2 in accordance with a previously calculated correspondence (calibration parameter 2060) between the measured three-dimensional coordinates and the position and orientation of the action part of the robot 2. A process for generating a command is executed.
 キャリブレーション実行プログラム2064は、計測された三次元座標とロボット2の作用部の位置および姿勢との間の予め算出された対応関係(キャリブレーションパラメータ2060)を算出するためのキャリブレーション(後述する、カメラ・ロボットキャリブレーション)を実行する。 The calibration execution program 2064 is a calibration (to be described later) for calculating a pre-calculated correspondence relationship (calibration parameter 2060) between the measured three-dimensional coordinates and the position and orientation of the action part of the robot 2. Execute camera / robot calibration).
 通信インターフェイス部208は、画像計測装置100と制御装置200との間のデータの遣り取りを担当する。フィールドネットワークコントローラ210,212は、フィールドネットワークを介して、ロボット制御装置300(図2参照)などの任意のデバイスとの間でデータを遣り取りする。図5には、2つのフィールドネットワークコントローラ210,212を図示するが、単一のフィールドネットワークコントローラを採用してもよい。USBコントローラ214は、USB接続を介して、任意の外部装置などとの間でデータを遣り取りする。 The communication interface unit 208 is in charge of data exchange between the image measuring device 100 and the control device 200. The field network controllers 210 and 212 exchange data with an arbitrary device such as the robot controller 300 (see FIG. 2) via the field network. Although two field network controllers 210 and 212 are illustrated in FIG. 5, a single field network controller may be employed. The USB controller 214 exchanges data with an arbitrary external device via a USB connection.
 メモリカードインターフェイス216は、着脱可能な記録媒体の一例であるメモリカード218を受付ける。メモリカードインターフェイス216は、メモリカード218に対してデータを書込み、メモリカード218から各種データ(ログやトレースデータなど)を読出すことが可能になっている。 The memory card interface 216 receives a memory card 218 which is an example of a removable recording medium. The memory card interface 216 can write data to the memory card 218 and read various data (such as logs and trace data) from the memory card 218.
 ローカルバスコントローラ220は、ローカルバス222を介して、任意のローカルIOユニットとの間でデータを遣り取りする。 The local bus controller 220 exchanges data with an arbitrary local IO unit via the local bus 222.
 図5には、プロセッサ202がプログラムを実行することで必要な機能が提供される構成例を示したが、これらの提供される機能の一部または全部を、専用のハードウェア回路(例えば、ASICまたはFPGAなど)を用いて実装してもよい。あるいは、制御装置200の主要部を、汎用的なアーキテクチャに従うハードウェア(例えば、汎用パソコンをベースとした産業用パソコン)を用いて実現してもよい。この場合には、仮想化技術を用いて、用途の異なる複数のOSを並列的に実行させるとともに、各OS上で必要なアプリケーションを実行させるようにしてもよい。 FIG. 5 illustrates a configuration example in which a necessary function is provided by the processor 202 executing a program. However, part or all of the provided function is transferred to a dedicated hardware circuit (for example, an ASIC). Alternatively, it may be implemented using an FPGA or the like. Or you may implement | achieve the principal part of the control apparatus 200 using the hardware (For example, the industrial personal computer based on a general purpose personal computer) according to a general purpose architecture. In this case, a plurality of OSs having different applications may be executed in parallel using a virtualization technique, and necessary applications may be executed on each OS.
 (c4:ロボット制御装置)
 図6は、本実施の形態に従うロボット制御システム1に含まれるロボット制御装置300の構成例を示す模式図である。ロボット制御装置300は、制御装置200からの指令(典型的には、目標位置の三次元座標、姿勢(向き)、および移動速度など)に従って、ロボット2の各軸の動作パターンなどを計算する。
(C4: Robot controller)
FIG. 6 is a schematic diagram showing a configuration example of the robot control apparatus 300 included in the robot control system 1 according to the present embodiment. The robot control device 300 calculates an operation pattern of each axis of the robot 2 in accordance with a command from the control device 200 (typically, a three-dimensional coordinate of a target position, a posture (orientation), a moving speed, etc.).
 図6を参照して、ロボット制御装置300は、プロセッサ302と、メインメモリ304と、ストレージ306と、通信インターフェイス(I/F)部308と、ドライブコントローラ310とを含む。これらのコンポーネントは、プロセッサバス312を介して接続されている。ロボット制御装置300は、さらに、ドライブコントローラ310に接続された1または複数のサーボドライバ320-1,320-2,320-3,・・・,320-n(以下、「サーボドライバ320」とも総称する。)を含む。 Referring to FIG. 6, the robot control apparatus 300 includes a processor 302, a main memory 304, a storage 306, a communication interface (I / F) unit 308, and a drive controller 310. These components are connected via a processor bus 312. The robot controller 300 further includes one or more servo drivers 320-1, 320-2, 320-3,..., 320-n (hereinafter also referred to as “servo driver 320”) connected to the drive controller 310. Included).
 プロセッサ302は、制御演算などを実行する演算処理部に相当し、CPUやGPUなどで構成される。具体的には、プロセッサ302は、ストレージ306に格納されたプログラムを読出して、メインメモリ304に展開して実行することで各種処理を実現する。 The processor 302 corresponds to a calculation processing unit that executes control calculations and the like, and includes a CPU, a GPU, and the like. Specifically, the processor 302 implements various processes by reading out a program stored in the storage 306, developing it in the main memory 304, and executing it.
 メインメモリ304は、DRAMやSRAMなどの揮発性記憶装置などで構成される。ストレージ306は、記憶部に相当し、例えば、HDDやSSDなどの不揮発性記憶装置などで構成される。ストレージ306には、ロボット制御に係るシステムプログラムなどが格納される。 The main memory 304 is configured by a volatile storage device such as DRAM or SRAM. The storage 306 corresponds to a storage unit, and includes, for example, a nonvolatile storage device such as an HDD or an SSD. The storage 306 stores system programs related to robot control.
 通信インターフェイス部308は、制御装置200とロボット制御装置300との間のデータの遣り取りを担当する。 The communication interface unit 308 is in charge of data exchange between the control device 200 and the robot control device 300.
 ドライブコントローラ310は、ロボット2の各軸を動作させるモータ330-1,330-2,330-3,・・・,330-n(以下、「モータ330」とも総称する。)を駆動するためのサーボドライバ320の各々を制御する。サーボドライバ320は、ドライブコントローラ310からの指令に従って、接続されているモータ330を、指定された方向のトルク、加速度、回転速度で駆動する。 The drive controller 310 drives motors 330-1, 330-2, 330-3,..., 330-n (hereinafter also collectively referred to as “motors 330”) that operate the respective axes of the robot 2. Each of the servo drivers 320 is controlled. The servo driver 320 drives the connected motor 330 with torque, acceleration, and rotation speed in a specified direction in accordance with a command from the drive controller 310.
 図6には、プロセッサ302がプログラムを実行することで必要な機能が提供される構成例を示したが、これらの提供される機能の一部または全部を、専用のハードウェア回路(例えば、ASICまたはFPGAなど)を用いて実装してもよい。あるいは、ロボット制御装置300の主要部を、汎用的なアーキテクチャに従うハードウェア(例えば、汎用パソコンをベースとした産業用パソコン)を用いて実現してもよい。この場合には、仮想化技術を用いて、用途の異なる複数のOSを並列的に実行させるとともに、各OS上で必要なアプリケーションを実行させるようにしてもよい。 FIG. 6 shows a configuration example in which a necessary function is provided by the processor 302 executing a program. However, part or all of the provided function is transferred to a dedicated hardware circuit (for example, an ASIC). Alternatively, it may be implemented using an FPGA or the like. Or you may implement | achieve the principal part of the robot control apparatus 300 using the hardware (for example, industrial personal computer based on a general purpose personal computer) according to a general purpose architecture. In this case, a plurality of OSs having different applications may be executed in parallel using a virtualization technique, and necessary applications may be executed on each OS.
 <D.キャリブレーション>
 次に、本実施の形態に従うロボット制御システム1におけるキャリブレーションについて説明する。
<D. Calibration>
Next, calibration in robot control system 1 according to the present embodiment will be described.
 (d1:概要)
 ロボット制御システム1においては、複数種類のキャリブレーションを実行する必要がある。
(D1: Overview)
In the robot control system 1, it is necessary to execute a plurality of types of calibration.
 複数種類のキャリブレーションのうち一つは、3Dカメラ10による視野を基準に規定される座標系(以下、「カメラ座標系」とも称す。)のキャリブレーションである。このカメラ座標系のキャリブレーションは、既知のパターンが描かれた基準物体を複数の既知の位置に順次配置するとともに、画像計測装置100により計測される位置および姿勢に対応する位置および姿勢を取得する。そして、基準物体の位置および姿勢と三次元座標系上の位置および姿勢との対応関係(典型的には、変換式を規定するキャリブレーションパラメータ)が決定される。このキャリブレーションパラメータを用いることで、カメラ座標系に配置された任意の被写体の三次元座標を正確に計測できる。 One of the multiple types of calibration is calibration of a coordinate system (hereinafter also referred to as “camera coordinate system”) defined based on the field of view of the 3D camera 10. In this calibration of the camera coordinate system, reference objects on which a known pattern is drawn are sequentially arranged at a plurality of known positions, and positions and orientations corresponding to the positions and orientations measured by the image measuring apparatus 100 are acquired. . Then, a correspondence relationship between the position and orientation of the reference object and the position and orientation on the three-dimensional coordinate system (typically, a calibration parameter that defines a conversion equation) is determined. By using this calibration parameter, the three-dimensional coordinates of an arbitrary subject arranged in the camera coordinate system can be accurately measured.
 ここで、三次元座標系上の位置は、三次元座標系上において各軸の値によって規定される座標を意味し、三次元座標系上の姿勢は、三次元座標系上に各軸を基準とする回転方向によって規定される。すなわち、三次元座標系上の位置は、3次元の値であり、三次元座標系上においていずれの座標に存在するのかを示す情報であり、三次元座標系上の姿勢は、3次元の値であり、三次元座標系上においていずれの向きを向いているのかを示す情報である。 Here, the position on the 3D coordinate system means the coordinate specified by the value of each axis on the 3D coordinate system, and the attitude on the 3D coordinate system is based on each axis on the 3D coordinate system. Defined by the direction of rotation. That is, the position on the three-dimensional coordinate system is a three-dimensional value and is information indicating which coordinate exists on the three-dimensional coordinate system. The posture on the three-dimensional coordinate system is a three-dimensional value. It is information indicating which direction the three-dimensional coordinate system is facing.
 複数種類のキャリブレーションのうち別の一つは、ロボット2のベース位置を基準に規定される座標系(以下、「ロボット基準座標系」とも称す。)のキャリブレーションである。このロボット基準座標系のキャリブレーションは、制御装置200から指令として目標位置および目標姿勢の三次元座標をロボット制御装置300に与えた場合に、ロボット2の先端が指定された目標位置に移動することを保証する処理である。ロボット基準座標系のキャリブレーションにより、ロボット制御装置300におけるロボット2のモータ330を駆動するための演算式などが修正される。 Another one of the plurality of types of calibration is calibration of a coordinate system (hereinafter also referred to as “robot reference coordinate system”) defined based on the base position of the robot 2. The calibration of the robot reference coordinate system is such that when the three-dimensional coordinates of the target position and target posture are given to the robot control device 300 as a command from the control device 200, the tip of the robot 2 moves to the designated target position. It is a process that guarantees. An arithmetic expression for driving the motor 330 of the robot 2 in the robot control apparatus 300 is corrected by the calibration of the robot reference coordinate system.
 本実施の形態に従うロボット制御システム1は、カメラ座標系とロボット基準座標系との間の対応関係を決定するためのキャリブレーション(「カメラ・ロボットキャリブレーション」とも称す。)を実行可能になっている。 The robot control system 1 according to the present embodiment can execute calibration (also referred to as “camera robot calibration”) for determining the correspondence between the camera coordinate system and the robot reference coordinate system. Yes.
 以下の説明においては、主として、カメラ・ロボットキャリブレーションについて説明する。 In the following description, camera / robot calibration will be mainly described.
 図7は、本実施の形態に従うロボット制御システム1におけるカメラ・ロボットキャリブレーションの一側面を説明するための図である。図7を参照して、ロボット制御システム1においては、3Dカメラ10により撮像された画像に基づいて三次元座標を計測し、その計測値に基づいてロボット2を制御する。そのため、ロボット制御システム1におけるカメラ・ロボットキャリブレーションは、3Dカメラ10の視野内で行なうことが効率的である。さらに、3Dカメラ10の視野内であっても、対象のアプリケーションによっては、視野内のすべてを利用するのではなく、一部の空間領域のみを使用する場合もある。このような場合には、実際に使用される空間領域(すなわち、ロボット2の作用部が存在し得る空間領域)についてカメラ・ロボットキャリブレーションを実行すればよい。 FIG. 7 is a diagram for explaining one aspect of camera / robot calibration in robot control system 1 according to the present embodiment. With reference to FIG. 7, in the robot control system 1, three-dimensional coordinates are measured based on an image captured by the 3D camera 10, and the robot 2 is controlled based on the measured value. Therefore, it is efficient to perform the camera / robot calibration in the robot control system 1 within the field of view of the 3D camera 10. Furthermore, even within the field of view of the 3D camera 10, depending on the target application, not all of the field of view may be used, but only a part of the spatial region may be used. In such a case, camera / robot calibration may be executed for a space area that is actually used (that is, a space area where the action part of the robot 2 may exist).
 本実施の形態に従うロボット制御システム1においては、カメラ・ロボットキャリブレーションに用いられる空間領域(以下、「キャリブレーション領域」とも称す。)をユーザが任意に設定できるようになっている。ユーザが任意に設定したキャリブレーション領域に従って、カメラ・ロボットキャリブレーションが実行される。 In the robot control system 1 according to the present embodiment, a user can arbitrarily set a space area (hereinafter also referred to as “calibration area”) used for camera / robot calibration. Camera / robot calibration is executed in accordance with a calibration area arbitrarily set by the user.
 (d2:キャリブレーション領域の設定)
 図8は、本実施の形態に従うロボット制御システム1が提供するユーザインターフェイス画面の一例を示す模式図である。本実施の形態においては、一例として、画像計測装置100の設定受付プログラム1066によりキャリブレーション領域418の設定が受付けられるようになっており、図8に示すユーザインターフェイス画面400は、典型的には、画像計測装置100の表示部110(図4)上に表示される。但し、画像計測装置100に限定されることなく、制御装置200に接続される表示装置(不図示)上に表示されるようにしてもよい。画像計測装置100および/または制御装置200に設定が与えられるのであれば、どのような表示および入力の形態であってもよい。
(D2: calibration area setting)
FIG. 8 is a schematic diagram showing an example of a user interface screen provided by robot control system 1 according to the present embodiment. In the present embodiment, as an example, the setting of the calibration area 418 is received by the setting reception program 1066 of the image measurement apparatus 100, and the user interface screen 400 shown in FIG. It is displayed on the display unit 110 (FIG. 4) of the image measuring device 100. However, it is not limited to the image measuring device 100, and may be displayed on a display device (not shown) connected to the control device 200. As long as settings are given to the image measuring apparatus 100 and / or the control apparatus 200, any display and input forms may be used.
 本実施の形態においては、3Dカメラ10は、有効視野の断面が矩形になるように光学設計されているとする。そのため、図8に示すユーザインターフェイス画面400においては、3Dカメラ10の有効視野を、断面の矩形に対応する2つの直交する方向(便宜上、「X軸方向」および「Y軸方向」と称する。)により規定している。すなわち、ユーザインターフェイス画面400は、X断面設定オブジェクト401と、Y断面設定オブジェクト402とを含む。 In this embodiment, it is assumed that the 3D camera 10 is optically designed so that the cross section of the effective field of view is rectangular. Therefore, in the user interface screen 400 shown in FIG. 8, the effective field of view of the 3D camera 10 is two orthogonal directions corresponding to the rectangle of the cross section (referred to as “X-axis direction” and “Y-axis direction” for convenience). It is prescribed by. That is, the user interface screen 400 includes an X section setting object 401 and a Y section setting object 402.
 X断面設定オブジェクト401およびY断面設定オブジェクト402は、有効視野表示403および404をそれぞれ有している。有効視野表示403および404は、3Dカメラ10により三次元座標を計測できる範囲を示す。すなわち、撮像部である3Dカメラ10の視野範囲416が併せて表示される。 The X section setting object 401 and the Y section setting object 402 have effective visual field displays 403 and 404, respectively. The effective visual field displays 403 and 404 indicate a range in which the 3D camera 10 can measure the three-dimensional coordinates. That is, the visual field range 416 of the 3D camera 10 that is the imaging unit is also displayed.
 ユーザは、ユーザインターフェイス画面400上で、実際にアプリケーションで利用される領域、すなわちキャリブレーション領域418を設定する。 The user sets an area actually used by the application, that is, a calibration area 418 on the user interface screen 400.
 より具体的には、X断面設定オブジェクト401およびY断面設定オブジェクト402に両方にまたがるように、計測最下面設定バー410および計測最上面設定バー411が提供されている。ユーザは、計測最下面設定バー410および計測最上面設定バー411を操作して、キャリブレーション領域418の最上面および最下面を設定する。最上面および最下面は、3Dカメラ10の光軸AX上に沿った有効範囲(Z軸上の有効範囲)を設定するものである。すなわち、キャリブレーション領域418の設定として、撮像部である3Dカメラ10の光軸AX上の範囲設定が可能になっている。 More specifically, a measurement bottom surface setting bar 410 and a measurement top surface setting bar 411 are provided so as to extend over both the X cross section setting object 401 and the Y cross section setting object 402. The user operates the measurement bottom surface setting bar 410 and the measurement top surface setting bar 411 to set the top surface and the bottom surface of the calibration area 418. The uppermost surface and the lowermost surface set an effective range (effective range on the Z axis) along the optical axis AX of the 3D camera 10. That is, as a setting of the calibration area 418, a range setting on the optical axis AX of the 3D camera 10 that is the imaging unit can be set.
 典型的には、計測最下面設定バー410により設定される最下面は、ロボット2が配置される床面などを考慮して設定される。また、計測最上面設定バー411により設定される最上面は、ロボット2がワークを把持して搬送する範囲などを考慮して設定される。 Typically, the lowermost surface set by the measurement lowermost surface setting bar 410 is set in consideration of the floor surface on which the robot 2 is arranged. The top surface set by the measurement top surface setting bar 411 is set in consideration of the range in which the robot 2 grips and transports the workpiece.
 X断面設定オブジェクト401には、X軸方向幅設定バー412および413が提供されている。ユーザは、X軸方向幅設定バー412および413を操作して、キャリブレーション領域418のX軸断面における幅を設定する。すなわち、X軸方向幅設定バー412および413は、3Dカメラ10の光軸AXに直交する方向に沿った有効範囲(X軸上の有効範囲)を設定するものである。 The X section setting object 401 is provided with X axis direction width setting bars 412 and 413. The user operates the X-axis direction width setting bars 412 and 413 to set the width of the calibration area 418 in the X-axis cross section. That is, the X-axis direction width setting bars 412 and 413 are for setting an effective range (effective range on the X-axis) along the direction orthogonal to the optical axis AX of the 3D camera 10.
 同様に、Y断面設定オブジェクト402には、Y軸方向幅設定バー414および415が提供されている。ユーザは、Y軸方向幅設定バー414および415を操作して、キャリブレーション領域418のY軸断面における幅を設定する。すなわち、Y軸方向幅設定バー414および415は、3Dカメラ10の光軸AXに直交する方向に沿った有効範囲(Y軸上の有効範囲)を設定するものである。 Similarly, Y-axis direction width setting bars 414 and 415 are provided for the Y cross-section setting object 402. The user operates the Y-axis direction width setting bars 414 and 415 to set the width of the calibration area 418 in the Y-axis cross section. That is, the Y-axis direction width setting bars 414 and 415 are for setting an effective range (effective range on the Y axis) along a direction orthogonal to the optical axis AX of the 3D camera 10.
 図8に示すように、ユーザは、ユーザインターフェイス画面400において、任意のキャリブレーション領域418を設定できる。すなわち、ロボット制御システム1は、カメラ・ロボットキャリブレーションにおいて、ロボット2の作用部に関連付けられた基準プレート20を配置すべき領域であるキャリブレーション領域418の設定を受付ける設定受付機能を有している。 As shown in FIG. 8, the user can set an arbitrary calibration area 418 on the user interface screen 400. That is, the robot control system 1 has a setting reception function for receiving the setting of the calibration area 418 that is an area in which the reference plate 20 associated with the action part of the robot 2 is to be arranged in the camera / robot calibration. .
 設定されたキャリブレーション領域418に基づいて、カメラ・ロボットキャリブレーションが実行される。すなわち、キャリブレーション領域418の高さ方向の範囲は、計測最下面設定バー410および計測最上面設定バー411により規定される。また、キャリブレーション領域418の断面は、X軸方向幅設定バー412および413、ならびに、Y軸方向幅設定バー414および415により規定される。その結果、キャリブレーション領域418は、撮像部である3Dカメラ10の光軸AXを基準とした直方体(立方体を含む)として設定されることになる。このとき、キャリブレーション領域418の断面幅は、3Dカメラ10からの視野範囲416に応じて定まることになる。すなわち、キャリブレーション領域418として設定される直方体の断面の大きさは、3Dカメラ10の視野範囲416と3Dカメラ10からキャリブレーション領域418の端面までの距離とに応じて決定されることになる。 Based on the set calibration area 418, camera / robot calibration is executed. That is, the range in the height direction of the calibration area 418 is defined by the measurement bottom surface setting bar 410 and the measurement top surface setting bar 411. The cross section of the calibration area 418 is defined by X-axis direction width setting bars 412 and 413 and Y-axis direction width setting bars 414 and 415. As a result, the calibration area 418 is set as a rectangular parallelepiped (including a cube) based on the optical axis AX of the 3D camera 10 that is the imaging unit. At this time, the cross-sectional width of the calibration area 418 is determined in accordance with the visual field range 416 from the 3D camera 10. That is, the size of the cross section of the rectangular parallelepiped set as the calibration area 418 is determined in accordance with the visual field range 416 of the 3D camera 10 and the distance from the 3D camera 10 to the end face of the calibration area 418.
 なお、必ずしもキャリブレーション領域418を直方体として設定する必要はなく、3Dカメラ10の有効視野の形状に応じて、3Dカメラ10側ほど断面が小さくなるような形状(例えば、四角錐の頂点部を切り落としたような形状)を設定してもよい。但し、現実のアプリケーションを想定すると、直方体として設定することが計測安定性などの観点から好ましい。 Note that the calibration area 418 does not necessarily have to be set as a rectangular parallelepiped, and a shape whose cross section becomes smaller toward the 3D camera 10 side according to the shape of the effective field of view of the 3D camera 10 (for example, a vertex of a quadrangular pyramid is cut off). May be set. However, when an actual application is assumed, setting as a rectangular parallelepiped is preferable from the viewpoint of measurement stability and the like.
 以上のような操作によって設定されたキャリブレーション領域418に従って、後述するようなカメラ・ロボットキャリブレーションが実行される。なお、キャリブレーション領域418は、対象のアプリケーションに応じて任意に設定される。例えば、対象のワークの形状などに応じて設定してもよい。具体的には、配置されたワークを把持して指定された位置に搬送する、ピックアンドプレイスなどのアプリケーションにおいては、ワークが入ったコンテナの形状などに基づいて、キャリブレーション領域418を設定してもよい。 In accordance with the calibration area 418 set by the above operation, camera / robot calibration as described later is executed. The calibration area 418 is arbitrarily set according to the target application. For example, it may be set according to the shape of the target workpiece. Specifically, in an application such as pick and place that grips a placed work and transports it to a specified position, a calibration area 418 is set based on the shape of the container containing the work. Also good.
 このように、本実施の形態においては、ユーザは、ユーザインターフェイス画面400上で、キャリブレーション領域418(すなわち、三次元座標の計測が必要な領域)を設定する。典型的には、ユーザインターフェイス画面400内に表示される各設定バーをマウス操作などにより設定する。 As described above, in the present embodiment, the user sets the calibration area 418 (that is, an area where measurement of three-dimensional coordinates is necessary) on the user interface screen 400. Typically, each setting bar displayed in the user interface screen 400 is set by a mouse operation or the like.
 すると、画像計測装置100および/または制御装置200は、設定されたキャリブレーション領域418に基づいて、基準プレート20(図2)の相対移動位置をキャリブレーション領域418に対して過不足ないように決定する。その後、ユーザは、基準プレート20を3Dカメラ10の視野範囲416の中心に配置して、カメラ・ロボットキャリブレーションの実行を指示することになる。これにより、ロボット2に指令が与えられて、カメラ・ロボットキャリブレーションが実行される。 Then, the image measurement device 100 and / or the control device 200 determines the relative movement position of the reference plate 20 (FIG. 2) so as not to be excessive or insufficient based on the set calibration region 418. To do. Thereafter, the user places the reference plate 20 at the center of the visual field range 416 of the 3D camera 10 and instructs execution of camera / robot calibration. Thereby, a command is given to the robot 2 and camera / robot calibration is executed.
 (d3:カメラ・ロボットキャリブレーションの概要)
 ここで、カメラ・ロボットキャリブレーションの概略について説明する。カメラ・ロボットキャリブレーションは、典型的には、カメラ座標系上の位置および姿勢とロボット基準座標系上の位置および姿勢との間を相互変換するための行列の係数(パラメータ)を算出する処理を含む。
(D3: Overview of camera / robot calibration)
Here, an outline of camera / robot calibration will be described. The camera / robot calibration typically includes a process of calculating matrix coefficients (parameters) for mutual conversion between the position and orientation on the camera coordinate system and the position and orientation on the robot reference coordinate system. Including.
 ここで、ロボット2の先端にはハンドピース6が装着されており、ハンドピース6の位置を規定する座標系(以下、「ロボット先端座標系」とも称す。)を導入する。さらに、ハンドピース6に装着される基準プレート20上のマーカ22の位置を規定する座標系(以下、「マーカ座標系」とも称す。)を導入する。 Here, a hand piece 6 is attached to the tip of the robot 2, and a coordinate system that defines the position of the hand piece 6 (hereinafter also referred to as "robot tip coordinate system") is introduced. Furthermore, a coordinate system (hereinafter also referred to as “marker coordinate system”) that defines the position of the marker 22 on the reference plate 20 mounted on the handpiece 6 is introduced.
 ここで、3Dカメラ10のカメラ座標系とマーカ座標系との関係を示す行列をAとする。この行列Aは、3Dカメラ10によりマーカ22を認識することで推定できる。また、ロボット先端座標系とロボット基準座標系との関係を示す行列をBとする。この行列Bは、制御装置200からの指令に相当する。そして、以下の関係を満たす、行列Xおよび行列Zが推定される。 Here, let A be a matrix indicating the relationship between the camera coordinate system of the 3D camera 10 and the marker coordinate system. This matrix A can be estimated by recognizing the marker 22 by the 3D camera 10. Also, let B be a matrix indicating the relationship between the robot tip coordinate system and the robot reference coordinate system. This matrix B corresponds to a command from the control device 200. Then, a matrix X and a matrix Z that satisfy the following relationship are estimated.
  AX=ZB
 推定された行列Xおよび行列Zを用いることで、カメラ座標系での位置および姿勢をロボット基準座標系上の位置および姿勢に変換する計算式を得る。
AX = ZB
By using the estimated matrix X and matrix Z, a calculation formula for converting the position and orientation in the camera coordinate system into the position and orientation on the robot reference coordinate system is obtained.
 すなわち、カメラ・ロボットキャリブレーションにおいては、各位置における、画像計測装置100による計測値および制御装置200からの指令は既知であり、これらの値の組に基づいて、行列Xおよび行列Zが推定される。 That is, in the camera / robot calibration, the measurement values by the image measurement device 100 and the commands from the control device 200 at each position are known, and the matrix X and the matrix Z are estimated based on a set of these values. The
 このような行列の推定には、カメラ座標系およびロボット基準座標系をそれぞれ基準とした三次元座標のデータセット(例えば、10~20点)が必要となる。そのため、表面に1または複数のマーカ22が描かれた基準プレート20をロボット2の先端に装着して、以下のa)およびb)の操作を10~20回繰返す。 Such matrix estimation requires a three-dimensional coordinate data set (for example, 10 to 20 points) based on the camera coordinate system and the robot reference coordinate system, respectively. Therefore, the reference plate 20 having one or more markers 22 drawn on the surface is attached to the tip of the robot 2 and the following operations a) and b) are repeated 10 to 20 times.
 a)マーカ22が3Dカメラ10の視野内に含まれるように、ロボット2の先端位置を所定位置に配置して、そのときのマーカ22のロボット基準座標系上の位置および姿勢(ロボット2の先端位置および向き)を取得する
 b)a)の状態で、3Dカメラ10による撮像を行なうことで、マーカ22のカメラ座標系上の位置および姿勢を取得する
 a)およびb)の操作を繰返すことで取得された位置および姿勢の組(10~20個のデータセット)を用いて、上述したような行列を推定する。推定された行例を用いることで、カメラ・ロボットキャリブレーションにおけるロボット2と3Dカメラ10との位置関係に基づく、任意の被写体のロボット基準座標系での位置および姿勢を算出できる。
a) The tip position of the robot 2 is arranged at a predetermined position so that the marker 22 is included in the field of view of the 3D camera 10, and the position and posture of the marker 22 on the robot reference coordinate system at that time (tip of the robot 2) B) Acquire the position and orientation of the marker 22 on the camera coordinate system by performing imaging with the 3D camera 10 in the state of a). B) By repeating the operations a) and b) The matrix as described above is estimated by using the obtained set of positions and orientations (10 to 20 data sets). By using the estimated line example, it is possible to calculate the position and orientation of an arbitrary subject in the robot reference coordinate system based on the positional relationship between the robot 2 and the 3D camera 10 in the camera / robot calibration.
 <E.カメラ・ロボットキャリブレーション(自動)>
 次に、本実施の形態に従うロボット制御システム1におけるカメラ・ロボットキャリブレーションを自動的に実行する場合の処理例について説明する。
<E. Camera / Robot Calibration (Automatic)>
Next, a description will be given of a processing example when the camera / robot calibration is automatically executed in the robot control system 1 according to the present embodiment.
 図9は、本実施の形態に従うロボット制御システム1におけるカメラ・ロボットキャリブレーションの自動処理手順の一例を示すシーケンス図である。図9に示すように、カメラ・ロボットキャリブレーションの自動処理は、主として、画像計測装置100および制御装置200によって実行される。 FIG. 9 is a sequence diagram showing an example of an automatic processing procedure of camera / robot calibration in the robot control system 1 according to the present embodiment. As shown in FIG. 9, the camera / robot calibration automatic processing is mainly executed by the image measuring device 100 and the control device 200.
 まず、画像計測装置100は、カメラ・ロボットキャリブレーションの実行が指示されると、キャリブレーション領域の設定を受付けるためのユーザインターフェイス画面400を表示部110に表示する(シーケンスSQ100)。ユーザは、マウスといった入力部108を操作して、ユーザインターフェイス画面400上でキャリブレーション領域を設定する(シーケンスSQ102)。画像計測装置100は、シーケンスSQ102においてユーザが設定したキャリブレーション領域を受付ける(シーケンスSQ104)。 First, when the execution of camera / robot calibration is instructed, the image measuring apparatus 100 displays the user interface screen 400 for accepting the setting of the calibration area on the display unit 110 (sequence SQ100). The user operates input unit 108 such as a mouse to set a calibration area on user interface screen 400 (sequence SQ102). Image measuring apparatus 100 accepts the calibration area set by the user in sequence SQ102 (sequence SQ104).
 続いて、画像計測装置100は、3Dカメラ10により撮像された画像に基づいて、3Dカメラ10の視野内に基準プレート20が存在しているか否かを判断する(シーケンスSQ106)。 Subsequently, the image measuring device 100 determines whether or not the reference plate 20 exists within the field of view of the 3D camera 10 based on the image captured by the 3D camera 10 (sequence SQ106).
 3Dカメラ10の視野内に基準プレート20が存在していない場合(シーケンスSQ106においてNO)には、画像計測装置100は、ユーザに対して、ティーチングペンダントなどでロボット2を操作して、基準プレート20を3Dカメラ10の視野内に配置するように通知する(シーケンスSQ108)。そして、シーケンスSQ106の処理が繰返される。 When the reference plate 20 does not exist in the field of view of the 3D camera 10 (NO in sequence SQ106), the image measurement device 100 operates the robot 2 with a teaching pendant or the like to the user, and the reference plate 20 Is placed within the field of view of the 3D camera 10 (sequence SQ108). Then, the process of sequence SQ106 is repeated.
 3Dカメラ10の視野内に基準プレート20が存在している場合(シーケンスSQ106においてYES)には、画像計測装置100は、ユーザ操作を受けて(シーケンスSQ110)、シーケンスSQ104において受付けられたキャリブレーション領域の情報とともに、カメラ・ロボットキャリブレーションの開始指令を制御装置200へ送信する(シーケンスSQ112)。キャリブレーション領域の情報は、キャリブレーション領域の各頂点の位置を示すカメラ座標系上の位置(直方体であれば8個の位置)を含む。 When reference plate 20 is present within the field of view of 3D camera 10 (YES in sequence SQ106), image measurement apparatus 100 receives a user operation (sequence SQ110) and receives the calibration area received in sequence SQ104. Together with the above information, a camera / robot calibration start command is transmitted to the control device 200 (sequence SQ112). The information of the calibration area includes positions on the camera coordinate system indicating the positions of the vertices of the calibration area (eight positions if a rectangular parallelepiped).
 制御装置200は、まず、カメラ座標系において規定されたキャリブレーション領域に対応する、ロボット基準座標系上の位置を取得する(シーケンスSQ200~SQ206)。 Control device 200 first acquires a position on the robot reference coordinate system corresponding to the calibration area defined in the camera coordinate system (sequences SQ200 to SQ206).
 具体的には、制御装置200は、画像計測装置100から取得したキャリブレーション領域の各頂点を示すカメラ座標系上の位置のうち、1つの位置を選択する(シーケンスSQ200)。画像計測装置100は、3Dカメラ10により撮像された画像から計測された、基準プレート20上のマーカ22のカメラ座標系上の位置を取得して制御装置200へ与える(シーケンスSQ120)。制御装置200は、シーケンスSQ200において選択した位置と、基準プレート20上のマーカ22のカメラ座標系上の位置との差に基づいて、基準プレート20上のマーカ22を選択した位置に配置するための指令(ロボット基準座標系上の位置)を算出し(シーケンスSQ202)、算出した指令をロボット制御装置300へ与える(シーケンスSQ204)。 Specifically, control device 200 selects one position among positions on the camera coordinate system indicating each vertex of the calibration area acquired from image measurement apparatus 100 (sequence SQ200). The image measuring device 100 acquires the position on the camera coordinate system of the marker 22 on the reference plate 20 measured from the image captured by the 3D camera 10 and gives it to the control device 200 (sequence SQ120). Controller 200 arranges marker 22 on reference plate 20 at the selected position based on the difference between the position selected in sequence SQ200 and the position of marker 22 on reference plate 20 on the camera coordinate system. A command (position on the robot reference coordinate system) is calculated (sequence SQ202), and the calculated command is given to the robot controller 300 (sequence SQ204).
 シーケンスSQ120,SQ202,SQ204の処理(図9の※1の処理)は、シーケンスSQ200において選択した位置と、基準プレート20上のマーカ22のカメラ座標系上の位置とが実質的に一致するまで繰返される。そして、シーケンスSQ200において選択した位置と、基準プレート20上のマーカ22のカメラ座標系上の位置とが実質的に一致すると、制御装置200は、当該一致した時点の指令、すなわちロボット基準座標系上の位置を、キャリブレーション領域の1つの頂点に対応する位置として格納する(シーケンスSQ206)。 The processing of sequences SQ120, SQ202, and SQ204 (the processing of * 1 in FIG. 9) is repeated until the position selected in sequence SQ200 substantially matches the position of marker 22 on reference plate 20 on the camera coordinate system. It is. When the position selected in sequence SQ200 substantially matches the position of the marker 22 on the reference plate 20 on the camera coordinate system, the control device 200 instructs the corresponding point in time, that is, on the robot reference coordinate system. Is stored as a position corresponding to one vertex of the calibration area (sequence SQ206).
 シーケンスSQ200,SQ120,SQ202,SQ204,SQ206の処理(図9の※2の処理)は、画像計測装置100から取得したキャリブレーション領域の頂点の数だけ繰返される。 The processing of the sequences SQ200, SQ120, SQ202, SQ204, and SQ206 (the processing of * 2 in FIG. 9) is repeated by the number of vertices in the calibration area acquired from the image measuring device 100.
 以上の処理によって、キャリブレーション領域の各頂点に対応するロボット基準座標系上の位置の組(直方体であれば8個の位置)が取得される。なお、必要に応じて、キャリブレーション領域の各頂点に対応するロボット基準座標系上の姿勢についても取得される。 Through the above processing, a set of positions on the robot reference coordinate system corresponding to each vertex of the calibration area (eight positions if a rectangular parallelepiped) is acquired. If necessary, the posture on the robot reference coordinate system corresponding to each vertex of the calibration area is also acquired.
 続いて、制御装置200は、シーケンスSQ206において取得したロボット基準座標系上の位置の組に基づいて、基準プレート20を配置すべき位置の組を決定する(シーケンスSQ208)。シーケンスSQ208においては、ロボット基準座標系上の複数の位置が決定される。制御装置200は、基準プレート20を配置すべき位置の組に加えて、基準プレート20を移動させる軌跡などを決定してもよい。 Subsequently, the control device 200 determines a set of positions where the reference plate 20 should be arranged based on the set of positions on the robot reference coordinate system acquired in the sequence SQ206 (sequence SQ208). In sequence SQ208, a plurality of positions on the robot reference coordinate system are determined. The control device 200 may determine a trajectory or the like for moving the reference plate 20 in addition to a set of positions where the reference plate 20 should be arranged.
 基準プレート20を配置すべき位置の組は、例えば、キャリブレーション領域の頂点間を均等に所定数に分割することで決定してもよい。あるいは、均等ではなく、3Dカメラ10からの距離が遠い領域ほど、より多くの位置を設定するようにしてもよい。 The set of positions where the reference plate 20 should be arranged may be determined, for example, by equally dividing the vertices of the calibration area into a predetermined number. Or you may make it set more positions as the area | region far from the 3D camera 10 is not equal, and is far.
 上述したように、カメラ・ロボットキャリブレーションにおいては、10~20個のデータセットが必要となるので、シーケンスSQ208においても、10~20個の位置を決定することが好ましい。 As described above, since 10 to 20 data sets are required for the camera / robot calibration, it is preferable to determine 10 to 20 positions also in the sequence SQ208.
 そして、制御装置200は、シーケンスSQ208において決定した基準プレート20を配置すべき複数の位置のうち、1つの位置を選択する(シーケンスSQ210)。そして、制御装置200は、選択した位置(ロボット基準座標系上の位置)への基準プレート20の配置を指示する指令をロボット制御装置300へ与える(シーケンスSQ212)。制御装置200によって実行されるシーケンスSQ210,SQ212の処理は、ロボット2に指令を順次与えて、キャリブレーション領域内に基準プレート20(基準物体)を順次配置する配置制御機能に相当する。 Then, control device 200 selects one position among a plurality of positions where reference plate 20 determined in sequence SQ208 is to be arranged (sequence SQ210). Then, control device 200 gives a command to robot control device 300 to instruct the placement of reference plate 20 at the selected position (position on the robot reference coordinate system) (sequence SQ212). The processing of sequences SQ210 and SQ212 executed by control device 200 corresponds to an arrangement control function for sequentially giving commands to robot 2 and sequentially arranging reference plates 20 (reference objects) in the calibration area.
 ロボット2が指定された位置に基準プレート20を配置すると、画像計測装置100は、3Dカメラ10により撮像された画像から計測された、基準プレート20上のマーカ22のカメラ座標系上の位置を取得して制御装置200へ与える(シーケンスSQ122)。 When the reference plate 20 is placed at the designated position by the robot 2, the image measurement device 100 acquires the position on the camera coordinate system of the marker 22 on the reference plate 20 measured from the image captured by the 3D camera 10. To control device 200 (sequence SQ122).
 制御装置200は、現在配置されている基準プレート20の位置を示すロボット基準座標系上の位置(および姿勢)と、3Dカメラ10が撮像した画像により計測されたカメラ座標系上の位置(および姿勢)とを対応付けて格納する(シーケンスSQ214)。 The control device 200 includes a position (and posture) on the robot reference coordinate system that indicates the position of the reference plate 20 that is currently arranged, and a position (and posture) on the camera coordinate system measured by an image captured by the 3D camera 10. Are stored in association with each other (sequence SQ214).
 シーケンスSQ210,SQ122,SQ212,SQ214の処理(図9の※3の処理)は、シーケンスSQ210において決定された基準プレート20を配置すべき位置の数だけ繰返される。 The processing of the sequences SQ210, SQ122, SQ212, and SQ214 (the processing of * 3 in FIG. 9) is repeated for the number of positions where the reference plate 20 determined in the sequence SQ210 is to be arranged.
 以上の処理によって、カメラ・ロボットキャリブレーションに必要な、カメラ座標系上の位置とロボット基準座標系上の位置との対応関係を示すデータセットを取得できる。最終的に、制御装置200は、取得したデータセットに基づいて、所定の演算処理を実行することで、カメラ座標系とロボット基準座標系との間の対応関係を規定する行列を推定する(シーケンスSQ216)。この推定された行列がカメラ・ロボットキャリブレーションの結果として格納される。すなわち、制御装置200は、基準プレート20(基準物体)がキャリブレーション領域内に順次配置されたときに取得される、基準プレート20の三次元座標とロボット2の作用部の位置および姿勢との組に基づいて、キャリブレーションパラメータ(対応関係)を算出する。 Through the above processing, it is possible to acquire a data set indicating the correspondence between the position on the camera coordinate system and the position on the robot reference coordinate system, which is necessary for camera / robot calibration. Finally, the control apparatus 200 estimates a matrix that defines the correspondence between the camera coordinate system and the robot reference coordinate system by executing a predetermined calculation process based on the acquired data set (sequence). SQ216). This estimated matrix is stored as a result of camera / robot calibration. In other words, the control device 200 sets a combination of the three-dimensional coordinates of the reference plate 20 and the position and orientation of the action part of the robot 2 acquired when the reference plate 20 (reference object) is sequentially arranged in the calibration area. Based on the above, a calibration parameter (correspondence) is calculated.
 以上のような処理により、カメラ・ロボットキャリブレーションの実行が完了する。
 図10は、本実施の形態に従うロボット制御システム1におけるカメラ・ロボットキャリブレーションの自動処理において提供されるユーザインターフェイス画面の一例を示す模式図である。図10に示されるユーザインターフェイス画面420は、3Dカメラ10による撮像により得られる基準プレート20についての三次元計測結果、および、3Dカメラ10による撮像により得られる基準プレート20についての二次元画像を同時に表示する。さらに、ユーザインターフェイス画面420は、ユーザが予め設定したキャリブレーション領域および基準プレート20の高さを表示する。
The execution of the camera / robot calibration is completed by the processing as described above.
FIG. 10 is a schematic diagram showing an example of a user interface screen provided in the automatic camera / robot calibration process in the robot control system 1 according to the present embodiment. A user interface screen 420 shown in FIG. 10 simultaneously displays a three-dimensional measurement result of the reference plate 20 obtained by imaging with the 3D camera 10 and a two-dimensional image of the reference plate 20 obtained by imaging with the 3D camera 10. To do. Further, the user interface screen 420 displays the calibration area and the height of the reference plate 20 preset by the user.
 具体的には、ユーザインターフェイス画面420においては、3Dカメラ10により撮像された画像に対して、予め設定されたキャリブレーション領域を示す有効領域枠422が表示される。ユーザインターフェイス画面420は、高さ表示バー424を含んでいる。高さ表示バー424は、Z軸の計測可能範囲を表示するものである。なお、初期設定として、3Dカメラ10とロボット2のベースまでの距離が入力されているとする。そして、画像計測装置100により計測された基準プレート20の高さ方向の位置を示すインジケータ426が高さ表示バー424に対応付けて表示される。 Specifically, on the user interface screen 420, an effective area frame 422 indicating a preset calibration area is displayed for an image captured by the 3D camera 10. The user interface screen 420 includes a height display bar 424. The height display bar 424 displays the measurable range of the Z axis. It is assumed that the distance between the 3D camera 10 and the base of the robot 2 is input as an initial setting. Then, an indicator 426 indicating the height direction position of the reference plate 20 measured by the image measuring device 100 is displayed in association with the height display bar 424.
 図10(A)および図10(B)に示すように、基準プレート20の高さに応じて、キャリブレーション領域の断面の大きさは異なるので、ユーザインターフェイス画面420においても有効領域枠422の大きさは、基準プレート20の高さに応じて変化することになる。 As shown in FIGS. 10A and 10B, since the size of the cross section of the calibration area varies depending on the height of the reference plate 20, the size of the effective area frame 422 is also displayed on the user interface screen 420. The height changes according to the height of the reference plate 20.
 ユーザは、ユーザインターフェイス画面420を参照して、基準プレート20が設定したキャリブレーション領域内に存在しない場合には、ティーチングペンダントなどを操作して、基準プレート20がキャリブレーション領域内に配置されるように調整する。 When the user refers to the user interface screen 420 and the reference plate 20 does not exist within the set calibration area, the user operates the teaching pendant or the like so that the reference plate 20 is arranged within the calibration area. Adjust to.
 図10に示すようなユーザインターフェイス画面420を参照して、ユーザが初期設定を完了すると、上述したように、設定されたキャリブレーション領域に応じて、ロボット2が移動すべき範囲が自動的に決定される。このとき、3Dカメラ10により撮像された二次元画像および三次元計測結果をフィードバック情報として利用して、ロボット2に対する移動量の指令などを逐次決定する。 When the user completes the initial setting with reference to the user interface screen 420 as shown in FIG. 10, the range in which the robot 2 should move is automatically determined according to the set calibration area as described above. Is done. At this time, by using the two-dimensional image captured by the 3D camera 10 and the three-dimensional measurement result as feedback information, a movement amount command for the robot 2 is sequentially determined.
 そして、決定されたロボット2の移動範囲の上限および下限からそれぞれ均等に移動ステップ量が決定される。そして、決定されたステップ量ずつ基準プレート20を移動させつつ、二次元画像の取得および三次元計測を行なうことで、カメラ座標系上の位置とロボット基準座標系上の位置および姿勢との対応関係を取得する。そして、取得したカメラ座標系上の位置とロボット基準座標系上の位置および姿勢との相関関係に基づいて、カメラ・ロボットキャリブレーションを実行して、必要なパラメータを算出する。 Then, the movement step amount is determined equally from the upper limit and the lower limit of the determined movement range of the robot 2. Then, by acquiring the two-dimensional image and performing the three-dimensional measurement while moving the reference plate 20 by the determined step amount, the correspondence between the position on the camera coordinate system and the position and orientation on the robot reference coordinate system To get. Based on the correlation between the acquired position on the camera coordinate system and the position and orientation on the robot reference coordinate system, camera / robot calibration is executed to calculate necessary parameters.
 <F.カメラ・ロボットキャリブレーション(手動)>
 次に、本実施の形態に従うロボット制御システム1におけるカメラ・ロボットキャリブレーションを手動で実行する場合の処理例について説明する。
<F. Camera / Robot Calibration (Manual)>
Next, a processing example in the case of manually executing camera / robot calibration in the robot control system 1 according to the present embodiment will be described.
 図11は、本実施の形態に従うロボット制御システム1におけるカメラ・ロボットキャリブレーションの手動処理手順の一例を示すシーケンス図である。図11に示すように、カメラ・ロボットキャリブレーションの手動処理は、主として、画像計測装置100および制御装置200によって実行される。 FIG. 11 is a sequence diagram showing an example of a manual processing procedure of camera / robot calibration in the robot control system 1 according to the present embodiment. As shown in FIG. 11, manual processing of camera / robot calibration is mainly executed by the image measurement device 100 and the control device 200.
 まず、画像計測装置100は、カメラ・ロボットキャリブレーションの実行が指示されると、キャリブレーション領域の設定を受付けるためのユーザインターフェイス画面400を表示部110に表示する(シーケンスSQ150)。ユーザは、マウスといった入力部108を操作して、ユーザインターフェイス画面400上でキャリブレーション領域を設定する(シーケンスSQ152)。画像計測装置100は、シーケンスSQ152においてユーザが設定したキャリブレーション領域を受付ける(シーケンスSQ154)。 First, when the execution of camera / robot calibration is instructed, the image measuring apparatus 100 displays a user interface screen 400 for accepting the setting of the calibration area on the display unit 110 (sequence SQ150). The user operates input unit 108 such as a mouse to set a calibration area on user interface screen 400 (sequence SQ152). Image measuring apparatus 100 accepts the calibration area set by the user in sequence SQ152 (sequence SQ154).
 続いて、画像計測装置100は、設定されたキャリブレーション領域に基づいて、基準プレート20を配置すべき位置の組を決定する(シーケンスSQ156)。すなわち、画像計測装置100は、キャリブレーション実行機能として、キャリブレーション領域において基準プレート20(基準物体)を配置すべき複数の位置を決定する位置決定機能を有している。シーケンスSQ156においては、カメラ座標系上の複数の位置が決定される。制御装置200は、基準プレート20を配置すべき位置の組に加えて、基準プレート20を移動させる軌跡などを決定してもよい。 Subsequently, the image measuring apparatus 100 determines a set of positions where the reference plate 20 is to be arranged based on the set calibration area (sequence SQ156). That is, the image measuring apparatus 100 has a position determination function that determines a plurality of positions where the reference plate 20 (reference object) is to be arranged in the calibration area as a calibration execution function. In sequence SQ156, a plurality of positions on the camera coordinate system are determined. The control device 200 may determine a trajectory or the like for moving the reference plate 20 in addition to a set of positions where the reference plate 20 should be arranged.
 基準プレート20を配置すべき位置の組は、例えば、キャリブレーション領域の頂点間を均等に所定数に分割することで決定してもよい。あるいは、均等ではなく、3Dカメラ10からの距離が遠い領域ほど、より多くの位置を設定するようにしてもよい。 The set of positions where the reference plate 20 should be arranged may be determined, for example, by equally dividing the vertices of the calibration area into a predetermined number. Or you may make it set more positions as the area | region far from the 3D camera 10 is not equal, and is far.
 上述したように、カメラ・ロボットキャリブレーションにおいては、10~20個のデータセットが必要となるので、シーケンスSQ156においても、10~20個の位置を決定することが好ましい。 As described above, since 10 to 20 data sets are required for camera / robot calibration, it is preferable to determine 10 to 20 positions also in the sequence SQ156.
 続いて、画像計測装置100は、シーケンスSQ156において決定した基準プレート20を配置すべき複数の位置のうち、1つの位置を目標位置に選択する(シーケンスSQ158)。そして、画像計測装置100は、3Dカメラ10により撮像された画像に基づいて、基準プレート20上のマーカ22のカメラ座標系上の現在位置を取得し(シーケンスSQ160)、選択されている目標位置と取得した現在位置との差に基づいて、基準プレート20が目標位置に配置されているか否かを判断する(シーケンスSQ162)。 Subsequently, the image measuring apparatus 100 selects one position as a target position among a plurality of positions where the reference plate 20 determined in the sequence SQ156 is to be arranged (sequence SQ158). The image measuring apparatus 100 acquires the current position on the camera coordinate system of the marker 22 on the reference plate 20 based on the image captured by the 3D camera 10 (sequence SQ160), and the selected target position and Based on the difference from the acquired current position, it is determined whether or not the reference plate 20 is disposed at the target position (sequence SQ162).
 基準プレート20が目標位置に配置されていない場合(シーケンスSQ162においてNO)には、画像計測装置100は、ユーザに基準プレート20を目標位置に配置するための情報を通知する(シーケンスSQ164)。ユーザは、この通知を参照して、ティーチングペンダントなどを操作して、基準プレート20が目標位置に配置されるように、ロボット2の位置および姿勢を調整する。そして、シーケンスSQ160以下の処理が繰返される。 When the reference plate 20 is not arranged at the target position (NO in sequence SQ162), the image measuring device 100 notifies the user of information for arranging the reference plate 20 at the target position (sequence SQ164). The user refers to this notification and operates the teaching pendant or the like to adjust the position and posture of the robot 2 so that the reference plate 20 is disposed at the target position. Then, the processes after sequence SQ160 are repeated.
 これに対して、基準プレート20が目標位置に配置されている場合(シーケンスSQ162においてYES)には、画像計測装置100は、ユーザに対して基準プレート20が目標位置に配置されていることを通知する(シーケンスSQ166)とともに、制御装置200に対して、基準プレート20上のマーカ22のカメラ座標系上の位置(および姿勢)を送信する(シーケンスSQ168)。 On the other hand, when reference plate 20 is disposed at the target position (YES in sequence SQ162), image measurement apparatus 100 notifies the user that reference plate 20 is disposed at the target position. At the same time (sequence SQ166), the position (and orientation) of the marker 22 on the reference plate 20 on the camera coordinate system is transmitted to the control device 200 (sequence SQ168).
 すると、制御装置200は、現在配置されている基準プレート20の位置を示すロボット基準座標系上の位置(および姿勢)をロボット制御装置300から取得する(シーケンスSQ250)。そして、制御装置200は、現在配置されている基準プレート20の位置を示すロボット基準座標系上の位置(および姿勢)と、画像計測装置100からのカメラ座標系上の位置(および姿勢)とを対応付けて格納する(シーケンスSQ252)。 Then, control device 200 acquires a position (and posture) on the robot reference coordinate system indicating the position of reference plate 20 currently arranged from robot control device 300 (sequence SQ250). Then, the control device 200 determines the position (and orientation) on the robot reference coordinate system indicating the position of the reference plate 20 currently arranged and the position (and orientation) on the camera coordinate system from the image measuring device 100. Correspondingly stored (sequence SQ252).
 シーケンスSQ158~SQ168,SQ250,SQ252の処理(図11の※4の処理)は、シーケンスSQ156において決定した基準プレート20を配置すべき複数の位置の数だけ繰返される。 The processing of the sequences SQ158 to SQ168, SQ250, SQ252 (the processing of * 4 in FIG. 11) is repeated by the number of positions where the reference plate 20 determined in the sequence SQ156 is to be arranged.
 以上の処理によって、カメラ・ロボットキャリブレーションに必要な、カメラ座標系上の位置とロボット基準座標系上の位置との対応関係を示すデータセットを取得できる。最終的に、制御装置200は、取得したデータセットに基づいて、所定の演算処理を実行することで、カメラ座標系とロボット基準座標系との間の対応関係を規定する行列を推定する(シーケンスSQ254)。この推定された行列がカメラ・ロボットキャリブレーションの結果として格納される。すなわち、制御装置200は、基準プレート20(基準物体)がキャリブレーション領域内に順次配置されたときに取得される、基準プレート20の三次元座標とロボット2の作用部の位置および姿勢との組に基づいて、キャリブレーションパラメータ(対応関係)を算出する。 Through the above processing, it is possible to acquire a data set indicating the correspondence between the position on the camera coordinate system and the position on the robot reference coordinate system, which is necessary for camera / robot calibration. Finally, the control apparatus 200 estimates a matrix that defines the correspondence between the camera coordinate system and the robot reference coordinate system by executing a predetermined calculation process based on the acquired data set (sequence). SQ254). This estimated matrix is stored as a result of camera / robot calibration. In other words, the control device 200 sets a combination of the three-dimensional coordinates of the reference plate 20 and the position and orientation of the action part of the robot 2 acquired when the reference plate 20 (reference object) is sequentially arranged in the calibration area. Based on the above, a calibration parameter (correspondence) is calculated.
 以上のような処理により、カメラ・ロボットキャリブレーションの実行が完了する。
 図12は、本実施の形態に従うロボット制御システム1におけるカメラ・ロボットキャリブレーションの手動処理において提供されるユーザインターフェイス画面の一例を示す模式図である。
The execution of the camera / robot calibration is completed by the processing as described above.
FIG. 12 is a schematic diagram showing an example of a user interface screen provided in the camera / robot calibration manual process in the robot control system 1 according to the present embodiment.
 図12は、本実施の形態に従うロボット制御システム1におけるカメラ・ロボットキャリブレーションの手動処理において提供されるユーザインターフェイス画面の一例を示す模式図である。図12に示されるユーザインターフェイス画面430は、3Dカメラ10による撮像により得られる基準プレート20についての三次元計測結果、および、3Dカメラ10による撮像により得られる基準プレート20についての二次元画像を同時に表示する。さらに、ユーザインターフェイス画面430は、ユーザが予め設定したキャリブレーション領域および基準プレート20の高さを表示する。 FIG. 12 is a schematic diagram showing an example of a user interface screen provided in the camera / robot calibration manual process in the robot control system 1 according to the present embodiment. A user interface screen 430 shown in FIG. 12 simultaneously displays a three-dimensional measurement result of the reference plate 20 obtained by imaging with the 3D camera 10 and a two-dimensional image of the reference plate 20 obtained by imaging with the 3D camera 10. To do. Further, the user interface screen 430 displays the calibration area and the height of the reference plate 20 preset by the user.
 具体的には、ユーザインターフェイス画面430においては、3Dカメラ10により撮像された画像に対して、予め設定されたキャリブレーション領域を示す有効領域枠422が表示される。有効領域枠422に関連付けて、選択されている目標位置の二次元位置を示すインジケータ432が表示されている。 Specifically, on the user interface screen 430, an effective area frame 422 indicating a preset calibration area is displayed for an image captured by the 3D camera 10. In association with the effective area frame 422, an indicator 432 indicating the two-dimensional position of the selected target position is displayed.
 なお、基準プレート20の高さに応じて、キャリブレーション領域の断面の大きさは異なるので、ユーザインターフェイス画面430においても有効領域枠422の大きさは、基準プレート20の高さに応じて変化することになる。 Note that since the cross-sectional size of the calibration area varies depending on the height of the reference plate 20, the size of the effective area frame 422 also changes on the user interface screen 430 according to the height of the reference plate 20. It will be.
 ユーザインターフェイス画面420は、高さ表示バー424を含んでいる。高さ表示バー424は、Z軸の計測可能範囲を表示するものである。なお、初期設定として、3Dカメラ10とロボット2のベースまでの距離が入力されているとする。そして、画像計測装置100により計測された基準プレート20の高さ方向の位置を示すインジケータ426、および、選択されている目標位置の高さを示すインジケータ428が、高さ表示バー424に対応付けて表示される。 The user interface screen 420 includes a height display bar 424. The height display bar 424 displays the measurable range of the Z axis. It is assumed that the distance between the 3D camera 10 and the base of the robot 2 is input as an initial setting. An indicator 426 indicating the position in the height direction of the reference plate 20 measured by the image measuring apparatus 100 and an indicator 428 indicating the height of the selected target position are associated with the height display bar 424. Is displayed.
 図12に示すように、画像計測装置100は、決定された基準プレート20を配置すべき複数の位置のうち一つと基準プレート20(基準物体)の現在位置との関係を示す位置表示機能を提供する。 As shown in FIG. 12, the image measuring apparatus 100 provides a position display function that indicates the relationship between one of a plurality of positions where the determined reference plate 20 should be placed and the current position of the reference plate 20 (reference object). To do.
 ユーザは、ユーザインターフェイス画面420を参照することで、基準プレート20を適切な位置に配置できる。すなわち、ユーザは、基準プレート20の配置高さを予め指定された高さに合せるとともに、基準プレート20の二次元位置を示すインジケータ432が基準プレート20の中心と一致するように、ロボット2を操作する。このロボット2の操作中においても、3Dカメラ10による二次元画像の取得および三次元計測は繰返し実行される。 The user can place the reference plate 20 at an appropriate position by referring to the user interface screen 420. That is, the user operates the robot 2 so that the arrangement height of the reference plate 20 matches a predetermined height and the indicator 432 indicating the two-dimensional position of the reference plate 20 coincides with the center of the reference plate 20. To do. Even during the operation of the robot 2, the acquisition of the two-dimensional image and the three-dimensional measurement by the 3D camera 10 are repeatedly executed.
 そして、基準プレート20が選択された目標位置に配置されると、そのときのカメラ座標系上の位置およびロボット基準座標系上の位置および姿勢が関連付けて格納される。この一連の操作および処理が、カメラ・ロボットキャリブレーションを実行するための数だけ繰返される。最終的に、取得したカメラ座標系上の位置とロボット基準座標系上の位置および姿勢との相関関係に基づいて、カメラ・ロボットキャリブレーションを実行して、必要なパラメータを算出する。 When the reference plate 20 is placed at the selected target position, the position on the camera coordinate system and the position and orientation on the robot reference coordinate system at that time are stored in association with each other. This series of operations and processes is repeated for the number of times for executing the camera / robot calibration. Finally, based on the correlation between the obtained position on the camera coordinate system and the position and orientation on the robot reference coordinate system, camera / robot calibration is executed to calculate necessary parameters.
 なお、ユーザが操作を誤って基準プレート20がキャリブレーション領域外に存在するようになった場合には、何らかの通知をユーザに行なうようにしてもよい。このような通知形態としては、表示色の変更、注意を促すメッセージの表示、警報音の発生などを採用してもよい。図12(B)に示すユーザインターフェイス画面430においては、警告メッセージ434が表示されている例を示す。このような警告を行なうことで、カメラ・ロボットキャリブレーションを効率的に実行できる。 It should be noted that if the user mistakenly operates and the reference plate 20 comes out of the calibration area, some notification may be given to the user. As such a notification form, a change in display color, display of a warning message, generation of an alarm sound, or the like may be employed. An example in which a warning message 434 is displayed on the user interface screen 430 illustrated in FIG. By giving such a warning, the camera / robot calibration can be executed efficiently.
 あるいは、ユーザの誤操作により基準プレート20の位置が適切な領域から大きく外れる場合などには、制御装置200からロボット制御装置300に指令を与えて、あるいは、ロボット制御装置300に実装されている保護機能などを用いて、ロボット2を強制的に停止するようにしてもよい。 Alternatively, when the position of the reference plate 20 greatly deviates from an appropriate region due to a user's erroneous operation, a protection function implemented in the robot control device 300 is given by giving a command to the robot control device 300 from the control device 200. For example, the robot 2 may be forcibly stopped.
 <G.変形例>
 上述した実施の形態に対しては、以下のような変形も可能である。
<G. Modification>
The following modifications can be made to the above-described embodiment.
 (g1:実装形態)
 上述の説明においては、主として画像計測装置100および制御装置200が連係することで、本実施の形態に係るカメラ・ロボットキャリブレーションを実現する実装形態について説明したが、実装形態としてはこれに限られるものではない。
(G1: Mounting form)
In the above description, the mounting form that realizes the camera / robot calibration according to the present embodiment mainly by linking the image measuring apparatus 100 and the control apparatus 200 has been described. However, the mounting form is not limited thereto. It is not a thing.
 例えば、画像計測装置100および制御装置200が提供する機能は、互いに入れ替えてもよいし、画像計測装置100および制御装置200を一体の装置として構成してもよい。さらに、ロボット制御装置300を含めて一体の装置として構成することもできる。 For example, the functions provided by the image measurement device 100 and the control device 200 may be interchanged, or the image measurement device 100 and the control device 200 may be configured as an integrated device. Furthermore, the robot control apparatus 300 can be configured as an integrated apparatus.
 すなわち、上述したような処理および機能を提供できる構成であれば、どのような実装形態を採用してもよい。 That is, any implementation may be adopted as long as it can provide the processing and functions as described above.
 (g2:自動処理および手動処理)
 説明の便宜上、カメラ・ロボットキャリブレーションの自動処理および手動処理の両方を実行可能なロボット制御システム1を例示したが、必ずしも両方の処理を実装する必要はない。要求仕様やアプリケーションなどに応じて、いずれかの一方の処理のみを実装するようにしてもよい。
(G2: Automatic processing and manual processing)
For convenience of explanation, the robot control system 1 capable of performing both automatic processing and manual processing of camera / robot calibration is illustrated, but it is not always necessary to implement both processing. Only one of the processes may be implemented according to the required specification or application.
 (g3:基準プレート20の位置および経路)
 カメラ・ロボットキャリブレーションの実行に必要なデータセットを取得するための、基準プレート20を配置する位置は、任意に設定できる。また、設定された複数の位置に基準プレート20を順次配置する経路についても任意に設定できる。但し、基準プレート20を順次配置する経路については、作業の効率化のため、所定の最適化アルゴリズムに従って、最短経路を設定するようにしてもよい。
(G3: position and path of the reference plate 20)
The position at which the reference plate 20 is arranged for acquiring a data set necessary for executing the camera / robot calibration can be arbitrarily set. Further, a route for sequentially arranging the reference plates 20 at a plurality of set positions can be arbitrarily set. However, with respect to the path where the reference plates 20 are sequentially arranged, the shortest path may be set according to a predetermined optimization algorithm in order to improve work efficiency.
 <H.付記>
 上述したような本実施の形態は、以下のような技術思想を含む。
[構成1]
 ロボット制御システム(1)であって、
 ロボット(2)の作用部(8)を視野内に含むように配置された撮像部(10)と、
 前記撮像部により撮像された画像に基づいて、当該撮像部の視野内に存在する任意の対象物の三次元座標を計測する計測部(52;100)と、
 前記計測された三次元座標と前記ロボットの作用部の位置および姿勢との間の予め算出された対応関係に従って、前記ロボットの前記作用部を位置決めするための指令を生成する指令生成部(54;200)と、
 前記対応関係を算出するためのキャリブレーションを実行するキャリブレーション実行部(58;100;200)と、
 前記キャリブレーションにおいて、前記ロボットの作用部に関連付けられた基準物体(20)を配置すべき領域であるキャリブレーション領域(418)の設定を受付ける設定受付部(60;400)とを備える、ロボット制御システム。
[構成2]
 前記設定受付部は、前記撮像部を基準として、設定されているキャリブレーション領域の範囲を示す、構成1に記載のロボット制御システム。
[構成3]
 前記設定受付部は、前記撮像部の視野範囲(416)を併せて表示する、構成2に記載のロボット制御システム。
[構成4]
 前記キャリブレーション領域は、前記撮像部の光軸(AX)を基準とした直方体として設定される、構成1~3のいずれか1項に記載のロボット制御システム。
[構成5]
 前記キャリブレーション領域として設定される直方体の断面の大きさは、前記撮像部の視野と前記撮像部から前記キャリブレーション領域の端面までの距離とに応じて、決定される、構成4に記載のロボット制御システム。
[構成6]
 前記設定受付部は、前記キャリブレーション領域の設定として、前記撮像部の光軸(AX)上の範囲設定を受付ける、構成4または5に記載のロボット制御システム。
[構成7]
 前記キャリブレーション実行部は、
  前記ロボットに指令を順次与えて、前記キャリブレーション領域内に前記基準物体を順次配置する配置制御部(200;SQ210,SQ212)と、
  前記基準物体が前記キャリブレーション領域内に順次配置されたときに取得される、前記基準物体の三次元座標と前記ロボットの作用部の位置および姿勢との組に基づいて、前記対応関係を算出する算出部(200;SQ216)とを含む、構成1~6のいずれか1項に記載のロボット制御システム。
[構成8]
 前記キャリブレーション実行部は、
  前記キャリブレーション領域において前記基準物体を配置すべき複数の位置を決定する位置決定部(100;SQ156)と、
  前記決定された複数の位置のうち一つと前記基準物体の現在位置との関係を示す位置表示部(100;SQ164)と、
  前記基準物体が前記決定された複数の位置に順次配置されたときに取得される、前記基準物体の三次元座標と前記ロボットの作用部の位置および姿勢との組に基づいて、前記対応関係を算出する算出部(200;SQ254)とを含む、構成1~6のいずれか1項に記載のロボット制御システム。
<H. Addendum>
The present embodiment as described above includes the following technical idea.
[Configuration 1]
A robot control system (1),
An imaging unit (10) disposed so as to include the action unit (8) of the robot (2) in the field of view;
A measurement unit (52; 100) for measuring a three-dimensional coordinate of an arbitrary object existing in the field of view of the imaging unit based on an image captured by the imaging unit;
A command generation unit (54) that generates a command for positioning the action part of the robot according to a pre-calculated correspondence relationship between the measured three-dimensional coordinates and the position and posture of the action part of the robot; 200),
A calibration execution unit (58; 100; 200) for executing calibration for calculating the correspondence relationship;
In the calibration, a robot control including a setting reception unit (60; 400) that receives a setting of a calibration region (418) that is a region where a reference object (20) associated with the action unit of the robot is to be arranged system.
[Configuration 2]
The robot control system according to Configuration 1, wherein the setting reception unit indicates a range of a calibration area that is set with the imaging unit as a reference.
[Configuration 3]
The robot control system according to Configuration 2, wherein the setting reception unit also displays a visual field range (416) of the imaging unit.
[Configuration 4]
The robot control system according to any one of configurations 1 to 3, wherein the calibration area is set as a rectangular parallelepiped based on an optical axis (AX) of the imaging unit.
[Configuration 5]
The robot according to configuration 4, wherein a size of a cross section of a rectangular parallelepiped set as the calibration area is determined according to a field of view of the imaging unit and a distance from the imaging unit to an end surface of the calibration area. Control system.
[Configuration 6]
The robot control system according to Configuration 4 or 5, wherein the setting reception unit receives a range setting on the optical axis (AX) of the imaging unit as the setting of the calibration area.
[Configuration 7]
The calibration execution unit
An arrangement controller (200; SQ210, SQ212) for sequentially giving instructions to the robot and sequentially arranging the reference objects in the calibration area;
The correspondence is calculated based on a set of the three-dimensional coordinates of the reference object and the position and orientation of the action part of the robot, which are acquired when the reference object is sequentially arranged in the calibration area. The robot control system according to any one of configurations 1 to 6, including a calculation unit (200; SQ216).
[Configuration 8]
The calibration execution unit
A position determination unit (100; SQ156) for determining a plurality of positions where the reference object is to be arranged in the calibration area;
A position display unit (100; SQ164) showing a relationship between one of the determined positions and the current position of the reference object;
Based on a set of the three-dimensional coordinates of the reference object and the position and orientation of the action part of the robot, which are obtained when the reference object is sequentially arranged at the determined positions, the correspondence relationship is obtained. The robot control system according to any one of configurations 1 to 6, further comprising a calculation unit (200; SQ254) for calculating.
 <I.利点>
 本実施の形態に従うロボット制御システムのように、3Dカメラにより撮像された画像に関連付けられるカメラ座標系とロボットを制御するためのロボット基準座標系との間のカメラ・ロボットキャリブレーションを実行するにあたっては、ロボットにより基準物体を所定位置に配置する必要があり、基準物体の位置および姿勢の指定が容易ではない。
<I. Advantage>
When performing the camera / robot calibration between the camera coordinate system associated with the image captured by the 3D camera and the robot reference coordinate system for controlling the robot, as in the robot control system according to the present embodiment. The reference object needs to be arranged at a predetermined position by the robot, and it is not easy to specify the position and orientation of the reference object.
 すなわち、基準物体を計測視野内に収める必要があり、基準物体を床面に接触させてはならず、不慣れなユーザは、このような要求を満たしつつロボットを操作することが難しい場合がある。 That is, it is necessary to place the reference object in the measurement field of view, and the reference object should not be brought into contact with the floor surface, and an inexperienced user may find it difficult to operate the robot while satisfying such requirements.
 さらに、3Dカメラにより撮像される二次元画像に基づいてカメラ・キャリブレーションを実行した場合には、3Dカメラの遠方側の三次元計測が不可能な領域についても対象に含まれることがあり、ムダな領域について、カメラ・キャリブレーションを実行することになる。また、3Dカメラの近接側および遠方側では、有効視野の大きさが異なっており、光軸方向に沿って基準物体を位置決めする場合には、動作させるべき範囲を自動的に設定することが難しい。 Furthermore, when camera calibration is executed based on a two-dimensional image captured by a 3D camera, an area that cannot be three-dimensionally measured on the far side of the 3D camera may be included in the target. The camera calibration is executed for such a region. Also, the size of the effective field of view is different between the near side and the far side of the 3D camera, and it is difficult to automatically set the range to be operated when positioning the reference object along the optical axis direction. .
 このような課題に対して、本実施の形態に従うロボット制御システムにおいては、3Dカメラを基準としてカメラ・ロボットキャリブレーションを実行すべき領域(キャリブレーション領域)を任意に設定できる。このとき、3Dカメラの有効視野を合せて表示できるので、適切なキャリブレーション領域の設定が可能となる。そして、設定されたキャリブレーション領域に関して、自動的あるいは手動でカメラ・ロボットキャリブレーションを実行できる。 For such a problem, in the robot control system according to the present embodiment, an area (calibration area) where the camera / robot calibration should be executed can be arbitrarily set based on the 3D camera. At this time, since the effective visual field of the 3D camera can be displayed together, an appropriate calibration area can be set. Then, camera / robot calibration can be executed automatically or manually for the set calibration area.
 また、本実施の形態においては、光軸方向のいずれの位置においても、キャリブレーション領域の断面を実質的に同一の大きさに設定することで、基準物体を配置する位置を自動的に決定できるとともに、アプリケーションに応じた設定を容易化できる。 In the present embodiment, the position where the reference object is arranged can be automatically determined by setting the cross section of the calibration area to substantially the same size at any position in the optical axis direction. At the same time, the setting according to the application can be facilitated.
 今回開示された実施の形態はすべての点で例示であって制限的なものではないと考えられるべきである。本発明の範囲は、上記した説明ではなく、請求の範囲によって示され、請求の範囲と均等の意味および範囲内でのすべての変更が含まれることが意図される。 The embodiment disclosed this time should be considered as illustrative in all points and not restrictive. The scope of the present invention is defined by the terms of the claims, rather than the description above, and is intended to include any modifications within the scope and meaning equivalent to the terms of the claims.
 1 ロボット制御システム、2 ロボット、4 アーム、6 ハンドピース、8 作用部、10 3Dカメラ、11 処理部、12 投光部、13,50 撮像部、14,110 表示部、15 記憶部、16,114,208,308 通信インターフェイス(I/F)部、20 基準プレート、22 マーカ、52 計測部、54 指令生成部、56,2060 キャリブレーションパラメータ、58 キャリブレーション実行部、60 設定受付部、100 画像計測装置、102,202,302 プロセッサ、104,204,304 メインメモリ、106,206,306 ストレージ、108 入力部、112 光学ドライブ、113 記録媒体、116,230,312 プロセッサバス、200 制御装置、210,212 フィールドネットワークコントローラ、214 USBコントローラ、216 メモリカードインターフェイス、218 メモリカード、220 ローカルバスコントローラ、222 ローカルバス、300 ロボット制御装置、310 ドライブコントローラ、320 サーボドライバ、330 モータ、400,420,430 ユーザインターフェイス画面、401,402 断面設定オブジェクト、403 有効視野表示、410 計測最下面設定バー、411 計測最上面設定バー、412,414 軸方向幅設定バー、416 視野範囲、418 キャリブレーション領域、422 有効領域枠、424 表示バー、426,428,432 インジケータ、434 警告メッセージ、1060 OS、1062 三次元計測プログラム、1064 モデルデータ、1066 設定受付プログラム、2062 指令生成プログラム、2064 キャリブレーション実行プログラム、AX 光軸。 DESCRIPTION OF SYMBOLS 1 Robot control system, 2 Robot, 4 Arm, 6 Handpiece, 8 Action part, 10 3D camera, 11 Processing part, 12 Flooding part, 13,50 Imaging part, 14,110 Display part, 15 Storage part, 16, 114, 208, 308 Communication interface (I / F) section, 20 reference plate, 22 marker, 52 measurement section, 54 command generation section, 56, 2060 calibration parameter, 58 calibration execution section, 60 setting reception section, 100 images Measuring device, 102, 202, 302 processor, 104, 204, 304 main memory, 106, 206, 306 storage, 108 input unit, 112 optical drive, 113 recording medium, 116, 230, 312 processor bus, 200 control device, 210 , 12 field network controller, 214 USB controller, 216 memory card interface, 218 memory card, 220 local bus controller, 222 local bus, 300 robot controller, 310 drive controller, 320 servo driver, 330 motor, 400, 420, 430 user interface Screen, 401, 402 Cross section setting object, 403 Effective field display, 410 Measurement bottom surface setting bar, 411 Measurement top surface setting bar, 412, 414 Axial width setting bar, 416 Field of view range, 418 Calibration area, 422 Effective area frame 424 display bar, 426, 428, 432 indicator, 434 warning message, 1060 OS, 1062 tertiary Measuring program, 1064 model data 1066 setting reception program 2062 command generation program, 2064 calibration execution program, AX optical axis.

Claims (8)

  1.  ロボット制御システムであって、
     ロボットの作用部を視野内に含むように配置された撮像部と、
     前記撮像部により撮像された画像に基づいて、当該撮像部の視野内に存在する任意の対象物の三次元座標を計測する計測部と、
     前記計測された三次元座標と前記ロボットの作用部の位置および姿勢との間の予め算出された対応関係に従って、前記ロボットの前記作用部を位置決めするための指令を生成する指令生成部と、
     前記対応関係を算出するためのキャリブレーションを実行するキャリブレーション実行部と、
     前記キャリブレーションにおいて、前記ロボットの作用部に関連付けられた基準物体を配置すべき領域であるキャリブレーション領域の設定を受付ける設定受付部とを備える、ロボット制御システム。
    A robot control system,
    An imaging unit disposed so as to include the robot's action unit in the field of view;
    A measurement unit that measures the three-dimensional coordinates of an arbitrary object existing in the field of view of the imaging unit, based on the image captured by the imaging unit;
    A command generation unit that generates a command for positioning the action part of the robot according to a pre-calculated correspondence relationship between the measured three-dimensional coordinates and the position and posture of the action part of the robot;
    A calibration execution unit for executing calibration for calculating the correspondence relationship;
    In the calibration, a robot control system comprising: a setting reception unit that receives a setting of a calibration region that is a region in which a reference object associated with the operation unit of the robot is to be placed.
  2.  前記設定受付部は、前記撮像部を基準として、設定されているキャリブレーション領域の範囲を示す、請求項1に記載のロボット制御システム。 The robot control system according to claim 1, wherein the setting reception unit indicates a set calibration area range with the imaging unit as a reference.
  3.  前記設定受付部は、前記撮像部の視野範囲を併せて表示する、請求項2に記載のロボット制御システム。 The robot control system according to claim 2, wherein the setting receiving unit also displays a field of view range of the imaging unit.
  4.  前記キャリブレーション領域は、前記撮像部の光軸を基準とした直方体として設定される、請求項1~3のいずれか1項に記載のロボット制御システム。 The robot control system according to any one of claims 1 to 3, wherein the calibration area is set as a rectangular parallelepiped based on an optical axis of the imaging unit.
  5.  前記キャリブレーション領域として設定される直方体の断面の大きさは、前記撮像部の視野と前記撮像部から前記キャリブレーション領域の端面までの距離とに応じて、決定される、請求項4に記載のロボット制御システム。 The size of the cross section of the rectangular parallelepiped set as the calibration area is determined according to a field of view of the imaging unit and a distance from the imaging unit to an end surface of the calibration area. Robot control system.
  6.  前記設定受付部は、前記キャリブレーション領域の設定として、前記撮像部の光軸上の範囲設定を受付ける、請求項4または5に記載のロボット制御システム。 The robot control system according to claim 4 or 5, wherein the setting reception unit receives a range setting on the optical axis of the imaging unit as the calibration area setting.
  7.  前記キャリブレーション実行部は、
      前記ロボットに指令を順次与えて、前記キャリブレーション領域内に前記基準物体を順次配置する配置制御部と、
      前記基準物体が前記キャリブレーション領域内に順次配置されたときに取得される、前記基準物体の三次元座標と前記ロボットの作用部の位置および姿勢との組に基づいて、前記対応関係を算出する算出部とを含む、請求項1~6のいずれか1項に記載のロボット制御システム。
    The calibration execution unit
    An arrangement controller that sequentially gives instructions to the robot and sequentially arranges the reference objects in the calibration area;
    The correspondence is calculated based on a set of the three-dimensional coordinates of the reference object and the position and orientation of the action part of the robot, which are acquired when the reference object is sequentially arranged in the calibration area. The robot control system according to any one of claims 1 to 6, further comprising a calculation unit.
  8.  前記キャリブレーション実行部は、
      前記キャリブレーション領域において前記基準物体を配置すべき複数の位置を決定する位置決定部と、
      前記決定された複数の位置のうち一つと前記基準物体の現在位置との関係を示す位置表示部と、
      前記基準物体が前記決定された複数の位置に順次配置されたときに取得される、前記基準物体の三次元座標と前記ロボットの作用部の位置および姿勢との組に基づいて、前記対応関係を算出する算出部とを含む、請求項1~6のいずれか1項に記載のロボット制御システム。
    The calibration execution unit
    A position determining unit that determines a plurality of positions where the reference object is to be arranged in the calibration area;
    A position display unit indicating a relationship between one of the determined positions and the current position of the reference object;
    Based on a set of the three-dimensional coordinates of the reference object and the position and orientation of the action part of the robot, which are obtained when the reference object is sequentially arranged at the determined positions, the correspondence relationship is obtained. The robot control system according to any one of claims 1 to 6, further comprising a calculation unit for calculating.
PCT/JP2019/020623 2018-06-15 2019-05-24 Robot control system WO2019239848A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018114907A JP7070127B2 (en) 2018-06-15 2018-06-15 Robot control system
JP2018-114907 2018-06-15

Publications (1)

Publication Number Publication Date
WO2019239848A1 true WO2019239848A1 (en) 2019-12-19

Family

ID=68842581

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/020623 WO2019239848A1 (en) 2018-06-15 2019-05-24 Robot control system

Country Status (2)

Country Link
JP (1) JP7070127B2 (en)
WO (1) WO2019239848A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11009889B2 (en) * 2016-10-14 2021-05-18 Ping An Technology (Shenzhen) Co., Ltd. Guide robot and method of calibrating moving region thereof, and computer readable storage medium
CN114894116A (en) * 2022-04-08 2022-08-12 苏州瀚华智造智能技术有限公司 Measurement data fusion method and non-contact measurement equipment

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7468288B2 (en) * 2020-10-16 2024-04-16 オムロン株式会社 CALIBRATION DEVICE AND METHOD FOR AUTOMATIC CALIBRATION SETTING
CN112525074B (en) * 2020-11-24 2022-04-12 杭州素问九州医疗科技有限公司 Calibration method, calibration system, robot, computer device and navigation system
JP7437343B2 (en) * 2021-03-29 2024-02-22 株式会社日立製作所 Calibration device for robot control

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0435885A (en) * 1990-05-30 1992-02-06 Fanuc Ltd Calibration method for visual sensor
JP2010188439A (en) * 2009-02-16 2010-09-02 Mitsubishi Electric Corp Method and apparatus for calculating parameter
US20150142171A1 (en) * 2011-08-11 2015-05-21 Siemens Healthcare Diagnostics Inc. Methods and apparatus to calibrate an orientation between a robot gripper and a camera
JP2015182144A (en) * 2014-03-20 2015-10-22 キヤノン株式会社 Robot system and calibration method of robot system
WO2018043524A1 (en) * 2016-09-02 2018-03-08 倉敷紡績株式会社 Robot system, robot system control device, and robot system control method
US20180178388A1 (en) * 2016-12-22 2018-06-28 Seiko Epson Corporation Control apparatus, robot and robot system
JP2018111166A (en) * 2017-01-12 2018-07-19 ファナック株式会社 Calibration device of visual sensor, method and program

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0435885A (en) * 1990-05-30 1992-02-06 Fanuc Ltd Calibration method for visual sensor
JP2010188439A (en) * 2009-02-16 2010-09-02 Mitsubishi Electric Corp Method and apparatus for calculating parameter
US20150142171A1 (en) * 2011-08-11 2015-05-21 Siemens Healthcare Diagnostics Inc. Methods and apparatus to calibrate an orientation between a robot gripper and a camera
JP2015182144A (en) * 2014-03-20 2015-10-22 キヤノン株式会社 Robot system and calibration method of robot system
WO2018043524A1 (en) * 2016-09-02 2018-03-08 倉敷紡績株式会社 Robot system, robot system control device, and robot system control method
US20180178388A1 (en) * 2016-12-22 2018-06-28 Seiko Epson Corporation Control apparatus, robot and robot system
JP2018111166A (en) * 2017-01-12 2018-07-19 ファナック株式会社 Calibration device of visual sensor, method and program

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11009889B2 (en) * 2016-10-14 2021-05-18 Ping An Technology (Shenzhen) Co., Ltd. Guide robot and method of calibrating moving region thereof, and computer readable storage medium
CN114894116A (en) * 2022-04-08 2022-08-12 苏州瀚华智造智能技术有限公司 Measurement data fusion method and non-contact measurement equipment
CN114894116B (en) * 2022-04-08 2024-02-23 苏州瀚华智造智能技术有限公司 Measurement data fusion method and non-contact measurement equipment

Also Published As

Publication number Publication date
JP7070127B2 (en) 2022-05-18
JP2019217571A (en) 2019-12-26

Similar Documents

Publication Publication Date Title
WO2019239848A1 (en) Robot control system
JP6723738B2 (en) Information processing apparatus, information processing method, and program
JP5949242B2 (en) Robot system, robot, robot control apparatus, robot control method, and robot control program
JP5549749B1 (en) Robot teaching system, robot teaching program generation method and teaching tool
JP4737668B2 (en) 3D measurement method and 3D measurement system
US7236854B2 (en) Method and a system for programming an industrial robot
JP2009012106A (en) Remote operation supporting device and program
JP2014128845A (en) Robot system display device
JP5113666B2 (en) Robot teaching system and display method of robot operation simulation result
JP2018008347A (en) Robot system and operation region display method
JP2018202514A (en) Robot system representing information for learning of robot
JP2018001393A (en) Robot device, robot control method, program and recording medium
JP2016159406A (en) Robot control device, robot control method and robot system
Ponomareva et al. Grasplook: a vr-based telemanipulation system with r-cnn-driven augmentation of virtual environment
US11951637B2 (en) Calibration apparatus and calibration method for coordinate system of robotic arm
US20220168902A1 (en) Method And Control Arrangement For Determining A Relation Between A Robot Coordinate System And A Movable Apparatus Coordinate System
US11926064B2 (en) Remote control manipulator system and remote control assistance system
JP7249221B2 (en) SENSOR POSITION AND POSTURE CALIBRATION DEVICE AND SENSOR POSITION AND POSTURE CALIBRATION METHOD
CN110900606B (en) Hand-eye linkage system based on small mechanical arm and control method thereof
JP2007066045A (en) Simulation device
KR20220110546A (en) Methods and systems for programming robots
JP2007101229A (en) Method and system for 3-dimensional measurement and control method and device of manipulator
WO2022181500A1 (en) Simulation device using three-dimensional position information obtained from output from vision sensor
WO2022249295A1 (en) Robot simulation device
JP2019067285A (en) Information processing device, information processing method, and information processing program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19820260

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19820260

Country of ref document: EP

Kind code of ref document: A1