WO2020022041A1 - Control system, control method, and program - Google Patents

Control system, control method, and program Download PDF

Info

Publication number
WO2020022041A1
WO2020022041A1 PCT/JP2019/026959 JP2019026959W WO2020022041A1 WO 2020022041 A1 WO2020022041 A1 WO 2020022041A1 JP 2019026959 W JP2019026959 W JP 2019026959W WO 2020022041 A1 WO2020022041 A1 WO 2020022041A1
Authority
WO
WIPO (PCT)
Prior art keywords
state
robot
control
frame
target frame
Prior art date
Application number
PCT/JP2019/026959
Other languages
French (fr)
Japanese (ja)
Inventor
加藤 豊
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2020022041A1 publication Critical patent/WO2020022041A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01RELECTRICALLY-CONDUCTIVE CONNECTIONS; STRUCTURAL ASSOCIATIONS OF A PLURALITY OF MUTUALLY-INSULATED ELECTRICAL CONNECTING ELEMENTS; COUPLING DEVICES; CURRENT COLLECTORS
    • H01R43/00Apparatus or processes specially adapted for manufacturing, assembling, maintaining, or repairing of line connectors or current collectors or for joining electric conductors
    • H01R43/26Apparatus or processes specially adapted for manufacturing, assembling, maintaining, or repairing of line connectors or current collectors or for joining electric conductors for engaging or disengaging the two parts of a coupling device

Definitions

  • the present technology relates to a control system, a control method, and a program for controlling a robot.
  • Non-Patent Document 1 The technique described in Non-Patent Document 1 is based on the premise that the relative position between the terminal supported by the robot hand and the camera is constant. Therefore, if the relative positional relationship between the terminal supported by the robot hand and the camera is deviated, the connection between the state of the terminal and the state of the connector may be lost, and the terminal may not be inserted into the connector.
  • the present invention has been made in view of the above problems, and an object of the present invention is to provide a control system, a control method, and a program in which the states of a plurality of objects can be changed in a coordinated manner.
  • the control system includes first to Nth robots, an imaging device for imaging the first to Nth objects, and a control system for controlling the first to Nth robots.
  • a control device. N is an integer of 2 or more.
  • the i-th robot changes the state of the i-th object.
  • i is an integer of 1 to N-1.
  • the Nth robot changes one state of the Nth object and the imaging device.
  • the other of the N-th object and the imaging device is installed at a fixed position.
  • the control device acquires change information for each of the first to Nth objects.
  • the change information corresponding to the j-th object indicates a relationship between the control amount of the j-th robot and the change amount of the state of the j-th object on the image of the imaging device.
  • the control device is configured to perform a first process of acquiring a real image captured by the imaging device, a second process of selecting a target frame from a reference moving image indicating a sample of the first to Nth objects, a real image and a target frame. And a third process for controlling each of the first to Nth robots based on the above.
  • the control device causes the state of the j-th object on the real image to approach the state of the j-th object on the target frame based on the change information corresponding to the j-th object.
  • the control amount of the j-th robot is calculated, and the j-th robot is controlled according to the calculated control amount.
  • the state of the target on the real image can be changed to the state of the target on the target frame based on the real image, the target frame, and the change information.
  • the states of the first to Nth objects change in a coordinated manner according to the reference moving image.
  • control device may determine that a deviation between a state of at least one of the first to Nth objects on the real image and a state of the at least one object on the target frame is less than a threshold. Then, the target frame is updated. According to this disclosure, the state of the first object can be reliably changed according to the state on the target frame.
  • the control device selects the second target frame from the plurality of frames.
  • the control device includes: a first time when a deviation between the state of the first object on the real image and the state of the first object on the first target frame is less than a first threshold; The first robot and the second robot such that the deviation between the state of the target object and the state of the second target object on the second target frame is less than a second threshold value at a second time. Control.
  • the time difference between the time when the first object reaches the state of the first target frame and the time when the second object reaches the state of the second target frame is a desired time. Then, the state of the first object and the state of the second object can be changed.
  • the imaging apparatus images the (N + 1) th object together with the first to Nth objects.
  • the reference moving image includes the (N + 1) th object.
  • the control device selects a second target frame from the plurality of frames.
  • the control device includes: a first time when a deviation between the state of the (N + 1) th object on the real image and the state of the (N + 1) th object on the first target frame is less than a first threshold;
  • the first robot is controlled such that the second time at which the deviation between the state of the target object and the state of the first target object on the second target frame is less than the second threshold satisfies a specified condition.
  • the time difference between the time when the (N + 1) th object reaches the state of the first target frame and the time when the first object reaches the state of the second target frame is a desired time. Then, the state of the first object can be changed.
  • control device may control a state of at least one of the first to Nth objects on the real image and a state of the target frame during a period from when the target frame is selected to when a predetermined time has elapsed. If the deviation from the state of at least one of the above objects does not become smaller than the threshold value, it is determined that an abnormality has occurred in the control system. According to this disclosure, a countermeasure against an abnormality of the control system can be started quickly.
  • the Nth robot changes the state of the imaging device.
  • the control device controls only the Nth robot in the third process.
  • each of the first to N-th robots is controlled in the third processing.
  • the control of the first to N-1st robots causes the first to N-1st objects and the N-th robot to be controlled.
  • the state with the object can be changed in conjunction with the reference moving image.
  • control device repeatedly executes a series of processing including the first processing to the third processing, and starts the first processing of the next series of processing while performing the third processing.
  • the target robot is continuously controlled according to the latest actual image without stopping the operation of the target robot. As a result, the state of the object can be changed quickly.
  • the control device selects a frame included in the predicted horizon period from the reference moving image as a target frame.
  • the control device determines a state of the j-th object on the frame included in the predicted horizon period in the reference moving image and a state of the j-th object on the image of the imaging device in the predicted horizon period.
  • the control amount of the j-th robot during the control horizon period is calculated so as to minimize the deviation. According to this disclosure, a change in the state of the target object can be made closer to the reference moving image.
  • the state indicates at least one of the position, posture, shape, and size of the object.
  • the position, posture, shape, and size of the first object can be respectively changed to the position, posture, shape, and size of the first object on the reference moving image.
  • the control method controls the first to Nth robots using the imaging device for imaging the first to Nth objects.
  • N is an integer of 2 or more.
  • the i-th robot changes the state of the i-th object.
  • i is an integer of 1 to N-1.
  • the Nth robot changes one state of the Nth object and the imaging device.
  • the other of the N-th object and the imaging device is installed at a fixed position.
  • the control method includes a first step of acquiring change information for each of the first to Nth objects.
  • the change information corresponding to the j-th object indicates a relationship between the control amount of the j-th robot and the change amount of the state of the j-th object on the image of the imaging device.
  • j is an integer of 1 to N.
  • the control method includes: a second step of acquiring a real image captured by the imaging device; a third step of selecting a target frame from reference moving images indicating samples of the first to Nth objects; A fourth step of controlling each of the first to Nth robots based on the above.
  • the fourth step is a j-th robot for bringing the state of the j-th object on the real image closer to the state of the j-th object on the target frame based on the change information corresponding to the j-th object. And controlling the j-th robot in accordance with the calculated control amount.
  • a program is a program for causing a computer to execute the above control method.
  • the states of a plurality of objects also change in a coordinated manner.
  • the states of a plurality of objects change in a coordinated manner.
  • FIG. 1 is a schematic diagram illustrating an outline of a control system according to a first embodiment.
  • FIG. 4 is a diagram illustrating an example of a real image captured by an imaging device and a first reference moving image.
  • FIG. 3 is a diagram illustrating an example of a real image captured by an imaging device and a second reference moving image.
  • FIG. 2 is a schematic diagram illustrating a hardware configuration of a control device included in the control system according to the first embodiment.
  • FIG. 2 is a block diagram illustrating a functional configuration of the control device according to the first embodiment.
  • FIG. 6 is a diagram illustrating an example of a method for creating a template.
  • FIG. 3 is a block diagram illustrating a functional configuration of a first control unit and a second control unit according to the first embodiment.
  • FIG. 5 is a diagram illustrating a method of generating a first change information set in a first control unit.
  • FIG. 4 is a diagram illustrating a method of calculating a control amount by a calculation unit of a first control unit.
  • 9 is a flowchart illustrating an example of a flow of a change information generation process performed by a change information generation unit.
  • 11 is a flowchart showing the flow of processing of a subroutine of step S2 shown in FIG. 6 is a flowchart illustrating an example of a flow of a process of controlling the target robot to change the state of the target object along the reference moving image in the first embodiment.
  • 13 is a flowchart showing the flow of processing of a subroutine of step S44 shown in FIG. FIG.
  • FIG. 3 is a diagram illustrating a relationship between a closest frame and a target frame.
  • 13 is a flowchart showing the flow of a subroutine of step S46 shown in FIG. It is a figure which shows another example of the reference moving image used as a sample of the connection of a male connector and a female connector.
  • 15 is a flowchart illustrating an example of a flow of a target frame selection process in Modification 1 of Embodiment 1. It is a flowchart which shows an example of the flow of abnormality determination processing.
  • FIG. 9 is a schematic diagram illustrating an object of a control system according to a second modification of the first embodiment.
  • FIG. 9 is a schematic diagram illustrating an outline of a control system according to a second embodiment.
  • FIG. 7 is a block diagram illustrating a functional configuration of a control device according to a second embodiment. It is a figure showing an example of a screen for designating a required passage frame and a related frame.
  • 20 is a flowchart illustrating an example of a flow of a target frame selection process in Embodiment 2.
  • FIG. 14 is a schematic diagram illustrating an object of a control system according to a first modification of the second embodiment.
  • FIG. 15 is a diagram illustrating an example of a reference moving image according to a first modification of the second embodiment. It is a figure showing an example of arrangement of four imaging devices.
  • FIG. 7 is a block diagram illustrating a functional configuration of a control device according to a second embodiment. It is a figure showing an example of a screen for designating a required passage frame and a related frame.
  • 20 is a flowchart illustrating an example of a flow of a target frame selection process in Embodiment 2.
  • FIG. 14 is a schematic diagram illustrating an
  • FIG. 13 is a schematic diagram illustrating an object of a control system according to a second modification of the second embodiment.
  • FIG. 9 is a schematic diagram illustrating an outline of a control system according to a third embodiment.
  • FIG. 13 is a block diagram illustrating a functional configuration of a control device according to Embodiment 3.
  • 13 is a flowchart illustrating a processing flow of a first control unit and a second control unit according to the third embodiment.
  • FIG. 13 is a schematic diagram illustrating an outline of a part of a control system according to a modification of the third embodiment.
  • FIG. 1 is a schematic diagram illustrating an outline of the control system according to the first embodiment.
  • the control system 1 connects the male connector 2a and the female connector 2b by inserting the male connector 2a into the female connector 2b in, for example, an industrial product production line.
  • control system 1 includes imaging devices 21 and 22, robots 30a and 30b, robot controllers 40a and 40b, and a control device 50.
  • the imaging devices 21 and 22 capture an image of a subject present in the field of view and generate image data (hereinafter, simply referred to as “image”).
  • image image data
  • the imaging devices 21 and 22 are set at fixed positions different from those of the robots 30a and 30b.
  • the imaging devices 21 and 22 are installed at different places, and image the male connector 2a and the female connector 2b as subjects from different directions.
  • the imaging devices 21 and 22 perform imaging according to a predetermined imaging cycle, and output a real image obtained by the imaging to the control device 50.
  • the robot 30a is a mechanism for changing the state (here, position and posture) of the male connector 2a, and is, for example, a vertical articulated robot.
  • the robot 30a has a hand 31a at its tip for supporting (holding) the male connector 2a, and changes the position and posture of the hand 31a with six degrees of freedom.
  • the robot 30a changes the position and the posture of the male connector 2a held by the hand 31a with six degrees of freedom.
  • the six degrees of freedom include translational degrees of freedom in the X, Y, and Z directions, and rotational degrees of freedom in the pitch, yaw, and roll directions.
  • the number of degrees of freedom of the hand 31a is not limited to six, and may be three to five or seven or more.
  • the robot 30a has a plurality of servo motors, and the position and posture of the male connector 2a are changed by driving the servo motors.
  • An encoder is provided for each of the plurality of servomotors, and the position of the servomotor is measured.
  • the robot 30b is a mechanism for changing the state (here, position and posture) of the female connector 2b, and is, for example, an XY ⁇ stage.
  • the robot 30b has a stage 31b for supporting (mounting) the female connector 2b, and changes the position and posture of the stage 31b with three degrees of freedom. That is, the robot 30b changes the position and posture of the female connector 2b mounted on the stage 31b with three degrees of freedom.
  • the three degrees of freedom include a translational degree of freedom in the X direction and the Y direction and a rotational degree of freedom in a rotational direction ( ⁇ direction) about an axis orthogonal to the XY plane.
  • the number of degrees of freedom of the robot 30b is not limited to three, and may be four or more.
  • the robot 30b has a plurality of servomotors, and drives and changes the position and orientation of the female connector 2b.
  • An encoder is provided for each of the plurality of servomotors, and the position of the servomotor is measured.
  • the robot controller 40a controls the operation of the robot 30a according to the control command received from the control device 50.
  • the robot controller 40a receives from the control device 50 control commands for translational degrees of freedom in the X, Y, and Z directions and rotational degrees of freedom in the pitch, yaw, and roll directions. These X direction, Y direction, Z direction, pitch direction, yaw direction, and roll direction are indicated by the coordinate system of the robot 30a.
  • the robot controller 40a performs feedback control on the robot 30a so that the translation amounts of the hand 31a in the X, Y, and Z directions approach control commands for the degrees of freedom of translation in the X, Y, and Z directions, respectively. Do.
  • the robot controller 40a performs feedback control on the robot 30a so that the rotational movement amounts of the hand 31a in the pitch direction, the yaw direction, and the roll direction approach the control commands for the rotational degrees of freedom in the pitch, yaw, and roll directions, respectively. Do.
  • the robot controller 40b controls the operation of the robot 30b according to the control command received from the control device 50.
  • the robot controller 40b receives from the control device 50 control commands for the degrees of freedom of translation and rotation in the X and Y directions. These X direction, Y direction, and rotation direction are indicated by the coordinate system of the robot 30b.
  • the robot controller 40b performs feedback control on the robot 30b such that the translation amounts of the stage 31b in the X and Y directions approach control commands for the degrees of freedom of translation in the X and Y directions, respectively.
  • the robot controller 40b performs feedback control on the robot 30b such that the rotational movement amount of the stage 31b approaches the control command of the rotational degree of freedom.
  • the control device 50 controls the robots 30a and 30b via the robot controllers 40a and 40b, respectively.
  • the control device 50 stores a first reference moving image and a second reference moving image, each of which represents a sample of the male connector 2a and the female connector 2b.
  • the first reference moving image is a moving image when viewed from the position of the imaging device 21.
  • the second reference moving image is a moving image when viewed from the position of the imaging device 22.
  • Each of the first reference moving image and the second reference moving image includes a plurality of frames (hereinafter, referred to as M (M is an integer of 2 or more)) arranged in time series.
  • M is an integer of 2 or more
  • the k-th frame (k is an integer from 1 to M) of the first reference moving image and the k-th frame of the second reference moving image are obtained by simultaneously viewing the male connector 2a and the female connector 2b in a certain state from different directions. It is an image when it is.
  • the control device 50 acquires change information indicating the relationship between the control amount of the robot 30a and the change amount of the state of the male connector 2a on the real images of the imaging devices 21 and 22. Further, the control device 50 acquires change information indicating the relationship between the control amount of the robot 30b and the change amount of the state of the female connector 2b on the real images of the imaging devices 21 and 22.
  • the controller 50 performs the following first to third processing.
  • the control device 50 repeatedly executes a series of processes including the first to third processes.
  • the first process is a process of acquiring actual images captured by the imaging devices 21 and 22.
  • the second process is a process of selecting a target frame from each of the first reference moving image and the second reference moving image.
  • the control device 50 selects the k-th frame from the second reference moving image as the target frame.
  • the third process is a process for controlling each of the robots 30a and 30b based on the actual image and the target frame.
  • the control device 50 calculates and calculates the control amount of the robot 30a for bringing the state of the male connector 2a on the actual image closer to the state of the male connector 2a on the target frame based on the change information corresponding to the robot 30a.
  • the robot 30a is controlled according to the control amount.
  • the control device 50 generates a control command indicating a control amount of the robot 30a, and outputs the generated control command to the robot controller 40a.
  • control device 50 calculates a control amount of the robot 30b for bringing the state of the female connector 2b on the actual image closer to the state of the female connector 2b on the target frame based on the change information corresponding to the robot 30b, The robot 30b is controlled according to the calculated control amount.
  • the control device 50 generates a control command indicating a control amount of the robot 30b, and outputs the generated control command to the robot controller 40b.
  • FIG. 2 is a diagram illustrating an example of an actual image captured by the imaging device 21 and a first reference moving image.
  • FIG. 3 is a diagram illustrating an example of a real image captured by the imaging device 22 and a second reference moving image.
  • FIG. 2 shows real images 90a to 93a captured by the imaging device 21 and frames 70a to 73a of the first reference moving image.
  • FIG. 3 shows real images 90b to 93b imaged by the imaging device 22, and frames 70b to 73b of the second reference moving image.
  • the real images 90a and 90b are images captured at the same time.
  • the real images 91a and 91b are images captured at the same time after the real images 90a and 90b.
  • the real images 92a and 92b are images captured at the same time after the real images 91a and 91b.
  • the real images 93a and 93b are images captured at the same time after the real images 92a and 92b.
  • Each of the frames 70a and 70b is the first frame in the corresponding reference moving image.
  • Each of the frames 71a and 71b is the s-th (s is an integer of 2 or more) frame in the corresponding reference moving image.
  • Each of the frames 72a and 72b is a t-th (t is an integer greater than s) frame in the corresponding reference moving image.
  • Each of the frames 73a and 73b is a u-th (u is an integer greater than t) frame in the corresponding reference moving image.
  • the male connector 2a held by the hand 31a moves downward from above the female connector 2b. 2 shows a state of connection with the female connector 2b.
  • the control device 50 acquires real images 90a and 90b including the female connector 2b placed on the stage 31b and the male connector 2a held by the hand 31a from the imaging devices 21 and 22, respectively.
  • the control device 50 selects, from the first reference moving image and the second reference moving image, the frames 71a and 71b when the female connector 2b has moved to a desired position and posture, respectively, as target frames.
  • the controller 50 controls the stage 31b to bring the state of the female connector 2b on the actual images 90a, 90b closer to the state of the female connector 2b on the frames 71a, 71b based on the change information corresponding to the female connector 2b. Calculate the amount. Then, the control device 50 outputs a control command indicating the calculated control amount to the robot controller 40b. The robot controller 40b controls the robot 30b according to a control command. As a result, as shown in the real images 91a and 91b, the position and posture of the female connector 2b change to desired positions and postures (the positions and postures indicated by the frames 71a and 71b).
  • the control device 50 changes the control amount of the hand 31a for bringing the state of the male connector 2a on the real images 90a and 90b closer to the state of the male connector 2a on the frames 71a and 71b. calculate. Then, the control device 50 outputs a control command indicating the calculated control amount to the robot controller 40a.
  • the robot controller 40a controls the robot 30a according to a control command. Thereby, as shown in the actual images 91a and 91b, the state of the male connector 2a changes to the position and posture above the female connector 2b (the positions and postures indicated by the frames 71a and 71b).
  • control device 50 selects the frames 72a and 72b when the male connector 2a has moved to a position immediately above the female connector 2b as target frames.
  • the control device 50 controls the hand 31a to bring the state of the male connector 2a on the real images 91a and 91b closer to the state of the male connector 2a on the frames 72a and 72b based on the change information corresponding to the male connector 2a. Calculate the amount. Then, the control device 50 outputs a control command indicating the calculated control amount to the robot controller 40a.
  • the robot controller 40a controls the robot 30a according to a control command. Thereby, as shown in the real images 92a and 92b, the position and posture of the male connector 2a change to the position and posture directly above the female connector 2b (the position and posture indicated by the frames 72a and 72b).
  • control device 50 selects the frames 73a and 73b when the connection between the male connector 2a and the female connector 2b is completed as target frames.
  • the control device 50 controls the hand 31a to bring the state of the male connector 2a on the real images 92a and 92b closer to the state of the male connector 2a on the frames 73a and 73b based on the change information corresponding to the male connector 2a. Is calculated. Then, the control device 50 outputs a control command indicating the calculated control amount to the robot controller 40a. The robot controller 40a controls the robot 30a according to a control command. Thereby, as shown in the real images 93a and 93b, the male connector 2a moves to the position and posture (the positions and postures indicated by the frames 73a and 73b) at which the connection to the female connector 2b is completed.
  • control device 50 can change the male connector 2a and the female connector 2b from the state on the actual image to the state on the target frame.
  • the states of the male connector 2a and the female connector 2b change in a coordinated manner according to the first reference moving image and the second reference moving image.
  • the control device 50 can control the robots 30a and 30b without using calibration data for associating the coordinate systems of the imaging devices 21 and 22 with the robots 30a and 30b. Therefore, the operator does not need to perform the calibration in advance. Further, the operator designs in advance an operation program of the robot 30a for changing the state of the male connector 2a to a desired state and an operation program of the robot 30b for changing the state of the female connector 2b to a desired state. No need to do. Therefore, it is possible to reduce the labor required for changing the states of the male connector 2a and the female connector 2b to desired states by the robots 30a and 30b.
  • the positions and postures of the male connector 2a and the female connector 2b in the real space are specified from the images taken by the imaging devices 21 and 22, and the robots 30a and 30b are controlled based on the positions and postures. There is a way to do it.
  • the accuracy of the calibration data is reduced in accordance with the aging of the robots 30a and 30b, and the male connector 2a and the female connector 2b cannot be connected well.
  • the male connector 2a and the female connector 2b cannot be connected properly due to a positional shift or individual difference between the male connector 2a and the female connector 2b. Even in such a case, by using this application example, the male connector 2a and the female connector 2b can be connected in accordance with the reference moving image.
  • FIG. 4 is a schematic diagram illustrating a hardware configuration of a control device included in the control system according to the first embodiment.
  • the control device 50 has a structure according to a computer architecture, and executes various programs as described below by executing a program installed in advance by a processor.
  • control device 50 includes a processor 510 such as a CPU (Central Processing Unit) or an MPU (Micro-Processing Unit), a RAM (Random Access Memory) 512, a display controller 514, a system controller 516, It includes an I / O (Input @ Output) controller 518, a hard disk 520, a camera interface 522, an input interface 524, a robot controller interface 526, a communication interface 528, and a memory card interface 530. These units are connected to each other so as to enable data communication with the system controller 516 as the center.
  • a processor 510 such as a CPU (Central Processing Unit) or an MPU (Micro-Processing Unit), a RAM (Random Access Memory) 512, a display controller 514, a system controller 516, It includes an I / O (Input @ Output) controller 518, a hard disk 520, a camera interface 522, an input interface 524, a robot controller interface 526, a communication
  • the processor 510 exchanges programs (codes) and the like with the system controller 516 and executes them in a predetermined order, thereby realizing the intended arithmetic processing.
  • the system controller 516 is connected to the processor 510, the RAM 512, the display controller 514, and the I / O controller 518 via buses, respectively, exchanges data with each unit, and performs processing of the entire control device 50. Govern.
  • the RAM 512 is typically a volatile storage device such as a DRAM (Dynamic Random Access Memory), and includes programs read from the hard disk 520, images (image data) acquired by the imaging devices 21 and 22, The processing result for the image, work data, and the like are stored.
  • DRAM Dynamic Random Access Memory
  • the display controller 514 is connected to the display unit 532, and outputs a signal for displaying various information to the display unit 532 according to an internal command from the system controller 516.
  • the I / O controller 518 controls data exchange between a recording medium connected to the control device 50 and an external device. More specifically, the I / O controller 518 is connected to the hard disk 520, the camera interface 522, the input interface 524, the robot controller interface 526, the communication interface 528, and the memory card interface 530.
  • the hard disk 520 is typically a nonvolatile magnetic storage device, and stores various information in addition to the control program 550 executed by the processor 510.
  • the control program 550 installed on the hard disk 520 is distributed while being stored in a memory card 536 or the like.
  • a semiconductor storage device such as a flash memory or an optical storage device such as a DVD-RAM (Digital Versatile Disk Random Access Memory) may be employed.
  • the camera interface 522 corresponds to an input unit that receives image data from the imaging devices 21 and 22, and mediates data transmission between the processor 510 and the imaging devices 21 and 22.
  • the camera interface 522 includes image buffers 522a and 522b for temporarily storing image data from the imaging devices 21 and 22, respectively.
  • a single image buffer that can be shared may be provided for a plurality of imaging devices, it is preferable to independently arrange a plurality of image buffers in association with each imaging device in order to speed up processing.
  • the input interface 524 mediates data transmission between the processor 510 and an input device 534 such as a keyboard, a mouse, a touch panel, and a dedicated console.
  • the robot controller interface 526 mediates data transmission between the processor 510 and the robot controllers 40a and 40b.
  • the communication interface 528 mediates data transmission between the processor 510 and another personal computer or server (not shown).
  • the communication interface 528 is typically made of Ethernet (registered trademark), USB (Universal Serial Bus), or the like.
  • the memory card interface 530 mediates data transmission between the processor 510 and the memory card 536 as a recording medium.
  • the memory card 536 circulates in a state where a control program 550 executed by the control device 50 and the like are stored, and the memory card interface 530 reads the control program 550 from the memory card 536.
  • the memory card 536 includes a general-purpose semiconductor storage device such as an SD (Secure Digital), a magnetic recording medium such as a flexible disk (Flexible Disk), and an optical recording medium such as a CD-ROM (Compact Disk-Read Only Memory). Consists of Alternatively, a program downloaded from a distribution server or the like may be installed in the control device 50 via the communication interface 528.
  • an OS for providing basic functions of the computer in addition to an application for providing functions according to the present embodiment may be installed.
  • the control program according to the present embodiment executes processing by calling necessary modules in a predetermined order and / or timing among program modules provided as a part of the OS. Is also good.
  • control program according to the present embodiment may be provided by being incorporated in a part of another program. Even in such a case, the program itself does not include a module included in another program to be combined as described above, and the process is executed in cooperation with the other program. That is, the control program according to the present embodiment may be a form incorporated in such another program.
  • part or all of the functions provided by executing the control program may be implemented as a dedicated hardware circuit.
  • FIG. 5 is a block diagram illustrating a functional configuration of the control device according to the first embodiment.
  • the control device 50 includes a reference moving image storage unit 51, a teaching range selection unit 52, an image processing unit 53, a target frame selection unit 54, a first control unit 55a, and a second control unit.
  • the reference moving image storage unit 51 includes the hard disk 520 and the RAM 512 shown in FIG.
  • the teaching range selection unit 52 and the image processing unit 53 are realized by the processor 510 shown in FIG.
  • the reference moving image storage unit 51 stores a first reference moving image and a second reference moving image.
  • the first reference moving image and the second reference moving image show how the male connector 2a and the female connector 2b are moved and connected to each other by manually operating the robots 30a and 30b.
  • the first reference moving image and the second reference moving image may show a state in which the male connector 2a and the female connector 2b are moved by an operator's hand and connected to each other.
  • the difference between the working distance (WD) of the imaging device 21 when the male connector 2a is closest to the imaging device 21 and when the male connector 2a is farthest from the imaging device 21 is the working distance. Small enough compared to.
  • the difference in working distance of the imaging device 22 between when the male connector 2a is closest to the imaging device 22 and when the male connector 2a is farthest from the imaging device 22 is smaller than the working distance. Small enough.
  • a change in the posture of the male connector 2a during movement is very small. Therefore, the shape and size of the male connector 2a hardly change in the first reference moving image and the second reference moving image.
  • the difference between the working distance of the imaging device 21 when the female connector 2b is closest to the imaging device 21 and the working distance of the female connector 2b when it is farthest from the imaging device 21 is the working distance. It is small enough.
  • the difference between the working distance of the imaging device 22 when the female connector 2b is at the position closest to the imaging device 22 and when the female connector 2b is at the position farthest from the imaging device 22 is the working distance. It is small enough. Further, a change in the posture of the female connector 2b during movement is very small. Therefore, the shape and size of the female connector 2b hardly change in the first reference moving image and the second reference moving image.
  • the teaching range selection unit 52 selects, for each object (here, the male connector 2a and the female connector 2b), a teaching range serving as a sample of the object from the first reference moving image and the second reference moving image.
  • the teaching range selection unit 52 displays a screen prompting the user to select a teaching range on the display unit 532.
  • the operator checks each frame of the first reference moving image and the second reference moving image, and operates the input device 534 to determine the first frame and the last frame of a series of frames in which the object is performing a desired operation. specify.
  • the teaching range selection unit 52 selects a designated frame from the first frame to the last frame as the teaching range.
  • the teaching range selection unit 52 sets the female connector from the first frame 70 a to a frame after the s-th frame (a frame where a part of the female connector 2 b starts to be cut off). 2b is selected as the teaching range.
  • the teaching range selection unit 52 selects a range from the first frame 70a (a frame where the entire male connector 2a starts to appear) to the u-th frame 73a as the teaching range of the male connector 2a.
  • the teaching range selection unit 52 sets the female from the first frame 70b to the frame after the s-th frame (the frame where a part of the female connector 2b starts to be cut off). Select as the teaching range of the connector 2b.
  • the teaching range selection unit 52 selects the range from the first frame 70b (the frame where the entire male connector 2a starts to appear) to the u-th frame 73b as the teaching range of the male connector 2a.
  • the image processing unit 53 performs image processing on the target image, and detects an object from the target image using template matching.
  • a template which is data representing the image feature of the target object is prepared in advance, and the degree of matching of the image feature between the target image and the template is evaluated. This is a process for detecting the position, posture, shape, and size of the image.
  • the target images on which the image processing unit 53 performs the image processing are the frames of the first reference moving image, the frames of the second reference moving image, and the real images captured by the imaging devices 21 and 22.
  • the image processing unit 53 creates a template for each object (the male connector 2a and the female connector 2b) as advance preparation.
  • FIG. 6 is a diagram showing an example of a method for creating a template.
  • FIG. 6A shows a frame selected from the first reference moving image.
  • FIG. 6B shows a frame selected from the second reference moving image.
  • the image processing unit 53 causes the display unit 532 (see FIG. 4) to display a frame selected by the operator from each of the first reference moving image and the second reference moving image. The operator may visually select a frame in which the entire object (the male connector 2a or the female connector 2b) is shown.
  • the image processing unit 53 accepts the designation of the area of the target object on the frame displayed on the display unit 532. For example, the operator operates the input device 534 (see FIG. 4) to input a line 3a surrounding the male connector 2a and a line 3b surrounding the female connector 2b.
  • the image processing unit 53 specifies a region surrounded by the line 3a as an image region of the male connector 2a, and specifies a region surrounded by the line 3b as an image region of the female connector 2b.
  • the image processing unit 53 extracts a plurality of feature points of the male connector 2a and their feature amounts from the image region surrounded by the line 3a for each frame of the first reference image and the second reference image.
  • the image processing unit 53 creates the coordinates and the feature amount of each of the plurality of feature points on the image as a template of the male connector 2a.
  • the image processing unit 53 extracts a plurality of feature points of the female connector 2b and their feature amounts from the image region surrounded by the line 3b. Extract.
  • the image processing unit 53 creates the coordinates and the feature amount of each of the plurality of feature points on the image as a template of the female connector 2b.
  • a feature point is a point characterized by a corner or an outline included in an image, and is, for example, an edge point.
  • the feature quantity is, for example, luminance, luminance gradient direction, quantization gradient direction, HoG (Histogram of Oriented Gradients), HAAR-like, SIFT (Scale-Invariant Feature Transform), and the like.
  • the luminance gradient direction represents a direction (angle) of a luminance gradient in a local region centered on a feature point as a continuous value
  • the quantization gradient direction is a local region centered on a feature point.
  • the direction of the luminance gradient is represented by a discrete value (for example, eight directions are held by 1-byte information of 0 to 7).
  • the image processing unit 53 extracts a plurality of feature points and their feature amounts from the frame of the first reference moving image or the real image captured by the imaging device 21.
  • the image processing unit 53 detects the target in the image by comparing the extracted feature points and feature amounts with the template of the target created from the frame of the first reference image.
  • the image processing unit 53 extracts a plurality of feature points and their feature amounts from the frame of the second reference moving image and the image captured by the imaging device 22.
  • the image processing unit 53 detects the target in the image by comparing the extracted feature points and feature amounts with the template of the target created from the frame of the second reference image.
  • the image processing unit 53 outputs, for each target object (male connector 2a and female connector 2b), the coordinates on the image of each feature point of the target object extracted from the target image.
  • Target frame selection section The target frame selection unit 54 selects a target frame from the first reference moving image and the second reference moving image. However, when the k-th frame of the first reference moving image is selected as the target frame, the target frame selecting unit 54 selects the k-th frame of the second reference moving image as the target frame. A specific example of the target frame selection method will be described later.
  • the first control unit 55a controls the robot 30a via the robot controller 40a and changes the state of the male connector 2a.
  • the second controller 55b controls the robot 30b via the robot controller 40b to change the state of the female connector 2b.
  • FIG. 7 is a block diagram showing a functional configuration of the first control unit and the second control unit according to the first embodiment.
  • each of the first control unit 55a and the second control unit 55b includes a change information generation unit 56, a change information storage unit 57, a calculation unit 58, a command unit 59, and an end determination unit. 60.
  • the change information storage unit 57 includes the hard disk 520 and the RAM 512 shown in FIG.
  • the change information generation unit 56, the calculation unit 58, the command unit 59, and the end determination unit 60 are realized by the processor 510 illustrated in FIG.
  • the change information generation unit 56 performs a first change indicating the relationship between the control amount of the target robot and the change amount of the state of the target object on the real image captured by the imaging device 21 for each of the plurality of degrees of freedom. Generate information.
  • the change information generating unit 56 stores a first change information set 571 including a plurality of first change information generated for a plurality of degrees of freedom in the change information storage unit 57.
  • the change information generating unit 56 displays, for each of the plurality of degrees of freedom, a relationship between the control amount of the target robot and the change amount of the state of the target object on the real image captured by the imaging device 22. 2 Change information is generated.
  • the change information generation unit 56 stores a second change information set 572 including a plurality of pieces of second change information generated for a plurality of degrees of freedom in the change information storage unit 57.
  • the target object is the male connector 2a in the first control unit 55a and the female connector 2b in the second control unit 55b.
  • the target robot is the robot 30a in the first control unit 55a, and the robot 30b in the second control unit 55b.
  • the plurality of degrees of freedom are six degrees of freedom in the first control unit 55a and three degrees of freedom in the second control unit 55b.
  • the first change information and the second change information indicate the amount of change in the state of the target on the image when the target robot is controlled by the unit control amount.
  • the first change information and the second change information include an object on the image before controlling the target robot by the unit control amount, and an object on the image after controlling the target robot by the unit control amount.
  • a mapping that converts to
  • the change information generating unit 56 generates the first change information set 571 for each frame in the teaching range of the first reference moving image. Further, the change information generating unit 56 generates a second change information set 572 for each frame in the teaching range of the second reference moving image.
  • the process of generating and storing the first change information set 571 and the second change information set 572 by the change information generating unit 56 is executed as preparation.
  • a method of generating the first change information set 571 in the first control unit 55a will be described with reference to FIG. Note that the method of generating the second change information set 572 in the first control unit 55a and the method of generating the first change information set 571 and the second change information set 572 in the second control unit 55b are the same. The description of the generation method is omitted.
  • FIG. 8 is a diagram illustrating a method of generating the first change information set in the first control unit.
  • FIG. 8A shows the k-th frame 84 of the first reference moving image.
  • the state (here, position and orientation) of the male connector 2a corresponding to the k-th frame 84 in the real space is set as a reference state.
  • FIG. 8B illustrates an image 94a captured by the imaging device 21 after the male connector 2a is translated from the reference state by the unit control amount to the degree of translational freedom in the Y direction.
  • FIG. 8C illustrates an image 94b captured by the imaging device 21 after the male connector 2a is translated from the reference state by the unit control amount to the degree of translation in the X direction.
  • FIG. 8A shows the k-th frame 84 of the first reference moving image.
  • the state (here, position and orientation) of the male connector 2a corresponding to the k-th frame 84 in the real space is set as a reference state.
  • FIG. 8B
  • FIG. 8D shows an image 94c captured by the imaging device 21 after the male connector 2a is translated from the reference state by a unit control amount to the translational freedom in the Z direction.
  • FIG. 8E shows an image 94d captured by the imaging device 21 after the male connector 2a is rotationally moved by the unit control amount to the rotational degree of freedom in the pitch direction from the reference state.
  • FIG. 8F illustrates an image 94e captured by the imaging device 21 after the male connector 2a is rotationally moved by a unit control amount to the rotational degree of freedom in the yaw direction from the reference state.
  • FIG. 8G shows an image 94f captured by the imaging device 21 after the male connector 2a is rotationally moved by the unit control amount to the rotational degree of freedom in the roll direction from the reference state.
  • the change information generating unit 56 acquires, from the image processing unit 53, the coordinates on the image of each feature point of the male connector 2a extracted from each of the frame 84 and the images 94a to 94f.
  • the change information generating unit 56 outputs information indicating a mapping for converting the coordinates of the feature points 4a 'to 4g' of the male connector 2a extracted from the frame 84 into the coordinates of the feature points 4a to 4g extracted from the image 94a. , And the first change information corresponding to the translational degree of freedom in the Y direction.
  • the change information generation unit 56 converts information indicating a mapping that transforms the coordinates of the feature points 4a ′ to 4g ′ into the coordinates of the feature points 4a to 4g extracted from the image 94b according to the translation degree of freedom in the X direction. Is generated as first change information.
  • the change information generation unit 56 converts the information indicating the mapping that converts the coordinates of the feature points 4a ′ to 4g ′ into the coordinates of the feature points 4a to 4g extracted from the image 94c into the first information corresponding to the translation degree in the Z direction. Generated as change information.
  • the change information generation unit 56 converts the information indicating the mapping that converts the coordinates of the feature points 4a ′ to 4g ′ into the coordinates of the feature points 4a to 4g extracted from the image 94d into the first information corresponding to the rotational degree of freedom in the pitch direction. Generated as change information.
  • the change information generation unit 56 converts the information indicating the mapping that converts the coordinates of the feature points 4a ′ to 4g ′ into the coordinates of the feature points 4a to 4g extracted from the image 94e into the first information corresponding to the rotational degree of freedom in the yaw direction. Generated as change information.
  • the change information generation unit 56 converts information indicating a mapping that transforms the coordinates of the feature points 4a ′ to 4g ′ into the coordinates of the feature points 4a to 4g extracted from the image 94f according to the first degree of rotation corresponding to the rotational degree of freedom in the roll direction. Generated as change information. In this way, the change information generating unit 56 generates the first change information set 571 corresponding to the k-th frame 84 of the first reference moving image.
  • the change information generation unit 56 generates the first change information set 571 corresponding to the remaining frames in the teaching range of the first reference moving image by the same method.
  • the calculating unit 58 includes a plurality of degrees of freedom for bringing the state of the target on the real image captured by the imaging devices 21 and 22 closer to the state of the target on target frames of the first reference moving image and the second reference moving image, respectively. Is calculated.
  • the calculation unit 58 acquires the first change information set 571 and the second change information set 572 corresponding to the target frame from the change information storage unit 57, and stores the acquired first change information set 571 and second change information set 572 in the acquired first change information set 571 and second change information set 572.
  • the control amount is calculated based on the control amount.
  • the first change information and the second change information include an object on the image before controlling the target robot by the unit control amount and an object on the image after controlling the target robot by the unit control amount.
  • the calculation unit 58 acquires from the image processing unit 53 the coordinates on the image of the feature points of the object extracted from the real image and the coordinates on the image of the feature points of the object extracted from the target frame.
  • the calculating unit 58 calculates a control amount of each of a plurality of degrees of freedom for mapping the target on the real image to the target on the target frame based on the first change information and the second change information.
  • FIG. 9 is a diagram illustrating a method of calculating the control amount by the calculation unit of the first control unit.
  • the calculating unit 58 obtains, from the image processing unit 53, the coordinates on the image of the feature points 4a 'to 4g' of the male connector 2a extracted from the target frame of the first reference moving image. Further, the calculation unit 58 acquires from the image processing unit 53 the coordinates on the image of the characteristic points 4a to 4g of the male connector 2a extracted from the real image obtained by the imaging of the imaging device 21.
  • the number of feature points is not limited to seven.
  • the calculation unit 58 calculates the difference vectors 61a to 61g of each feature point.
  • the difference vectors 61a to 61g are vectors having the feature points 4a to 4g as starting points and the feature points 4a 'to 4g' as end points, respectively.
  • the calculation unit 58 calculates the average x component ⁇ x1 and y component ⁇ y1 of the difference vectors 61a to 61g.
  • the x component and the y component are indicated by the coordinate system of the image.
  • the calculation unit 58 calculates the average x component of the difference vector between the feature point extracted from the real image obtained by the imaging of the imaging device 22 and the feature point extracted from the target frame of the second reference moving image. Calculate ⁇ x2 and y component ⁇ y2.
  • the calculation unit 58 calculates the control amounts of the three translational degrees of freedom such that the average of the difference vectors of the plurality of feature points is eliminated. Specifically, the calculating unit 58 uses the ⁇ x1, ⁇ y1, ⁇ x2, and ⁇ y2, the first change information set 571, and the second change information set 572 to translate the hand 31a in the X, Y, and Z directions. The control amount of each degree is calculated.
  • the male connector 2a When the hand 31a translates in any of the degrees of freedom of translation in the X, Y, and Z directions, the male connector 2a translates in a certain direction on the images obtained by the imaging devices 21 and 22. Therefore, the first change information corresponding to the degree of freedom of translation in the first change information set 571 indicates a mapping that converts an arbitrary point on the image into a point translated in a certain direction. Similarly, the second change information corresponding to the degree of freedom of translation in the second change information set 572 indicates a mapping that converts an arbitrary point on the image into a point translated in a certain direction.
  • the first change information corresponding to the translation degree of freedom in the X direction of the first change information set 571 corresponding to the target frame indicates a mapping that converts the point (x, y) into a point (x + dX1_1, y + dY1_1).
  • the first change information corresponding to the degree of freedom of translation in the Y direction indicates a mapping that converts an arbitrary point (x, y) on the image into a point (x + dX1_2, y + dY1_2).
  • the first change information corresponding to the degree of freedom of translation in the Z direction indicates a mapping that converts an arbitrary point (x, y) on the image into a point (x + dX1_3, y + dY1_3).
  • the second change information corresponding to the degree of freedom of translation in the X direction of the second change information set 572 corresponding to the target frame converts an arbitrary point (x, y) on the image into a point (x + dX2_1, y + dY2_1).
  • the second change information corresponding to the degree of freedom of translation in the Y direction indicates a mapping that converts an arbitrary point (x, y) on the image into a point (x + dX2_2, y + dY2_2).
  • the second change information corresponding to the translation degree of freedom in the Z direction indicates a mapping that converts an arbitrary point (x, y) on the image into a point (x + dX2_3, y + dY2_3).
  • the calculation unit 58 calculates coefficients a1, a2, and a3 by solving the following four linear equations (1) to (4).
  • the first change information and the second change information indicate the amount of change in the state of the male connector 2a on the image when the robot 30a is controlled by the unit control amount. Therefore, the calculation unit 58 calculates the control amount of the translational freedom in the X direction by a1 times the unit control amount, the control amount of the translational freedom in the Y direction by a2 times the unit control amount, and the control amount of the translational freedom in the Z direction. Is a3 times the unit control amount.
  • the control amounts of these translation degrees of freedom are control amounts for bringing the state of the male connector 2a on the real image closer to the state of the male connector 2a on the target frame by the average of the difference vectors of the plurality of feature points.
  • the calculation unit 58 calculates the control amounts of the three rotational degrees of freedom.
  • the calculation unit 58 subtracts the above average x component ( ⁇ x1 or ⁇ x2) and y component ( ⁇ y1 or ⁇ y2) from the difference vector of each feature point.
  • the calculating unit 58 calculates the control amounts of the three rotational degrees of freedom in which the residual of the difference vector of each feature point is closest to zero.
  • the calculation unit 58 starts a search algorithm with a solution in which the control amounts of the rotational degrees of freedom in the pitch direction, the yaw direction, and the roll direction are 0 as the current solution.
  • the calculation unit 58 simulates a change in the residual of the difference vector of each feature point when the robot 30a is controlled according to each of a plurality of solutions near the current solution.
  • the calculation unit 58 replaces the neighboring solution with the current solution when there is a neighboring solution in which the residual of the difference vector of each feature point is closer to 0 than the current solution based on the simulation result.
  • the calculation unit 58 searches for a solution in which the residual of the difference vector becomes an extreme value by repeating this process.
  • the command unit 59 generates a control command for moving the target robot by the control amount calculated by the calculation unit 58, and outputs the generated control command to the target robot controller.
  • the target robot controller is the robot controller 40a in the first controller 55a, and the robot controller 40b in the second controller 55b.
  • End determination section The end determination unit 60 calculates a deviation between the state of the target object on the real image and the state of the target object on the last frame of the teaching range, and when the calculated deviation is less than a predetermined threshold value, Is determined to end. When determining that the control of the target robot is to be ended, the end determination unit 60 outputs an end notification.
  • the deviation is, for example, an average of distances between corresponding feature points of the target object extracted from the real image and the final frame.
  • the threshold value is set according to the accuracy required for the state of the object.
  • the threshold is the threshold Tha in the first controller 55a, and the threshold Thb in the second controller 55b.
  • the threshold value Tha and the threshold value Thb may be the same or different.
  • FIG. 10 is a flowchart illustrating an example of a flow of a change information generation process performed by the change information generation unit.
  • FIG. 10 shows a flow of processing of the change information generation unit 56 of the first control unit 55a.
  • the change information generation unit 56 of the second control unit 55b may generate the change information according to the same method as in FIG.
  • the process of generating change information is performed as advance preparation.
  • step S1 the control device 50 causes the robot 30a to perform a fixed operation of holding the male connector 2a conveyed to a predetermined position by the hand 31a and moving the male connector 2a from above to below. Thereby, the male connector 2a is moved within the field of view of the imaging devices 21 and 22.
  • step S2 the change information generation unit 56 controls the robot 30a such that the same images as the first frames of the teaching ranges of the first reference moving image and the second reference moving image are captured by the imaging devices 21 and 22, respectively.
  • the subroutine of step S2 will be described later.
  • step S3 the change information generating unit 56 sets k to the frame number of the first frame of the teaching range.
  • step S4 the change information generator 56 selects one of the six degrees of freedom.
  • step S5 the change information generation unit 56 generates a control command for moving the hand 31a by the unit control amount in the positive direction of the selected degree of freedom, and outputs the control command to the robot controller 40a.
  • step S6 the change information generator 56 acquires the latest actual image from the imaging devices 21 and 22 after the hand 31a has moved by the unit control amount.
  • step S7 the change information generating unit 56 obtains, from the image processing unit 53, the coordinates of the characteristic points of the male connector 2a extracted from the real image obtained in step S6.
  • step S ⁇ b> 8 the change information generating unit 56 acquires the coordinates of the characteristic points of the male connector 2 a extracted from the k-th frame of the first reference moving image and the second reference moving image from the image processing unit 53.
  • step S9 the change information generating unit 56 sets the first change information corresponding to the k-th frame and the degree of freedom selected in step S4 based on the coordinates obtained in step S8 and the coordinates obtained in step S7.
  • the second change information is generated. That is, the change information generating unit 56 displays a mapping for converting the coordinates of the feature points extracted from the real image by the imaging device 21 into the coordinates of the feature points extracted from the k-th frame of the first reference moving image. 1 Generate change information. Further, the change information generating unit 56 displays a mapping for converting the coordinates of the feature points extracted from the real image by the imaging device 22 into the coordinates of the feature points extracted from the k-th frame of the second reference moving image. 2 Change information is generated.
  • step S10 the change information generating unit 56 generates a control command for returning the hand 31a to the original state (the state before the latest step S5), and outputs the control command to the robot controller 40a.
  • the male connector 2a returns to the state before step S5.
  • step S11 the change information generation unit 56 determines whether there is any unselected degree of freedom. If there is an unselected degree of freedom (YES in step S11), the process of generating change information returns to step S4. As a result, steps S4 to S10 are repeated, and the first change information and the second change information are generated for each of the six degrees of freedom.
  • step S12 the change information generation unit 56 determines whether k is the frame number of the last frame in the teaching range.
  • step S13 the change information generating unit 56 generates the same image as the (k + 1) th frame of the first reference moving image and the second reference moving image in the imaging device 21,
  • the robot 30a is controlled so as to be imaged by each of the robots 22.
  • the change information generation unit 56 changes the state of the male connector 2a on the k-th frame to the state of the male connector 2a on the (k + 1) -th frame by the same method as the processing content of the calculation unit 58 described above.
  • a control amount with six degrees of freedom for approaching is calculated. That is, for each of the plurality of feature points extracted from the k-th frame, the change information generation unit 56 sets the difference vector having the feature point as a starting point and the corresponding feature point extracted from the (k + 1) -th frame as an end point. Ask for.
  • the change information generation unit 56 calculates a control amount having six degrees of freedom based on the difference vector for each feature point and the first change information and the second change information corresponding to the k-th frame.
  • the change information generation unit 56 generates a control command indicating the calculated control amount, and outputs the control command to the robot controller 40a.
  • step S14 the change information generation unit 56 adds 1 to k. After step S14, the process returns to step S4. As a result, also for the (k + 1) th frame, the first change information and the second change information for each of the six degrees of freedom are generated.
  • step S12 If k is the frame number of the last frame (YES in step S12), the first change information set 571 and the second change information set 572 have been generated for all frames, and the change information generation processing ends.
  • FIG. 11 is a flowchart showing the flow of the processing of the subroutine of step S2 shown in FIG.
  • step S21 the change information generation unit 56 acquires from the image processing unit 53 the coordinates of the characteristic points of the male connector 2a extracted from the first frame of the teaching range of the first reference moving image and the second reference moving image.
  • step S22 the change information generation unit 56 acquires the latest real image from the imaging devices 21 and 22.
  • step S23 the change information generating unit 56 acquires from the image processing unit 53 the coordinates of the characteristic points of the male connector 2a extracted from the actual image acquired in step S22.
  • step S24 the change information generation unit 56 determines whether the deviation between the state of the male connector 2a on the real image acquired in step S22 and the state of the male connector 2a on the first frame is less than the threshold Tha.
  • the deviation is, for example, an average of distances between corresponding feature points of the male connector 2a extracted from the real image and the first frame.
  • the change information generation unit 56 determines that the same image as the first frame is captured by the imaging devices 21 and 22, respectively, and ends the process.
  • step S25 the change information generating unit 56 selects one of the six degrees of freedom.
  • step S26 the change information generator 56 selects one of the positive direction and the negative direction as the control direction.
  • step S27 the change information generating unit 56 generates a control command for moving the hand 31a by the unit control amount in the selected control direction with the selected degree of freedom, and outputs the control command to the robot controller 40a.
  • step S28 the change information generation unit 56 acquires the actual image from the imaging devices 21 and 22 after the hand 31a moves by the unit control amount.
  • step S29 the change information generating unit 56 obtains, from the image processing unit 53, the coordinates of the characteristic points of the male connector 2a extracted from the actual image obtained in step S28.
  • step S30 the change information generator 56 calculates a deviation between the state of the male connector 2a on the real image acquired in step S28 and the state of the male connector 2a on the first frame.
  • step S31 the change information generation unit 56 generates a control command for returning the hand 31a to the original state (the state before the latest step S27), and outputs the control command to the robot controller 40a. Thereby, the male connector 2a returns to the state before step S27.
  • step S32 the change information generator 56 determines whether there is an unselected control direction. If there is an unselected control direction (YES in step S32), the process returns to step S26.
  • step S33 the change information generation unit 56 determines whether there is an unselected degree of freedom. If there is an unselected degree of freedom (YES in step S33), the process returns to step S25.
  • step S34 the change information generating unit 56 moves the hand 31a by the unit control amount in the degree of freedom and control direction corresponding to the minimum deviation.
  • the robot 30a is controlled via the controller 40a. After step S34, the process returns to step S22.
  • FIG. 12 is a flowchart illustrating an example of a process flow of controlling the target robot such that the state of the target object changes along the first reference moving image and the second reference moving image.
  • step S41 the control device 50 determines whether or not the end notification has been output from the end determination units 60 of all the control units (the first control unit 55a and the second control unit 55b). When the end notification has been output from the end determination units 60 of all the control units (YES in step S41), the process ends.
  • step S42 the control device 50 acquires the actual images captured by the imaging devices 21 and 22. Step S42 is performed for each imaging cycle.
  • step S43 the image processing unit 53 detects all objects (male connector 2a and female connector 2b) from the real image and the target frame by template matching, and extracts the coordinates of the feature points of each object.
  • step S44 the target frame selection unit 54 selects a target frame from the first reference moving image and the second reference moving image.
  • step S45 the target frame selection unit 54 specifies a target that includes the target frame within the teaching range and controls the specified target (at least one of the first control unit 55a and the second control unit 55b). And outputs a control instruction to.
  • step S46 the control unit (at least one of the first control unit 55a and the second control unit 55b) that has received the control instruction controls the target robot.
  • the process returns to step S41. If NO in step S4, a series of processes in steps S42 to S46 is repeated for each imaging cycle. Further, at this time, if NO in step S41, step S42 of acquiring the next actual image may be started while the target robot is being controlled in step S46. Thus, the target robot is continuously controlled according to the latest actual image without stopping the operation of the target robot. As a result, the state of the object can be changed quickly.
  • FIG. 13 is a flowchart showing the flow of the processing of the subroutine of step S44 shown in FIG.
  • FIG. 14 is a diagram illustrating the relationship between the closest frame and the target frame.
  • step S51 the target frame selection unit 54 acquires from the image processing unit 53 the coordinates of the feature points of all the objects extracted from each frame of the first reference moving image and the second reference moving image.
  • step S52 the target frame selection unit 54 obtains, from the image processing unit 53, the coordinates of the feature points of all the objects extracted from the real images captured by the imaging devices 21 and 22.
  • step S53 the target frame selection unit 54 determines whether the first target frame selection has been completed.
  • step S54 the target frame selecting unit 54 determines that the previous target frame is the last frame in the teaching range corresponding to any one of the objects. It is determined whether or not there is.
  • step S55 the target frame selecting unit 54 calculates a deviation between the state on the real image and the state on the final frame for the target corresponding to the teaching range to which the final frame belongs, and determines whether the deviation is less than the threshold. Determine whether or not.
  • the deviation is, for example, an average of distances between corresponding feature points of the object in the real image and the final frame.
  • step S56 the target frame selecting unit 54 selects the same frame as the previous target frame as the target frame. After step S56, the process ends.
  • step S55 If the deviation is less than the threshold (YES in step S55), it is determined that the state of the target object has reached the state on the last frame, and the process proceeds to step S57. If the first target frame selection has not been completed (NO in step S53), and if the previous target frame is not the last frame (NO in step S54), the process proceeds to step S57.
  • step S57 the target frame selection unit 54 calculates the deviation between the state of all objects on the real image and the state of all objects on each frame, and specifies the frame with the minimum deviation as the closest frame.
  • the deviation is, for example, an average of distances between corresponding feature points of the object in the real image and each frame.
  • the target frame selection unit 54 determines the state of all the objects on the real image captured by the imaging device 21 and the state of all the objects on the k-th frame of the first reference moving image. Calculate one deviation. Further, the target frame selection unit 54 calculates a second deviation between the states of all the objects on the real image captured by the imaging device 22 and the states of all the objects on the k-th frame of the second reference moving image. calculate. The target frame selection unit 54 calculates the average of the first deviation and the second deviation as the deviation corresponding to the k-th frame.
  • step S58 the target frame selection unit 54 determines whether there is a last frame in any of the teaching ranges up to a frame that is a predetermined number later than the closest frame.
  • step S59 target frame selecting section 54 selects the final frame as a target frame.
  • the target frame selecting unit 54 selects the last frame having the smallest frame number among the plurality of final frames.
  • step S60 the target frame selecting unit 54 selects a frame that is a predetermined number later than the closest frame as the target frame. After step S60, the target frame selection process ends.
  • FIG. 15 is a flowchart showing the flow of the processing of the subroutine of step S46 shown in FIG.
  • step S61 the end determination unit 60 determines whether the target frame is the last frame in the teaching range.
  • step S62 the end determination unit 60 determines whether the deviation between the state of the target on the real image and the state of the target on the final frame is less than a threshold. Is determined.
  • step S63 the termination determination unit 60 outputs a termination notification. After step S63, the process ends.
  • step S61 If the target frame is not the last frame (NO in step S61) and if the deviation is equal to or larger than the threshold (NO in step S62), the process proceeds to step S64.
  • step S64 the calculation unit 58 brings the state of the target on the real image closer to the state of the target on the target frame based on the first change information set 571 and the second change information set 572 corresponding to the target frame. For each of a plurality of degrees of freedom for the calculation.
  • step S65 the command unit 59 generates a control command indicating the calculated control amount, and outputs the control command to the target robot controller. After step S65, the process ends.
  • the first control unit 55a and the second control unit 55b each control the target robot according to the flow shown in FIG.
  • the states of the male connector 2a and the female connector 2b on the actual image approach the states of the male connector 2a and the female connector 2b on the target frame, respectively.
  • the robots 30a and 30b are controlled.
  • the states of the male connector 2a and the female connector 2b change in a coordinated manner according to the target frame.
  • the control system 1 includes the robots 30a and 30b, the imaging devices 21 and 22 for imaging the male connector 2a and the female connector 2b, and the control device 50 for controlling the robots 30a and 30b.
  • the robots 30a and 30b change the states of the male connector 2a and the female connector 2b, respectively.
  • the imaging devices 21 and 21 are installed at fixed positions different from the robots 30a and 30b, and image the male connector 2a and the female connector 2b supported by the robots 30a and 30b, respectively.
  • the control device 50 stores a first reference moving image and a second reference moving image showing samples of the male connector 2a and the female connector 2b.
  • the first reference moving image and the second reference moving image include a plurality of frames arranged in chronological order.
  • the control device 50 acquires change information for each of the male connector 2a and the female connector 2b.
  • the change information corresponding to the male connector 2a indicates the relationship between the control amount of the robot 30a and the change amount of the state of the male connector 2a on the images of the imaging devices 21 and 22.
  • the change information corresponding to the female connector 2b indicates the relationship between the control amount of the robot 30b and the change amount of the state of the female connector 2b on the images of the imaging devices 21 and 22.
  • the control device 50 performs a first process of acquiring a real image captured by the imaging devices 21 and 22, a second process of selecting a target frame from a plurality of frames, and the robot 30 a, based on the real image and the target frame. And a third process for controlling each of the switches 30b.
  • the control device 50 calculates and calculates a control amount of the robot 30a for bringing the state of the male connector 2a on the real image closer to the state of the male connector 2a on the target frame based on the change information corresponding to the male connector 2a.
  • the robot 30a is controlled according to the control amount thus set.
  • the control device 50 calculates a control amount of the robot 30b for bringing the state of the female connector 2b on the actual image closer to the state of the female connector 2b on the target frame, based on the change information corresponding to the female connector 2b.
  • the robot 30b is controlled according to the control amount thus set.
  • the states of the male connector 2a and the female connector 2b on the actual image can be respectively changed to the states of the male connector 2a and the female connector 2b on the target frame. That is, the states of the male connector 2a and the female connector 2b change in conjunction with each other according to the target frame.
  • the control device 50 repeatedly executes a series of processes including the first to third processes, and starts the first process of the next series of processes while performing the third process.
  • the robots 30a and 30b are continuously controlled according to the latest actual image without stopping the operations of the robots 30a and 30b.
  • the states of the male connector 2a and the female connector 2b can be changed quickly.
  • FIG. 16 is a diagram showing another example of the reference moving image that is a sample of the connection between the male connector and the female connector.
  • the male connector 2a moves toward the female connector 2b along a direction parallel to the substrate 5.
  • the state of connection to the female connector 2b is shown.
  • the male connector 2a moves along the L-shaped path A.
  • the frame 74 shows the male connector 2a located above the substrate 5.
  • the frame 75 shows the male connector 2a reaching the upper surface of the substrate 5.
  • the frame 76 shows the male connector 2a connected to the female connector 2b.
  • the male connector 2a goes to the state shown in the frame 76 without passing through the state shown in the frame 75.
  • the male connector 2a faces from the direction inclined with respect to the female connector 2b. Therefore, there is a possibility that the pins of the male connector 2a cannot be inserted into the insertion holes of the female connector 2b.
  • the target frame selection unit 54 of the present modification receives the specification of a frame to be passed (hereinafter, referred to as a “pass-necessary frame”), and always selects the specified pass-necessary frame as a target frame.
  • the target frame selection unit 54 receives a pass-necessary frame, the target frame selection unit 54 receives designation of an object to pass the state indicated by the pass-necessary frame.
  • the target frame selection unit 54 displays a screen for prompting the user to specify a frame that must pass, on the display unit 532 (see FIG. 4).
  • the worker confirms each frame of the first reference moving image and the second reference moving image, and operates the input device 534 (see FIG. 4) to pass the required pass frame and the target to pass through the state indicated by the pass required frame.
  • An object (a target object corresponding to a required passing frame) is specified.
  • FIG. 17 is a flowchart illustrating an example of the flow of a target frame selection process according to the first modification of the first embodiment.
  • the flowchart shown in FIG. 17 is different from the flowchart shown in FIG. 13 only in that steps S74, S78, and S79 are included instead of steps S54, S58, and S59. Therefore, only steps S74, S78, and S79 will be described.
  • step S74 the target frame selection unit 54 determines whether or not the previous target frame is the last frame or a mandatory pass frame.
  • step S78 the target frame selecting unit 54 determines whether or not there is at least one of the last frame and the pass-by-pass frame until a predetermined number of frames later than the closest frame.
  • step S79 the target frame selection unit 54 selects at least one of the last frame and the pass-necessary frame as the target frame. Note that if there are a plurality of frames that are at least one of the last frame and the required pass frame by a predetermined number of frames after the closest frame, the target frame selecting unit 54 sets the frame having the smallest frame number among the plurality of frames. Select
  • the required-passing frame is always selected as the target frame. Further, when the pass-necessary frame is selected as the target frame, the target frame is updated after the deviation between the state of the target on the real image and the target state on the pass-necessary frame becomes less than the threshold. This makes it possible to reliably change the state of the target object corresponding to the passing essential frame by the state on the passing essential frame.
  • the target frame selection unit 54 is configured to control the control system when the target object corresponding to the required pass frame does not match the state on the required pass frame even after the lapse of the specified time since the required pass frame is selected as the target frame. May be determined to be abnormal.
  • FIG. 18 is a flowchart showing an example of the flow of the abnormality determination process.
  • the target frame selecting unit 54 resets the timer when selecting a pass-through essential frame as a target frame.
  • step S82 the target frame selection unit 54 determines that the deviation between the state on the real image captured by the imaging devices 21 and 22 and the state on the passing essential frame is smaller than the threshold for the target corresponding to the passing essential frame. It is determined whether or not. If the deviation is less than the threshold (YES in step S82), the abnormality determination processing ends.
  • step S83 the target frame selecting unit 54 determines whether the timer value exceeds a specified time.
  • step S83 If the timer value does not exceed the specified time (NO in step S83), the abnormality determination process returns to step S82.
  • step S84 the target frame selection unit 54 determines that some abnormality has occurred in the control system 1. Thereby, the countermeasure against the abnormality of the control system 1 can be started quickly. After step S84, the abnormality determination processing ends.
  • control device 50 may notify the display unit 532 of the occurrence of the abnormality, or may stop the control of the robots 30a and 30b.
  • control system 1 connects (assembles) the male connector 2a and the female connector 2b.
  • control system 1 may assemble another two objects.
  • FIG. 19 is a schematic diagram illustrating an object of the control system according to the second modification of the first embodiment.
  • the control system 1 according to the second modification causes the upper case 6a and the lower case 6b to engage with each other in a production line of an industrial product or the like.
  • the upper case 6a and the lower case 6b have a substantially rectangular box shape in plan view.
  • the upper case 6a is arranged so as to open downward.
  • the lower case 6b is arranged so as to open upward.
  • 2Two engaging claws 7a and 7b are formed on the upper end of the lower case 6b on the near side in the drawing.
  • Two engaging claws are also formed at the upper end of the lower case 6b on the far side in the drawing.
  • the upper case 6a moves downward from above the lower case 6b and engages with four engagement claws of the lower case 6b.
  • the upper case 6a is gripped by the hand 31a (see FIG. 1) of the robot 30a.
  • the lower case 6b is mounted on a stage 31b (see FIG. 1) of the robot 30b.
  • the robot 30a further includes control rods 32a to 32d (see FIG. 19) for changing the state (the shape here) of the upper case 6a with two degrees of freedom.
  • the robot 30a can apply a force corresponding to the control amount to the control rods 32a and 32b facing each other in a direction to approach each other.
  • the robot 30a can apply a force according to the control amount to the control rods 32c and 32d facing each other in a direction to approach each other.
  • control rods 32c and 32d are in contact with the other two opposite side walls of the upper case 6a, respectively. Therefore, when a force that approaches the control rods 32c and 32d is applied, the upper case 6a is deformed. The amount of deformation differs depending on the control amount.
  • the imaging devices 21 and 22 are arranged at positions where the engagement claws 7a and 7b on the lower side of the lower case 6b can be imaged. However, the imaging devices 21 and 22 cannot image the two engagement claws on the back side of the lower case 6b in the drawing. Therefore, the control system 1 according to the second modification of the first embodiment further includes imaging devices 23 and 24. The imaging devices 23 and 24 are arranged at positions where two engagement claws on the back side of the lower case 6b in the drawing can be imaged.
  • the control device 50 of the second modification stores four reference moving images respectively corresponding to the imaging devices 21 to 24.
  • the change information generating unit 56 may generate four change information sets respectively corresponding to the four reference moving images.
  • the robot 30a changes the position and orientation of the upper case 6a with six degrees of freedom, and changes the shape of the upper case 6a with two degrees of freedom. Therefore, the change information generation unit 56 generates the first change information and the second change information corresponding to each of the eight degrees of freedom for the upper case 6a.
  • a moving image showing how the upper case 6a is engaged with the lower case 6b after deforming the shape of the upper case 6a so as to easily engage with the four engaging claws of the lower case 6b is prepared in advance.
  • the calculation unit 58 calculates the control amounts for the control rods 32a and 32b and the control amounts for the control rods 32c and 32d for bringing the shape of the upper case 6a on the real image closer to the shape of the upper case 6a on the target frame. calculate.
  • the upper case 6a can be deformed in the same manner as the reference moving image.
  • the upper case 6a can be easily engaged with the lower case 6b.
  • the calculation unit 58 calculates the control amount of each of the plurality of degrees of freedom for bringing the state of the target on the real image captured by the imaging devices 21 and 22 closer to the state of the target on the target frame by another method. It may be calculated.
  • the change information generation unit 56 sets each of the plurality of degrees of freedom for bringing the state of the object on the kth frame closer to the state of the object on the (k + 1) th frame. Calculate the control amount.
  • the calculating unit 58 stores the control amount calculated for each two consecutive frames as an inter-frame control amount, and uses the inter-frame control amount to change the state of the target on the real image on the target frame.
  • the control amount of each of the plurality of degrees of freedom for approaching the state of the target object may be calculated.
  • the calculation unit 58 is configured to bring the state of the target on the real image closer to the state of the target on the closest frame based on the first change information and the second change information corresponding to the closest frame. Is calculated for each of the plurality of degrees of freedom. Further, the calculation unit 58 calculates the sum ⁇ of the inter-frame control amounts from the closest frame to the target frame. The calculating unit 58 may calculate the sum of the control amount ⁇ and the sum ⁇ of the inter-frame control amounts as a control amount for bringing the state of the target on the real image closer to the state of the target on the target frame.
  • the calculation unit 58 is configured by a known model predictive control (“Adachi,“ Basics of Model Predictive Control ”, Journal of the Robotics Society of Japan, July 2014, Vol. 32, No. 6, p. 9-12) (Non-Patent Document By performing 2)), control amounts of a plurality of degrees of freedom may be calculated.
  • the target frame selection unit 54 selects a plurality of frames included in the predicted horizon period in the teaching range as target frames.
  • the calculation unit 58 performs control during the control horizon period so as to minimize the deviation between the state of the object on the target frame and the state of the object on the images captured by the imaging devices 21 and 22 during the predicted horizon period. Calculate the amount.
  • the image processing unit 53 may detect the target from the target image using the 3D-CAD data of the target.
  • the reference moving image (first reference moving image, second reference moving image) may be created by CG (Computer @ Graphics).
  • the change information generation unit 56 changes the information indicating the mapping that converts the coordinates of the feature points of the target object extracted from the target frame into the coordinates of the corresponding feature points extracted from the captured image. Generate as information. However, the change information generating unit 56 generates, as the change information, information indicating the coordinates of the feature points of the target object extracted from the captured image and the mapping to be converted to the coordinates of the corresponding feature points extracted from the target frame. You may.
  • the control device 50 may display the first reference moving image and the second reference moving image on the display unit 532, receive a moving image editing instruction from an operator, and edit the moving image. For example, the worker may delete unnecessary frames from the first reference moving image and the second reference moving image.
  • the robot 30a may change the state (here, the size) of an object that expands and contracts, such as a balloon, for example.
  • the control device 50 acquires change information indicating a change in the size of the target object when the robot 30a is controlled by the unit control amount. Then, the control device controls the robot 30a based on the change information such that the size of the target on the real image approaches the size of the target on the target frame.
  • the male connector 2a and the female connector 2b are connected by moving the male connector 2a toward the female connector 2b.
  • the male connector 2a may be placed on the stage 31b, and the female connector 2b may be gripped by the hand 31a.
  • the male connector 2a and the female connector 2b are connected by moving the female connector 2b toward the male connector 2a.
  • the reference moving image storage unit 51 of the control device 50 stores the reference moving image.
  • an external device of the control device 50 may store the reference moving image.
  • the reference moving image storage unit 51 may store, instead of or in addition to the reference moving image, the coordinates and characteristic amounts of the feature points of each object extracted from each frame of the reference moving image. Thus, the processing of the image processing unit 53 for each frame of the reference moving image can be omitted.
  • FIG. 20 is a schematic diagram illustrating an outline of a control system according to the second embodiment.
  • the control system 1A solders the electric wires 8e to the pads 8f of the substrate 9 using the soldering iron 8c and the solder feeder 8d in a production line of an industrial product or the like.
  • the control system 1A is different from the control system 1 shown in FIG. 1 in that the robots 30c to 30f, the robot controllers 40c to 40f, and the control device 50A are replaced with the robots 30a, 30b, the robot controllers 40a, 40b, and the control device 50. It is different in that it is prepared.
  • the imaging devices 21 and 22 image the soldering iron 8c, the solder feeder 8d, the electric wire 8e, and the pad 8f as objects from different directions.
  • the robot 30c is a mechanism for changing the state (the position and the posture here) of the soldering iron 8c, and is, for example, a vertical articulated robot.
  • the robot 30c has a hand 31c at its tip for holding the soldering iron 8c, and changes the position and posture of the hand 31c with six degrees of freedom.
  • the robot 30d is a mechanism for changing the state (here, position and posture) of the solder feeder 8d, and is, for example, a vertical articulated robot.
  • the robot 30d has a hand 31d holding the solder feeder 8d at the tip, and changes the position and posture of the hand 31d with six degrees of freedom.
  • the robot 30e is a mechanism for changing the state (here, position and posture) of the electric wire 8e, and is, for example, a vertical articulated robot.
  • the robot 30e has a hand 31e holding the electric wire 8e at the tip, and changes the position and the posture of the hand 31e with six degrees of freedom.
  • the robot 30f is a mechanism for changing the state (in this case, position and orientation) of the pad 8f on the substrate 9, and is, for example, an XY ⁇ stage.
  • the robot 30f has a stage 31f on which the substrate 9 is placed, and changes the position and posture of the stage 31f with three degrees of freedom.
  • the robots 30c to 30e have the same configuration as the robot 30a shown in FIG.
  • the robot 30f has the same configuration as the robot 30b shown in FIG. 1 except that the target placed on the stage 31f is different.
  • the robot controllers 40c to 40e control the operations of the robots 30c to 30e, respectively, according to the control commands received from the control device 50A.
  • the robot controllers 40c to 40e have the same configuration as the robot controller 40a shown in FIG.
  • the robot controller 40f controls the operation of the robot 30f according to the control command received from the control device 50A.
  • the robot controller 40f has the same configuration as the robot controller 40b shown in FIG.
  • the control device 50A controls the robots 30c to 30f via the robot controllers 40c to 40f, respectively.
  • the control device 50A stores a first reference moving image and a second reference moving image showing samples of the soldering iron 8c, the solder feeder 8d, the electric wire 8e, and the pad 8f.
  • the first reference moving image is a moving image when viewed from the position of the imaging device 21.
  • the second reference moving image is a moving image when viewed from the position of the imaging device 22.
  • FIG. 21 is a diagram illustrating an example of a reference moving image (first reference moving image or second reference moving image) according to the second embodiment.
  • FIG. 21 shows frames 77 to 83 of the reference moving image.
  • the frame 77 is a frame when the state of the pad 8f has reached a desired state.
  • the frame 78 is a frame when the electric wire 8e reaches the pad 8f.
  • the frame 79 is a frame when the tip of the soldering iron 8c contacts the pad 8f.
  • the frame 80 is a frame when the tip of the solder feeder 8d contacts the tip of the soldering iron 8c.
  • the frame 81 is a frame when the solder supplied from the solder feeder 8d and melted (hereinafter referred to as “molten solder 8g”) has a desired size.
  • the frame 82 is a frame when the solder feeder 8d is moved away from the pad 8f.
  • the frame 83 is a frame when the soldering iron 8c is moved away from the pad 8f.
  • the control device 50A acquires change information indicating a change in the state of the soldering iron 8c on the images obtained from the imaging devices 21 and 22 when the robot 30c is controlled by the unit control amount for each of the six degrees of freedom. .
  • the control device 50A acquires change information indicating a change in the state of the solder feeder 8d on the images obtained from the imaging devices 21 and 22 when the robot 30d is controlled by the unit control amount for each of the six degrees of freedom.
  • the control device 50A acquires change information indicating a change in the state of the electric wire 8e on the images obtained from the imaging devices 21 and 22 when the robot 30e is controlled by the unit control amount for each of the six degrees of freedom.
  • the control device 50A acquires change information indicating a change in the state of the pad 8f on the images obtained from the imaging devices 21 and 22 when the robot 30f is controlled by the unit control amount for each of the three degrees of freedom.
  • the control device 50A changes the state of the target on the real images captured by the imaging devices 21 and 22 to the state of the target on target frames of the first reference moving image and the second reference moving image, as in the first embodiment.
  • the target robot is controlled via the target robot controller so as to approach each other.
  • the objects are the soldering iron 8c, the solder feeder 8d, the electric wire 8e, and the pad 8f.
  • the target robot controllers are the robot controllers 40c to 40f.
  • the target robots are the robots 30c to 30f.
  • control device 50A determines the first deviation between the state of the first object on the real image and the state of the first object on the first target frame, the state of the second object on the real image, and the second deviation. A second deviation from the state of the second object on the target frame is calculated.
  • the control device 50A controls the robots 30c to 30f such that the time when the first deviation is less than the first threshold value and the time when the second deviation is less than the second threshold value satisfy prescribed conditions. This makes it possible to control the time when the states of the soldering iron 8c, the solder feeder 8d, the electric wire 8e, and the pad 8f reach the target frame.
  • control device 50A has a hardware configuration as shown in FIG. 4, as in the first embodiment. Therefore, detailed description of the hardware configuration of the control device 50A is omitted.
  • FIG. 22 is a block diagram showing a functional configuration of the control device according to the second embodiment.
  • the control device 50A includes a target frame selection unit 54A instead of the target frame selection unit 54, as compared with the control device 50 shown in FIG. 5, and includes a first control unit 55a and a second control unit 55a.
  • the difference is that a first control unit 55c, a second control unit 55d, a third control unit 55e, and a fourth control unit 55f are provided instead of the unit 55b.
  • the reference moving image storage unit 51 stores a first reference moving image and a second reference moving image showing samples of four objects (the soldering iron 8c, the solder feeder 8d, the electric wire 8e, and the pad 8f).
  • the teaching range selection unit 52 selects a teaching range for each of the four objects (the soldering iron 8c, the solder feeder 8d, the electric wire 8e, and the pad 8f).
  • the image processing unit 53 detects four objects (the soldering iron 8c, the solder feeder 8d, the electric wire 8e, and the pad 8f) from the target image.
  • the image processing unit 53 also detects the molten solder 8g (see FIG. 21) on the pad 8f from the target image as a target.
  • the image processing unit 53 creates a template of the molten solder 8g in advance, and extracts a feature point and a feature amount of the molten solder 8g from the target image.
  • the state (here, the size) of the molten solder 8g changes according to the amount of the molten solder. Therefore, it is preferable that the image processing unit 53 performs color extraction and labeling from the target image, and extracts SIFT, which is not easily affected by enlargement / reduction, as a feature amount. Furthermore, it is preferable that the image processing unit 53 extracts SIFT that is not easily affected by color enlargement / reduction as a feature amount. Thereby, the image processing unit 53 can easily detect the molten solder 8g from the target image.
  • the target frame selection unit 54A selects a target frame from the first reference moving image and the second reference moving image, similarly to the target frame selection unit 54 of the first embodiment. However, similarly to the first modification of the first embodiment, the target frame selection unit 54A specifies a pass-necessary frame and specifies an object to pass through the state indicated by the pass-necessary frame (the target ) Is accepted.
  • the target frame selecting unit 54A can receive the time difference between two consecutive passing required frames.
  • first related frame the first frame of two consecutive passing essential frames
  • second related frame the subsequent frame
  • the target frame selection unit 54A receives designation of a first related frame, a second related frame, and a time difference.
  • the target frame selecting unit 54A selects the first related frame as the target frame, and then selects the second related frame as the next target frame. That is, when the deviation between the state on the real image and the state on the first related frame of the object corresponding to the first related frame is less than the threshold, the target frame selecting unit 54A sets the second related frame as the target frame. select.
  • FIG. 23 is a diagram showing an example of a screen for designating a frame that must pass, a related frame, and a time difference.
  • the control device 50A displays a screen as shown in FIG. 23 on the display unit 532, and accepts designation of a required passage frame, an object corresponding to the required passage frame, a first related frame, a second related frame, and a time difference.
  • the operator operates the input device 534 to specify a required pass frame, an object corresponding to the required pass frame, a first related frame, a second related frame, and a time difference.
  • the teaching range selection unit 52 selects the frames 83 and 82 as the last frames of the teaching range corresponding to the pad 8f and the electric wire 8e.
  • the target frame selection unit 54A receives the frames 79 and 83 as the indispensable passage frames corresponding to the target object “solder iron 8c”.
  • the target frame selecting unit 54A receives the frames 80 and 82 as the passage essential frames corresponding to the target object “solder feeder 8d”.
  • the target frame selection unit 54A receives the frame 81 as a passing essential frame corresponding to the target object “melted solder 8g”.
  • the target frame selection unit 54A receives an instruction to set the frames 79 and 80, which are two consecutive passing essential frames, as the first related frame and the second related frame, respectively, and to set the time difference to “3 seconds”. Further, the target frame selecting unit 54A receives an instruction to set the frames 81 and 82, which are two consecutive passing essential frames, as the first related frame and the second related frame, respectively, and to set the time difference to “0.5 seconds”. .
  • the first control unit 55c controls the robot 30c via the robot controller 40c to change the state of the soldering iron 8c.
  • the second controller 55d controls the robot 30d via the robot controller 40d to change the state of the solder feeder 8d.
  • the third control unit 55e controls the robot 30e via the robot controller 40e to change the state of the electric wire 8e.
  • the fourth control unit 55f controls the robot 30f via the robot controller 40f to change the state of the pad 8f.
  • Each of the first control unit 55c, the second control unit 55d, the third control unit 55e, and the fourth control unit 55f includes a change information generation unit 56, a change information storage unit 57, A section 58, a command section 59, and an end determination section 60 are provided (see FIG. 7).
  • the target object, the target robot, and the target robot controller in each unit are different from those in the first embodiment.
  • the object is the soldering iron 8c in the first control unit 55c, the solder feeder 8d in the second control unit 55d, the electric wire 8e in the third control unit 55e, and the pad 8f in the fourth control unit 55f.
  • the target robot is the robot 30c in the first control unit 55c, the robot 30d in the second control unit 55d, the robot 30e in the third control unit 55e, and the robot 30f in the fourth control unit 55f.
  • the target robot controller is the robot controller 40c in the first controller 55c, the robot controller 40d in the second controller 55d, the robot controller 40e in the third controller 55e, and the robot controller 40f in the fourth controller 55f. It is.
  • the calculating unit 58 has the following functions in addition to the functions of the first embodiment. That is, when the target frame selecting unit 54 receives designation of the first related frame, the second related frame, and the time difference, the calculating unit 58 adjusts the control amount so as to satisfy the time difference.
  • the calculating unit 58 adjusts the control amount as follows.
  • the calculation unit 58 calculates the time at which the deviation between the state on the real image and the state on the first related frame becomes less than the threshold for the object corresponding to the first related frame by the time difference specified by the time difference. Calculated as expected arrival time.
  • the calculation unit 58 calculates a control amount of each degree of freedom for bringing the state of the object corresponding to the second related frame on the real image closer to the state of the object on the second related frame.
  • the calculation unit 58 adjusts the control amount by multiplying the calculated control amount by the imaging cycle / (scheduled arrival time ⁇ current time).
  • control device 50A controls the target robot so as to change the state of the target along the reference moving image according to the flowcharts shown in FIGS.
  • target frame selection unit 54A performs a target frame selection process according to the flowchart shown in FIG.
  • FIG. 24 is a flowchart showing an example of the flow of a target frame selection process in the second embodiment.
  • the flowchart shown in FIG. 24 differs from the flowchart shown in FIG. 17 only in that steps S81 and S82 are included. Therefore, steps S81 and S82 will be described below.
  • step S55 If YES in step S55, the process proceeds to step S81.
  • the target frame selection unit 54A determines whether or not the target frame is a first related frame indicated by a specified condition.
  • step S82 the target frame selecting unit 54A selects the second related frame corresponding to the first related frame as the target frame. After step S82, the target frame selection process ends.
  • step S81 If the target frame is the first related frame (YES in step S81), the process of selecting a target frame moves to step S57.
  • the time at which the deviation between the state of the first object on the real image and the state of the first object on the first target frame is less than the threshold value, and the state of the second object on the real image
  • the time when the deviation from the state of the second object on the second target frame is less than the threshold satisfies the prescribed condition.
  • the target robot is set as follows. Controlled.
  • the time when the deviation between the state of the soldering iron 8c on the real image and the state of the soldering iron 8c on the frame 79 is less than the threshold value Tha is t1
  • the state of the solder feeder 8d on the real image and the solder on the frame 80 are
  • the time when the deviation from the state of the feeder 8d becomes smaller than the threshold Thd is defined as t2.
  • the control device 50A controls the robot 30d so as to satisfy the specified condition that (t2 ⁇ t1) is 3 seconds.
  • the solder feeder 8d can be brought into contact with the soldering iron 8c after the pad 8f is sufficiently heated.
  • the time at which the deviation between the state of the molten solder 8g on the real image and the state of the molten solder 8g on the frame 81 is less than the threshold Thg is defined as t3, and the state of the solder feeder 8d on the real image and the solder feeder 8d on the frame 82 are set.
  • the time at which the deviation from the state described above becomes smaller than the threshold Thd is defined as t4.
  • the control device 50A controls the robot 30d so as to satisfy the specified condition that (t4 ⁇ t3) is 0.5 second.
  • the state of the molten solder 8g indicates the size of the molten solder 8g.
  • the target frame is updated to the frame 82 after the deviation between the state of the molten solder 8g on the actual image and the state of the molten solder 8g on the frame 81 becomes less than the threshold Thg. Then, after the molten solder 8g reaches a desired size, the solder feeder 8d can be immediately separated from the soldering iron 8c to maintain the size of the molten solder 8g.
  • the target robot is controlled as follows.
  • the frame 83 is selected as the target frame. Therefore, after the solder feeder 8d is sufficiently separated from the soldering iron 8c, the soldering iron 8c is moved away from the pad 8f. Thereby, unintended contact between the soldering iron 8c and the solder feeder 8d can be avoided.
  • control system 1A solders the electric wire 8e to the pad 8f using the soldering iron 8c and the solder feeder 8d.
  • the control system 1A may assemble another four objects.
  • FIG. 25 is a schematic diagram showing an object of the control system according to the first modification of the second embodiment.
  • the control system 1A according to the first modification joins the cylindrical members 10e and 10f with screws 10c and 10d in an industrial product production line or the like.
  • Screw holes 11a and 11b are formed in the cylindrical member 10e, and screw holes 12a and 12b are formed in the cylindrical member 10f. With the screw hole 11a and the screw hole 12a overlapping and the screw hole 11b and the screw hole 12b overlapping, the screw 10c is inserted into the screw holes 11a and 12a, and the screw 10d is inserted into the screw holes 11b and 12b.
  • the screw 10c is gripped by the hand 31c (see FIG. 20) of the robot 30c.
  • the screw 10d is gripped by the hand 31d of the robot 30d.
  • the cylindrical member 10e is gripped by the hand 31e of the robot 30e.
  • the cylindrical member 10f is mounted on a stage 31f of the robot 30f.
  • the imaging devices 21 and 22 are arranged at positions where the screw holes 11a and 12a and the screw 10c can be imaged. However, the imaging devices 21 and 22 cannot image the screw holes 11b and 12b and the screw 10d. Therefore, the control system 1A according to the first modification further includes imaging devices 23 and 24. The imaging devices 23 and 24 are arranged at positions where the screw holes 11b and 12b and the screw 10d can be imaged.
  • the control device 50A stores four reference moving images corresponding to the imaging devices 21 to 24, respectively.
  • FIG. 26 is a diagram illustrating an example of the reference moving image according to the first modification of the second embodiment.
  • FIG. 26 shows frames 84 to 86 of the reference moving image.
  • the frame 84 is a frame when the cylindrical member 10f reaches a desired position and posture.
  • the frame 85 is a frame when the screw hole 11a of the cylindrical member 10e overlaps the screw hole 12a of the cylindrical member 10f.
  • the frame 86 is a frame immediately before the screw 10c is inserted into the screw holes 11a and 12a.
  • the change information generation unit 56 of the first modification may generate four change information sets respectively corresponding to the four reference moving images. Then, the calculation unit 58 may calculate the control amount of each degree of freedom of the target robot based on the four change information sets.
  • each of the cylindrical members 10e and 10f may be further provided with two screw holes. In this case, the cylindrical members 10e and 10f are joined by four screws.
  • FIG. 27 is a diagram illustrating an example of the arrangement of four imaging devices.
  • the imaging device 21 is arranged at a position where two screws 10c and 10g can be imaged. That is, the screws 10c and 10g exist in the visual field range 21a of the imaging device 21.
  • the imaging device 22 is arranged at a position where the two screws 10c and 10h can be imaged. That is, the screws 10c and 10h exist in the visual field range 22a of the imaging device 22.
  • the imaging device 23 is arranged at a position where two screws 10d and 10g can be imaged. That is, the screws 10d and 10g exist in the visual field range 23a of the imaging device 23.
  • the imaging device 24 is arranged at a position where the two screws 10d and 10h can be imaged. That is, the screws 10d and 10h exist in the visual field range 24a of the imaging device 24.
  • FIG. 28 is a schematic diagram illustrating an object of the control system according to the second modification of the second embodiment.
  • the control system 1A according to the second modification uses a welding torch 13c and a welding rod 13d to weld two cylindrical members 13e and 13f to each other in an industrial product production line or the like.
  • the welding torch 13c is gripped by the hand 31c (see FIG. 20) of the robot 30c.
  • the welding rod 13d is gripped by the hand 31d of the robot 30d.
  • the cylindrical member 13e is gripped by the hand 31e of the robot 30e.
  • the cylindrical member 13f is mounted on a stage 31f of the robot 30f.
  • the target frame selection unit 54A may receive a time difference “no designation” for the first related frame and the second related frame. In this case, a frame between the first related frame and the second related frame is not selected as the target frame and is skipped. When the designation of the time difference “no designation” is received, the calculation unit 58 does not adjust the control amount.
  • the target frame selection unit 54A may receive designation of a plurality of continuous frames in the reference moving image as an operation group. For example, when the same operation is repeated a plurality of times, the operator designates a frame group corresponding to the operation as an operation group. Further, the operator also specifies the number of repetitions of the operation group.
  • the target frame selection unit 54A may specify the frames included in the operation group as target frames to be repeated by the number of repetitions.
  • FIG. 29 is a schematic diagram illustrating an outline of a control system according to the third embodiment.
  • the third embodiment unlike the first and second embodiments, one of a plurality of objects is installed at a fixed position, and the state of the imaging devices 21 and 22 is changed by a robot.
  • the control system 1B sequentially processes the processing target portions 15, 16 of the large member 14j using the processing tools 14h, 14i in a production line of an industrial product or the like.
  • the large member 14j is, for example, a housing of a large device, an automobile body, or the like.
  • the processing tools 14h and 14i are, for example, a drill, an electric file, or the like.
  • the control system 1B is different from the control system 1 shown in FIG. 1 in that instead of the robots 30a, 30b, the robot controllers 40a, 40b, and the control device 50, the robots 30h, 30i, 30j, the robot controllers 40h, 40i, 40j, The difference is that a control device 50B is provided.
  • the robot 30h is a mechanism for changing the state (here, position and posture) of the processing tool 14h, and is, for example, a vertical articulated robot.
  • the robot 30h has a hand 31h that supports the processing tool 14h at the tip, and changes the position and posture of the hand 31h with a plurality of degrees of freedom. Further, the robot 30h includes a pedestal 33h movable along the rail 34 in the direction of the arrow AR.
  • the robot 30i is a mechanism for changing the state (here, position and orientation) of the processing tool 14i, and is, for example, a vertical articulated robot.
  • the robot 30i has a hand 31i at its tip for supporting the processing tool 14i, and changes the position and orientation of the hand 31i with a plurality of degrees of freedom. Further, the robot 30i includes a pedestal 33h movable along the rail 34 in the direction of the arrow AR.
  • the robot 30j is a mechanism for changing the states (here, positions and postures) of the imaging devices 21 and 22, and is, for example, a vertical articulated robot.
  • the robot 30j has a hand 31j that supports the imaging devices 21 and 22 at the tip, and changes the position and posture of the hand 31j with a plurality of degrees of freedom. Further, the robot 30j includes a pedestal 33h movable along the rail 34 in the direction of the arrow AR.
  • the pedestals 33h, 33i, and 33j move along the common rail.
  • a rail is provided for each of the pedestals 33h, 33i, and 33j, and each of the pedestals 33h, 33i, and 33j may move along the corresponding rail.
  • the robot controllers 40h, 40i, and 40j perform operation control of the robots 30h, 30i, and 30j, respectively, according to control commands received from the control device 50B.
  • the robot controllers 40h, 40i, and 40j change the states of the hands 31h, 31i, and 31j, respectively, and move the pedestals 33h, 33i, and 33j, respectively, according to a control command from the control device 50B.
  • the control device 50B has a hardware configuration as shown in FIG. 4, as in the first embodiment. Therefore, a detailed description of the hardware configuration of the control device 50B will be omitted.
  • FIG. 30 is a block diagram showing a functional configuration of the control device according to the third embodiment.
  • the control device 50B is different from the control device 50 shown in FIG. 5 in that a first control unit 55h and a second control unit 55i are used instead of the first control unit 55a and the second control unit 55b.
  • a third control unit 55j is used instead of the first control unit 55a and the second control unit 55b.
  • the reference moving image storage unit 51 stores a first reference moving image and a second reference moving image showing samples of the processing tools 14h and 14i and the large member 14j.
  • the first reference moving image and the second reference moving image indicate, for example, the following first to third scenes in order.
  • the first scene is a scene in which the processing tools 14h and 14i process the processing target portion 15 in a state where the processing target portion 15 of the large member 14j is at a fixed position on the image.
  • the second scene is a scene in which the large member 14j moves on the image, and the processing target portion 16 of the large member 14j moves to a fixed position on the image.
  • the third scene is a scene in which the processing tools 14h and 14i process the processing target portion 16 in a state where the processing target portion 16 of the large member 14j is at a fixed position on the image.
  • the teaching range selection unit 52 selects a teaching range for each of the processing tool 14h, the processing tool 14i, and the large member 14j.
  • the image processing unit 53 detects the processing tool 14h, the processing tool 14i, and the large member 14j from the target image. Since the size of the large member 14j is large, the field of view of the imaging devices 21 and 22 includes only a part of the large member 14j. Therefore, the image processing unit 53 detects a pattern formed on the surface of the large member 14j.
  • the first control unit 55h controls the robot 30h via the robot controller 40h to change the state of the processing tool 14h.
  • the second control unit 55i controls the robot 30i via the robot controller 40i to change the state of the processing tool 14h.
  • the third control unit 55j controls the robot 30j via the robot controller 40j to change the states of the imaging devices 21 and 22.
  • Each of the first control unit 55h, the second control unit 55i, and the third control unit 55j includes a change information generation unit 56, a change information storage unit 57, a calculation unit 58, and a command unit, as in the first embodiment. 59 and an end determination unit 60 (see FIG. 7).
  • the target object is the processing tool 14h in the first control unit 55h, the processing tool 14i in the second control unit 55i, and the large member 14j in the third control unit 55j.
  • the target robot is the robot 30h in the first control unit 55h, the robot 30i in the second control unit 55i, and the robot 30j in the third control unit 55j.
  • the target robot controller is the robot controller 40h in the first controller 55h, the robot controller 40i in the second controller 55i, and the robot controller 40j in the third controller 55j.
  • the change information generating unit 56 of the first control unit 55h and the second control unit 55i determines the unit control amount of the target robot and the target object on the real image captured by the imaging device 21.
  • the first change information indicating the relationship with the change amount of the state is generated.
  • the change information generation unit 56 of the first control unit 55h and the second control unit 55i when the robot 30j is not operating, controls the unit control amount of the target robot and the actual control image on the real image captured by the imaging device 22.
  • the second change information indicating the relationship with the amount of change in the state of the object is generated.
  • the change information generation unit 56 of the first control unit 55h performs the first change information and the second change information in the state of the imaging devices 21 and 22 in which the processing target portion 15 of the large member 14j is at a fixed position on an image. Generate change information.
  • the change information generation unit 56 of the second control unit 55i generates the first change information and the second change information in the state of the imaging devices 21 and 22 where the processing target portion 16 of the large member 14j is at a fixed position on the image. .
  • the state of the large member 14j is not changed by the robot 30j. However, since the state of the imaging devices 21 and 22 is changed by the robot 30j, the state of the large member 14j on the real image of the imaging devices 21 and 22 is changed. Therefore, the change information generation unit 56 of the third control unit 55j performs the first change indicating the relationship between the unit control amount of the robot 30j and the change amount of the state of the large member 14j on the real image captured by the imaging device 21. Generate information. In addition, the change information generation unit 56 of the third control unit 55j performs the second change indicating the relationship between the unit control amount of the robot 30j and the change amount of the state of the large member 14j on the real image captured by the imaging device 22. Generate information.
  • the processing contents of the calculation unit 58, the command unit 59, and the end determination unit 60 in the third control unit 55j are the same as those of the first control unit 55a and the second control unit 55b in the first embodiment.
  • the calculation unit 58 in the first control unit 55h and the second control unit 55i determines that the state of the target on the real image approaches the state of the target on the target frame only when the following start conditions are satisfied.
  • the control amount of the target robot is calculated. Start condition: The deviation between the state of the large member 14j on the actual image and the state of the large member 14j on the target frame is less than the threshold Thj.
  • the processing contents of the command unit 59 and the termination determination unit 60 in the first control unit 55h and the second control unit 55i are the same as those of the first control unit 55a and the second control unit 55b in the first embodiment.
  • the control device 50B controls the target robot such that the state of the target object changes in accordance with the first reference moving image and the second reference moving image according to the flowchart in FIG. 12 as in the first embodiment. .
  • the third control unit 55j performs the processing of the subroutine of step S46 shown in FIG. 12 according to the flowchart shown in FIG. 15, as in the first embodiment.
  • first control unit 55h and second control unit 55i perform the processing of the subroutine of step S46 shown in FIG. 12 according to the flowchart shown in FIG.
  • FIG. 31 is a flowchart showing the flow of processing of the first control unit and the second control unit of the third embodiment. As shown in FIG. 31, the processing flow of the first control unit 55h and the second control unit 55i according to the third embodiment is different from the flowchart shown in FIG. 15 in that step S90 is provided. Therefore, only step S90 will be described.
  • step S90 it is determined whether or not the deviation between the state of the large member 14j on the actual image and the state of the large member 14j on the target frame is less than the threshold Thj. If the deviation is equal to or larger than the threshold Thj (NO in step S90), the process ends. If the deviation is less than the threshold Thj (YES in step S90), steps S61 to S65 are performed.
  • control device 50B controls only the robot 30j.
  • control device 50B controls each of robots 30h to 30j.
  • the robots 30h to 30j are controlled as follows.
  • the robots 30 h and 30 i are not moved until the imaging devices 21 and 22 move so that the processing target portion 15 of the large member 14 j is at a fixed position on the image. It will be in a stopped state.
  • the first control unit 55h and the second control unit 55i change the state of the target on the real image to the position of the target on the target frame. Control the target robot to approach the state.
  • the processing tools 14h and 14i process the processing target portion 15.
  • the third control unit 55j controls the robot 30j such that the state of the large member 14j on the actual image approaches the state of the large member 14j on the target frame.
  • the robot 30j since the state of the large member 14j is constant, the robot 30j hardly operates, and the states of the imaging devices 21 and 22 are substantially constant.
  • the robot 30j When the target frame is selected from the second scene, the robot 30j is controlled so that the state of the large member 14j on the real image changes, and the imaging devices 21 and 22 move.
  • the robots 30h and 30i are in a stopped state until the imaging devices 21 and 22 move so that the processing target portion 16 of the large member 14j is at a fixed position on the image.
  • the first control unit 55h and the second control unit 55i The target robot is controlled so that the state of the object approaches the state of the target on the target frame.
  • the processing tools 14h and 14i process the processing target portion 16.
  • the third control unit 55j controls the robot 30j such that the state of the large member 14j on the actual image approaches the state of the large member 14j on the target frame.
  • the robot 30j since the state of the large member 14j is constant, the robot 30j hardly operates, and the states of the imaging devices 21 and 22 are substantially constant.
  • FIG. 32 is a schematic diagram showing an outline of a part of a control system according to a modification of the third embodiment.
  • the robots 30h and 30i may be installed on a pedestal 33j included in the robot 30j. In this case, the robots 30h and 30i move integrally with the robot 30j.
  • the first to third embodiments and the modified examples include the following disclosure.
  • the i-th robot (30a to 30f, 30h to 30j) changes the state of the i-th object, i is an integer of 1 to N-1,
  • the N-th robot (30a to 30f, 30h to 30j) changes one state of the N-th object and the imaging device,
  • the other of the N-th object and the imaging device (21 to 24) is installed at a fixed position,
  • the control device (50, 50A, 50B) acquires change information for each of the first to Nth objects
  • the control device (50, 50A, 50B) A first process of acquiring a real image captured by the imaging device (21 to 24); A second process of selecting a target frame from a reference moving image indicating a sample of the first to Nth objects; Performing a third process for controlling each of the first to Nth robots based on the actual image and the target frame; In the third processing, the control device (50, 50A, 50B) sets the state of the j-th object on the real image to the target based on the change information corresponding to the j-th object.
  • a control amount of the j-th robot (30a to 30f, 30h to 30j) for approximating the state of the j-th object on the frame is calculated, and the j-th robot (30a to 30f) is calculated according to the calculated control amount. , 30h to 30j), and a control system (1, 1A, 1B).
  • the control device (50, 50A, 50B) includes a state of at least one of the first to Nth objects on the real image and a state of the at least one object on the target frame.
  • the control device (50, 50A, 50B) selects a first target frame from the reference moving image, and then selects a second target frame from the plurality of frames.
  • the control device (50, 50A, 50B) determines that a deviation between the state of the first object on the real image and the state of the first object on the first target frame is less than a first threshold. And a second time at which the deviation between the state of the second object on the real image and the state of the second object on the second target frame is less than a second threshold.
  • the imaging devices (21 to 24) image the (N + 1) th object (8g) together with the first to Nth objects,
  • the reference moving image includes the (N + 1) th object,
  • the control device (50A) sets the target frame after the deviation between the state of the (N + 1) th object on the real image and the state of the (N + 1) th object on the target frame becomes less than a threshold value.
  • the control system (1A) according to Configuration 1, which is updated.
  • the imaging devices (21 to 24) image the (N + 1) th object (8g) together with the first to Nth objects,
  • the reference moving image includes the (N + 1) th object,
  • the control device (50A) selects a first target frame from the reference moving image, and then selects a second target frame from the plurality of frames,
  • the control device (50A) includes a first time when a deviation between the state of the (N + 1) th object on the real image and the state of the (N + 1) th object on the first target frame is less than a first threshold. And a second time at which the deviation between the state of the first object on the real image and the state of the first object on the second target frame is less than a second threshold satisfies a prescribed condition.
  • the control system (1A) according to Configuration 1, which controls the first robot (30d) as described above.
  • the control device (50, 50A) controls at least one of the first to N-th objects on the real image during a period from when the target frame is selected to when a specified time has elapsed.
  • the control system (1, 1A) according to configuration 1, wherein it is determined that an abnormality has occurred in the control system when a deviation between a state and a state of the at least one object on the target frame is not less than a threshold. ).
  • the N-th robot (30j) changes the state of the imaging devices (21, 22),
  • the control device (50B) performs the third operation.
  • the control system (50B) according to Configuration 1, wherein each of the first to Nth robots (30h to 30j) is controlled.
  • Control device (Configuration 8)
  • the control device (50, 50A, 50B) repeatedly executes a series of processing including the first processing to the third processing, and performs the first processing of the next series of processing while performing the third processing.
  • the control device (50, 50A, 50B) selects a frame included in a predicted horizon period among the reference moving images as the target frame,
  • the control device (50, 50A) includes a state of the j-th object on a frame included in a predicted horizon period of the reference moving image, and an image of the imaging device during the predicted horizon period.
  • the control system (1, 1A, 1B) according to Configuration 1, wherein a control amount of the j-th robot during a control horizon period is calculated so as to minimize a deviation from the state of the j-th object. .
  • the i-th robot (30a to 30f, 30h to 30j) changes the state of the i-th object, i is an integer of 1 to N-1,
  • the N-th robot (30a to 30f, 30h to 30j) changes one state of the N-th object and the imaging device (21 to 24),
  • the other of the N-th object and the imaging device (21 to 24) is installed at a fixed position
  • the control method includes: A first step of acquiring change information for each of the first to Nth objects;
  • the change information corresponding to the j-th object indicates a relationship between a control amount of the j-th robot and a change amount of a state of the
  • First to Nth objects are obtained by using imaging devices (21 to 24) for imaging the first to Nth objects (2a, 2b, 6a, 6b, 8c to 8f, 10c to 10f, 13c to 13f).
  • N is an integer of 2 or more;
  • the i-th robot (30a to 30f, 30h to 30j) changes the state of the i-th object, i is an integer of 1 to N-1,
  • the N-th robot (30a to 30f, 30h to 30j) changes one state of the N-th object and the imaging device (21 to 24),
  • the other of the N-th object and the imaging device (21 to 24) is installed at a fixed position
  • the control method includes: A first step of acquiring change information for each of the first to Nth objects;
  • the change information corresponding to the j-th object indicates a relationship between a control amount of the j-th robot and a change amount of a state of the j-th object on an image of the imaging device, j is an integer from 1 to N;
  • the control method includes: A second step of acquiring a real image captured by the imaging device (21 to 24); A third step of selecting a target frame from a reference moving image indicating a sample of the first to
  • 1, 1A, 1B control system 2a male connector, 2b female connector, 3 wire, 4a, 4g feature point, 5, 9 board, 6a upper case, 6b lower case, 7a, 7b engaging claw, 8c soldering iron, 8d solder feeder, 8e electric wire, 8f pad, 8g molten solder, 10c, 10d, 10g, 10h screw, 10e, 10f, 13e, 13f cylindrical member, 11a, 11b, 12a, 12b screw hole, 13c welding torch, 13d welding rod , 14h, 14i processing tool, 14j large member, 15, 16 processing target part, 21-24 imaging device, 21a-24a field of view, 30a-30f, 30h-30j robot, 31a, 31c-31e, 31h-31j hand, 31b, 31f stage, 32a to 32d control rod, 33h, 3 i, 33j pedestal, 34 rail, 40a to 40f, 40h to 40j robot controller, 50, 50A, 50B control device, 51 reference moving image storage unit, 52 teaching

Landscapes

  • Engineering & Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Human Computer Interaction (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)
  • Manufacturing Of Electrical Connectors (AREA)

Abstract

A control device that performs: a first processing that obtains an actual image captured by an imaging device; a second processing that selects a target frame from a reference video; and a third processing that controls each of 1st to Nth robots, on the basis of the actual image and the target frame. In the third processing, the control device: calculates a control amount for a jth robot for bringing the state of a jth target object in the actual image towards a model state for the nth target object in the target frame, on the basis of change information corresponding to the jth target object; and controls the jth robot in accordance with the calculated control amount. As a result, a control system can be achieved that is capable of coordinated change of the state of a plurality of target objects.

Description

制御システム、制御方法およびプログラムControl system, control method and program
 本技術は、ロボットを制御する制御システム、制御方法およびプログラムに関する。 The present technology relates to a control system, a control method, and a program for controlling a robot.
 「村田敦、他3名、「視覚サーボによる高精度位置決め手法の開発」、SEIテクニカルレビュー、2009年7月、第175号、p. 98-102」(非特許文献1)には、視覚サーボを利用し、ロボットアームを目標位置に導く技術が開示されている。非特許文献1に開示されたシステムは、ロボットアームに取り付けられたカメラによって、コネクタを撮像し、カメラの画像に基づいて、ロボットハンドに支持された端子をコネクタに挿入する。 "Atsushi Murata and three others," Development of high-precision positioning method using visual servo ", SEI Technical Review, July 2009, No. 175, p. U.S. Pat. No. 6,064,056 discloses a technique for guiding a robot arm to a target position. In the system disclosed in Non-Patent Document 1, an image of a connector is taken by a camera attached to a robot arm, and a terminal supported by a robot hand is inserted into the connector based on the image of the camera.
 非特許文献1に記載の技術は、ロボットハンドに支持された端子とカメラとの相対位置が一定であることを前提としている。そのため、ロボットハンドに支持された端子とカメラとの相対位置関係にずれが生じると、端子の状態とコネクタの状態との連係がくずれ、端子をコネクタに挿入できなくなる可能性がある。 The technique described in Non-Patent Document 1 is based on the premise that the relative position between the terminal supported by the robot hand and the camera is constant. Therefore, if the relative positional relationship between the terminal supported by the robot hand and the camera is deviated, the connection between the state of the terminal and the state of the connector may be lost, and the terminal may not be inserted into the connector.
 本発明は、上記の問題を鑑みてなされたものであり、その目的は、複数の対象物の状態が連係して変化可能な制御システム、制御方法およびプログラムを提供することである。 The present invention has been made in view of the above problems, and an object of the present invention is to provide a control system, a control method, and a program in which the states of a plurality of objects can be changed in a coordinated manner.
 本開示の一例によれば、制御システムは、第1~第Nのロボットと、第1~第Nの対象物を撮像するための撮像装置と、第1~第Nのロボットを制御するための制御装置とを備える。Nは2以上の整数である。第iのロボットは、第iの対象物の状態を変化させる。iは1~N-1の整数である。第Nのロボットは、第Nの対象物および撮像装置の一方の状態を変化させる。第Nの対象物および撮像装置の他方は、定位置に設置される。制御装置は、第1~第Nの対象物の各々について変化情報を取得する。第jの対象物に対応する変化情報は、第jのロボットの制御量と、撮像装置の画像上における第jの対象物の状態の変化量との関係を示す。jは1~Nの整数である。制御装置は、撮像装置によって撮像された実画像を取得する第1処理と、第1~第Nの対象物の見本を示す基準動画から目標フレームを選択する第2処理と、実画像と目標フレームとに基づいて第1~第Nのロボットの各々を制御する第3処理とを行なう。制御装置は、第3処理において、第jの対象物に対応する変化情報に基づいて、実画像上の第jの対象物の状態を目標フレーム上の第jの対象物の状態に近づけるための第jのロボットの制御量を算出し、算出した制御量に従って第jのロボットを制御する。 According to an example of the present disclosure, the control system includes first to Nth robots, an imaging device for imaging the first to Nth objects, and a control system for controlling the first to Nth robots. A control device. N is an integer of 2 or more. The i-th robot changes the state of the i-th object. i is an integer of 1 to N-1. The Nth robot changes one state of the Nth object and the imaging device. The other of the N-th object and the imaging device is installed at a fixed position. The control device acquires change information for each of the first to Nth objects. The change information corresponding to the j-th object indicates a relationship between the control amount of the j-th robot and the change amount of the state of the j-th object on the image of the imaging device. j is an integer of 1 to N. The control device is configured to perform a first process of acquiring a real image captured by the imaging device, a second process of selecting a target frame from a reference moving image indicating a sample of the first to Nth objects, a real image and a target frame. And a third process for controlling each of the first to Nth robots based on the above. In the third processing, the control device causes the state of the j-th object on the real image to approach the state of the j-th object on the target frame based on the change information corresponding to the j-th object. The control amount of the j-th robot is calculated, and the j-th robot is controlled according to the calculated control amount.
 この開示によれば、実画像と目標フレームと変化情報とに基づいて、実画像上の対象物の状態を目標フレーム上の対象物の状態まで変化させることができる。これにより、第1~第Nの対象物の状態は、基準動画に従って連係して変化する。 According to this disclosure, the state of the target on the real image can be changed to the state of the target on the target frame based on the real image, the target frame, and the change information. As a result, the states of the first to Nth objects change in a coordinated manner according to the reference moving image.
 上述の開示において、制御装置は、実画像上における第1~第Nの対象物のうちの少なくとも1つの対象物の状態と目標フレーム上の少なくとも1つの対象物の状態との偏差が閾値未満となってから、目標フレームを更新する。この開示によれば、第1の対象物の状態を目標フレーム上の状態により確実に変化させることができる。 In the above disclosure, the control device may determine that a deviation between a state of at least one of the first to Nth objects on the real image and a state of the at least one object on the target frame is less than a threshold. Then, the target frame is updated. According to this disclosure, the state of the first object can be reliably changed according to the state on the target frame.
 上述の開示において、制御装置は、基準動画から第1目標フレームを選択した後、複数のフレームから第2目標フレームを選択する。制御装置は、実画像上の第1の対象物の状態と第1目標フレーム上の第1の対象物の状態との偏差が第1閾値未満となる第1時刻と、実画像上の第2の対象物の状態と第2目標フレーム上の第2の対象物の状態との偏差が第2閾値未満となる第2時刻とが規定条件を満たすように、第1のロボットおよび第2のロボットを制御する。 In the above disclosure, after selecting the first target frame from the reference moving image, the control device selects the second target frame from the plurality of frames. The control device includes: a first time when a deviation between the state of the first object on the real image and the state of the first object on the first target frame is less than a first threshold; The first robot and the second robot such that the deviation between the state of the target object and the state of the second target object on the second target frame is less than a second threshold value at a second time. Control.
 この開示によれば、第1の対象物が第1目標フレームの状態に到達する時刻と、第2の対象物が第2目標フレームの状態に到達する時刻との時間差が所望の時間となるように、第1の対象物と第2の対象物との状態を変化させることができる。 According to this disclosure, the time difference between the time when the first object reaches the state of the first target frame and the time when the second object reaches the state of the second target frame is a desired time. Then, the state of the first object and the state of the second object can be changed.
 上述の開示において、撮像装置は、第1~第Nの対象物とともに第N+1の対象物を撮像する。基準動画は、第N+1の対象物を含む。制御装置は、基準動画から第1目標フレームを選択した後、複数のフレームから第2目標フレームを選択する。制御装置は、実画像上の第N+1の対象物の状態と第1目標フレーム上の第N+1の対象物の状態との偏差が第1閾値未満となる第1時刻と、実画像上の第1の対象物の状態と第2目標フレーム上の第1の対象物の状態との偏差が第2閾値未満となる第2時刻とが規定条件を満たすように、第1のロボットを制御する。 In the above disclosure, the imaging apparatus images the (N + 1) th object together with the first to Nth objects. The reference moving image includes the (N + 1) th object. After selecting the first target frame from the reference moving image, the control device selects a second target frame from the plurality of frames. The control device includes: a first time when a deviation between the state of the (N + 1) th object on the real image and the state of the (N + 1) th object on the first target frame is less than a first threshold; The first robot is controlled such that the second time at which the deviation between the state of the target object and the state of the first target object on the second target frame is less than the second threshold satisfies a specified condition.
 この開示によれば、第N+1の対象物が第1目標フレームの状態に到達する時刻と、第1の対象物が第2目標フレームの状態に到達する時刻との時間差が所望の時間となるように、第1の対象物の状態を変化させることができる。 According to this disclosure, the time difference between the time when the (N + 1) th object reaches the state of the first target frame and the time when the first object reaches the state of the second target frame is a desired time. Then, the state of the first object can be changed.
 上述の開示において、制御装置は、目標フレームが選択されてから規定時間経過するまでの間に、実画像上の第1~第Nの対象物のうちの少なくとも1つの対象物の状態と目標フレーム上の少なくとも1つの対象物の状態との偏差が閾値未満とならない場合に、制御システムに異常が生じていると判定する。この開示によれば、制御システムの異常に対する対策を早く開始することができる。 In the above disclosure, the control device may control a state of at least one of the first to Nth objects on the real image and a state of the target frame during a period from when the target frame is selected to when a predetermined time has elapsed. If the deviation from the state of at least one of the above objects does not become smaller than the threshold value, it is determined that an abnormality has occurred in the control system. According to this disclosure, a countermeasure against an abnormality of the control system can be started quickly.
 上述の開示において、第Nのロボットは、撮像装置の状態を変化させる。制御装置は、実画像上の第Nの対象物の状態と目標フレーム上の第Nの対象物の状態との偏差が閾値を超える場合、第3処理において、第Nのロボットのみを制御し、実画像上の第Nの対象物の状態と目標フレーム上の第Nの対象物の状態との偏差が閾値未満である場合、第3処理において、第1~第Nのロボットの各々を制御する。この開示によれば、第Nのロボットに対する制御によって撮像装置の状態が安定してから、第1~第N-1のロボットの制御によって、第1~第N-1の対象物と第Nの対象物との状態を基準動画に従って連係して変化させることができる。 に お い て In the above disclosure, the Nth robot changes the state of the imaging device. When the deviation between the state of the Nth object on the real image and the state of the Nth object on the target frame exceeds a threshold, the control device controls only the Nth robot in the third process, When the deviation between the state of the N-th object on the real image and the state of the N-th object on the target frame is smaller than the threshold, each of the first to N-th robots is controlled in the third processing. . According to this disclosure, after the state of the imaging device is stabilized by the control of the N-th robot, the control of the first to N-1st robots causes the first to N-1st objects and the N-th robot to be controlled. The state with the object can be changed in conjunction with the reference moving image.
 上述の開示において、制御装置は、第1処理から第3処理を含む一連処理を繰り返し実行し、第3処理を行なっている間に次の一連処理の第1処理を開始する。この開示によれば、対象ロボットの動作を停止させることなく、最新の実画像に応じて対象ロボットが制御され続ける。その結果、対象物の状態を速く変化させることができる。 In the above disclosure, the control device repeatedly executes a series of processing including the first processing to the third processing, and starts the first processing of the next series of processing while performing the third processing. According to this disclosure, the target robot is continuously controlled according to the latest actual image without stopping the operation of the target robot. As a result, the state of the object can be changed quickly.
 上述の開示において、制御装置は、第2処理において、基準動画のうち予測ホライズン期間に含まれるフレームを目標フレームとして選択する。制御装置は、第3処理において、基準動画のうち予測ホライズン期間に含まれるフレーム上の第jの対象物の状態と、予測ホライズン期間における撮像装置の画像上の第jの対象物の状態との偏差を最小化するように、制御ホライズン期間における第jのロボットの制御量を算出する。この開示によれば、対象物の状態の変化を基準動画により近づけることができる。 In the above disclosure, in the second processing, the control device selects a frame included in the predicted horizon period from the reference moving image as a target frame. In the third processing, the control device determines a state of the j-th object on the frame included in the predicted horizon period in the reference moving image and a state of the j-th object on the image of the imaging device in the predicted horizon period. The control amount of the j-th robot during the control horizon period is calculated so as to minimize the deviation. According to this disclosure, a change in the state of the target object can be made closer to the reference moving image.
 上述の開示において、状態は、物体の位置、姿勢、形状、サイズの少なくとも1つを示す。この開示によれば、第1対象物の位置、姿勢、形状、サイズを基準動画上の第1対象物の位置、姿勢、形状、サイズにそれぞれ変化させることができる。 In the above disclosure, the state indicates at least one of the position, posture, shape, and size of the object. According to this disclosure, the position, posture, shape, and size of the first object can be respectively changed to the position, posture, shape, and size of the first object on the reference moving image.
 本開示の一例によれば、制御方法は、第1~第Nの対象物を撮像するための撮像装置を用いて、第1~第Nのロボットを制御する。Nは2以上の整数である。第iのロボットは、第iの対象物の状態を変化させる。iは1~N-1の整数である。第Nのロボットは、第Nの対象物および撮像装置の一方の状態を変化させる。第Nの対象物および撮像装置の他方は、定位置に設置される。制御方法は、第1~第Nの対象物の各々について変化情報を取得する第1ステップを備える。第jの対象物に対応する変化情報は、第jのロボットの制御量と、撮像装置の画像上における第jの対象物の状態の変化量との関係を示す。jは1~Nの整数である。制御方法は、撮像装置によって撮像された実画像を取得する第2ステップと、第1~第Nの対象物の見本を示す基準動画から目標フレームを選択する第3ステップと、実画像と目標フレームとに基づいて第1~第Nのロボットの各々を制御する第4ステップとをさらに備える。第4ステップは、第jの対象物に対応する変化情報に基づいて、実画像上の第jの対象物の状態を目標フレーム上の第jの対象物の状態に近づけるための第jのロボットの制御量を算出し、算出した制御量に従って第jのロボットを制御するステップを含む。 According to an example of the present disclosure, the control method controls the first to Nth robots using the imaging device for imaging the first to Nth objects. N is an integer of 2 or more. The i-th robot changes the state of the i-th object. i is an integer of 1 to N-1. The Nth robot changes one state of the Nth object and the imaging device. The other of the N-th object and the imaging device is installed at a fixed position. The control method includes a first step of acquiring change information for each of the first to Nth objects. The change information corresponding to the j-th object indicates a relationship between the control amount of the j-th robot and the change amount of the state of the j-th object on the image of the imaging device. j is an integer of 1 to N. The control method includes: a second step of acquiring a real image captured by the imaging device; a third step of selecting a target frame from reference moving images indicating samples of the first to Nth objects; A fourth step of controlling each of the first to Nth robots based on the above. The fourth step is a j-th robot for bringing the state of the j-th object on the real image closer to the state of the j-th object on the target frame based on the change information corresponding to the j-th object. And controlling the j-th robot in accordance with the calculated control amount.
 本開示の一例によれば、プログラムは、上記の制御方法をコンピュータに実行させるためのプログラムである。これらの開示によっても、複数の対象物の状態が連係して変化する。 According to an example of the present disclosure, a program is a program for causing a computer to execute the above control method. According to these disclosures, the states of a plurality of objects also change in a coordinated manner.
 本発明によれば、複数の対象物の状態が連係して変化する。 According to the present invention, the states of a plurality of objects change in a coordinated manner.
実施の形態1に係る制御システムの概要を示す模式図である。FIG. 1 is a schematic diagram illustrating an outline of a control system according to a first embodiment. 撮像装置21によって撮像された実画像と第1基準動画との一例を示す図である。FIG. 4 is a diagram illustrating an example of a real image captured by an imaging device and a first reference moving image. 撮像装置22によって撮像された実画像と第2基準動画との一例を示す図である。FIG. 3 is a diagram illustrating an example of a real image captured by an imaging device and a second reference moving image. 実施の形態1に係る制御システムを構成する制御装置のハードウェア構成を示す模式図である。FIG. 2 is a schematic diagram illustrating a hardware configuration of a control device included in the control system according to the first embodiment. 実施の形態1に係る制御装置の機能構成を示すブロック図である。FIG. 2 is a block diagram illustrating a functional configuration of the control device according to the first embodiment. テンプレートの作成方法の一例を示す図である。FIG. 6 is a diagram illustrating an example of a method for creating a template. 実施の形態1における第1制御部および第2制御部の機能構成を示すブロック図である。FIG. 3 is a block diagram illustrating a functional configuration of a first control unit and a second control unit according to the first embodiment. 第1制御部における第1変化情報セットの生成方法を説明する図である。FIG. 5 is a diagram illustrating a method of generating a first change information set in a first control unit. 第1制御部の算出部による制御量の算出方法を説明する図である。FIG. 4 is a diagram illustrating a method of calculating a control amount by a calculation unit of a first control unit. 変化情報生成部による変化情報の生成処理の流れの一例を示すフローチャートである。9 is a flowchart illustrating an example of a flow of a change information generation process performed by a change information generation unit. 図10に示すステップS2のサブルーチンの処理の流れを示すフローチャートである。11 is a flowchart showing the flow of processing of a subroutine of step S2 shown in FIG. 実施の形態1における、基準動画に沿って対象物の状態を変化させるように対象ロボットを制御する処理の流れの一例を示すフローチャートである。6 is a flowchart illustrating an example of a flow of a process of controlling the target robot to change the state of the target object along the reference moving image in the first embodiment. 図12に示すステップS44のサブルーチンの処理の流れを示すフローチャートである。13 is a flowchart showing the flow of processing of a subroutine of step S44 shown in FIG. 最近接フレームと目標フレームとの関係を示す図である。FIG. 3 is a diagram illustrating a relationship between a closest frame and a target frame. 図12に示すステップS46のサブルーチンの処理の流れを示すフローチャートである。13 is a flowchart showing the flow of a subroutine of step S46 shown in FIG. オスコネクタとメスコネクタとの接続の見本となる基準動画の別の例を示す図である。It is a figure which shows another example of the reference moving image used as a sample of the connection of a male connector and a female connector. 実施の形態1の変形例1における目標フレームの選択処理の流れの一例を示すフローチャートである。15 is a flowchart illustrating an example of a flow of a target frame selection process in Modification 1 of Embodiment 1. 異常判定処理の流れの一例を示すフローチャートである。It is a flowchart which shows an example of the flow of abnormality determination processing. 実施の形態1の変形例2に係る制御システムの対象物を示す模式図である。FIG. 9 is a schematic diagram illustrating an object of a control system according to a second modification of the first embodiment. 実施の形態2に係る制御システムの概要を示す模式図である。FIG. 9 is a schematic diagram illustrating an outline of a control system according to a second embodiment. 実施の形態2における基準動画の一例を示す図である。146 is a diagram illustrating an example of a reference moving image in Embodiment 2. [FIG. 実施の形態2に係る制御装置の機能構成を示すブロック図である。FIG. 7 is a block diagram illustrating a functional configuration of a control device according to a second embodiment. 通過必須フレームおよび関連フレームを指定するための画面の一例を示す図である。It is a figure showing an example of a screen for designating a required passage frame and a related frame. 実施の形態2における目標フレームの選択処理の流れの一例を示すフローチャートである。20 is a flowchart illustrating an example of a flow of a target frame selection process in Embodiment 2. 実施の形態2の変形例1に係る制御システムの対象物を示す模式図である。FIG. 14 is a schematic diagram illustrating an object of a control system according to a first modification of the second embodiment. 実施の形態2の変形例1における基準動画の一例を示す図である。FIG. 15 is a diagram illustrating an example of a reference moving image according to a first modification of the second embodiment. 4つの撮像装置の配置例を示す図である。It is a figure showing an example of arrangement of four imaging devices. 実施の形態2の変形例2に係る制御システムの対象物を示す模式図である。FIG. 13 is a schematic diagram illustrating an object of a control system according to a second modification of the second embodiment. 実施の形態3に係る制御システムの概要を示す模式図である。FIG. 9 is a schematic diagram illustrating an outline of a control system according to a third embodiment. 実施の形態3に係る制御装置の機能構成を示すブロック図である。FIG. 13 is a block diagram illustrating a functional configuration of a control device according to Embodiment 3. 実施の形態3の第1制御部および第2制御部の処理の流れを示すフローチャートである。13 is a flowchart illustrating a processing flow of a first control unit and a second control unit according to the third embodiment. 実施の形態3の変形例に係る制御システムの一部の概要を示す模式図である。FIG. 13 is a schematic diagram illustrating an outline of a part of a control system according to a modification of the third embodiment.
 以下、図面を参照しつつ、本発明に従う各実施の形態について説明する。以下の説明では、同一の部品および構成要素には同一の符号を付してある。それらの名称および機能も同じである。したがって、これらについての詳細な説明は繰り返さない。 Hereinafter, embodiments of the present invention will be described with reference to the drawings. In the following description, the same parts and components are denoted by the same reference numerals. Their names and functions are the same. Therefore, detailed description thereof will not be repeated.
 ≪実施の形態1≫
 <A1.適用例>
 まず、図1を参照して、本発明が適用される場面の一例について説明する。図1は、実施の形態1に係る制御システムの概要を示す模式図である。
Embodiment 1
<A1. Application example>
First, an example of a scene to which the present invention is applied will be described with reference to FIG. FIG. 1 is a schematic diagram illustrating an outline of the control system according to the first embodiment.
 本実施の形態に係る制御システム1は、たとえば、工業製品の生産ラインなどにおいて、オスコネクタ2aをメスコネクタ2bに挿入することにより、オスコネクタ2aとメスコネクタ2bとを接続させる。 The control system 1 according to the present embodiment connects the male connector 2a and the female connector 2b by inserting the male connector 2a into the female connector 2b in, for example, an industrial product production line.
 図1に示すように、制御システム1は、撮像装置21,22と、ロボット30a,30bと、ロボットコントローラ40a,40bと、制御装置50とを備える。 As shown in FIG. 1, the control system 1 includes imaging devices 21 and 22, robots 30a and 30b, robot controllers 40a and 40b, and a control device 50.
 撮像装置21,22は、撮像視野に存在する被写体を撮像して画像データ(以下、単に「画像」という)を生成する。撮像装置21,22は、ロボット30a,30bと異なる定位置に設定される。撮像装置21,22は、互いに異なる場所に設置され、互いに異なる方向から、被写体としてのオスコネクタ2aおよびメスコネクタ2bを撮像する。撮像装置21,22は、予め定められた撮像周期に従って撮像し、撮像によって得られた実画像を制御装置50に出力する。 (4) The imaging devices 21 and 22 capture an image of a subject present in the field of view and generate image data (hereinafter, simply referred to as “image”). The imaging devices 21 and 22 are set at fixed positions different from those of the robots 30a and 30b. The imaging devices 21 and 22 are installed at different places, and image the male connector 2a and the female connector 2b as subjects from different directions. The imaging devices 21 and 22 perform imaging according to a predetermined imaging cycle, and output a real image obtained by the imaging to the control device 50.
 ロボット30aは、オスコネクタ2aの状態(ここでは位置および姿勢)を変化させるための機構であり、たとえば垂直多関節ロボットである。ロボット30aは、先端にオスコネクタ2aを支持(ここでは把持)するためのハンド31aを有し、ハンド31aの位置および姿勢を6自由度で変化させる。言い換えると、ロボット30aは、ハンド31aに把持されたオスコネクタ2aの位置および姿勢を6自由度で変化させる。当該6自由度は、X方向、Y方向およびZ方向の並進自由度と、ピッチ方向、ヨー方向およびロール方向の回転自由度とを含む。ただし、ハンド31aの自由度の個数は6個に限定されず、3~5個または7個以上であってもよい。 The robot 30a is a mechanism for changing the state (here, position and posture) of the male connector 2a, and is, for example, a vertical articulated robot. The robot 30a has a hand 31a at its tip for supporting (holding) the male connector 2a, and changes the position and posture of the hand 31a with six degrees of freedom. In other words, the robot 30a changes the position and the posture of the male connector 2a held by the hand 31a with six degrees of freedom. The six degrees of freedom include translational degrees of freedom in the X, Y, and Z directions, and rotational degrees of freedom in the pitch, yaw, and roll directions. However, the number of degrees of freedom of the hand 31a is not limited to six, and may be three to five or seven or more.
 ロボット30aは、複数のサーボモータを有しており、当該サーボモータが駆動されることにより、オスコネクタ2aの位置および姿勢を変化させる。当該複数のサーボモータの各々に対応してエンコーダが設けられており、サーボモータの位置が計測される。 The robot 30a has a plurality of servo motors, and the position and posture of the male connector 2a are changed by driving the servo motors. An encoder is provided for each of the plurality of servomotors, and the position of the servomotor is measured.
 ロボット30bは、メスコネクタ2bの状態(ここでは位置および姿勢)を変化させるための機構であり、たとえばXYθステージである。ロボット30bは、メスコネクタ2bを支持する(ここでは載せる)ためのステージ31bを有し、ステージ31bの位置および姿勢を3自由度で変化させる。すなわち、ロボット30bは、ステージ31bに載置されたメスコネクタ2bの位置および姿勢を3自由度で変化させる。当該3自由度は、X方向およびY方向の並進自由度と、XY平面に直交する軸を中心とする回転方向(θ方向)の回転自由度とを含む。ただし、ロボット30bの自由度の個数は3個に限定されず、4個以上であってもよい。 The robot 30b is a mechanism for changing the state (here, position and posture) of the female connector 2b, and is, for example, an XYθ stage. The robot 30b has a stage 31b for supporting (mounting) the female connector 2b, and changes the position and posture of the stage 31b with three degrees of freedom. That is, the robot 30b changes the position and posture of the female connector 2b mounted on the stage 31b with three degrees of freedom. The three degrees of freedom include a translational degree of freedom in the X direction and the Y direction and a rotational degree of freedom in a rotational direction (θ direction) about an axis orthogonal to the XY plane. However, the number of degrees of freedom of the robot 30b is not limited to three, and may be four or more.
 ロボット30bは、複数のサーボモータを有しており、当該サーボモータが駆動されることにより、メスコネクタ2bの位置および姿勢を変化させる。当該複数のサーボモータの各々に対応してエンコーダが設けられており、サーボモータの位置が計測される。 (4) The robot 30b has a plurality of servomotors, and drives and changes the position and orientation of the female connector 2b. An encoder is provided for each of the plurality of servomotors, and the position of the servomotor is measured.
 ロボットコントローラ40aは、制御装置50から受けた制御指令に従って、ロボット30aの動作制御を行なう。ロボットコントローラ40aは、X方向、Y方向およびZ方向の並進自由度ならびにピッチ方向、ヨー方向およびロール方向の回転自由度の各々の制御指令を制御装置50から受ける。これらX方向、Y方向、Z方向、ピッチ方向、ヨー方向およびロール方向は、ロボット30aの座標系で示される。ロボットコントローラ40aは、ハンド31aのX方向、Y方向およびZ方向の並進移動量がX方向、Y方向およびZ方向の並進自由度の制御指令にそれぞれ近づくように、ロボット30aに対してフィードバック制御を行なう。ロボットコントローラ40aは、ハンド31aのピッチ方向、ヨー方向およびロール方向の回転移動量がピッチ方向、ヨー方向およびロール方向の回転自由度の制御指令にそれぞれ近づくように、ロボット30aに対してフィードバック制御を行なう。 The robot controller 40a controls the operation of the robot 30a according to the control command received from the control device 50. The robot controller 40a receives from the control device 50 control commands for translational degrees of freedom in the X, Y, and Z directions and rotational degrees of freedom in the pitch, yaw, and roll directions. These X direction, Y direction, Z direction, pitch direction, yaw direction, and roll direction are indicated by the coordinate system of the robot 30a. The robot controller 40a performs feedback control on the robot 30a so that the translation amounts of the hand 31a in the X, Y, and Z directions approach control commands for the degrees of freedom of translation in the X, Y, and Z directions, respectively. Do. The robot controller 40a performs feedback control on the robot 30a so that the rotational movement amounts of the hand 31a in the pitch direction, the yaw direction, and the roll direction approach the control commands for the rotational degrees of freedom in the pitch, yaw, and roll directions, respectively. Do.
 ロボットコントローラ40bは、制御装置50から受けた制御指令に従って、ロボット30bの動作制御を行なう。ロボットコントローラ40bは、X方向およびY方向の並進自由度ならびに回転自由度の各々の制御指令を制御装置50から受ける。これらX方向、Y方向および回転方向は、ロボット30bの座標系で示される。ロボットコントローラ40bは、ステージ31bのX方向およびY方向の並進移動量がX方向およびY方向の並進自由度の制御指令にそれぞれ近づくように、ロボット30bに対してフィードバック制御を行なう。ロボットコントローラ40bは、ステージ31bの回転移動量が回転自由度の制御指令に近づくように、ロボット30bに対してフィードバック制御を行なう。 The robot controller 40b controls the operation of the robot 30b according to the control command received from the control device 50. The robot controller 40b receives from the control device 50 control commands for the degrees of freedom of translation and rotation in the X and Y directions. These X direction, Y direction, and rotation direction are indicated by the coordinate system of the robot 30b. The robot controller 40b performs feedback control on the robot 30b such that the translation amounts of the stage 31b in the X and Y directions approach control commands for the degrees of freedom of translation in the X and Y directions, respectively. The robot controller 40b performs feedback control on the robot 30b such that the rotational movement amount of the stage 31b approaches the control command of the rotational degree of freedom.
 制御装置50は、ロボットコントローラ40a,40bを介して、ロボット30a,30bをそれぞれ制御する。 The control device 50 controls the robots 30a and 30b via the robot controllers 40a and 40b, respectively.
 制御装置50は、オスコネクタ2aおよびメスコネクタ2bの見本を示す、第1基準動画および第2基準動画を記憶している。第1基準動画は、撮像装置21の位置から見たときの動画である。第2基準動画は、撮像装置22の位置から見たときの動画である。 The control device 50 stores a first reference moving image and a second reference moving image, each of which represents a sample of the male connector 2a and the female connector 2b. The first reference moving image is a moving image when viewed from the position of the imaging device 21. The second reference moving image is a moving image when viewed from the position of the imaging device 22.
 第1基準動画および第2基準動画の各々は、時系列に並んだ複数(以下、M枚(Mは、2以上の整数)とする)のフレームを含む。第1基準動画のk番目(kは1~Mのいずれかの整数)のフレームと第2基準動画のk番目のフレームとは、ある状態のオスコネクタ2aおよびメスコネクタ2bを同時に異なる方向から見たときの画像である。 Each of the first reference moving image and the second reference moving image includes a plurality of frames (hereinafter, referred to as M (M is an integer of 2 or more)) arranged in time series. The k-th frame (k is an integer from 1 to M) of the first reference moving image and the k-th frame of the second reference moving image are obtained by simultaneously viewing the male connector 2a and the female connector 2b in a certain state from different directions. It is an image when it is.
 制御装置50は、ロボット30aの制御量と、撮像装置21,22の実画像上のオスコネクタ2aの状態の変化量との関係を示す変化情報を取得する。さらに、制御装置50は、ロボット30bの制御量と、撮像装置21,22の実画像上のメスコネクタ2bの状態の変化量との関係を示す変化情報を取得する。 The control device 50 acquires change information indicating the relationship between the control amount of the robot 30a and the change amount of the state of the male connector 2a on the real images of the imaging devices 21 and 22. Further, the control device 50 acquires change information indicating the relationship between the control amount of the robot 30b and the change amount of the state of the female connector 2b on the real images of the imaging devices 21 and 22.
 制御装置50は、以下に示す第1~第3処理を行なう。制御装置50は、第1~第3処理を含む一連処理を繰り返し実行する。第1処理は、撮像装置21,22によって撮像された実画像を取得する処理である。 (4) The controller 50 performs the following first to third processing. The control device 50 repeatedly executes a series of processes including the first to third processes. The first process is a process of acquiring actual images captured by the imaging devices 21 and 22.
 第2処理は、第1基準動画および第2基準動画の各々から目標フレームを選択する処理である。制御装置50は、第1基準動画からk番目のフレームを目標フレームとして選択する場合、第2基準動画からもk番目のフレームを目標フレームとして選択する。 The second process is a process of selecting a target frame from each of the first reference moving image and the second reference moving image. When selecting the k-th frame from the first reference moving image as the target frame, the control device 50 also selects the k-th frame from the second reference moving image as the target frame.
 第3処理は、実画像と目標フレームとに基づいてロボット30a,30bの各々を制御する処理である。制御装置50は、ロボット30aに対応する変化情報に基づいて、実画像上のオスコネクタ2aの状態を目標フレーム上のオスコネクタ2aの状態に近づけるためのロボット30aの制御量を算出し、算出した制御量に従ってロボット30aを制御する。制御装置50は、ロボット30aの制御量を示す制御指令を生成し、生成した制御指令をロボットコントローラ40aに出力する。さらに、制御装置50は、ロボット30bに対応する変化情報に基づいて、実画像上のメスコネクタ2bの状態を目標フレーム上のメスコネクタ2bの状態に近づけるためのロボット30bの制御量を算出し、算出した制御量に従ってロボット30bを制御する。制御装置50は、ロボット30bの制御量を示す制御指令を生成し、生成した制御指令をロボットコントローラ40bに出力する。 The third process is a process for controlling each of the robots 30a and 30b based on the actual image and the target frame. The control device 50 calculates and calculates the control amount of the robot 30a for bringing the state of the male connector 2a on the actual image closer to the state of the male connector 2a on the target frame based on the change information corresponding to the robot 30a. The robot 30a is controlled according to the control amount. The control device 50 generates a control command indicating a control amount of the robot 30a, and outputs the generated control command to the robot controller 40a. Further, the control device 50 calculates a control amount of the robot 30b for bringing the state of the female connector 2b on the actual image closer to the state of the female connector 2b on the target frame based on the change information corresponding to the robot 30b, The robot 30b is controlled according to the calculated control amount. The control device 50 generates a control command indicating a control amount of the robot 30b, and outputs the generated control command to the robot controller 40b.
 図2は、撮像装置21によって撮像された実画像と第1基準動画との一例を示す図である。図3は、撮像装置22によって撮像された実画像と第2基準動画との一例を示す図である。図2には、撮像装置21によって撮像された実画像90a~93aと、第1基準動画のフレーム70a~73aとが示されている。図3には、撮像装置22によって撮像された実画像90b~93bと、第2基準動画のフレーム70b~73bとが示されている。 FIG. 2 is a diagram illustrating an example of an actual image captured by the imaging device 21 and a first reference moving image. FIG. 3 is a diagram illustrating an example of a real image captured by the imaging device 22 and a second reference moving image. FIG. 2 shows real images 90a to 93a captured by the imaging device 21 and frames 70a to 73a of the first reference moving image. FIG. 3 shows real images 90b to 93b imaged by the imaging device 22, and frames 70b to 73b of the second reference moving image.
 実画像90a,90bは、同時刻に撮像された画像である。実画像91a,91bは、実画像90a,90bより後の同時刻に撮像された画像である。実画像92a,92bは、実画像91a,91bより後の同時刻に撮像された画像である。実画像93a,93bは、実画像92a,92bより後の同時刻に撮像された画像である。 The real images 90a and 90b are images captured at the same time. The real images 91a and 91b are images captured at the same time after the real images 90a and 90b. The real images 92a and 92b are images captured at the same time after the real images 91a and 91b. The real images 93a and 93b are images captured at the same time after the real images 92a and 92b.
 フレーム70a,70bの各々は、対応する基準動画における1番目のフレームである。フレーム71a,71bの各々は、対応する基準動画におけるs番目(sは2以上の整数)のフレームである。フレーム72a,72bの各々は、対応する基準動画におけるt番目(tはsより大きい整数)のフレームである。フレーム73a,73bの各々は、対応する基準動画におけるu番目(uはtより大きい整数)のフレームである。 Each of the frames 70a and 70b is the first frame in the corresponding reference moving image. Each of the frames 71a and 71b is the s-th (s is an integer of 2 or more) frame in the corresponding reference moving image. Each of the frames 72a and 72b is a t-th (t is an integer greater than s) frame in the corresponding reference moving image. Each of the frames 73a and 73b is a u-th (u is an integer greater than t) frame in the corresponding reference moving image.
 第1基準動画および第2基準動画は、ステージ31b上に載置されたメスコネクタ2bが所望の状態に移動した後、ハンド31aによって把持されたオスコネクタ2aがメスコネクタ2bの上方から下向きに移動し、メスコネクタ2bと接続する様子を示している。 In the first reference moving image and the second reference moving image, after the female connector 2b mounted on the stage 31b moves to a desired state, the male connector 2a held by the hand 31a moves downward from above the female connector 2b. 2 shows a state of connection with the female connector 2b.
 制御装置50は、ステージ31b上に載置されたメスコネクタ2bと、ハンド31aによって把持されたオスコネクタ2aとを含む実画像90a,90bを撮像装置21,22からそれぞれ取得する。 The control device 50 acquires real images 90a and 90b including the female connector 2b placed on the stage 31b and the male connector 2a held by the hand 31a from the imaging devices 21 and 22, respectively.
 制御装置50は、第1基準動画および第2基準動画から、メスコネクタ2bが所望の位置および姿勢に移動したときのフレーム71a,71bを目標フレームとしてそれぞれ選択する。 The control device 50 selects, from the first reference moving image and the second reference moving image, the frames 71a and 71b when the female connector 2b has moved to a desired position and posture, respectively, as target frames.
 制御装置50は、メスコネクタ2bに対応する変化情報に基づいて、実画像90a,90b上のメスコネクタ2bの状態をフレーム71a,71b上のメスコネクタ2bの状態へそれぞれ近づけるためのステージ31bの制御量を算出する。そして、制御装置50は、算出した制御量を示す制御指令をロボットコントローラ40bに出力する。ロボットコントローラ40bは、制御指令に従ってロボット30bを制御する。これにより、実画像91a,91bに示されるように、メスコネクタ2bの位置および姿勢は、所望の位置および姿勢(フレーム71a,71bで示される位置および姿勢)に変化する。 The controller 50 controls the stage 31b to bring the state of the female connector 2b on the actual images 90a, 90b closer to the state of the female connector 2b on the frames 71a, 71b based on the change information corresponding to the female connector 2b. Calculate the amount. Then, the control device 50 outputs a control command indicating the calculated control amount to the robot controller 40b. The robot controller 40b controls the robot 30b according to a control command. As a result, as shown in the real images 91a and 91b, the position and posture of the female connector 2b change to desired positions and postures (the positions and postures indicated by the frames 71a and 71b).
 さらに、ロボット30bの制御と並行して、制御装置50は、実画像90a,90b上のオスコネクタ2aの状態からフレーム71a,71b上のオスコネクタ2aの状態に近づけるためのハンド31aの制御量を算出する。そして、制御装置50は、算出した制御量を示す制御指令をロボットコントローラ40aに出力する。ロボットコントローラ40aは、制御指令に従ってロボット30aを制御する。これにより、実画像91a,91bに示されるように、オスコネクタ2aの状態は、メスコネクタ2bの上方の位置および姿勢(フレーム71a,71bで示される位置および姿勢)に変化する。 Further, in parallel with the control of the robot 30b, the control device 50 changes the control amount of the hand 31a for bringing the state of the male connector 2a on the real images 90a and 90b closer to the state of the male connector 2a on the frames 71a and 71b. calculate. Then, the control device 50 outputs a control command indicating the calculated control amount to the robot controller 40a. The robot controller 40a controls the robot 30a according to a control command. Thereby, as shown in the actual images 91a and 91b, the state of the male connector 2a changes to the position and posture above the female connector 2b (the positions and postures indicated by the frames 71a and 71b).
 次に、制御装置50は、オスコネクタ2aがメスコネクタ2bの直上の位置まで移動したときのフレーム72a,72bを目標フレームとして選択する。 Next, the control device 50 selects the frames 72a and 72b when the male connector 2a has moved to a position immediately above the female connector 2b as target frames.
 制御装置50は、オスコネクタ2aに対応する変化情報に基づいて、実画像91a,91b上のオスコネクタ2aの状態をフレーム72a,72b上のオスコネクタ2aの状態にそれぞれ近づけるためのハンド31aの制御量を算出する。そして、制御装置50は、算出した制御量を示す制御指令をロボットコントローラ40aに出力する。ロボットコントローラ40aは、制御指令に従ってロボット30aを制御する。これにより、実画像92a,92bに示されるように、オスコネクタ2aの位置および姿勢は、メスコネクタ2bの直上の位置および姿勢(フレーム72a,72bで示される位置および姿勢)に変化する。 The control device 50 controls the hand 31a to bring the state of the male connector 2a on the real images 91a and 91b closer to the state of the male connector 2a on the frames 72a and 72b based on the change information corresponding to the male connector 2a. Calculate the amount. Then, the control device 50 outputs a control command indicating the calculated control amount to the robot controller 40a. The robot controller 40a controls the robot 30a according to a control command. Thereby, as shown in the real images 92a and 92b, the position and posture of the male connector 2a change to the position and posture directly above the female connector 2b (the position and posture indicated by the frames 72a and 72b).
 次に、制御装置50は、オスコネクタ2aとメスコネクタ2bとの接続が完了したときのフレーム73a,73bを目標フレームとして選択する。 Next, the control device 50 selects the frames 73a and 73b when the connection between the male connector 2a and the female connector 2b is completed as target frames.
 制御装置50は、オスコネクタ2aに対応する変化情報に基づいて、実画像92a,92b上のオスコネクタ2aの状態をフレーム73a,73b上のオスコネクタ2aの状態に近づけるためのハンド31aの制御量を算出する。そして、制御装置50は、算出した制御量を示す制御指令をロボットコントローラ40aに出力する。ロボットコントローラ40aは、制御指令に従ってロボット30aを制御する。これにより、実画像93a,93bに示されるように、オスコネクタ2aは、メスコネクタ2bに接続完了した位置および姿勢(フレーム73a,73bで示される位置および姿勢)まで移動する。 The control device 50 controls the hand 31a to bring the state of the male connector 2a on the real images 92a and 92b closer to the state of the male connector 2a on the frames 73a and 73b based on the change information corresponding to the male connector 2a. Is calculated. Then, the control device 50 outputs a control command indicating the calculated control amount to the robot controller 40a. The robot controller 40a controls the robot 30a according to a control command. Thereby, as shown in the real images 93a and 93b, the male connector 2a moves to the position and posture (the positions and postures indicated by the frames 73a and 73b) at which the connection to the female connector 2b is completed.
 このように、制御装置50は、オスコネクタ2aおよびメスコネクタ2bを実画像上の状態から目標フレーム上の状態へそれぞれ変化させることができる。これにより、オスコネクタ2aおよびメスコネクタ2bの状態は、第1基準動画および第2基準動画に従って連係して変化する。 As described above, the control device 50 can change the male connector 2a and the female connector 2b from the state on the actual image to the state on the target frame. Thus, the states of the male connector 2a and the female connector 2b change in a coordinated manner according to the first reference moving image and the second reference moving image.
 制御装置50は、撮像装置21,22とロボット30a,30bとの座標系をそれぞれ対応付けるキャリブレーションデータを用いることなく、ロボット30a,30bを制御できる。そのため、作業者は、キャリブレーションを予め行なう必要がない。さらに、作業者は、オスコネクタ2aの状態を所望の状態に変化させるためのロボット30aの動作プログラムと、メスコネクタ2bの状態を所望の状態に変化させるためのロボット30bの動作プログラムとを予め設計する必要がない。そのため、ロボット30a,30bによってオスコネクタ2aおよびメスコネクタ2bの状態を所望の状態にそれぞれ変化させるために必要な手間を低減できる。 The control device 50 can control the robots 30a and 30b without using calibration data for associating the coordinate systems of the imaging devices 21 and 22 with the robots 30a and 30b. Therefore, the operator does not need to perform the calibration in advance. Further, the operator designs in advance an operation program of the robot 30a for changing the state of the male connector 2a to a desired state and an operation program of the robot 30b for changing the state of the female connector 2b to a desired state. No need to do. Therefore, it is possible to reduce the labor required for changing the states of the male connector 2a and the female connector 2b to desired states by the robots 30a and 30b.
 キャリブレーションデータがある場合、撮像装置21,22によって撮像された画像からオスコネクタ2aおよびメスコネクタ2bの実空間上の位置および姿勢を特定し、当該位置および姿勢に基づいてロボット30a,30bを制御する方法が考えられる。しかしながら、ロボット30a,30bの経年劣化に応じてキャリブレーションデータの精度が低下して、オスコネクタ2aとメスコネクタ2bとをうまく接続できない可能性がある。さらに、オスコネクタ2aおよびメスコネクタ2bの位置ずれや個体差に起因して、オスコネクタ2aとメスコネクタ2bとをうまく接続できない可能性がある。このような場合であっても、本適用例を用いることにより、オスコネクタ2aとメスコネクタ2bとを基準動画通りに接続させることができる。 When there is calibration data, the positions and postures of the male connector 2a and the female connector 2b in the real space are specified from the images taken by the imaging devices 21 and 22, and the robots 30a and 30b are controlled based on the positions and postures. There is a way to do it. However, there is a possibility that the accuracy of the calibration data is reduced in accordance with the aging of the robots 30a and 30b, and the male connector 2a and the female connector 2b cannot be connected well. Furthermore, there is a possibility that the male connector 2a and the female connector 2b cannot be connected properly due to a positional shift or individual difference between the male connector 2a and the female connector 2b. Even in such a case, by using this application example, the male connector 2a and the female connector 2b can be connected in accordance with the reference moving image.
 <B1.具体例>
 次に、実施の形態1に係る制御システムの具体例について説明する。
<B1. Specific example>
Next, a specific example of the control system according to the first embodiment will be described.
 <B1-1.制御装置のハードウェア構成>
 図4は、実施の形態1に係る制御システムを構成する制御装置のハードウェア構成を示す模式図である。図4に示されるように、制御装置50は、コンピュータアーキテクチャに従う構造を有しており、予めインストールされたプログラムをプロセッサが実行することで、後述するような各種の処理を実現する。
<B1-1. Hardware configuration of control device>
FIG. 4 is a schematic diagram illustrating a hardware configuration of a control device included in the control system according to the first embodiment. As shown in FIG. 4, the control device 50 has a structure according to a computer architecture, and executes various programs as described below by executing a program installed in advance by a processor.
 より具体的には、制御装置50は、CPU(Central Processing Unit)やMPU(Micro-Processing Unit)などのプロセッサ510と、RAM(Random Access Memory)512と、表示コントローラ514と、システムコントローラ516と、I/O(Input Output)コントローラ518と、ハードディスク520と、カメラインターフェイス522と、入力インターフェイス524と、ロボットコントローラインターフェイス526と、通信インターフェイス528と、メモリカードインターフェイス530とを含む。これらの各部は、システムコントローラ516を中心として、互いにデータ通信可能に接続される。 More specifically, the control device 50 includes a processor 510 such as a CPU (Central Processing Unit) or an MPU (Micro-Processing Unit), a RAM (Random Access Memory) 512, a display controller 514, a system controller 516, It includes an I / O (Input @ Output) controller 518, a hard disk 520, a camera interface 522, an input interface 524, a robot controller interface 526, a communication interface 528, and a memory card interface 530. These units are connected to each other so as to enable data communication with the system controller 516 as the center.
 プロセッサ510は、システムコントローラ516との間でプログラム(コード)などを交換して、これらを所定順序で実行することで、目的の演算処理を実現する。 The processor 510 exchanges programs (codes) and the like with the system controller 516 and executes them in a predetermined order, thereby realizing the intended arithmetic processing.
 システムコントローラ516は、プロセッサ510、RAM512、表示コントローラ514、およびI/Oコントローラ518とそれぞれバスを介して接続されており、各部との間でデータ交換などを行うとともに、制御装置50全体の処理を司る。 The system controller 516 is connected to the processor 510, the RAM 512, the display controller 514, and the I / O controller 518 via buses, respectively, exchanges data with each unit, and performs processing of the entire control device 50. Govern.
 RAM512は、典型的には、DRAM(Dynamic Random Access Memory)などの揮発性の記憶装置であり、ハードディスク520から読み出されたプログラムや、撮像装置21,22によって取得された画像(画像データ)、画像に対する処理結果、およびワークデータなどを保持する。 The RAM 512 is typically a volatile storage device such as a DRAM (Dynamic Random Access Memory), and includes programs read from the hard disk 520, images (image data) acquired by the imaging devices 21 and 22, The processing result for the image, work data, and the like are stored.
 表示コントローラ514は、表示部532と接続されており、システムコントローラ516からの内部コマンドに従って、各種の情報を表示するための信号を表示部532へ出力する。 The display controller 514 is connected to the display unit 532, and outputs a signal for displaying various information to the display unit 532 according to an internal command from the system controller 516.
 I/Oコントローラ518は、制御装置50に接続される記録媒体や外部機器との間のデータ交換を制御する。より具体的には、I/Oコントローラ518は、ハードディスク520と、カメラインターフェイス522と、入力インターフェイス524と、ロボットコントローラインターフェイス526と、通信インターフェイス528と、メモリカードインターフェイス530と接続される。 The I / O controller 518 controls data exchange between a recording medium connected to the control device 50 and an external device. More specifically, the I / O controller 518 is connected to the hard disk 520, the camera interface 522, the input interface 524, the robot controller interface 526, the communication interface 528, and the memory card interface 530.
 ハードディスク520は、典型的には、不揮発性の磁気記憶装置であり、プロセッサ510で実行される制御プログラム550に加えて、各種情報が格納される。このハードディスク520にインストールされる制御プログラム550は、メモリカード536などに格納された状態で流通する。なお、ハードディスク520に代えて、フラッシュメモリなどの半導体記憶装置やDVD-RAM(Digital Versatile Disk Random Access Memory)などの光学記憶装置を採用してもよい。 The hard disk 520 is typically a nonvolatile magnetic storage device, and stores various information in addition to the control program 550 executed by the processor 510. The control program 550 installed on the hard disk 520 is distributed while being stored in a memory card 536 or the like. Instead of the hard disk 520, a semiconductor storage device such as a flash memory or an optical storage device such as a DVD-RAM (Digital Versatile Disk Random Access Memory) may be employed.
 カメラインターフェイス522は、撮像装置21,22から画像データを受付ける入力部に相当し、プロセッサ510と撮像装置21,22との間のデータ伝送を仲介する。カメラインターフェイス522は、撮像装置21,22からの画像データをそれぞれ一時的に蓄積するための画像バッファ522a,522bを含む。複数の撮像装置に対して、共有できる単一の画像バッファを設けてもよいが、処理高速化のため、それぞれの撮像装置に対応付けて独立に複数配置することが好ましい。 The camera interface 522 corresponds to an input unit that receives image data from the imaging devices 21 and 22, and mediates data transmission between the processor 510 and the imaging devices 21 and 22. The camera interface 522 includes image buffers 522a and 522b for temporarily storing image data from the imaging devices 21 and 22, respectively. Although a single image buffer that can be shared may be provided for a plurality of imaging devices, it is preferable to independently arrange a plurality of image buffers in association with each imaging device in order to speed up processing.
 入力インターフェイス524は、プロセッサ510とキーボード、マウス、タッチパネル、専用コンソールなどの入力装置534との間のデータ伝送を仲介する。 The input interface 524 mediates data transmission between the processor 510 and an input device 534 such as a keyboard, a mouse, a touch panel, and a dedicated console.
 ロボットコントローラインターフェイス526は、プロセッサ510とロボットコントローラ40a,40bとの間のデータ伝送を仲介する。 The robot controller interface 526 mediates data transmission between the processor 510 and the robot controllers 40a and 40b.
 通信インターフェイス528は、プロセッサ510と図示しない他のパーソナルコンピュータやサーバ装置などとの間のデータ伝送を仲介する。通信インターフェイス528は、典型的には、イーサネット(登録商標)やUSB(Universal Serial Bus)などからなる。 The communication interface 528 mediates data transmission between the processor 510 and another personal computer or server (not shown). The communication interface 528 is typically made of Ethernet (registered trademark), USB (Universal Serial Bus), or the like.
 メモリカードインターフェイス530は、プロセッサ510と記録媒体であるメモリカード536との間のデータ伝送を仲介する。メモリカード536には、制御装置50で実行される制御プログラム550などが格納された状態で流通し、メモリカードインターフェイス530は、このメモリカード536から制御プログラム550を読み出す。メモリカード536は、SD(Secure Digital)などの汎用的な半導体記憶デバイスや、フレキシブルディスク(Flexible Disk)などの磁気記録媒体や、CD-ROM(Compact Disk-Read Only Memory)などの光学記録媒体等からなる。あるいは、通信インターフェイス528を介して、配信サーバなどからダウンロードしたプログラムを制御装置50にインストールしてもよい。 The memory card interface 530 mediates data transmission between the processor 510 and the memory card 536 as a recording medium. The memory card 536 circulates in a state where a control program 550 executed by the control device 50 and the like are stored, and the memory card interface 530 reads the control program 550 from the memory card 536. The memory card 536 includes a general-purpose semiconductor storage device such as an SD (Secure Digital), a magnetic recording medium such as a flexible disk (Flexible Disk), and an optical recording medium such as a CD-ROM (Compact Disk-Read Only Memory). Consists of Alternatively, a program downloaded from a distribution server or the like may be installed in the control device 50 via the communication interface 528.
 上述のような汎用的なコンピュータアーキテクチャに従う構造を有するコンピュータを利用する場合には、本実施の形態に係る機能を提供するためのアプリケーションに加えて、コンピュータの基本的な機能を提供するためのOS(Operating System)がインストールされていてもよい。この場合には、本実施の形態に係る制御プログラムは、OSの一部として提供されるプログラムモジュールのうち、必要なモジュールを所定の順序および/またはタイミングで呼び出して処理を実行するものであってもよい。 When using a computer having a structure according to the general-purpose computer architecture as described above, an OS for providing basic functions of the computer in addition to an application for providing functions according to the present embodiment (Operating @ System) may be installed. In this case, the control program according to the present embodiment executes processing by calling necessary modules in a predetermined order and / or timing among program modules provided as a part of the OS. Is also good.
 さらに、本実施の形態に係る制御プログラムは、他のプログラムの一部に組み込まれて提供されるものであってもよい。その場合にも、プログラム自体には、上記のような組み合わせられる他のプログラムに含まれるモジュールを含んでおらず、当該他のプログラムと協働して処理が実行される。すなわち、本実施の形態に係る制御プログラムとしては、このような他のプログラムに組み込まれた形態であってもよい。 Further, the control program according to the present embodiment may be provided by being incorporated in a part of another program. Even in such a case, the program itself does not include a module included in another program to be combined as described above, and the process is executed in cooperation with the other program. That is, the control program according to the present embodiment may be a form incorporated in such another program.
 なお、代替的に、制御プログラムの実行により提供される機能の一部もしくは全部を専用のハードウェア回路として実装してもよい。 Alternatively, part or all of the functions provided by executing the control program may be implemented as a dedicated hardware circuit.
 <B1-2.制御装置の機能構成>
 図5は、実施の形態1に係る制御装置の機能構成を示すブロック図である。図5に示されるように、制御装置50は、基準動画記憶部51と、教示範囲選択部52と、画像処理部53と、目標フレーム選択部54と、第1制御部55aと、第2制御部55bとを備える。基準動画記憶部51は、図4に示すハードディスク520およびRAM512によって構成される。教示範囲選択部52と画像処理部53とは、図4に示すプロセッサ510が制御プログラム550を実行することにより実現される。
<B1-2. Functional configuration of control device>
FIG. 5 is a block diagram illustrating a functional configuration of the control device according to the first embodiment. As shown in FIG. 5, the control device 50 includes a reference moving image storage unit 51, a teaching range selection unit 52, an image processing unit 53, a target frame selection unit 54, a first control unit 55a, and a second control unit. A portion 55b. The reference moving image storage unit 51 includes the hard disk 520 and the RAM 512 shown in FIG. The teaching range selection unit 52 and the image processing unit 53 are realized by the processor 510 shown in FIG.
 <B1-2-1.基準動画記憶部>
 基準動画記憶部51は、第1基準動画と第2基準動画とを記憶する。第1基準動画および第2基準動画は、ロボット30a,30bを手動で操作して、オスコネクタ2aおよびメスコネクタ2bをそれぞれ移動させ、互いに接続させる様子を示す。もしくは、第1基準動画および第2基準動画は、作業者の手によってオスコネクタ2aおよびメスコネクタ2bを移動させ、互いに接続させる様子を示してもよい。
<B1-2-1. Reference video storage unit>
The reference moving image storage unit 51 stores a first reference moving image and a second reference moving image. The first reference moving image and the second reference moving image show how the male connector 2a and the female connector 2b are moved and connected to each other by manually operating the robots 30a and 30b. Alternatively, the first reference moving image and the second reference moving image may show a state in which the male connector 2a and the female connector 2b are moved by an operator's hand and connected to each other.
 なお、オスコネクタ2aが撮像装置21に最も近い位置にあるときと、オスコネクタ2aが撮像装置21から最も遠い位置にあるときとの撮像装置21の作動距離(WD)の差は、当該作動距離に比べて十分に小さい。同様に、オスコネクタ2aが撮像装置22に最も近い位置にあるときと、オスコネクタ2aが撮像装置22から最も遠い位置にあるときとの撮像装置22の作動距離の差は、当該作動距離に比べて十分に小さい。さらに、移動中のオスコネクタ2aの姿勢の変化は微小である。そのため、第1基準動画および第2基準動画において、オスコネクタ2aの形状および大きさはほとんど変化しない。 The difference between the working distance (WD) of the imaging device 21 when the male connector 2a is closest to the imaging device 21 and when the male connector 2a is farthest from the imaging device 21 is the working distance. Small enough compared to. Similarly, the difference in working distance of the imaging device 22 between when the male connector 2a is closest to the imaging device 22 and when the male connector 2a is farthest from the imaging device 22 is smaller than the working distance. Small enough. Further, a change in the posture of the male connector 2a during movement is very small. Therefore, the shape and size of the male connector 2a hardly change in the first reference moving image and the second reference moving image.
 同様に、メスコネクタ2bが撮像装置21に最も近い位置にあるときと、メスコネクタ2bが撮像装置21から最も遠い位置にあるときとの撮像装置21の作動距離との差は、当該作動距離に比べて十分に小さい。同様に、メスコネクタ2bが撮像装置22に最も近い位置にあるときと、メスコネクタ2bが撮像装置22から最も遠い位置にあるときとの撮像装置22の作動距離との差は、当該作動距離に比べて十分に小さい。さらに、移動中のメスコネクタ2bの姿勢の変化は微小である。そのため、第1基準動画および第2基準動画において、メスコネクタ2bの形状および大きさはほとんど変化しない。 Similarly, the difference between the working distance of the imaging device 21 when the female connector 2b is closest to the imaging device 21 and the working distance of the female connector 2b when it is farthest from the imaging device 21 is the working distance. It is small enough. Similarly, the difference between the working distance of the imaging device 22 when the female connector 2b is at the position closest to the imaging device 22 and when the female connector 2b is at the position farthest from the imaging device 22 is the working distance. It is small enough. Further, a change in the posture of the female connector 2b during movement is very small. Therefore, the shape and size of the female connector 2b hardly change in the first reference moving image and the second reference moving image.
 <B1-2-2.教示範囲選択部>
 教示範囲選択部52は、対象物(ここではオスコネクタ2aおよびメスコネクタ2b)ごとに、第1基準動画および第2基準動画のうちの当該対象物の見本となる教示範囲を選択する。
<B1-2-2. Teaching range selector>
The teaching range selection unit 52 selects, for each object (here, the male connector 2a and the female connector 2b), a teaching range serving as a sample of the object from the first reference moving image and the second reference moving image.
 教示範囲選択部52は、教示範囲の選択指示を促す画面を表示部532に表示する。作業者は、第1基準動画および第2基準動画の各フレームを確認し、入力装置534を操作して、対象物が所望の動作をしている一連のフレーム群の先頭フレームと最終フレームとを指定する。教示範囲選択部52、指示された先頭フレームから最終フレームまでを教示範囲として選択する。 (4) The teaching range selection unit 52 displays a screen prompting the user to select a teaching range on the display unit 532. The operator checks each frame of the first reference moving image and the second reference moving image, and operates the input device 534 to determine the first frame and the last frame of a series of frames in which the object is performing a desired operation. specify. The teaching range selection unit 52 selects a designated frame from the first frame to the last frame as the teaching range.
 たとえば、図2に示される第1基準動画に対して、教示範囲選択部52は、1番目のフレーム70aからs番目よりも後フレーム(メスコネクタ2bの一部が欠け始めるフレーム)までをメスコネクタ2bの教示範囲として選択する。教示範囲選択部52は、1番目のフレーム70a(オスコネクタ2aの全体が現れ始めるフレーム)からu番目のフレーム73aまでをオスコネクタ2aの教示範囲として選択する。 For example, with respect to the first reference moving image shown in FIG. 2, the teaching range selection unit 52 sets the female connector from the first frame 70 a to a frame after the s-th frame (a frame where a part of the female connector 2 b starts to be cut off). 2b is selected as the teaching range. The teaching range selection unit 52 selects a range from the first frame 70a (a frame where the entire male connector 2a starts to appear) to the u-th frame 73a as the teaching range of the male connector 2a.
 同様に、図3に示される第2基準動画に対して、教示範囲選択部52は、1番目のフレーム70bからs番目よりも後フレーム(メスコネクタ2bの一部が欠け始めるフレーム)までをメスコネクタ2bの教示範囲として選択する。教示範囲選択部52は、1番目のフレーム70b(オスコネクタ2aの全体が現れ始めるフレーム)からu番目のフレーム73bまでをオスコネクタ2aの教示範囲として選択する。 Similarly, with respect to the second reference moving image shown in FIG. 3, the teaching range selection unit 52 sets the female from the first frame 70b to the frame after the s-th frame (the frame where a part of the female connector 2b starts to be cut off). Select as the teaching range of the connector 2b. The teaching range selection unit 52 selects the range from the first frame 70b (the frame where the entire male connector 2a starts to appear) to the u-th frame 73b as the teaching range of the male connector 2a.
 <B1-2-3.画像処理部>
 画像処理部53は、対象画像に対して画像処理を行ない、テンプレートマッチングを用いて、対象画像中から対象物を検出する。テンプレートマッチングの基本処理は、対象物の画像特徴を表すデータであるテンプレートを予め用意しておき、対象画像とテンプレートとの間の画像特徴の一致度を評価することで、対象画像中の対象物の位置や姿勢、形状、サイズを検出する処理である。
<B1-2-3. Image processing section>
The image processing unit 53 performs image processing on the target image, and detects an object from the target image using template matching. In the basic processing of template matching, a template which is data representing the image feature of the target object is prepared in advance, and the degree of matching of the image feature between the target image and the template is evaluated. This is a process for detecting the position, posture, shape, and size of the image.
 画像処理部53が画像処理を行なう対象画像は、第1基準動画のフレーム、第2基準動画のフレームおよび撮像装置21,22によって撮像された実画像である。 The target images on which the image processing unit 53 performs the image processing are the frames of the first reference moving image, the frames of the second reference moving image, and the real images captured by the imaging devices 21 and 22.
 画像処理部53は、事前準備として、対象物(オスコネクタ2aおよびメスコネクタ2b)ごとのテンプレートを作成する。 The image processing unit 53 creates a template for each object (the male connector 2a and the female connector 2b) as advance preparation.
 図6は、テンプレートの作成方法の一例を示す図である。図6(a)には第1基準動画から選択されたフレームが示される。図6(b)には第2基準動画から選択されたフレームが示される。図6に示されるように、画像処理部53は、第1基準動画および第2基準動画の各々から作業者によって選択されたフレームを表示部532(図4参照)に表示させる。作業者は、対象物(オスコネクタ2aまたはメスコネクタ2b)の全体が示されるフレームを目視によって選択すればよい。 FIG. 6 is a diagram showing an example of a method for creating a template. FIG. 6A shows a frame selected from the first reference moving image. FIG. 6B shows a frame selected from the second reference moving image. As shown in FIG. 6, the image processing unit 53 causes the display unit 532 (see FIG. 4) to display a frame selected by the operator from each of the first reference moving image and the second reference moving image. The operator may visually select a frame in which the entire object (the male connector 2a or the female connector 2b) is shown.
 画像処理部53は、表示部532に表示されたフレーム上において、対象物の領域指定を受け付ける。たとえば、作業者は、入力装置534(図4参照)を操作して、オスコネクタ2aを取り囲む線3aと、メスコネクタ2bを取り囲む線3bとを入力する。画像処理部53は、線3aで囲まれた領域をオスコネクタ2aの画像領域として特定し、線3bで囲まれた領域をメスコネクタ2bの画像領域として特定する。 (4) The image processing unit 53 accepts the designation of the area of the target object on the frame displayed on the display unit 532. For example, the operator operates the input device 534 (see FIG. 4) to input a line 3a surrounding the male connector 2a and a line 3b surrounding the female connector 2b. The image processing unit 53 specifies a region surrounded by the line 3a as an image region of the male connector 2a, and specifies a region surrounded by the line 3b as an image region of the female connector 2b.
 画像処理部53は、第1基準画像および第2基準画像の各々のフレームに対して、線3aで囲まれた画像領域から、オスコネクタ2aの複数の特徴点およびそれらの特徴量を抽出する。画像処理部53は、複数の特徴点の各々の画像上の座標と特徴量とをオスコネクタ2aのテンプレートとして作成する。同様に、画像処理部53は、第1基準画像および第2基準画像の各々のフレームに対して、線3bで囲まれた画像領域から、メスコネクタ2bの複数の特徴点およびそれらの特徴量を抽出する。画像処理部53は、複数の特徴点の各々の画像上の座標と特徴量とをメスコネクタ2bのテンプレートとして作成する。 (4) The image processing unit 53 extracts a plurality of feature points of the male connector 2a and their feature amounts from the image region surrounded by the line 3a for each frame of the first reference image and the second reference image. The image processing unit 53 creates the coordinates and the feature amount of each of the plurality of feature points on the image as a template of the male connector 2a. Similarly, for each frame of the first reference image and the second reference image, the image processing unit 53 extracts a plurality of feature points of the female connector 2b and their feature amounts from the image region surrounded by the line 3b. Extract. The image processing unit 53 creates the coordinates and the feature amount of each of the plurality of feature points on the image as a template of the female connector 2b.
 特徴点は、画像に含まれるかどや輪郭などから特徴づけられる点であり、たとえばエッジ点である。特徴量は、たとえば、輝度、輝度勾配方向、量子化勾配方向、HoG(Histogram of Oriented Gradients)、HAAR-like、SIFT(Scale-Invariant Feature Transform)などである。輝度勾配方向とは、特徴点を中心とする局所領域での輝度の勾配の方向(角度)を連続値で表すものであり、量子化勾配方向とは、特徴点を中心とする局所領域での輝度の勾配の方向を離散値で表す(たとえば、8方向を0~7の1バイトの情報で保持する)ものである。 A feature point is a point characterized by a corner or an outline included in an image, and is, for example, an edge point. The feature quantity is, for example, luminance, luminance gradient direction, quantization gradient direction, HoG (Histogram of Oriented Gradients), HAAR-like, SIFT (Scale-Invariant Feature Transform), and the like. The luminance gradient direction represents a direction (angle) of a luminance gradient in a local region centered on a feature point as a continuous value, and the quantization gradient direction is a local region centered on a feature point. The direction of the luminance gradient is represented by a discrete value (for example, eight directions are held by 1-byte information of 0 to 7).
 画像処理部53は、第1基準動画のフレームまたは撮像装置21によって撮像される実画像から複数の特徴点およびそれらの特徴量を抽出する。画像処理部53は、抽出された特徴点および特徴量と、第1基準画像のフレームから作成した対象物のテンプレートと照合することにより、画像中の対象物を検出する。 The image processing unit 53 extracts a plurality of feature points and their feature amounts from the frame of the first reference moving image or the real image captured by the imaging device 21. The image processing unit 53 detects the target in the image by comparing the extracted feature points and feature amounts with the template of the target created from the frame of the first reference image.
 画像処理部53は、第2基準動画のフレームおよび撮像装置22によって撮像される画像から複数の特徴点およびそれらの特徴量を抽出する。画像処理部53は、抽出した特徴点および特徴量と、第2基準画像のフレームから作成した対象物のテンプレートと照合することにより、画像中の対象物を検出する。 The image processing unit 53 extracts a plurality of feature points and their feature amounts from the frame of the second reference moving image and the image captured by the imaging device 22. The image processing unit 53 detects the target in the image by comparing the extracted feature points and feature amounts with the template of the target created from the frame of the second reference image.
 画像処理部53は、対象物(オスコネクタ2aおよびメスコネクタ2b)ごとに、対象画像から抽出した当該対象物の各特徴点の画像上の座標を出力する。 The image processing unit 53 outputs, for each target object (male connector 2a and female connector 2b), the coordinates on the image of each feature point of the target object extracted from the target image.
 <B1-2-4.目標フレーム選択部>
 目標フレーム選択部54は、第1基準動画および第2基準動画から目標フレームを選択する。ただし、目標フレーム選択部54は、第1基準動画のk番目のフレームを目標フレームとして選択した場合、第2基準動画のk番目のフレームを目標フレームとして選択する。目標フレームの選択方法の具体例については後述する。
<B1-2-4. Target frame selection section>
The target frame selection unit 54 selects a target frame from the first reference moving image and the second reference moving image. However, when the k-th frame of the first reference moving image is selected as the target frame, the target frame selecting unit 54 selects the k-th frame of the second reference moving image as the target frame. A specific example of the target frame selection method will be described later.
 <B1-2-5.第1制御部および第2制御部>
 第1制御部55aは、ロボットコントローラ40aを介してロボット30aを制御し、オスコネクタ2aの状態を変化させる。
<B1-2-5. First Control Unit and Second Control Unit>
The first control unit 55a controls the robot 30a via the robot controller 40a and changes the state of the male connector 2a.
 第2制御部55bは、ロボットコントローラ40bを介してロボット30bを制御し、メスコネクタ2bの状態を変化させる。 The second controller 55b controls the robot 30b via the robot controller 40b to change the state of the female connector 2b.
 図7は、実施の形態1における第1制御部および第2制御部の機能構成を示すブロック図である。図7に示されるように、第1制御部55aおよび第2制御部55bの各々は、変化情報生成部56と、変化情報記憶部57と、算出部58と、指令部59と、終了判定部60とを備える。変化情報記憶部57は、図4に示すハードディスク520およびRAM512によって構成される。変化情報生成部56と、算出部58と、指令部59と、終了判定部60とは、図4に示すプロセッサ510が制御プログラム550を実行することにより実現される。 FIG. 7 is a block diagram showing a functional configuration of the first control unit and the second control unit according to the first embodiment. As shown in FIG. 7, each of the first control unit 55a and the second control unit 55b includes a change information generation unit 56, a change information storage unit 57, a calculation unit 58, a command unit 59, and an end determination unit. 60. The change information storage unit 57 includes the hard disk 520 and the RAM 512 shown in FIG. The change information generation unit 56, the calculation unit 58, the command unit 59, and the end determination unit 60 are realized by the processor 510 illustrated in FIG.
 <B1-2-6.変化情報生成部>
 変化情報生成部56は、複数の自由度の各々に対して、対象ロボットの制御量と、撮像装置21によって撮像された実画像上の対象物の状態の変化量との関係を示す第1変化情報を生成する。変化情報生成部56は、複数の自由度に対してそれぞれ生成した複数の第1変化情報からなる第1変化情報セット571を変化情報記憶部57に格納する。
<B1-2-6. Change information generator>
The change information generation unit 56 performs a first change indicating the relationship between the control amount of the target robot and the change amount of the state of the target object on the real image captured by the imaging device 21 for each of the plurality of degrees of freedom. Generate information. The change information generating unit 56 stores a first change information set 571 including a plurality of first change information generated for a plurality of degrees of freedom in the change information storage unit 57.
 さらに、変化情報生成部56は、複数の自由度の各々に対して、対象ロボットの制御量と、撮像装置22によって撮像された実画像上の対象物の状態の変化量との関係を示す第2変化情報を生成する。変化情報生成部56は、複数の自由度に対してそれぞれ生成した複数の第2変化情報からなる第2変化情報セット572を変化情報記憶部57に格納する。 Further, the change information generating unit 56 displays, for each of the plurality of degrees of freedom, a relationship between the control amount of the target robot and the change amount of the state of the target object on the real image captured by the imaging device 22. 2 Change information is generated. The change information generation unit 56 stores a second change information set 572 including a plurality of pieces of second change information generated for a plurality of degrees of freedom in the change information storage unit 57.
 対象物は、第1制御部55aではオスコネクタ2aであり、第2制御部55bではメスコネクタ2bである。対象ロボットは、第1制御部55aではロボット30aであり、第2制御部55bではロボット30bである。複数の自由度は、第1制御部55aでは6自由度であり、第2制御部55bでは3自由度である。 The target object is the male connector 2a in the first control unit 55a and the female connector 2b in the second control unit 55b. The target robot is the robot 30a in the first control unit 55a, and the robot 30b in the second control unit 55b. The plurality of degrees of freedom are six degrees of freedom in the first control unit 55a and three degrees of freedom in the second control unit 55b.
 本実施の形態では、第1変化情報および第2変化情報は、単位制御量だけ対象ロボットを制御したときの画像上の対象物の状態の変化量を示す。具体的には、第1変化情報および第2変化情報は、単位制御量だけ対象ロボットを制御する前の画像上の対象物を、単位制御量だけ対象ロボットを制御した後の画像上の対象物に変換する写像を示す。 In the present embodiment, the first change information and the second change information indicate the amount of change in the state of the target on the image when the target robot is controlled by the unit control amount. Specifically, the first change information and the second change information include an object on the image before controlling the target robot by the unit control amount, and an object on the image after controlling the target robot by the unit control amount. Here is a mapping that converts to
 単位制御量だけ対象ロボットを制御したときの画像上の対象物の状態の変化量は、対象物の実空間上の状態に依存する。そのため、変化情報生成部56は、第1基準動画の教示範囲の各フレームに対して第1変化情報セット571を生成する。さらに、変化情報生成部56は、第2基準動画の教示範囲の各フレームに対して第2変化情報セット572を生成する。 変 化 The amount of change in the state of the target on the image when the target robot is controlled by the unit control amount depends on the state of the target in the real space. Therefore, the change information generating unit 56 generates the first change information set 571 for each frame in the teaching range of the first reference moving image. Further, the change information generating unit 56 generates a second change information set 572 for each frame in the teaching range of the second reference moving image.
 変化情報生成部56による第1変化情報セット571および第2変化情報セット572の生成および格納処理は、事前準備として実行される。 The process of generating and storing the first change information set 571 and the second change information set 572 by the change information generating unit 56 is executed as preparation.
 図8を参照して、第1制御部55aにおける第1変化情報セット571の生成方法を説明する。なお、第1制御部55aにおける第2変化情報セット572の生成、ならびに、第2制御部55bにおける第1変化情報セット571および第2変化情報セット572の生成方法も同様の方法であるため、これらの生成方法の説明を省略する。 A method of generating the first change information set 571 in the first control unit 55a will be described with reference to FIG. Note that the method of generating the second change information set 572 in the first control unit 55a and the method of generating the first change information set 571 and the second change information set 572 in the second control unit 55b are the same. The description of the generation method is omitted.
 図8は、第1制御部における第1変化情報セットの生成方法を説明する図である。図8(a)は、第1基準動画のk番目のフレーム84を示す。k番目のフレーム84に対応するオスコネクタ2aの実空間上の状態(ここでは位置および姿勢)を基準状態とする。図8(b)は、基準状態からオスコネクタ2aをY方向の並進自由度に単位制御量だけ並進移動させた後に撮像装置21によって撮像された画像94aを示す。図8(c)は、基準状態からオスコネクタ2aをX方向の並進自由度に単位制御量だけ並進移動させた後に撮像装置21によって撮像された画像94bを示す。図8(d)は、基準状態からオスコネクタ2aをZ方向の並進自由度に単位制御量だけ並進移動させた後に撮像装置21によって撮像された画像94cを示す。図8(e)は、基準状態からオスコネクタ2aをピッチ方向の回転自由度に単位制御量だけ回転移動させた後に撮像装置21によって撮像された画像94dを示す。図8(f)は、基準状態からオスコネクタ2aをヨー方向の回転自由度に単位制御量だけ回転移動させた後に撮像装置21によって撮像された画像94eを示す。図8(g)は、基準状態からオスコネクタ2aをロール方向の回転自由度に単位制御量だけ回転移動させた後に撮像装置21によって撮像された画像94fを示す。 FIG. 8 is a diagram illustrating a method of generating the first change information set in the first control unit. FIG. 8A shows the k-th frame 84 of the first reference moving image. The state (here, position and orientation) of the male connector 2a corresponding to the k-th frame 84 in the real space is set as a reference state. FIG. 8B illustrates an image 94a captured by the imaging device 21 after the male connector 2a is translated from the reference state by the unit control amount to the degree of translational freedom in the Y direction. FIG. 8C illustrates an image 94b captured by the imaging device 21 after the male connector 2a is translated from the reference state by the unit control amount to the degree of translation in the X direction. FIG. 8D shows an image 94c captured by the imaging device 21 after the male connector 2a is translated from the reference state by a unit control amount to the translational freedom in the Z direction. FIG. 8E shows an image 94d captured by the imaging device 21 after the male connector 2a is rotationally moved by the unit control amount to the rotational degree of freedom in the pitch direction from the reference state. FIG. 8F illustrates an image 94e captured by the imaging device 21 after the male connector 2a is rotationally moved by a unit control amount to the rotational degree of freedom in the yaw direction from the reference state. FIG. 8G shows an image 94f captured by the imaging device 21 after the male connector 2a is rotationally moved by the unit control amount to the rotational degree of freedom in the roll direction from the reference state.
 変化情報生成部56は、フレーム84および画像94a~94fの各々から抽出されたオスコネクタ2aの各特徴点の画像上の座標を、画像処理部53から取得する。 The change information generating unit 56 acquires, from the image processing unit 53, the coordinates on the image of each feature point of the male connector 2a extracted from each of the frame 84 and the images 94a to 94f.
 変化情報生成部56は、フレーム84から抽出されたオスコネクタ2aの特徴点4a’~4g’の座標を、画像94aから抽出された特徴点4a~4gの座標にそれぞれ変換する写像を示す情報を、Y方向の並進自由度に対応する第1変化情報として生成する。 The change information generating unit 56 outputs information indicating a mapping for converting the coordinates of the feature points 4a 'to 4g' of the male connector 2a extracted from the frame 84 into the coordinates of the feature points 4a to 4g extracted from the image 94a. , And the first change information corresponding to the translational degree of freedom in the Y direction.
 同様に、変化情報生成部56は、特徴点4a’~4g’の座標を画像94bから抽出された特徴点4a~4gの座標に変換する写像を示す情報を、X方向の並進自由度に対応する第1変化情報として生成する。変化情報生成部56は、特徴点4a’~4g’の座標を画像94cから抽出された特徴点4a~4gの座標に変換する写像を示す情報を、Z方向の並進自由度に対応する第1変化情報として生成する。変化情報生成部56は、特徴点4a’~4g’の座標を画像94dから抽出された特徴点4a~4gの座標に変換する写像を示す情報を、ピッチ方向の回転自由度に対応する第1変化情報として生成する。変化情報生成部56は、特徴点4a’~4g’の座標を画像94eから抽出された特徴点4a~4gの座標に変換する写像を示す情報を、ヨー方向の回転自由度に対応する第1変化情報として生成する。変化情報生成部56は、特徴点4a’~4g’の座標を画像94fから抽出された特徴点4a~4gの座標に変換する写像を示す情報を、ロール方向の回転自由度に対応する第1変化情報として生成する。このようにして、変化情報生成部56は、第1基準動画のk番目のフレーム84に対応する第1変化情報セット571を生成する。 Similarly, the change information generation unit 56 converts information indicating a mapping that transforms the coordinates of the feature points 4a ′ to 4g ′ into the coordinates of the feature points 4a to 4g extracted from the image 94b according to the translation degree of freedom in the X direction. Is generated as first change information. The change information generation unit 56 converts the information indicating the mapping that converts the coordinates of the feature points 4a ′ to 4g ′ into the coordinates of the feature points 4a to 4g extracted from the image 94c into the first information corresponding to the translation degree in the Z direction. Generated as change information. The change information generation unit 56 converts the information indicating the mapping that converts the coordinates of the feature points 4a ′ to 4g ′ into the coordinates of the feature points 4a to 4g extracted from the image 94d into the first information corresponding to the rotational degree of freedom in the pitch direction. Generated as change information. The change information generation unit 56 converts the information indicating the mapping that converts the coordinates of the feature points 4a ′ to 4g ′ into the coordinates of the feature points 4a to 4g extracted from the image 94e into the first information corresponding to the rotational degree of freedom in the yaw direction. Generated as change information. The change information generation unit 56 converts information indicating a mapping that transforms the coordinates of the feature points 4a ′ to 4g ′ into the coordinates of the feature points 4a to 4g extracted from the image 94f according to the first degree of rotation corresponding to the rotational degree of freedom in the roll direction. Generated as change information. In this way, the change information generating unit 56 generates the first change information set 571 corresponding to the k-th frame 84 of the first reference moving image.
 変化情報生成部56は、同様の方法により、第1基準動画のうちの教示範囲の残りのフレームに対応する第1変化情報セット571を生成する。 The change information generation unit 56 generates the first change information set 571 corresponding to the remaining frames in the teaching range of the first reference moving image by the same method.
 <B1-2-7.算出部>
 算出部58は、撮像装置21,22によって撮像された実画像上の対象物の状態を第1基準動画および第2基準動画の目標フレーム上の対象物の状態にそれぞれ近づけるための複数の自由度の各々の制御量を算出する。
<B1-2-7. Calculation part>
The calculating unit 58 includes a plurality of degrees of freedom for bringing the state of the target on the real image captured by the imaging devices 21 and 22 closer to the state of the target on target frames of the first reference moving image and the second reference moving image, respectively. Is calculated.
 算出部58は、変化情報記憶部57から、目標フレームに対応する第1変化情報セット571および第2変化情報セット572を取得し、取得した第1変化情報セット571および第2変化情報セット572に基づいて、制御量を算出する。上述したように、第1変化情報および第2変化情報は、単位制御量だけ対象ロボットを制御する前の画像上の対象物を、単位制御量だけ対象ロボットを制御した後の画像上の対象物に変換する写像を示す。そこで、算出部58は、実画像から抽出された対象物の特徴点の画像上の座標と、目標フレームから抽出された対象物の特徴点の画像上の座標とを画像処理部53から取得する。算出部58は、第1変化情報および第2変化情報に基づいて、実画像上の対象物を目標フレーム上の対象物に写像するための複数の自由度の各々の制御量を算出する。 The calculation unit 58 acquires the first change information set 571 and the second change information set 572 corresponding to the target frame from the change information storage unit 57, and stores the acquired first change information set 571 and second change information set 572 in the acquired first change information set 571 and second change information set 572. The control amount is calculated based on the control amount. As described above, the first change information and the second change information include an object on the image before controlling the target robot by the unit control amount and an object on the image after controlling the target robot by the unit control amount. Here is a mapping that converts to Therefore, the calculation unit 58 acquires from the image processing unit 53 the coordinates on the image of the feature points of the object extracted from the real image and the coordinates on the image of the feature points of the object extracted from the target frame. . The calculating unit 58 calculates a control amount of each of a plurality of degrees of freedom for mapping the target on the real image to the target on the target frame based on the first change information and the second change information.
 図9を参照して、第1制御部55aの算出部58による制御量の算出方法を説明する。なお、第2制御部55bの算出部58による制御量の算出方法も同様の方法であるため、説明を省略する。 With reference to FIG. 9, a method of calculating the control amount by the calculating unit 58 of the first control unit 55a will be described. Note that the calculation method of the control amount by the calculation unit 58 of the second control unit 55b is the same as that of the second control unit 55b.
 図9は、第1制御部の算出部による制御量の算出方法を説明する図である。算出部58は、第1基準動画の目標フレームから抽出されたオスコネクタ2aの特徴点4a’~4g’の画像上の座標を画像処理部53から取得する。さらに、算出部58は、撮像装置21の撮像により得られた実画像から抽出されたオスコネクタ2aの特徴点4a~4gの画像上の座標を画像処理部53から取得する。なお、特徴点の数は、7つに限定されるものではない。 FIG. 9 is a diagram illustrating a method of calculating the control amount by the calculation unit of the first control unit. The calculating unit 58 obtains, from the image processing unit 53, the coordinates on the image of the feature points 4a 'to 4g' of the male connector 2a extracted from the target frame of the first reference moving image. Further, the calculation unit 58 acquires from the image processing unit 53 the coordinates on the image of the characteristic points 4a to 4g of the male connector 2a extracted from the real image obtained by the imaging of the imaging device 21. The number of feature points is not limited to seven.
 算出部58は、各特徴点の差分ベクトル61a~61gを算出する。差分ベクトル61a~61gはそれぞれ、特徴点4a~4gを起点とし、特徴点4a’~4g’を終点とするベクトルである。算出部58は、差分ベクトル61a~61gの平均のx成分Δx1およびy成分Δy1を算出する。x成分およびy成分は、画像の座標系で示される。 The calculation unit 58 calculates the difference vectors 61a to 61g of each feature point. The difference vectors 61a to 61g are vectors having the feature points 4a to 4g as starting points and the feature points 4a 'to 4g' as end points, respectively. The calculation unit 58 calculates the average x component Δx1 and y component Δy1 of the difference vectors 61a to 61g. The x component and the y component are indicated by the coordinate system of the image.
 同様の方法により、算出部58は、撮像装置22の撮像により得られた実画像から抽出された特徴点と第2基準動画の目標フレームから抽出された特徴点との差分ベクトルの平均のx成分Δx2およびy成分Δy2を算出する。 In a similar manner, the calculation unit 58 calculates the average x component of the difference vector between the feature point extracted from the real image obtained by the imaging of the imaging device 22 and the feature point extracted from the target frame of the second reference moving image. Calculate Δx2 and y component Δy2.
 まず、算出部58は、複数の特徴点それぞれの差分ベクトルの平均分がなくなるように、3つの並進自由度の制御量を算出する。具体的には、算出部58は、Δx1,Δy1,Δx2およびΔy2と第1変化情報セット571と第2変化情報セット572とを用いて、ハンド31aのX方向、Y方向およびZ方向の並進自由度の各々の制御量を算出する。 First, the calculation unit 58 calculates the control amounts of the three translational degrees of freedom such that the average of the difference vectors of the plurality of feature points is eliminated. Specifically, the calculating unit 58 uses the Δx1, Δy1, Δx2, and Δy2, the first change information set 571, and the second change information set 572 to translate the hand 31a in the X, Y, and Z directions. The control amount of each degree is calculated.
 ハンド31aがX方向、Y方向およびZ方向の並進自由度のいずかに並進移動した場合、撮像装置21,22の撮像により得られる画像上において、オスコネクタ2aは一定方向に並進移動する。そのため、第1変化情報セット571における並進自由度に対応する第1変化情報は、画像上の任意の点を一定方向に並進移動させた点に変換する写像を示す。同様に、第2変化情報セット572における並進自由度に対応する第2変化情報は、画像上の任意の点を一定方向に並進移動させた点に変換する写像を示す。 When the hand 31a translates in any of the degrees of freedom of translation in the X, Y, and Z directions, the male connector 2a translates in a certain direction on the images obtained by the imaging devices 21 and 22. Therefore, the first change information corresponding to the degree of freedom of translation in the first change information set 571 indicates a mapping that converts an arbitrary point on the image into a point translated in a certain direction. Similarly, the second change information corresponding to the degree of freedom of translation in the second change information set 572 indicates a mapping that converts an arbitrary point on the image into a point translated in a certain direction.
 ここで、目標フレームに対応する第1変化情報セット571のX方向の並進自由度に対応する第1変化情報が、点(x、y)を点(x+dX1_1,y+dY1_1)に変換する写像を示すものとする。Y方向への並進自由度に対応する第1変化情報が、画像上の任意の点(x、y)を点(x+dX1_2,y+dY1_2)に変換する写像を示すものとする。Z方向への並進自由度に対応する第1変化情報が、画像上の任意の点(x、y)を点(x+dX1_3,y+dY1_3)に変換する写像を示すものとする。 Here, the first change information corresponding to the translation degree of freedom in the X direction of the first change information set 571 corresponding to the target frame indicates a mapping that converts the point (x, y) into a point (x + dX1_1, y + dY1_1). And The first change information corresponding to the degree of freedom of translation in the Y direction indicates a mapping that converts an arbitrary point (x, y) on the image into a point (x + dX1_2, y + dY1_2). It is assumed that the first change information corresponding to the degree of freedom of translation in the Z direction indicates a mapping that converts an arbitrary point (x, y) on the image into a point (x + dX1_3, y + dY1_3).
 さらに、目標フレームに対応する第2変化情報セット572のX方向への並進自由度に対応する第2変化情報が、画像上の任意の点(x、y)を点(x+dX2_1,y+dY2_1)に変換する写像を示すものとする。Y方向への並進自由度に対応する第2変化情報が、画像上の任意の点(x、y)を点(x+dX2_2,y+dY2_2)に変換する写像を示すものとする。Z方向への並進自由度に対応する第2変化情報が、画像上の任意の点(x、y)を点(x+dX2_3,y+dY2_3)に変換する写像を示すものとする。 Further, the second change information corresponding to the degree of freedom of translation in the X direction of the second change information set 572 corresponding to the target frame converts an arbitrary point (x, y) on the image into a point (x + dX2_1, y + dY2_1). Are shown. The second change information corresponding to the degree of freedom of translation in the Y direction indicates a mapping that converts an arbitrary point (x, y) on the image into a point (x + dX2_2, y + dY2_2). The second change information corresponding to the translation degree of freedom in the Z direction indicates a mapping that converts an arbitrary point (x, y) on the image into a point (x + dX2_3, y + dY2_3).
 このとき、算出部58は、以下の4つの線形方程式(1)~(4)を解くことにより、係数a1,a2,a3を算出する。 At this time, the calculation unit 58 calculates coefficients a1, a2, and a3 by solving the following four linear equations (1) to (4).
 a1×dX1_1+a2×dX1_2+a3×dX1_3=Δx1 ・・(1)
 a1×dY1_1+a2×dY1_2+a3×dY1_3=Δy1 ・・(2)
 a1×dX2_1+a2×dX2_2+a3×dX2_3=Δx2 ・・(3)
 a1×dY2_1+a2×dY2_2+a3×dY2_3=Δy2 ・・(4)。
a1 × dX1_1 + a2 × dX1_2 + a3 × dX1_3 = Δx1 (1)
a1 × dY1_1 + a2 × dY1_2 + a3 × dY1_3 = Δy1 (2)
a1 × dX2_1 + a2 × dX2_2 + a3 × dX2_3 = Δx2 (3)
a1 × dY2_1 + a2 × dY2_2 + a3 × dY2_3 = Δy2 (4).
 第1変化情報および第2変化情報は、単位制御量だけロボット30aを制御したときの画像上のオスコネクタ2aの状態の変化量を示す。そのため、算出部58は、X方向の並進自由度の制御量を単位制御量のa1倍、Y方向の並進自由度の制御量を単位制御量のa2倍、Z方向の並進自由度の制御量を単位制御量のa3倍とする。これら並進自由度の制御量は、複数の特徴点の差分ベクトルの平均分だけ、実画像上のオスコネクタ2aの状態を目標フレーム上のオスコネクタ2aの状態に近づけるための制御量である。 The first change information and the second change information indicate the amount of change in the state of the male connector 2a on the image when the robot 30a is controlled by the unit control amount. Therefore, the calculation unit 58 calculates the control amount of the translational freedom in the X direction by a1 times the unit control amount, the control amount of the translational freedom in the Y direction by a2 times the unit control amount, and the control amount of the translational freedom in the Z direction. Is a3 times the unit control amount. The control amounts of these translation degrees of freedom are control amounts for bringing the state of the male connector 2a on the real image closer to the state of the male connector 2a on the target frame by the average of the difference vectors of the plurality of feature points.
 次に、算出部58は、3つの回転自由度の各々の制御量を算出する。算出部58は、各特徴点の差分ベクトルから上記の平均のx成分(Δx1またはΔx2)およびy成分(Δy1またはΔy2)を差し引く。算出部58は、たとえば山登り法のような探索アルゴリズムを用いて、各特徴点の差分ベクトルの残差が最も0に近くなる3つの回転自由度の制御量を算出する。 Next, the calculation unit 58 calculates the control amounts of the three rotational degrees of freedom. The calculation unit 58 subtracts the above average x component (Δx1 or Δx2) and y component (Δy1 or Δy2) from the difference vector of each feature point. Using a search algorithm such as a hill-climbing method, the calculating unit 58 calculates the control amounts of the three rotational degrees of freedom in which the residual of the difference vector of each feature point is closest to zero.
 具体的には、算出部58は、ピッチ方向、ヨー方向およびロール方向の回転自由度の制御量が0である解を現在の解として探索アルゴリズムを開始する。算出部58は、現在の解の近傍の複数の解の各々に従ってロボット30aを制御したときの、各特徴点の差分ベクトルの残差の変化をシミュレーションする。算出部58は、シミュレーション結果から、現在の解よりも各特徴点の差分ベクトルの残差が0に近い近傍解が存在する場合に、当該近傍解を現在の解に置き換える。算出部58は、この処理を繰り返すことにより、差分ベクトルの残差が極値となる解を探索する。 Specifically, the calculation unit 58 starts a search algorithm with a solution in which the control amounts of the rotational degrees of freedom in the pitch direction, the yaw direction, and the roll direction are 0 as the current solution. The calculation unit 58 simulates a change in the residual of the difference vector of each feature point when the robot 30a is controlled according to each of a plurality of solutions near the current solution. The calculation unit 58 replaces the neighboring solution with the current solution when there is a neighboring solution in which the residual of the difference vector of each feature point is closer to 0 than the current solution based on the simulation result. The calculation unit 58 searches for a solution in which the residual of the difference vector becomes an extreme value by repeating this process.
 <B1-2-8.指令部>
 指令部59は、算出部58によって算出された制御量だけ対象ロボットを移動させるための制御指令を生成し、生成した制御指令を対象ロボットコントローラに出力する。対象ロボットコントローラは、第1制御部55aではロボットコントローラ40aであり、第2制御部55bではロボットコントローラ40bである。
<B1-2-8. Command part>
The command unit 59 generates a control command for moving the target robot by the control amount calculated by the calculation unit 58, and outputs the generated control command to the target robot controller. The target robot controller is the robot controller 40a in the first controller 55a, and the robot controller 40b in the second controller 55b.
 <B1-2-9.終了判定部>
 終了判定部60は、実画像上の対象物の状態と教示範囲の最終フレーム上の対象物の状態との偏差を算出し、算出した偏差が予め定められた閾値未満である場合に、対象ロボットの制御を終了すると判定する。終了判定部60は、対象ロボットの制御を終了すると判定すると、終了通知を出力する。
<B1-2-9. End determination section>
The end determination unit 60 calculates a deviation between the state of the target object on the real image and the state of the target object on the last frame of the teaching range, and when the calculated deviation is less than a predetermined threshold value, Is determined to end. When determining that the control of the target robot is to be ended, the end determination unit 60 outputs an end notification.
 偏差は、たとえば、実画像および最終フレームから抽出された対象物の対応する特徴点同士の距離の平均である。 The deviation is, for example, an average of distances between corresponding feature points of the target object extracted from the real image and the final frame.
 閾値は、対象物の状態に要求される精度に応じて設定される。閾値は、第1制御部55aでは閾値Thaであり、第2制御部55bでは閾値Thbである。閾値Thaと閾値Thbとは同じであってもよいし、異なっていてもよい。 The threshold value is set according to the accuracy required for the state of the object. The threshold is the threshold Tha in the first controller 55a, and the threshold Thb in the second controller 55b. The threshold value Tha and the threshold value Thb may be the same or different.
 <C1.動作例>
 <C1-1.変化情報の生成処理>
 図10を参照して、変化情報生成部56による変化情報の生成処理の流れについて説明する。図10は、変化情報生成部による変化情報の生成処理の流れの一例を示すフローチャートである。なお、図10には、第1制御部55aの変化情報生成部56の処理の流れが示されている。第2制御部55bの変化情報生成部56も、図10と同様の方法に従って変化情報を生成すればよい。変化情報の生成処理は、事前準備として行なわれる。
<C1. Operation example>
<C1-1. Change information generation processing>
With reference to FIG. 10, a flow of a change information generating process performed by the change information generating unit 56 will be described. FIG. 10 is a flowchart illustrating an example of a flow of a change information generation process performed by the change information generation unit. FIG. 10 shows a flow of processing of the change information generation unit 56 of the first control unit 55a. The change information generation unit 56 of the second control unit 55b may generate the change information according to the same method as in FIG. The process of generating change information is performed as advance preparation.
 まずステップS1において、制御装置50は、所定位置に搬送されてきたオスコネクタ2aをハンド31aによって把持し、オスコネクタ2aを上方から下方に移動させる定型動作をロボット30aに行なわせる。これにより、オスコネクタ2aは、撮像装置21,22の視野内に移動される。 First, in step S1, the control device 50 causes the robot 30a to perform a fixed operation of holding the male connector 2a conveyed to a predetermined position by the hand 31a and moving the male connector 2a from above to below. Thereby, the male connector 2a is moved within the field of view of the imaging devices 21 and 22.
 ステップS2において、変化情報生成部56は、第1基準動画および第2基準動画の教示範囲の先頭フレームと同じ画像が撮像装置21,22によってそれぞれ撮像されるように、ロボット30aを制御する。ステップS2のサブルーチンについては後述する。 In step S2, the change information generation unit 56 controls the robot 30a such that the same images as the first frames of the teaching ranges of the first reference moving image and the second reference moving image are captured by the imaging devices 21 and 22, respectively. The subroutine of step S2 will be described later.
 ステップS3において、変化情報生成部56は、kを教示範囲の先頭フレームのフレーム番号にセットする。ステップS4において、変化情報生成部56は、6自由度のうちの1つを選択する。 In step S3, the change information generating unit 56 sets k to the frame number of the first frame of the teaching range. In step S4, the change information generator 56 selects one of the six degrees of freedom.
 ステップS5において、変化情報生成部56は、選択した自由度の正方向に単位制御量だけハンド31aを移動させるための制御指令を生成し、ロボットコントローラ40aに出力する。 In step S5, the change information generation unit 56 generates a control command for moving the hand 31a by the unit control amount in the positive direction of the selected degree of freedom, and outputs the control command to the robot controller 40a.
 ステップS6において、変化情報生成部56は、ハンド31aが単位制御量だけ移動した後、撮像装置21,22から最新の実画像を取得する。 In step S6, the change information generator 56 acquires the latest actual image from the imaging devices 21 and 22 after the hand 31a has moved by the unit control amount.
 ステップS7において、変化情報生成部56は、ステップS6で取得した実画像から抽出されたオスコネクタ2aの特徴点の座標を画像処理部53から取得する。 In step S7, the change information generating unit 56 obtains, from the image processing unit 53, the coordinates of the characteristic points of the male connector 2a extracted from the real image obtained in step S6.
 ステップS8において、変化情報生成部56は、第1基準動画および第2基準動画のk番目のフレームから抽出されたオスコネクタ2aの特徴点の座標を画像処理部53から取得する。 In step S <b> 8, the change information generating unit 56 acquires the coordinates of the characteristic points of the male connector 2 a extracted from the k-th frame of the first reference moving image and the second reference moving image from the image processing unit 53.
 ステップS9において、変化情報生成部56は、ステップS8で取得した座標と、ステップS7で取得した座標とに基づいて、k番目のフレームおよびステップS4で選択した自由度に対応する第1変化情報および第2変化情報を生成する。すなわち、変化情報生成部56は、撮像装置21による実画像から抽出された特徴点の座標を第1基準動画のk番目のフレームから抽出された特徴点の座標に変換するための写像を示す第1変化情報を生成する。さらに、変化情報生成部56は、撮像装置22による実画像から抽出された特徴点の座標を第2基準動画のk番目のフレームから抽出された特徴点の座標に変換するための写像を示す第2変化情報を生成する。 In step S9, the change information generating unit 56 sets the first change information corresponding to the k-th frame and the degree of freedom selected in step S4 based on the coordinates obtained in step S8 and the coordinates obtained in step S7. The second change information is generated. That is, the change information generating unit 56 displays a mapping for converting the coordinates of the feature points extracted from the real image by the imaging device 21 into the coordinates of the feature points extracted from the k-th frame of the first reference moving image. 1 Generate change information. Further, the change information generating unit 56 displays a mapping for converting the coordinates of the feature points extracted from the real image by the imaging device 22 into the coordinates of the feature points extracted from the k-th frame of the second reference moving image. 2 Change information is generated.
 ステップS10において、変化情報生成部56は、ハンド31aを元の状態(直近のステップS5の前の状態)に戻すための制御指令を生成し、ロボットコントローラ40aに出力する。これにより、オスコネクタ2aは、ステップS5の前の状態に戻る。 In step S10, the change information generating unit 56 generates a control command for returning the hand 31a to the original state (the state before the latest step S5), and outputs the control command to the robot controller 40a. As a result, the male connector 2a returns to the state before step S5.
 ステップS11において、変化情報生成部56は、未選択の自由度があるか判定する。未選択の自由度がある場合(ステップS11でYES)、変化情報の生成処理は、ステップS4に戻る。その結果、ステップS4~ステップS10が繰り返され、6自由度の各々について、第1変化情報および第2変化情報が生成される。 In step S11, the change information generation unit 56 determines whether there is any unselected degree of freedom. If there is an unselected degree of freedom (YES in step S11), the process of generating change information returns to step S4. As a result, steps S4 to S10 are repeated, and the first change information and the second change information are generated for each of the six degrees of freedom.
 未選択の自由度がない場合(ステップS11でNO)、変化情報生成部56は、ステップS12において、kが教示範囲の最終フレームのフレーム番号であるか否かを判定する。 If there is no unselected degree of freedom (NO in step S11), in step S12, the change information generation unit 56 determines whether k is the frame number of the last frame in the teaching range.
 kが最終フレームのフレーム番号ではない場合(ステップS12でNO)、変化情報生成部56は、ステップS13において、第1基準動画および第2基準動画のk+1番目のフレームと同じ画像が撮像装置21,22によってそれぞれ撮像されるように、ロボット30aを制御する。 If k is not the frame number of the last frame (NO in step S12), in step S13, the change information generating unit 56 generates the same image as the (k + 1) th frame of the first reference moving image and the second reference moving image in the imaging device 21, The robot 30a is controlled so as to be imaged by each of the robots 22.
 具体的には、変化情報生成部56は、上述した算出部58の処理内容と同様の方法により、k番目のフレーム上のオスコネクタ2aの状態をk+1番目のフレーム上のオスコネクタ2aの状態に近づけるための6自由度の制御量を算出する。すなわち、変化情報生成部56は、k番目のフレームから抽出された複数の特徴点の各々について、当該特徴点を起点とし、k+1番目のフレームから抽出された対応する特徴点を終点とする差分ベクトルを求める。変化情報生成部56は、特徴点ごとの差分ベクトルとk番目のフレームに対応する第1変化情報および第2変化情報とに基づいて、6自由度の制御量を算出する。変化情報生成部56は、算出した制御量を示す制御指令を生成し、ロボットコントローラ40aに出力する。 Specifically, the change information generation unit 56 changes the state of the male connector 2a on the k-th frame to the state of the male connector 2a on the (k + 1) -th frame by the same method as the processing content of the calculation unit 58 described above. A control amount with six degrees of freedom for approaching is calculated. That is, for each of the plurality of feature points extracted from the k-th frame, the change information generation unit 56 sets the difference vector having the feature point as a starting point and the corresponding feature point extracted from the (k + 1) -th frame as an end point. Ask for. The change information generation unit 56 calculates a control amount having six degrees of freedom based on the difference vector for each feature point and the first change information and the second change information corresponding to the k-th frame. The change information generation unit 56 generates a control command indicating the calculated control amount, and outputs the control command to the robot controller 40a.
 ステップS14において、変化情報生成部56は、kに1だけ加算する。ステップS14の後、処理はステップS4に戻る。これにより、k+1番目のフレームについても、6自由度の各々の第1変化情報および第2変化情報が生成される。 In step S14, the change information generation unit 56 adds 1 to k. After step S14, the process returns to step S4. As a result, also for the (k + 1) th frame, the first change information and the second change information for each of the six degrees of freedom are generated.
 kが最終フレームのフレーム番号である場合(ステップS12でYES)、全てのフレームについて第1変化情報セット571および第2変化情報セット572が生成されたため、変化情報の生成処理は終了する。 If k is the frame number of the last frame (YES in step S12), the first change information set 571 and the second change information set 572 have been generated for all frames, and the change information generation processing ends.
 図11は、図10に示すステップS2のサブルーチンの処理の流れを示すフローチャートである。 FIG. 11 is a flowchart showing the flow of the processing of the subroutine of step S2 shown in FIG.
 まずステップS21において、変化情報生成部56は、第1基準動画および第2基準動画の教示範囲の先頭フレームから抽出されたオスコネクタ2aの特徴点の座標を画像処理部53から取得する。 First, in step S21, the change information generation unit 56 acquires from the image processing unit 53 the coordinates of the characteristic points of the male connector 2a extracted from the first frame of the teaching range of the first reference moving image and the second reference moving image.
 ステップS22において、変化情報生成部56は、撮像装置21,22から最新の実画像を取得する。 In step S22, the change information generation unit 56 acquires the latest real image from the imaging devices 21 and 22.
 ステップS23において、変化情報生成部56は、ステップS22で取得した実画像から抽出されたオスコネクタ2aの特徴点の座標を画像処理部53から取得する。 In step S23, the change information generating unit 56 acquires from the image processing unit 53 the coordinates of the characteristic points of the male connector 2a extracted from the actual image acquired in step S22.
 ステップS24において、変化情報生成部56は、ステップS22で取得した実画像上のオスコネクタ2aの状態と先頭フレーム上のオスコネクタ2aの状態との偏差が閾値Tha未満であるか否かを判定する。偏差は、たとえば、実画像および先頭フレームから抽出されたオスコネクタ2aの対応する特徴点同士の距離の平均である。 In step S24, the change information generation unit 56 determines whether the deviation between the state of the male connector 2a on the real image acquired in step S22 and the state of the male connector 2a on the first frame is less than the threshold Tha. . The deviation is, for example, an average of distances between corresponding feature points of the male connector 2a extracted from the real image and the first frame.
 偏差が閾値Tha未満である場合(ステップS24でYES)、変化情報生成部56は、先頭フレームと同じ画像が撮像装置21,22によってそれぞれ撮像されると判定し、処理を終了する。 If the deviation is less than the threshold value Tha (YES in step S24), the change information generation unit 56 determines that the same image as the first frame is captured by the imaging devices 21 and 22, respectively, and ends the process.
 偏差が閾値Tha以上である場合(ステップS24でNO)、ステップS25において、変化情報生成部56は、6自由度のうちの1つを選択する。ステップS26において、変化情報生成部56は、制御方向として正方向および負方向のいずれかを選択する。 If the deviation is equal to or larger than the threshold Tha (NO in step S24), in step S25, the change information generating unit 56 selects one of the six degrees of freedom. In step S26, the change information generator 56 selects one of the positive direction and the negative direction as the control direction.
 ステップS27において、変化情報生成部56は、選択した自由度の選択した制御方向に単位制御量だけハンド31aを移動させるための制御指令を生成し、ロボットコントローラ40aに出力する。 In step S27, the change information generating unit 56 generates a control command for moving the hand 31a by the unit control amount in the selected control direction with the selected degree of freedom, and outputs the control command to the robot controller 40a.
 ステップS28において、変化情報生成部56は、ハンド31aが単位制御量だけ移動した後、撮像装置21,22から実画像を取得する。 In step S28, the change information generation unit 56 acquires the actual image from the imaging devices 21 and 22 after the hand 31a moves by the unit control amount.
 ステップS29において、変化情報生成部56は、ステップS28で取得した実画像から抽出されたオスコネクタ2aの特徴点の座標を画像処理部53から取得する。 In step S29, the change information generating unit 56 obtains, from the image processing unit 53, the coordinates of the characteristic points of the male connector 2a extracted from the actual image obtained in step S28.
 ステップS30において、変化情報生成部56は、ステップS28で取得した実画像上のオスコネクタ2aの状態と先頭フレーム上のオスコネクタ2aの状態との偏差を算出する。 In step S30, the change information generator 56 calculates a deviation between the state of the male connector 2a on the real image acquired in step S28 and the state of the male connector 2a on the first frame.
 ステップS31において、変化情報生成部56は、ハンド31aを元の状態(直近のステップS27の前の状態)に戻すための制御指令を生成し、ロボットコントローラ40aに出力する。これにより、オスコネクタ2aは、ステップS27の前の状態に戻る。 In step S31, the change information generation unit 56 generates a control command for returning the hand 31a to the original state (the state before the latest step S27), and outputs the control command to the robot controller 40a. Thereby, the male connector 2a returns to the state before step S27.
 ステップS32において、変化情報生成部56は、未選択の制御方向があるか判定する。未選択の制御方向がある場合(ステップS32でYES)、処理はステップS26に戻る。 In step S32, the change information generator 56 determines whether there is an unselected control direction. If there is an unselected control direction (YES in step S32), the process returns to step S26.
 未選択の制御方向がない場合(ステップS32でNO)、ステップS33において、変化情報生成部56は、未選択の自由度があるか判定する。未選択の自由度がある場合(ステップS33でYES)、処理はステップS25に戻る。 If there is no unselected control direction (NO in step S32), in step S33, the change information generation unit 56 determines whether there is an unselected degree of freedom. If there is an unselected degree of freedom (YES in step S33), the process returns to step S25.
 未選択の自由度がない場合(ステップS33でNO)、ステップS34において、変化情報生成部56は、最小偏差に対応する自由度および制御方向に単位制御量だけハンド31aを移動するように、ロボットコントローラ40aを介してロボット30aを制御する。ステップS34の後、処理はステップS22に戻る。 If there is no unselected degree of freedom (NO in step S33), in step S34, the change information generating unit 56 moves the hand 31a by the unit control amount in the degree of freedom and control direction corresponding to the minimum deviation. The robot 30a is controlled via the controller 40a. After step S34, the process returns to step S22.
 <C1-2.ロボットの制御>
 図12を参照して、第1基準動画および第2基準動画に従って対象物の状態を変化させるように対象ロボットを制御する処理の流れについて説明する。図12は、第1基準動画および第2基準動画に沿って対象物の状態が変化するように対象ロボットを制御する処理の流れの一例を示すフローチャートである。
<C1-2. Robot control>
With reference to FIG. 12, a description will be given of a flow of processing for controlling the target robot so as to change the state of the target object according to the first reference moving image and the second reference moving image. FIG. 12 is a flowchart illustrating an example of a process flow of controlling the target robot such that the state of the target object changes along the first reference moving image and the second reference moving image.
 まずステップS41において、制御装置50は、全ての制御部(第1制御部55aおよび第2制御部55b)の終了判定部60から終了通知が出力されているか否かを判定する。全ての制御部の終了判定部60から終了通知が出力されている場合(ステップS41でYES)、処理を終了する。 First, in step S41, the control device 50 determines whether or not the end notification has been output from the end determination units 60 of all the control units (the first control unit 55a and the second control unit 55b). When the end notification has been output from the end determination units 60 of all the control units (YES in step S41), the process ends.
 全ての制御部の終了判定部60から終了通知が出力されていない場合(ステップS41でNO)、ステップS42において、制御装置50は、撮像装置21,22によって撮像された実画像を取得する。ステップS42は撮像周期ごとに行なわれる。 (4) When the end notification has not been output from the end determination units 60 of all the control units (NO in step S41), in step S42, the control device 50 acquires the actual images captured by the imaging devices 21 and 22. Step S42 is performed for each imaging cycle.
 ステップS43において、画像処理部53は、テンプレートマッチングにより、実画像および目標フレームから全ての対象物(オスコネクタ2aおよびメスコネクタ2b)を検出し、各対象物の特徴点の座標を抽出する。 In step S43, the image processing unit 53 detects all objects (male connector 2a and female connector 2b) from the real image and the target frame by template matching, and extracts the coordinates of the feature points of each object.
 ステップS44において、目標フレーム選択部54は、第1基準動画および第2基準動画から目標フレームを選択する。 In step S44, the target frame selection unit 54 selects a target frame from the first reference moving image and the second reference moving image.
 ステップS45において、目標フレーム選択部54は、目標フレームを教示範囲内に含む対象物を特定し、特定した対象物を制御する制御部(第1制御部55aおよび第2制御部55bの少なくとも一方)に対して制御指示を出力する。 In step S45, the target frame selection unit 54 specifies a target that includes the target frame within the teaching range and controls the specified target (at least one of the first control unit 55a and the second control unit 55b). And outputs a control instruction to.
 ステップS46において、制御指示を受けた制御部(第1制御部55aおよび第2制御部55bの少なくとも一方)は、対象ロボットの制御を行う。ステップS46の後、処理はステップS41に戻る。ステップS4でNOである場合、ステップS42~S46の一連処理が撮像周期毎に繰り返される。また、このとき、ステップS41でNOである場合、ステップS46による対象ロボットの制御を行っている間に、次の実画像を取得するステップS42を開始してもよい。これにより、対象ロボットの動作を停止させることなく、最新の実画像に応じて対象ロボットが制御され続ける。その結果、対象物の状態を速く変化させることができる。 In step S46, the control unit (at least one of the first control unit 55a and the second control unit 55b) that has received the control instruction controls the target robot. After step S46, the process returns to step S41. If NO in step S4, a series of processes in steps S42 to S46 is repeated for each imaging cycle. Further, at this time, if NO in step S41, step S42 of acquiring the next actual image may be started while the target robot is being controlled in step S46. Thus, the target robot is continuously controlled according to the latest actual image without stopping the operation of the target robot. As a result, the state of the object can be changed quickly.
 <C1-3.目標フレームの選択>
 図13および図14を参照して、目標フレーム選択部54による目標フレームの選択処理の流れについて説明する。図13は、図12に示すステップS44のサブルーチンの処理の流れを示すフローチャートである。図14は、最近接フレームと目標フレームとの関係を示す図である。
<C1-3. Select target frame>
With reference to FIG. 13 and FIG. 14, a flow of a target frame selection process by the target frame selection unit 54 will be described. FIG. 13 is a flowchart showing the flow of the processing of the subroutine of step S44 shown in FIG. FIG. 14 is a diagram illustrating the relationship between the closest frame and the target frame.
 まずステップS51において、目標フレーム選択部54は、第1基準動画および第2基準動画の各フレームから抽出された全ての対象物の特徴点の座標を画像処理部53から取得する。 First, in step S51, the target frame selection unit 54 acquires from the image processing unit 53 the coordinates of the feature points of all the objects extracted from each frame of the first reference moving image and the second reference moving image.
 ステップS52において、目標フレーム選択部54は、撮像装置21,22によって撮像された実画像から抽出された全ての対象物の特徴点の座標を画像処理部53から取得する。 In step S52, the target frame selection unit 54 obtains, from the image processing unit 53, the coordinates of the feature points of all the objects extracted from the real images captured by the imaging devices 21 and 22.
 ステップS53において、目標フレーム選択部54は、1回目の目標フレームの選択が完了しているか否かを判定する。 In step S53, the target frame selection unit 54 determines whether the first target frame selection has been completed.
 1回目の目標フレームの選択が完了している場合(ステップS53でYES)、ステップS54において、目標フレーム選択部54は、前回の目標フレームがいずれかの対象物に対応する教示範囲の最終フレームであるか否かを判定する。 If the first target frame selection has been completed (YES in step S53), in step S54, the target frame selecting unit 54 determines that the previous target frame is the last frame in the teaching range corresponding to any one of the objects. It is determined whether or not there is.
 前回の目標フレームが最終フレームである場合(ステップS54でYES)、処理はステップS55に移る。ステップS55において、目標フレーム選択部54は、当該最終フレームが属する教示範囲に対応する対象物について、実画像上の状態と最終フレーム上の状態との偏差を算出し、偏差が閾値未満であるか否かを判定する。偏差は、たとえば、実画像および最終フレームにおける対象物の対応する特徴点同士の距離の平均である。 場合 If the previous target frame is the last frame (YES in step S54), the process proceeds to step S55. In step S55, the target frame selecting unit 54 calculates a deviation between the state on the real image and the state on the final frame for the target corresponding to the teaching range to which the final frame belongs, and determines whether the deviation is less than the threshold. Determine whether or not. The deviation is, for example, an average of distances between corresponding feature points of the object in the real image and the final frame.
 偏差が閾値以上である場合(ステップ55でNO)、ステップS56において、目標フレーム選択部54は、前回の目標フレームと同じフレームを目標フレームとして選択する。ステップS56の後、処理は終了する。 If the deviation is equal to or larger than the threshold (NO in step 55), in step S56, the target frame selecting unit 54 selects the same frame as the previous target frame as the target frame. After step S56, the process ends.
 偏差が閾値未満である場合(ステップS55でYES)、対象物の状態が最終フレーム上の状態に到達したと判定され、処理はステップS57に移る。1回目の目標フレームの選択が完了していない場合(ステップS53でNO)、および前回の目標フレームが最終フレームでない場合(ステップS54でNO)も、処理はステップS57に移る。 場合 If the deviation is less than the threshold (YES in step S55), it is determined that the state of the target object has reached the state on the last frame, and the process proceeds to step S57. If the first target frame selection has not been completed (NO in step S53), and if the previous target frame is not the last frame (NO in step S54), the process proceeds to step S57.
 ステップS57において、目標フレーム選択部54は、実画像上の全ての対象物の状態と各フレーム上の全ての対象物の状態との偏差を算出し、最小偏差のフレームを最近接フレームとして特定する。偏差は、たとえば、実画像および各フレームにおける対象物の対応する特徴点同士の距離の平均である。 In step S57, the target frame selection unit 54 calculates the deviation between the state of all objects on the real image and the state of all objects on each frame, and specifies the frame with the minimum deviation as the closest frame. . The deviation is, for example, an average of distances between corresponding feature points of the object in the real image and each frame.
 具体的には、目標フレーム選択部54は、撮像装置21によって撮像された実画像上の全ての対象物の状態と第1基準動画のk番目のフレーム上の全ての対象物の状態との第1偏差を算出する。さらに、目標フレーム選択部54は、撮像装置22によって撮像された実画像上の全ての対象物の状態と第2基準動画のk番目のフレーム上の全ての対象物の状態との第2偏差を算出する。目標フレーム選択部54は、第1偏差と第2偏差との平均をk番目のフレームに対応する偏差として算出する。 Specifically, the target frame selection unit 54 determines the state of all the objects on the real image captured by the imaging device 21 and the state of all the objects on the k-th frame of the first reference moving image. Calculate one deviation. Further, the target frame selection unit 54 calculates a second deviation between the states of all the objects on the real image captured by the imaging device 22 and the states of all the objects on the k-th frame of the second reference moving image. calculate. The target frame selection unit 54 calculates the average of the first deviation and the second deviation as the deviation corresponding to the k-th frame.
 ステップS58において、目標フレーム選択部54は、最近接フレームから所定数だけ後のフレームまでに、いずれかの教示範囲の最終フレームがあるか否かを判定する。 In step S58, the target frame selection unit 54 determines whether there is a last frame in any of the teaching ranges up to a frame that is a predetermined number later than the closest frame.
 最終フレームがある場合(ステップS58でYES)、ステップS59において、目標フレーム選択部54は、当該最終フレームを目標フレームとして選択する。なお、最近接フレームから所定数だけ後のフレームまでに複数の最終フレームがある場合、目標フレーム選択部54は、当該複数の最終フレームのうち最もフレーム番号の小さい最終フレームを選択する。 場合 If there is a final frame (YES in step S58), in step S59, target frame selecting section 54 selects the final frame as a target frame. When there are a plurality of final frames from the closest frame to a frame that is a predetermined number later, the target frame selecting unit 54 selects the last frame having the smallest frame number among the plurality of final frames.
 最終フレームがない場合(ステップS58でNO)、ステップS60において、目標フレーム選択部54は、最近接フレームから所定数だけ後のフレームを目標フレームとして選択する。ステップS60の後、目標フレームの選択処理は終了する。 If there is no last frame (NO in step S58), in step S60, the target frame selecting unit 54 selects a frame that is a predetermined number later than the closest frame as the target frame. After step S60, the target frame selection process ends.
 <C1-4.制御部の処理>
 図15を参照して、図12のステップS46のサブルーチンの処理について説明する。図15は、図12に示すステップS46のサブルーチンの処理の流れを示すフローチャートである。
<C1-4. Processing of control unit>
Referring to FIG. 15, the processing of the subroutine of step S46 in FIG. 12 will be described. FIG. 15 is a flowchart showing the flow of the processing of the subroutine of step S46 shown in FIG.
 まずステップS61において、終了判定部60は、目標フレームが教示範囲の最終フレームであるか否かを判定する。 First, in step S61, the end determination unit 60 determines whether the target frame is the last frame in the teaching range.
 目標フレームが最終フレームである場合(ステップS61でYES)、ステップS62において、終了判定部60は、実画像上の対象物の状態と最終フレーム上の対象物の状態との偏差が閾値未満か否かを判定する。 If the target frame is the final frame (YES in step S61), in step S62, the end determination unit 60 determines whether the deviation between the state of the target on the real image and the state of the target on the final frame is less than a threshold. Is determined.
 偏差が閾値未満である場合(ステップS62でYES)、ステップS63において、終了判定部60は、終了通知を出力する。ステップS63の後、処理は終了する。 If the deviation is smaller than the threshold value (YES in step S62), in step S63, the termination determination unit 60 outputs a termination notification. After step S63, the process ends.
 目標フレームが最終フレームではない場合(ステップS61でNO)、および、偏差が閾値以上である場合(ステップS62でNO)、処理はステップS64に移る。 If the target frame is not the last frame (NO in step S61) and if the deviation is equal to or larger than the threshold (NO in step S62), the process proceeds to step S64.
 ステップS64において、算出部58は、目標フレームに対応する第1変化情報セット571および第2変化情報セット572に基づいて、実画像上の対象物の状態を目標フレーム上の対象物の状態に近づけるための複数の自由度の各々の制御量を算出する。 In step S64, the calculation unit 58 brings the state of the target on the real image closer to the state of the target on the target frame based on the first change information set 571 and the second change information set 572 corresponding to the target frame. For each of a plurality of degrees of freedom for the calculation.
 ステップS65において、指令部59は、算出された制御量を示す制御指令を生成し、対象ロボットコントローラに出力する。ステップS65の後、処理は終了する。 In step S65, the command unit 59 generates a control command indicating the calculated control amount, and outputs the control command to the target robot controller. After step S65, the process ends.
 第1制御部55aおよび第2制御部55bはそれぞれ、図15に示す流れに従って、対象ロボットを制御する。オスコネクタ2aおよびメスコネクタ2bの教示範囲に目標フレームが含まれる場合、実画像上のオスコネクタ2aおよびメスコネクタ2bの状態が目標フレーム上のオスコネクタ2aおよびメスコネクタ2bの状態にそれぞれ近づくように、ロボット30a,30bが制御される。これにより、オスコネクタ2aおよびメスコネクタ2bの状態は、目標フレームに従って連係して変化する。 The first control unit 55a and the second control unit 55b each control the target robot according to the flow shown in FIG. When the target frame is included in the teaching range of the male connector 2a and the female connector 2b, the states of the male connector 2a and the female connector 2b on the actual image approach the states of the male connector 2a and the female connector 2b on the target frame, respectively. , The robots 30a and 30b are controlled. As a result, the states of the male connector 2a and the female connector 2b change in a coordinated manner according to the target frame.
 <C1-5.作用・効果>
 以上のように、制御システム1は、ロボット30a,30bと、オスコネクタ2aおよびメスコネクタ2bを撮像するための撮像装置21,22と、ロボット30a,30bを制御する制御装置50とを備える。ロボット30a,30bは、オスコネクタ2aおよびメスコネクタ2bの状態をそれぞれ変化させる。撮像装置21,21は、ロボット30a,30bと異なる定位置に設置され、ロボット30a,30bによってそれぞれ支持されたオスコネクタ2aおよびメスコネクタ2bを撮像する。
<C1-5. Action / Effect>
As described above, the control system 1 includes the robots 30a and 30b, the imaging devices 21 and 22 for imaging the male connector 2a and the female connector 2b, and the control device 50 for controlling the robots 30a and 30b. The robots 30a and 30b change the states of the male connector 2a and the female connector 2b, respectively. The imaging devices 21 and 21 are installed at fixed positions different from the robots 30a and 30b, and image the male connector 2a and the female connector 2b supported by the robots 30a and 30b, respectively.
 制御装置50は、オスコネクタ2aおよびメスコネクタ2bの見本を示す第1基準動画および第2基準動画を記憶している。第1基準動画および第2基準動画は、時系列に並んだ複数のフレームを含む。 The control device 50 stores a first reference moving image and a second reference moving image showing samples of the male connector 2a and the female connector 2b. The first reference moving image and the second reference moving image include a plurality of frames arranged in chronological order.
 制御装置50は、オスコネクタ2aおよびメスコネクタ2bの各々について変化情報を取得する。オスコネクタ2aに対応する変化情報は、ロボット30aの制御量と、撮像装置21,22の画像上におけるオスコネクタ2aの状態の変化量との関係を示す。メスコネクタ2bに対応する変化情報は、ロボット30bの制御量と、撮像装置21,22の画像上におけるメスコネクタ2bの状態の変化量との関係を示す。 The control device 50 acquires change information for each of the male connector 2a and the female connector 2b. The change information corresponding to the male connector 2a indicates the relationship between the control amount of the robot 30a and the change amount of the state of the male connector 2a on the images of the imaging devices 21 and 22. The change information corresponding to the female connector 2b indicates the relationship between the control amount of the robot 30b and the change amount of the state of the female connector 2b on the images of the imaging devices 21 and 22.
 制御装置50は、撮像装置21,22によって撮像された実画像を取得する第1処理と、複数のフレームから目標フレームを選択する第2処理と、実画像と目標フレームとに基づいてロボット30a,30bの各々を制御する第3処理とを行なう。制御装置50は、オスコネクタ2aに対応する変化情報に基づいて、実画像上のオスコネクタ2aの状態を目標フレーム上のオスコネクタ2aの状態に近づけるためのロボット30aの制御量を算出し、算出した制御量に従ってロボット30aを制御する。制御装置50は、メスコネクタ2bに対応する変化情報に基づいて、実画像上のメスコネクタ2bの状態を目標フレーム上のメスコネクタ2bの状態に近づけるためのロボット30bの制御量を算出し、算出した制御量に従ってロボット30bを制御する。 The control device 50 performs a first process of acquiring a real image captured by the imaging devices 21 and 22, a second process of selecting a target frame from a plurality of frames, and the robot 30 a, based on the real image and the target frame. And a third process for controlling each of the switches 30b. The control device 50 calculates and calculates a control amount of the robot 30a for bringing the state of the male connector 2a on the real image closer to the state of the male connector 2a on the target frame based on the change information corresponding to the male connector 2a. The robot 30a is controlled according to the control amount thus set. The control device 50 calculates a control amount of the robot 30b for bringing the state of the female connector 2b on the actual image closer to the state of the female connector 2b on the target frame, based on the change information corresponding to the female connector 2b. The robot 30b is controlled according to the control amount thus set.
 これにより、実画像上のオスコネクタ2aおよびメスコネクタ2bの状態を目標フレーム上のオスコネクタ2aおよびメスコネクタ2bの状態までそれぞれ変化させることができる。つまり、オスコネクタ2aおよびメスコネクタ2bの状態は、目標フレームに従って連係して変化する。 に よ り Thereby, the states of the male connector 2a and the female connector 2b on the actual image can be respectively changed to the states of the male connector 2a and the female connector 2b on the target frame. That is, the states of the male connector 2a and the female connector 2b change in conjunction with each other according to the target frame.
 制御装置50は、第1から第3処理を含む一連処理を繰り返し実行し、第3処理を行なっている間に次の一連処理の第1処理を開始する。これにより、ロボット30a,30bの動作を停止させることなく、最新の実画像に応じてロボット30a,30bが制御され続ける。その結果、オスコネクタ2aおよびメスコネクタ2bの状態を速く変化させることができる。 The control device 50 repeatedly executes a series of processes including the first to third processes, and starts the first process of the next series of processes while performing the third process. Thereby, the robots 30a and 30b are continuously controlled according to the latest actual image without stopping the operations of the robots 30a and 30b. As a result, the states of the male connector 2a and the female connector 2b can be changed quickly.
 <D1.変形例>
 <D1-1.変形例1>
 目標フレーム選択部による目標フレームの選択処理は、図13に示す方法に限定されるものではない。
<D1. Modification>
<D1-1. Modification 1>
The process of selecting a target frame by the target frame selection unit is not limited to the method shown in FIG.
 図16は、オスコネクタとメスコネクタとの接続の見本となる基準動画の別の例を示す図である。図16に示す例の基準動画は、オスコネクタ2aが基板5の上方から基板5に向かって移動した後、基板5に平行な方向に沿ってオスコネクタ2aがメスコネクタ2bに向かって移動してメスコネクタ2bに接続する様子を示す。このように、オスコネクタ2aは、L字状の経路Aに沿って移動する。なお、フレーム74には、基板5の上方に位置しているオスコネクタ2aが示される。フレーム75には、基板5の上面に到達したオスコネクタ2aが示される。フレーム76には、メスコネクタ2bに接続されたオスコネクタ2aが示される。 FIG. 16 is a diagram showing another example of the reference moving image that is a sample of the connection between the male connector and the female connector. In the reference moving image of the example shown in FIG. 16, after the male connector 2a moves from above the substrate 5 toward the substrate 5, the male connector 2a moves toward the female connector 2b along a direction parallel to the substrate 5. The state of connection to the female connector 2b is shown. Thus, the male connector 2a moves along the L-shaped path A. The frame 74 shows the male connector 2a located above the substrate 5. The frame 75 shows the male connector 2a reaching the upper surface of the substrate 5. The frame 76 shows the male connector 2a connected to the female connector 2b.
 フレーム74に示されるオスコネクタ2aの状態においてフレーム76が目標フレームとして選択された場合、オスコネクタ2aは、フレーム75で示される状態を通過することなく、フレーム76で示される状態に向かうことになる。この場合、オスコネクタ2aは、メスコネクタ2bに対して傾斜した方向から向かう。そのため、オスコネクタ2aのピンがメスコネクタ2bの挿入孔に挿入できない可能性がある。オスコネクタ2aのピンをメスコネクタ2bの挿入孔に挿入させるためには、オスコネクタ2aは、フレーム75で示される状態を通過することが好ましい。 When the frame 76 is selected as the target frame in the state of the male connector 2a shown in the frame 74, the male connector 2a goes to the state shown in the frame 76 without passing through the state shown in the frame 75. . In this case, the male connector 2a faces from the direction inclined with respect to the female connector 2b. Therefore, there is a possibility that the pins of the male connector 2a cannot be inserted into the insertion holes of the female connector 2b. In order to insert the pins of the male connector 2a into the insertion holes of the female connector 2b, it is preferable that the male connector 2a pass through the state shown by the frame 75.
 そこで、本変形例の目標フレーム選択部54は、通過すべきフレーム(以下、「通過必須フレーム」という)の指定を受け付け、指定された通過必須フレームを必ず目標フレームとして選択する。目標フレーム選択部54は、通過必須フレームを受け付けた場合、当該通過必須フレームで示される状態を通過すべき対象物の指定を受け付ける。 Therefore, the target frame selection unit 54 of the present modification receives the specification of a frame to be passed (hereinafter, referred to as a “pass-necessary frame”), and always selects the specified pass-necessary frame as a target frame. When the target frame selection unit 54 receives a pass-necessary frame, the target frame selection unit 54 receives designation of an object to pass the state indicated by the pass-necessary frame.
 目標フレーム選択部54は、通過必須フレームの指定を促す画面を表示部532(図4参照)に表示する。作業者は、第1基準動画および第2基準動画の各フレームを確認し、入力装置534(図4参照)を操作して、通過必須フレームと当該通過必須フレームで示される状態を通過すべき対象物(通過必須フレームに対応する対象物)とを指定する。 (4) The target frame selection unit 54 displays a screen for prompting the user to specify a frame that must pass, on the display unit 532 (see FIG. 4). The worker confirms each frame of the first reference moving image and the second reference moving image, and operates the input device 534 (see FIG. 4) to pass the required pass frame and the target to pass through the state indicated by the pass required frame. An object (a target object corresponding to a required passing frame) is specified.
 図17は、実施の形態1の変形例1における目標フレームの選択処理の流れの一例を示すフローチャートである。図17に示すフローチャートは、図13に示すフローチャートと比較して、ステップS54,S58,S59の代わりにステップS74、S78,S79をそれぞれ含む点でのみ相違する。そのため、ステップS74,S78,S79についてのみ説明する。 FIG. 17 is a flowchart illustrating an example of the flow of a target frame selection process according to the first modification of the first embodiment. The flowchart shown in FIG. 17 is different from the flowchart shown in FIG. 13 only in that steps S74, S78, and S79 are included instead of steps S54, S58, and S59. Therefore, only steps S74, S78, and S79 will be described.
 ステップS74において、目標フレーム選択部54は、前回の目標フレームが最終フレームまたは通過必須フレームであるか否かを判定する。 In step S74, the target frame selection unit 54 determines whether or not the previous target frame is the last frame or a mandatory pass frame.
 ステップS78において、目標フレーム選択部54は、最近接フレームから所定数だけ後のフレームまでに最終フレームおよび通過必須フレームの少なくとも一方があるか否かを判定する。 In step S78, the target frame selecting unit 54 determines whether or not there is at least one of the last frame and the pass-by-pass frame until a predetermined number of frames later than the closest frame.
 最終フレームおよび通過必須フレームの少なくとも一方がある場合(ステップS78でYES)、ステップS79において、目標フレーム選択部54は、当該最終フレームおよび通過必須フレームの少なくとも一方を目標フレームとして選択する。なお、最近接フレームから所定数だけ後のフレームまでに最終フレームおよび通過必須フレームの少なくとも一方であるフレームが複数ある場合、目標フレーム選択部54は、当該複数のフレームのうち最もフレーム番号の小さいフレームを選択する。 If there is at least one of the last frame and the pass-necessary frame (YES in step S78), in step S79, the target frame selection unit 54 selects at least one of the last frame and the pass-necessary frame as the target frame. Note that if there are a plurality of frames that are at least one of the last frame and the required pass frame by a predetermined number of frames after the closest frame, the target frame selecting unit 54 sets the frame having the smallest frame number among the plurality of frames. Select
 本変形例1によれば、実画像上の対象物の状態が通過必須フレーム上の対象物の状態よりも前の状態であれば、通過必須フレームが必ず目標フレームとして選択される。さらに、通過必須フレームが目標フレームとして選択されると、実画像上の対象物の状態と通過必須フレーム上の対象物の状態との偏差が閾値未満となってから、目標フレームが更新される。これにより、通過必須フレームに対応する対象物の状態を通過必須フレーム上の状態により確実に変化させることができる。 According to the first modification, if the state of the target on the real image is earlier than the state of the target on the required-passing frame, the required-passing frame is always selected as the target frame. Further, when the pass-necessary frame is selected as the target frame, the target frame is updated after the deviation between the state of the target on the real image and the target state on the pass-necessary frame becomes less than the threshold. This makes it possible to reliably change the state of the target object corresponding to the passing essential frame by the state on the passing essential frame.
 さらに、目標フレーム選択部54は、通過必須フレームが目標フレームとして選択されてから規定時間経過しても、通過必須フレームに対応する対象物が通過必須フレーム上の状態と一致しない場合に、制御システムに異常が生じていると判定してもよい。 Further, the target frame selection unit 54 is configured to control the control system when the target object corresponding to the required pass frame does not match the state on the required pass frame even after the lapse of the specified time since the required pass frame is selected as the target frame. May be determined to be abnormal.
 図18は、異常判定処理の流れの一例を示すフローチャートである。まずステップS81において、目標フレーム選択部54は、通過必須フレームを目標フレームとして選択すると、タイマーをリセットする。 FIG. 18 is a flowchart showing an example of the flow of the abnormality determination process. First, in step S81, the target frame selecting unit 54 resets the timer when selecting a pass-through essential frame as a target frame.
 ステップS82において、目標フレーム選択部54は、通過必須フレームに対応する対象物について、撮像装置21,22によって撮像された実画像上の状態と通過必須フレーム上の状態との偏差が閾値未満であるか否かを判定する。偏差が閾値未満である場合(ステップS82でYES)、異常判定処理は終了する。 In step S82, the target frame selection unit 54 determines that the deviation between the state on the real image captured by the imaging devices 21 and 22 and the state on the passing essential frame is smaller than the threshold for the target corresponding to the passing essential frame. It is determined whether or not. If the deviation is less than the threshold (YES in step S82), the abnormality determination processing ends.
 偏差が閾値以上である場合(ステップS82でNO)、ステップS83において、目標フレーム選択部54は、タイマー値が規定時間を超えているか否かを判定する。 If the deviation is equal to or larger than the threshold (NO in step S82), in step S83, the target frame selecting unit 54 determines whether the timer value exceeds a specified time.
 タイマー値が規定時間を超えていない場合(ステップS83でNO)、異常判定処理はステップS82に戻る。 If the timer value does not exceed the specified time (NO in step S83), the abnormality determination process returns to step S82.
 タイマー値が規定時間を超えている場合(ステップS83でYES)、ステップS84において、目標フレーム選択部54は、制御システム1に何等かの異常が発生していると判定する。これにより、制御システム1の異常に対する対策を早く開始することができる。ステップS84の後、異常判定処理は終了する。 If the timer value exceeds the specified time (YES in step S83), in step S84, the target frame selection unit 54 determines that some abnormality has occurred in the control system 1. Thereby, the countermeasure against the abnormality of the control system 1 can be started quickly. After step S84, the abnormality determination processing ends.
 制御装置50は、目標フレーム選択部54によって異常が発生している判定された場合、表示部532に異常の発生を通知してもよいし、ロボット30a,30bの制御を停止してもよい。 When the target frame selection unit 54 determines that an abnormality has occurred, the control device 50 may notify the display unit 532 of the occurrence of the abnormality, or may stop the control of the robots 30a and 30b.
 <D1-2.変形例2>
 上記の説明では、制御システム1は、オスコネクタ2aとメスコネクタ2bとの接続(組み立て)を行なう。しかしながら、制御システム1は、別の2つの対象物同士を組み立ててもよい。
<D1-2. Modification 2>
In the above description, the control system 1 connects (assembles) the male connector 2a and the female connector 2b. However, the control system 1 may assemble another two objects.
 図19は、実施の形態1の変形例2に係る制御システムの対象物を示す模式図である。変形例2に係る制御システム1は、工業製品の生産ラインなどにおいて、上ケース6aと下ケース6bとを係合させる。上ケース6aおよび下ケース6bは、平面視略矩形状の箱状である。上ケース6aは、下側に開口するように配置される。下ケース6bは、上側に開口するように配置される。 FIG. 19 is a schematic diagram illustrating an object of the control system according to the second modification of the first embodiment. The control system 1 according to the second modification causes the upper case 6a and the lower case 6b to engage with each other in a production line of an industrial product or the like. The upper case 6a and the lower case 6b have a substantially rectangular box shape in plan view. The upper case 6a is arranged so as to open downward. The lower case 6b is arranged so as to open upward.
 下ケース6bの上端の図面手前側には2つの係合爪7a,7bが形成されている。下ケース6bの上端の図面奥側にも2つの係合爪(図示せず)が形成されている。上ケース6aは、下ケース6bの上方から下方に移動し、下ケース6bの4つの係合爪と係合する。 2Two engaging claws 7a and 7b are formed on the upper end of the lower case 6b on the near side in the drawing. Two engaging claws (not shown) are also formed at the upper end of the lower case 6b on the far side in the drawing. The upper case 6a moves downward from above the lower case 6b and engages with four engagement claws of the lower case 6b.
 上ケース6aは、ロボット30aのハンド31a(図1参照)によって把持される。下ケース6bは、ロボット30bのステージ31b(図1参照)上に載置される。 The upper case 6a is gripped by the hand 31a (see FIG. 1) of the robot 30a. The lower case 6b is mounted on a stage 31b (see FIG. 1) of the robot 30b.
 ロボット30aは、2自由度で上ケース6aの状態(ここでは形状)を変化させるための制御棒32a~32d(図19参照)をさらに備える。ロボット30aは、対向する制御棒32a,32bに対して、制御量に応じた力を互いに近づける方向に印加することが可能である。同様に、ロボット30aは、対向する制御棒32c,32dに対して、制御量に応じた力を互いに近づける方向に印加することが可能である。 The robot 30a further includes control rods 32a to 32d (see FIG. 19) for changing the state (the shape here) of the upper case 6a with two degrees of freedom. The robot 30a can apply a force corresponding to the control amount to the control rods 32a and 32b facing each other in a direction to approach each other. Similarly, the robot 30a can apply a force according to the control amount to the control rods 32c and 32d facing each other in a direction to approach each other.
 制御棒32a,32bは、上ケース6aの対向する2つの側壁にそれぞれ接触する。そのため、制御棒32a,32bに対して互いに近づける力が印加されると、上ケース6aは変形する。変形量は、制御量に応じて異なる。 (4) The control rods 32a and 32b come into contact with two opposing side walls of the upper case 6a, respectively. Therefore, when a force that causes the control rods 32a and 32b to approach each other is applied, the upper case 6a is deformed. The amount of deformation differs depending on the control amount.
 制御棒32c,32dは、上ケース6aの別の対向する2つの側壁にそれぞれ接触する。そのため、制御棒32c,32dに対して互いに近づける力が印加されると、上ケース6aは変形する。変形量は、制御量に応じて異なる。 (4) The control rods 32c and 32d are in contact with the other two opposite side walls of the upper case 6a, respectively. Therefore, when a force that approaches the control rods 32c and 32d is applied, the upper case 6a is deformed. The amount of deformation differs depending on the control amount.
 撮像装置21,22は、下ケース6bの図面手間側の係合爪7a,7bを撮像可能な位置に配置される。ただし、撮像装置21,22は、下ケース6bの図面奥側の2つの係合爪を撮像できない。そのため、実施の形態1の変形例2に係る制御システム1は、さらに撮像装置23,24を備える。撮像装置23,24は、下ケース6bの図面奥側の2つの係合爪を撮像可能な位置に配置される。 The imaging devices 21 and 22 are arranged at positions where the engagement claws 7a and 7b on the lower side of the lower case 6b can be imaged. However, the imaging devices 21 and 22 cannot image the two engagement claws on the back side of the lower case 6b in the drawing. Therefore, the control system 1 according to the second modification of the first embodiment further includes imaging devices 23 and 24. The imaging devices 23 and 24 are arranged at positions where two engagement claws on the back side of the lower case 6b in the drawing can be imaged.
 本変形例2の制御装置50は、撮像装置21~24にそれぞれ対応する4つの基準動画を記憶する。 The control device 50 of the second modification stores four reference moving images respectively corresponding to the imaging devices 21 to 24.
 変化情報生成部56は、4つの基準動画にそれぞれ対応する4つの変化情報セットを生成すればよい。ロボット30aは、6自由度で上ケース6aの位置および姿勢を変化させるとともに、2自由度で上ケース6aの形状を変化させる。そのため、変化情報生成部56は、上ケース6aについて、8自由度の各々に対応する第1変化情報および第2変化情報を生成する。 The change information generating unit 56 may generate four change information sets respectively corresponding to the four reference moving images. The robot 30a changes the position and orientation of the upper case 6a with six degrees of freedom, and changes the shape of the upper case 6a with two degrees of freedom. Therefore, the change information generation unit 56 generates the first change information and the second change information corresponding to each of the eight degrees of freedom for the upper case 6a.
 基準動画として、下ケース6bの4つの係合爪に係合しやすいように上ケース6aの形状を変形させてから、上ケース6aを下ケース6bに係合させる様子を示す動画が予め準備される。これにより、算出部58は、実画像上の上ケース6aの形状を目標フレーム上の上ケース6aの形状に近づけるための、制御棒32a,32bに対する制御量および制御棒32c,32dに対する制御量を算出する。これにより、上ケース6aを基準動画と同じように変形させることができる。その結果、上ケース6aを下ケース6bに容易に係合させることができる。 As a reference moving image, a moving image showing how the upper case 6a is engaged with the lower case 6b after deforming the shape of the upper case 6a so as to easily engage with the four engaging claws of the lower case 6b is prepared in advance. You. Accordingly, the calculation unit 58 calculates the control amounts for the control rods 32a and 32b and the control amounts for the control rods 32c and 32d for bringing the shape of the upper case 6a on the real image closer to the shape of the upper case 6a on the target frame. calculate. Thereby, the upper case 6a can be deformed in the same manner as the reference moving image. As a result, the upper case 6a can be easily engaged with the lower case 6b.
 <D1-3.変形例3>
 算出部58は、別の方法により、撮像装置21,22によって撮像された実画像上の対象物の状態を目標フレーム上の対象物の状態に近づけるための複数の自由度の各々の制御量を算出してもよい。
<D1-3. Modification 3>
The calculation unit 58 calculates the control amount of each of the plurality of degrees of freedom for bringing the state of the target on the real image captured by the imaging devices 21 and 22 closer to the state of the target on the target frame by another method. It may be calculated.
 図10のステップS13に示されるように、変化情報生成部56は、k番目のフレーム上の対象物の状態をk+1番目のフレーム上の対象物の状態に近づけるための複数の自由度の各々の制御量を算出する。算出部58は、連続する2つのフレームごとに算出された当該制御量をフレーム間制御量として記憶しておき、当該フレーム間制御量を用いて、実画像上の対象物の状態を目標フレーム上の対象物の状態に近づけるための複数の自由度の各々の制御量を算出してもよい。 As shown in step S13 in FIG. 10, the change information generation unit 56 sets each of the plurality of degrees of freedom for bringing the state of the object on the kth frame closer to the state of the object on the (k + 1) th frame. Calculate the control amount. The calculating unit 58 stores the control amount calculated for each two consecutive frames as an inter-frame control amount, and uses the inter-frame control amount to change the state of the target on the real image on the target frame. The control amount of each of the plurality of degrees of freedom for approaching the state of the target object may be calculated.
 具体的には、算出部58は、最近接フレームに対応する第1変化情報および第2変化情報に基づいて、実画像上の対象物の状態を最近接フレーム上の対象物の状態に近づけるための複数の自由度の各々の制御量αを算出する。さらに、算出部58は、最近接フレームから目標フレームまでのフレーム間制御量の和βを算出する。算出部58は、制御量αとフレーム間制御量の和βとの和を、実画像上の対象物の状態を目標フレーム上の対象物の状態に近づけるための制御量として算出すればよい。 Specifically, the calculation unit 58 is configured to bring the state of the target on the real image closer to the state of the target on the closest frame based on the first change information and the second change information corresponding to the closest frame. Is calculated for each of the plurality of degrees of freedom. Further, the calculation unit 58 calculates the sum β of the inter-frame control amounts from the closest frame to the target frame. The calculating unit 58 may calculate the sum of the control amount α and the sum β of the inter-frame control amounts as a control amount for bringing the state of the target on the real image closer to the state of the target on the target frame.
 <D1-4.変形例4>
 算出部58は、公知のモデル予測制御(「足立、「モデル予測制御の基礎」、日本ロボット学会誌、2014年7月、第32巻、第6号、p. 9-12」(非特許文献2)参照)を行なうことにより、複数の自由度の制御量を算出してもよい。
<D1-4. Modification 4>
The calculation unit 58 is configured by a known model predictive control (“Adachi,“ Basics of Model Predictive Control ”, Journal of the Robotics Society of Japan, July 2014, Vol. 32, No. 6, p. 9-12) (Non-Patent Document By performing 2)), control amounts of a plurality of degrees of freedom may be calculated.
 具体的には、目標フレーム選択部54は、教示範囲のうち予測ホライズン期間に含まれる複数のフレームを目標フレームとして選択する。算出部58は、目標フレーム上の対象物の状態と、予測ホライズン期間において撮像装置21,22によって撮像される画像上の対象物の状態との偏差を最小化するように、制御ホライズン期間における制御量を算出する。 {Specifically, the target frame selection unit 54 selects a plurality of frames included in the predicted horizon period in the teaching range as target frames. The calculation unit 58 performs control during the control horizon period so as to minimize the deviation between the state of the object on the target frame and the state of the object on the images captured by the imaging devices 21 and 22 during the predicted horizon period. Calculate the amount.
 <D1-5.その他の変形例>
 画像処理部53は、対象物の3D-CADデータを用いて、対象画像から対象物を検出してもよい。
<D1-5. Other Modifications>
The image processing unit 53 may detect the target from the target image using the 3D-CAD data of the target.
 基準動画(第1基準動画,第2基準動画)は、CG(Computer Graphics)によって作成されてもよい。 The reference moving image (first reference moving image, second reference moving image) may be created by CG (Computer @ Graphics).
 上記の説明では、変化情報生成部56は、対象フレームから抽出された対象物の特徴点の座標を、撮像された画像から抽出された対応する特徴点の座標に変換する写像を示す情報を変化情報として生成する。しかしながら、変化情報生成部56は、撮像された画像から抽出された対象物の特徴点の座標、対象フレームから抽出された対応する特徴点の座標に変換する写像を示す情報を変化情報として生成してもよい。 In the above description, the change information generation unit 56 changes the information indicating the mapping that converts the coordinates of the feature points of the target object extracted from the target frame into the coordinates of the corresponding feature points extracted from the captured image. Generate as information. However, the change information generating unit 56 generates, as the change information, information indicating the coordinates of the feature points of the target object extracted from the captured image and the mapping to be converted to the coordinates of the corresponding feature points extracted from the target frame. You may.
 制御装置50は、第1基準動画および第2基準動画を表示部532に表示し、作業者から動画の編集指示を受け付け、動画を編集してもよい。たとえば、作業者は、不要なフレームを第1基準動画および第2基準動画から削除してもよい。 The control device 50 may display the first reference moving image and the second reference moving image on the display unit 532, receive a moving image editing instruction from an operator, and edit the moving image. For example, the worker may delete unnecessary frames from the first reference moving image and the second reference moving image.
 ロボット30aは、たとえば風船のような膨張および収縮する対象物の状態(ここではサイズ)を変化させてもよい。この場合、制御装置50は、ロボット30aを単位制御量だけ制御したときの対象物の大きさの変化を示す変化情報を取得する。そして、制御装置は、変化情報に基づいて、実画像上の対象物の大きさが目標フレーム上の対象物の大きさに近づくように、ロボット30aを制御する。 The robot 30a may change the state (here, the size) of an object that expands and contracts, such as a balloon, for example. In this case, the control device 50 acquires change information indicating a change in the size of the target object when the robot 30a is controlled by the unit control amount. Then, the control device controls the robot 30a based on the change information such that the size of the target on the real image approaches the size of the target on the target frame.
 上記の説明では、オスコネクタ2aをメスコネクタ2bに向けて移動させることにより、オスコネクタ2aとメスコネクタ2bとを接続させた。しかしながら、オスコネクタ2aがステージ31b上に載置され、メスコネクタ2bがハンド31aに把持されてもよい。この場合、メスコネクタ2bをオスコネクタ2aに向けて移動させることにより、オスコネクタ2aとメスコネクタ2bとを接続させる。 In the above description, the male connector 2a and the female connector 2b are connected by moving the male connector 2a toward the female connector 2b. However, the male connector 2a may be placed on the stage 31b, and the female connector 2b may be gripped by the hand 31a. In this case, the male connector 2a and the female connector 2b are connected by moving the female connector 2b toward the male connector 2a.
 上記の説明では、制御装置50の基準動画記憶部51が基準動画を記憶するものとした。しかしながら、制御装置50の外部装置が基準動画を記憶していてもよい。 In the above description, the reference moving image storage unit 51 of the control device 50 stores the reference moving image. However, an external device of the control device 50 may store the reference moving image.
 基準動画記憶部51は、基準動画の代わりに、もしくは、基準動画に加えて、基準動画の各フレームから抽出された各対象物の特徴点の座標および特徴量を記憶していてもよい。これにより、基準動画の各フレームに対する画像処理部53の処理を省略することができる。 The reference moving image storage unit 51 may store, instead of or in addition to the reference moving image, the coordinates and characteristic amounts of the feature points of each object extracted from each frame of the reference moving image. Thus, the processing of the image processing unit 53 for each frame of the reference moving image can be omitted.
 ≪実施の形態2≫
 <A2.適用例>
 図20を参照して、本開示が適用される場面の一例について説明する。図20は、実施の形態2に係る制御システムの概要を示す模式図である。
<< Embodiment 2 >>
<A2. Application example>
An example of a scene to which the present disclosure is applied will be described with reference to FIG. FIG. 20 is a schematic diagram illustrating an outline of a control system according to the second embodiment.
 実施の形態2に係る制御システム1Aは、工業製品の生産ラインなどにおいて、半田ごて8cおよび半田フィーダ8dを用いて、基板9のパッド8fに電線8eを半田付けする。 The control system 1A according to the second embodiment solders the electric wires 8e to the pads 8f of the substrate 9 using the soldering iron 8c and the solder feeder 8d in a production line of an industrial product or the like.
 制御システム1Aは、図1に示す制御システム1と比較して、ロボット30a,30b、ロボットコントローラ40a,40bおよび制御装置50の代わりに、ロボット30c~30f、ロボットコントローラ40c~40fおよび制御装置50Aを備える点で相違する。 The control system 1A is different from the control system 1 shown in FIG. 1 in that the robots 30c to 30f, the robot controllers 40c to 40f, and the control device 50A are replaced with the robots 30a, 30b, the robot controllers 40a, 40b, and the control device 50. It is different in that it is prepared.
 撮像装置21,22は、互いに異なる方向から、被写体としての半田ごて8c、半田フィーダ8d、電線8eおよびパッド8fを撮像する。 The imaging devices 21 and 22 image the soldering iron 8c, the solder feeder 8d, the electric wire 8e, and the pad 8f as objects from different directions.
 ロボット30cは、半田ごて8cの状態(ここでは位置および姿勢)を変化させるための機構であり、たとえば垂直多関節ロボットである。ロボット30cは、先端に半田ごて8cを把持するハンド31cを有し、ハンド31cの位置および姿勢を6自由度で変化させる。 The robot 30c is a mechanism for changing the state (the position and the posture here) of the soldering iron 8c, and is, for example, a vertical articulated robot. The robot 30c has a hand 31c at its tip for holding the soldering iron 8c, and changes the position and posture of the hand 31c with six degrees of freedom.
 ロボット30dは、半田フィーダ8dの状態(ここでは位置および姿勢)を変化させるための機構であり、たとえば垂直多関節ロボットである。ロボット30dは、先端に半田フィーダ8dを把持するハンド31dを有し、ハンド31dの位置および姿勢を6自由度で変化させる。 The robot 30d is a mechanism for changing the state (here, position and posture) of the solder feeder 8d, and is, for example, a vertical articulated robot. The robot 30d has a hand 31d holding the solder feeder 8d at the tip, and changes the position and posture of the hand 31d with six degrees of freedom.
 ロボット30eは、電線8eの状態(ここでは位置および姿勢)を変化させるための機構であり、たとえば垂直多関節ロボットである。ロボット30eは、先端に電線8eを把持するハンド31eを有し、ハンド31eの位置および姿勢を6自由度で変化させる。 The robot 30e is a mechanism for changing the state (here, position and posture) of the electric wire 8e, and is, for example, a vertical articulated robot. The robot 30e has a hand 31e holding the electric wire 8e at the tip, and changes the position and the posture of the hand 31e with six degrees of freedom.
 ロボット30fは、基板9のパッド8fの状態(ここでは位置および姿勢)を変化させるための機構であり、たとえばXYθステージである。ロボット30fは、基板9が載置されるステージ31fを有し、ステージ31fの位置および姿勢を3自由度で変化させる。 The robot 30f is a mechanism for changing the state (in this case, position and orientation) of the pad 8f on the substrate 9, and is, for example, an XYθ stage. The robot 30f has a stage 31f on which the substrate 9 is placed, and changes the position and posture of the stage 31f with three degrees of freedom.
 ロボット30c~30eは、把持する対象物が異なるだけであり、図1に示すロボット30aと同様の構成を有する。ロボット30fは、ステージ31f上に載置される対象物が異なるだけであり、図1に示すロボット30bと同様の構成を有する。 The robots 30c to 30e have the same configuration as the robot 30a shown in FIG. The robot 30f has the same configuration as the robot 30b shown in FIG. 1 except that the target placed on the stage 31f is different.
 ロボットコントローラ40c~40eは、制御装置50Aから受けた制御指令に従って、ロボット30c~30eの動作制御をそれぞれ行なう。ロボットコントローラ40c~40eは、図1に示すロボットコントローラ40aと同様の構成を有する。 The robot controllers 40c to 40e control the operations of the robots 30c to 30e, respectively, according to the control commands received from the control device 50A. The robot controllers 40c to 40e have the same configuration as the robot controller 40a shown in FIG.
 ロボットコントローラ40fは、制御装置50Aから受けた制御指令に従って、ロボット30fの動作制御を行なう。ロボットコントローラ40fは、図1に示すロボットコントローラ40bと同様の構成を有する。 The robot controller 40f controls the operation of the robot 30f according to the control command received from the control device 50A. The robot controller 40f has the same configuration as the robot controller 40b shown in FIG.
 制御装置50Aは、ロボットコントローラ40c~40fを介して、ロボット30c~30fをそれぞれ制御する。 The control device 50A controls the robots 30c to 30f via the robot controllers 40c to 40f, respectively.
 制御装置50Aは、半田ごて8c、半田フィーダ8d、電線8eおよびパッド8fの見本を示す第1基準動画および第2基準動画を記憶している。第1基準動画は、撮像装置21の位置から見たときの動画である。第2基準動画は、撮像装置22の位置から見たときの動画である。 The control device 50A stores a first reference moving image and a second reference moving image showing samples of the soldering iron 8c, the solder feeder 8d, the electric wire 8e, and the pad 8f. The first reference moving image is a moving image when viewed from the position of the imaging device 21. The second reference moving image is a moving image when viewed from the position of the imaging device 22.
 図21は、実施の形態2における基準動画(第1基準動画または第2基準動画)の一例を示す図である。図21には、基準動画のフレーム77~83が示されている。 FIG. 21 is a diagram illustrating an example of a reference moving image (first reference moving image or second reference moving image) according to the second embodiment. FIG. 21 shows frames 77 to 83 of the reference moving image.
 フレーム77は、パッド8fの状態が所望の状態に到達したときのフレームである。フレーム78は、電線8eがパッド8fの上に到達したときのフレームである。フレーム79は、半田ごて8cの先端がパッド8fに接触したときのフレームである。フレーム80は、半田フィーダ8dの先端が半田ごて8cの先端に接触したときのフレームである。フレーム81は、半田フィーダ8dから供給され、溶融した半田(以下、「溶融半田8g」という)が所望のサイズになったときのフレームである。フレーム82は、半田フィーダ8dをパッド8fから遠ざけたときのフレームである。フレーム83は、半田ごて8cをパッド8fから遠ざけたときのフレームである。このような一連の動作により、パッド8fに電線8eが半田付けされる。 The frame 77 is a frame when the state of the pad 8f has reached a desired state. The frame 78 is a frame when the electric wire 8e reaches the pad 8f. The frame 79 is a frame when the tip of the soldering iron 8c contacts the pad 8f. The frame 80 is a frame when the tip of the solder feeder 8d contacts the tip of the soldering iron 8c. The frame 81 is a frame when the solder supplied from the solder feeder 8d and melted (hereinafter referred to as “molten solder 8g”) has a desired size. The frame 82 is a frame when the solder feeder 8d is moved away from the pad 8f. The frame 83 is a frame when the soldering iron 8c is moved away from the pad 8f. Through such a series of operations, the electric wire 8e is soldered to the pad 8f.
 制御装置50Aは、6自由度の各々について、単位制御量だけロボット30cを制御したときの、撮像装置21,22から得られる画像上の半田ごて8cの状態の変化を示す変化情報を取得する。 The control device 50A acquires change information indicating a change in the state of the soldering iron 8c on the images obtained from the imaging devices 21 and 22 when the robot 30c is controlled by the unit control amount for each of the six degrees of freedom. .
 制御装置50Aは、6自由度の各々について、単位制御量だけロボット30dを制御したときの、撮像装置21,22から得られる画像上の半田フィーダ8dの状態の変化を示す変化情報を取得する。 The control device 50A acquires change information indicating a change in the state of the solder feeder 8d on the images obtained from the imaging devices 21 and 22 when the robot 30d is controlled by the unit control amount for each of the six degrees of freedom.
 制御装置50Aは、6自由度の各々について、単位制御量だけロボット30eを制御したときの、撮像装置21,22から得られる画像上の電線8eの状態の変化を示す変化情報を取得する。 The control device 50A acquires change information indicating a change in the state of the electric wire 8e on the images obtained from the imaging devices 21 and 22 when the robot 30e is controlled by the unit control amount for each of the six degrees of freedom.
 制御装置50Aは、3自由度の各々について、単位制御量だけロボット30fを制御したときの、撮像装置21,22から得られる画像上のパッド8fの状態の変化を示す変化情報を取得する。 The control device 50A acquires change information indicating a change in the state of the pad 8f on the images obtained from the imaging devices 21 and 22 when the robot 30f is controlled by the unit control amount for each of the three degrees of freedom.
 制御装置50Aは、実施の形態1と同様に、撮像装置21,22によって撮像された実画像上の対象物の状態が第1基準動画および第2基準動画の目標フレーム上の対象物の状態にそれぞれ近づくように、対象ロボットコントローラを介して対象ロボットを制御する。対象物は、半田ごて8c、半田フィーダ8d、電線8eおよびパッド8fである。対象ロボットコントローラは、ロボットコントローラ40c~40fである。対象ロボットは、ロボット30c~30fである。これにより、半田ごて8c、半田フィーダ8d、電線8eおよびパッド8fの4つの対象物の状態は、第1基準動画および第2基準動画に従って連係して変化する。 The control device 50A changes the state of the target on the real images captured by the imaging devices 21 and 22 to the state of the target on target frames of the first reference moving image and the second reference moving image, as in the first embodiment. The target robot is controlled via the target robot controller so as to approach each other. The objects are the soldering iron 8c, the solder feeder 8d, the electric wire 8e, and the pad 8f. The target robot controllers are the robot controllers 40c to 40f. The target robots are the robots 30c to 30f. Thus, the states of the four objects, the soldering iron 8c, the solder feeder 8d, the electric wire 8e, and the pad 8f, change in conjunction with the first reference moving image and the second reference moving image.
 さらに、制御装置50Aは、実画像上の第1対象物の状態と第1目標フレーム上の第1対象物の状態との第1偏差と、実画像上の第2対象物の状態と第2目標フレーム上の第2対象物の状態との第2偏差とを算出する。制御装置50Aは、第1偏差が第1閾値未満となる時刻と、第2偏差が第2閾値未満となる時刻とが規定条件を満たすように、ロボット30c~30fを制御する。これにより、半田ごて8c、半田フィーダ8d、電線8eおよびパッド8fの状態を目標フレームに到達させる時刻を制御することができる。 Further, the control device 50A determines the first deviation between the state of the first object on the real image and the state of the first object on the first target frame, the state of the second object on the real image, and the second deviation. A second deviation from the state of the second object on the target frame is calculated. The control device 50A controls the robots 30c to 30f such that the time when the first deviation is less than the first threshold value and the time when the second deviation is less than the second threshold value satisfy prescribed conditions. This makes it possible to control the time when the states of the soldering iron 8c, the solder feeder 8d, the electric wire 8e, and the pad 8f reach the target frame.
 <B2.具体例>
 <B2-1.制御装置の構成>
 制御装置50Aは、実施の形態1と同様に、図4に示されるようなハードウェア構成を有する。そのため、制御装置50Aのハードウェア構成の詳細な説明を省略する。
<B2. Specific example>
<B2-1. Configuration of control device>
The control device 50A has a hardware configuration as shown in FIG. 4, as in the first embodiment. Therefore, detailed description of the hardware configuration of the control device 50A is omitted.
 図22は、実施の形態2に係る制御装置の機能構成を示すブロック図である。図22に示されるように、制御装置50Aは、図5に示す制御装置50と比較して、目標フレーム選択部54の代わりに目標フレーム選択部54Aを備え、第1制御部55aおよび第2制御部55bの代わりに第1制御部55c、第2制御部55d、第3制御部55eおよび第4制御部55fを備える点で相違する。 FIG. 22 is a block diagram showing a functional configuration of the control device according to the second embodiment. As shown in FIG. 22, the control device 50A includes a target frame selection unit 54A instead of the target frame selection unit 54, as compared with the control device 50 shown in FIG. 5, and includes a first control unit 55a and a second control unit 55a. The difference is that a first control unit 55c, a second control unit 55d, a third control unit 55e, and a fourth control unit 55f are provided instead of the unit 55b.
 ただし、基準動画記憶部51は、4つの対象物(半田ごて8c、半田フィーダ8d、電線8eおよびパッド8f)の見本を示す第1基準動画および第2基準動画を記憶する。 However, the reference moving image storage unit 51 stores a first reference moving image and a second reference moving image showing samples of four objects (the soldering iron 8c, the solder feeder 8d, the electric wire 8e, and the pad 8f).
 教示範囲選択部52は、4つの対象物(半田ごて8c、半田フィーダ8d、電線8eおよびパッド8f)の各々について教示範囲を選択する。 The teaching range selection unit 52 selects a teaching range for each of the four objects (the soldering iron 8c, the solder feeder 8d, the electric wire 8e, and the pad 8f).
 画像処理部53は、対象画像から4つの対象物(半田ごて8c、半田フィーダ8d、電線8eおよびパッド8f)を検出する。 (4) The image processing unit 53 detects four objects (the soldering iron 8c, the solder feeder 8d, the electric wire 8e, and the pad 8f) from the target image.
 さらに、画像処理部53は、対象画像からパッド8f上の溶融半田8g(図21参照)も対象物として検出する。画像処理部53は、溶融半田8gのテンプレートを事前に作成し、対象画像から溶融半田8gの特徴点および特徴量を抽出する。なお、溶融半田8gの状態(ここではサイズ)は、溶融した半田量に応じて変化する。そのため、画像処理部53は、対象画像から色抽出およびラベリングを行ない、拡大縮小の影響を受けにくいSIFTを特徴量として抽出することが好ましい。さらに、画像処理部53は、色拡大縮小の影響を受けにくいSIFTを特徴量として抽出することが好ましい。これにより、画像処理部53は、対象画像から溶融半田8gを検出しやすくなる。 {Circle around (4)} The image processing unit 53 also detects the molten solder 8g (see FIG. 21) on the pad 8f from the target image as a target. The image processing unit 53 creates a template of the molten solder 8g in advance, and extracts a feature point and a feature amount of the molten solder 8g from the target image. The state (here, the size) of the molten solder 8g changes according to the amount of the molten solder. Therefore, it is preferable that the image processing unit 53 performs color extraction and labeling from the target image, and extracts SIFT, which is not easily affected by enlargement / reduction, as a feature amount. Furthermore, it is preferable that the image processing unit 53 extracts SIFT that is not easily affected by color enlargement / reduction as a feature amount. Thereby, the image processing unit 53 can easily detect the molten solder 8g from the target image.
 <B2-2.目標フレーム選択部>
 目標フレーム選択部54Aは、実施の形態1の目標フレーム選択部54と同様に、第1基準動画および第2基準動画から目標フレームを選択する。ただし、目標フレーム選択部54Aは、実施の形態1の変形例1と同様に、通過必須フレームの指定と、当該通過必須フレームで示される状態を通過すべき対象物(通過必須フレームに対応する対象物)の指定とを受け付ける。
<B2-2. Target frame selection section>
The target frame selection unit 54A selects a target frame from the first reference moving image and the second reference moving image, similarly to the target frame selection unit 54 of the first embodiment. However, similarly to the first modification of the first embodiment, the target frame selection unit 54A specifies a pass-necessary frame and specifies an object to pass through the state indicated by the pass-necessary frame (the target ) Is accepted.
 さらに、目標フレーム選択部54Aは、複数の通過必須フレームの指定を受け付けた場合、連続する2つの通過必須フレーム間の時間差を受け付け可能である。ここで、連続する2つの通過必須フレームの先のフレームを「第1関連フレーム」とし、後のフレームを「第2関連フレーム」とする。目標フレーム選択部54Aは、第1関連フレーム、第2関連フレームおよび時間差の指定を受け付ける。 Furthermore, when the target frame selecting unit 54A receives the specification of a plurality of required passing frames, the target frame selecting unit 54A can receive the time difference between two consecutive passing required frames. Here, the first frame of two consecutive passing essential frames is referred to as “first related frame”, and the subsequent frame is referred to as “second related frame”. The target frame selection unit 54A receives designation of a first related frame, a second related frame, and a time difference.
 目標フレーム選択部54Aは、第1関連フレーム、第2関連フレームおよび時間差を受け付けた場合、第1関連フレームを目標フレームとして選択した後、第2関連フレームを次の目標フレームとして選択する。すなわち、目標フレーム選択部54Aは、第1関連フレームに対応する対象物について実画像上の状態と当該第1関連フレーム上の状態との偏差が閾値未満になると、第2関連フレームを目標フレームとして選択する。 When the first related frame, the second related frame, and the time difference are received, the target frame selecting unit 54A selects the first related frame as the target frame, and then selects the second related frame as the next target frame. That is, when the deviation between the state on the real image and the state on the first related frame of the object corresponding to the first related frame is less than the threshold, the target frame selecting unit 54A sets the second related frame as the target frame. select.
 図23は、通過必須フレーム、関連フレームおよび時間差を指定するための画面の一例を示す図である。制御装置50Aは、図23に示されるような画面を表示部532に表示し、通過必須フレーム、通過必須フレームに対応する対象物、第1関連フレーム、第2関連フレームおよび時間差の指定を受け付ける。作業者は、入力装置534を操作して、通過必須フレーム、通過必須フレームに対応する対象物、第1関連フレーム、第2関連フレームおよび時間差を指定する。 FIG. 23 is a diagram showing an example of a screen for designating a frame that must pass, a related frame, and a time difference. The control device 50A displays a screen as shown in FIG. 23 on the display unit 532, and accepts designation of a required passage frame, an object corresponding to the required passage frame, a first related frame, a second related frame, and a time difference. The operator operates the input device 534 to specify a required pass frame, an object corresponding to the required pass frame, a first related frame, a second related frame, and a time difference.
 図23に示す例では、教示範囲選択部52は、パッド8fおよび電線8eに対応する教示範囲の最終フレームとして、フレーム83,82をそれぞれ選択している。 In the example shown in FIG. 23, the teaching range selection unit 52 selects the frames 83 and 82 as the last frames of the teaching range corresponding to the pad 8f and the electric wire 8e.
 目標フレーム選択部54Aは、対象物「半田ごて8c」に対応する通過必須フレームとしてフレーム79,83を受け付けている。目標フレーム選択部54Aは、対象物「半田フィーダ8d」に対応する通過必須フレームとしてフレーム80,82を受け付けている。目標フレーム選択部54Aは、対象物「溶融半田8g」に対応する通過必須フレームとしてフレーム81を受け付けている。 The target frame selection unit 54A receives the frames 79 and 83 as the indispensable passage frames corresponding to the target object “solder iron 8c”. The target frame selecting unit 54A receives the frames 80 and 82 as the passage essential frames corresponding to the target object “solder feeder 8d”. The target frame selection unit 54A receives the frame 81 as a passing essential frame corresponding to the target object “melted solder 8g”.
 目標フレーム選択部54Aは、連続する2つの通過必須フレームであるフレーム79,80をそれぞれ第1関連フレームおよび第2関連フレームとし、時間差を「3秒」とする指示を受け付けている。さらに、目標フレーム選択部54Aは、連続する2つの通過必須フレームであるフレーム81,82をそれぞれ第1関連フレームおよび第2関連フレームとし、時間差を「0.5秒」とする指示を受け付けている。 The target frame selection unit 54A receives an instruction to set the frames 79 and 80, which are two consecutive passing essential frames, as the first related frame and the second related frame, respectively, and to set the time difference to “3 seconds”. Further, the target frame selecting unit 54A receives an instruction to set the frames 81 and 82, which are two consecutive passing essential frames, as the first related frame and the second related frame, respectively, and to set the time difference to “0.5 seconds”. .
 <B2-3.制御部>
 第1制御部55cは、ロボットコントローラ40cを介してロボット30cを制御し、半田ごて8cの状態を変化させる。
<B2-3. Control part>
The first control unit 55c controls the robot 30c via the robot controller 40c to change the state of the soldering iron 8c.
 第2制御部55dは、ロボットコントローラ40dを介してロボット30dを制御し、半田フィーダ8dの状態を変化させる。 The second controller 55d controls the robot 30d via the robot controller 40d to change the state of the solder feeder 8d.
 第3制御部55eは、ロボットコントローラ40eを介してロボット30eを制御し、電線8eの状態を変化させる。 The third control unit 55e controls the robot 30e via the robot controller 40e to change the state of the electric wire 8e.
 第4制御部55fは、ロボットコントローラ40fを介してロボット30fを制御し、パッド8fの状態を変化させる。 The fourth control unit 55f controls the robot 30f via the robot controller 40f to change the state of the pad 8f.
 第1制御部55c、第2制御部55d、第3制御部55eおよび第4制御部55fの各々は、実施の形態1と同様に、変化情報生成部56と、変化情報記憶部57と、算出部58と、指令部59と、終了判定部60とを備える(図7参照)。 Each of the first control unit 55c, the second control unit 55d, the third control unit 55e, and the fourth control unit 55f includes a change information generation unit 56, a change information storage unit 57, A section 58, a command section 59, and an end determination section 60 are provided (see FIG. 7).
 ただし、各部における対象物、対象ロボットおよび対象ロボットコントローラは、実施の形態1と異なる。対象物は、第1制御部55cでは半田ごて8cであり、第2制御部55dでは半田フィーダ8dであり、第3制御部55eでは電線8eであり、第4制御部55fではパッド8fである。対象ロボットは、第1制御部55cではロボット30cであり、第2制御部55dではロボット30dであり、第3制御部55eではロボット30eであり、第4制御部55fではロボット30fである。対象ロボットコントローラは、第1制御部55cではロボットコントローラ40cであり、第2制御部55dではロボットコントローラ40dであり、第3制御部55eではロボットコントローラ40eであり、第4制御部55fではロボットコントローラ40fである。 However, the target object, the target robot, and the target robot controller in each unit are different from those in the first embodiment. The object is the soldering iron 8c in the first control unit 55c, the solder feeder 8d in the second control unit 55d, the electric wire 8e in the third control unit 55e, and the pad 8f in the fourth control unit 55f. . The target robot is the robot 30c in the first control unit 55c, the robot 30d in the second control unit 55d, the robot 30e in the third control unit 55e, and the robot 30f in the fourth control unit 55f. The target robot controller is the robot controller 40c in the first controller 55c, the robot controller 40d in the second controller 55d, the robot controller 40e in the third controller 55e, and the robot controller 40f in the fourth controller 55f. It is.
 さらに、実施の形態2において、算出部58は、実施の形態1の機能に加えて、以下の機能を有する。すなわち、算出部58は、目標フレーム選択部54が第1関連フレーム,第2関連フレームおよび時間差の指定を受け付けている場合、当該時間差を満たすように制御量を調整する。 In addition, in the second embodiment, the calculating unit 58 has the following functions in addition to the functions of the first embodiment. That is, when the target frame selecting unit 54 receives designation of the first related frame, the second related frame, and the time difference, the calculating unit 58 adjusts the control amount so as to satisfy the time difference.
 たとえば、図23に示されるように、第1関連フレーム「フレーム79」と第2関連フレーム「フレーム80」と時間差「3秒」との指定を目標フレーム選択部54が受け付けている場合、算出部58は、以下のように制御量を調整する。算出部58は、第1関連フレームに対応する対象物について、実画像上の状態と当該第1関連フレーム上の状態との偏差が閾値未満となった時刻に指定された時間差だけ経過した時刻を到達予定時刻として算出する。算出部58は、実画像上の第2関連フレームに対応する対象物の状態を第2関連フレーム上の当該対象物の状態に近づけるための各自由度の制御量を算出する。算出部58は、算出した制御量に撮像周期/(到達予定時刻-現時刻)を乗じることにより、制御量を調整する。 For example, as shown in FIG. 23, when the target frame selecting unit 54 receives designation of the first related frame “frame 79”, the second related frame “frame 80”, and the time difference “3 seconds”, the calculating unit 58 adjusts the control amount as follows. The calculation unit 58 calculates the time at which the deviation between the state on the real image and the state on the first related frame becomes less than the threshold for the object corresponding to the first related frame by the time difference specified by the time difference. Calculated as expected arrival time. The calculation unit 58 calculates a control amount of each degree of freedom for bringing the state of the object corresponding to the second related frame on the real image closer to the state of the object on the second related frame. The calculation unit 58 adjusts the control amount by multiplying the calculated control amount by the imaging cycle / (scheduled arrival time−current time).
 <C2.動作例>
 実施の形態2に係る制御装置50Aは、実施の形態1と同様に、図12および図15に示すフローチャートに従って、基準動画に沿って対象物の状態を変化させるように対象ロボットを制御する。ただし、実施の形態2に係る目標フレーム選択部54Aは、図24に示すフローチャートに従って、目標フレームの選択処理を行なう。
<C2. Operation example>
As in the first embodiment, the control device 50A according to the second embodiment controls the target robot so as to change the state of the target along the reference moving image according to the flowcharts shown in FIGS. However, the target frame selection unit 54A according to the second embodiment performs a target frame selection process according to the flowchart shown in FIG.
 図24は、実施の形態2における目標フレームの選択処理の流れの一例を示すフローチャートである。図24に示すフローチャートは、図17に示すフローチャートと比較して、ステップS81,S82を含む点でのみ異なる。そのため、以下では、ステップS81,S82について説明する。 FIG. 24 is a flowchart showing an example of the flow of a target frame selection process in the second embodiment. The flowchart shown in FIG. 24 differs from the flowchart shown in FIG. 17 only in that steps S81 and S82 are included. Therefore, steps S81 and S82 will be described below.
 ステップS55でYESの場合、処理はステップS81に移る。ステップS81において、目標フレーム選択部54Aは、目標フレームが規定条件で示される第1関連フレームであるか否かを判定する。 場合 If YES in step S55, the process proceeds to step S81. In step S81, the target frame selection unit 54A determines whether or not the target frame is a first related frame indicated by a specified condition.
 目標フレームが第1関連フレームである場合(ステップS81でYES)、ステップS82において、目標フレーム選択部54Aは、当該第1関連フレームに対応する第2関連フレームを目標フレームとして選択する。ステップS82の後、目標フレームの選択処理は終了する。 If the target frame is the first related frame (YES in step S81), in step S82, the target frame selecting unit 54A selects the second related frame corresponding to the first related frame as the target frame. After step S82, the target frame selection process ends.
 目標フレームが第1関連フレームである場合(ステップS81でYES)、目標フレームの選択処理はステップS57に移る。 If the target frame is the first related frame (YES in step S81), the process of selecting a target frame moves to step S57.
 以上の処理により、実画像上の第1対象物の状態と第1目標フレーム上の第1対象物の状態との偏差が閾値未満となる時刻と、実画像上の第2対象物の状態と第2目標フレーム上の第2対象物の状態との偏差が閾値未満となる時刻とが規定条件を満たす。 By the above processing, the time at which the deviation between the state of the first object on the real image and the state of the first object on the first target frame is less than the threshold value, and the state of the second object on the real image The time when the deviation from the state of the second object on the second target frame is less than the threshold satisfies the prescribed condition.
 たとえば、図23に示す例のように、第1関連フレーム「フレーム79」と第2関連フレーム「フレーム80」と時間差「3秒」とが指定された場合には、以下のように対象ロボットが制御される。 For example, as in the example shown in FIG. 23, when the first related frame “frame 79” and the second related frame “frame 80” and the time difference “3 seconds” are specified, the target robot is set as follows. Controlled.
 実画像上の半田ごて8cの状態とフレーム79上の半田ごて8cの状態との偏差が閾値Tha未満となる時刻をt1とし、実画像上の半田フィーダ8dの状態とフレーム80上の半田フィーダ8dの状態との偏差が閾値Thd未満となる時刻をt2とする。このとき、制御装置50Aは、(t2-t1)が3秒であるという規定条件を満たすように、ロボット30dを制御する。これにより、パッド8fが十分に加熱されてから、半田フィーダ8dを半田ごて8cに接触させることができる。 The time when the deviation between the state of the soldering iron 8c on the real image and the state of the soldering iron 8c on the frame 79 is less than the threshold value Tha is t1, and the state of the solder feeder 8d on the real image and the solder on the frame 80 are The time when the deviation from the state of the feeder 8d becomes smaller than the threshold Thd is defined as t2. At this time, the control device 50A controls the robot 30d so as to satisfy the specified condition that (t2−t1) is 3 seconds. Thus, the solder feeder 8d can be brought into contact with the soldering iron 8c after the pad 8f is sufficiently heated.
 さらに、図23に示す例のように、第1関連フレーム「フレーム81」と第2関連フレーム「フレーム82」と時間差「0.5秒」とが指定された場合には、以下のように対象ロボットが制御される。 Furthermore, as in the example shown in FIG. 23, when the time difference “0.5 seconds” is specified between the first related frame “frame 81” and the second related frame “frame 82”, the target The robot is controlled.
 実画像上の溶融半田8gの状態とフレーム81上の溶融半田8gの状態との偏差が閾値Thg未満となる時刻をt3とし、実画像上の半田フィーダ8dの状態とフレーム82上の半田フィーダ8dの状態との偏差が閾値Thd未満となる時刻をt4とする。このとき、制御装置50Aは、(t4-t3)が0.5秒であるという規定条件を満たすように、ロボット30dを制御する。溶融半田8gの状態とは、溶融半田8gの大きさを示す。これにより、実画像上の溶融半田8gの状態とフレーム81上の溶融半田8gの状態との偏差が閾値Thg未満となってから、目標フレームがフレーム82に更新される。そして、溶融半田8gが所望の大きさに達してから、半田フィーダ8dを半田ごて8cから即座に離して、溶融半田8gの大きさを維持させることができる。 The time at which the deviation between the state of the molten solder 8g on the real image and the state of the molten solder 8g on the frame 81 is less than the threshold Thg is defined as t3, and the state of the solder feeder 8d on the real image and the solder feeder 8d on the frame 82 are set. The time at which the deviation from the state described above becomes smaller than the threshold Thd is defined as t4. At this time, the control device 50A controls the robot 30d so as to satisfy the specified condition that (t4−t3) is 0.5 second. The state of the molten solder 8g indicates the size of the molten solder 8g. Thus, the target frame is updated to the frame 82 after the deviation between the state of the molten solder 8g on the actual image and the state of the molten solder 8g on the frame 81 becomes less than the threshold Thg. Then, after the molten solder 8g reaches a desired size, the solder feeder 8d can be immediately separated from the soldering iron 8c to maintain the size of the molten solder 8g.
 また、図23に示す例のように、フレーム82の後のフレーム83が通過必須フレームとして指定されている場合には、以下のように対象ロボットが制御される。 Also, as in the example shown in FIG. 23, when the frame 83 after the frame 82 is specified as a mandatory passage frame, the target robot is controlled as follows.
 実画像上の半田フィーダ8dの状態とフレーム82上の半田フィーダ8dの状態との偏差が閾値Thd未満となった後に、フレーム83が目標フレームとして選択される。そのため、半田フィーダ8dが半田ごて8cから十分に離れた後、半田ごて8cがパッド8fから遠ざかるように移動される。これにより、半田ごて8cと半田フィーダ8dとの意図しない接触を避けることができる。 (4) After the deviation between the state of the solder feeder 8d on the actual image and the state of the solder feeder 8d on the frame 82 becomes less than the threshold Thd, the frame 83 is selected as the target frame. Therefore, after the solder feeder 8d is sufficiently separated from the soldering iron 8c, the soldering iron 8c is moved away from the pad 8f. Thereby, unintended contact between the soldering iron 8c and the solder feeder 8d can be avoided.
 <D2-1.変形例1>
 上記の説明では、制御システム1Aは、半田ごて8cおよび半田フィーダ8dを用いて、パッド8fに電線8eを半田付けする。しかしながら、制御システム1Aは、別の4つの対象物同士を組み立ててもよい。
<D2-1. Modification 1>
In the above description, the control system 1A solders the electric wire 8e to the pad 8f using the soldering iron 8c and the solder feeder 8d. However, the control system 1A may assemble another four objects.
 図25は、実施の形態2の変形例1に係る制御システムの対象物を示す模式図である。変形例1に係る制御システム1Aは、工業製品の生産ラインなどにおいて、円筒部材10e,10fをビス10c,10dで接合させる。 FIG. 25 is a schematic diagram showing an object of the control system according to the first modification of the second embodiment. The control system 1A according to the first modification joins the cylindrical members 10e and 10f with screws 10c and 10d in an industrial product production line or the like.
 円筒部材10eにはビス孔11a,11bが形成されている、円筒部材10fにはビス孔12a,12bが形成されている。ビス孔11aとビス孔12aとが重なり合い、ビス孔11bとビス孔12bとが重なり合った状態において、ビス孔11a,12aにビス10cが差し込まれ、ビス孔11b,12bにビス10dが差し込まれる。 ビ ス Screw holes 11a and 11b are formed in the cylindrical member 10e, and screw holes 12a and 12b are formed in the cylindrical member 10f. With the screw hole 11a and the screw hole 12a overlapping and the screw hole 11b and the screw hole 12b overlapping, the screw 10c is inserted into the screw holes 11a and 12a, and the screw 10d is inserted into the screw holes 11b and 12b.
 ビス10cは、ロボット30cのハンド31c(図20参照)によって把持される。ビス10dは、ロボット30dのハンド31dによって把持される。円筒部材10eは、ロボット30eのハンド31eによって把持される。円筒部材10fは、ロボット30fのステージ31f上に載置される。 The screw 10c is gripped by the hand 31c (see FIG. 20) of the robot 30c. The screw 10d is gripped by the hand 31d of the robot 30d. The cylindrical member 10e is gripped by the hand 31e of the robot 30e. The cylindrical member 10f is mounted on a stage 31f of the robot 30f.
 撮像装置21,22は、ビス孔11a,12aとビス10cとを撮像可能な位置に配置される。ただし、撮像装置21,22は、ビス孔11b,12bとビス10dとを撮像できない。そのため、変形例1に係る制御システム1Aは、さらに撮像装置23,24を備える。撮像装置23,24は、ビス孔11b,12bとビス10dとを撮像可能な位置に配置される。 The imaging devices 21 and 22 are arranged at positions where the screw holes 11a and 12a and the screw 10c can be imaged. However, the imaging devices 21 and 22 cannot image the screw holes 11b and 12b and the screw 10d. Therefore, the control system 1A according to the first modification further includes imaging devices 23 and 24. The imaging devices 23 and 24 are arranged at positions where the screw holes 11b and 12b and the screw 10d can be imaged.
 本変形例1の制御装置50Aは、撮像装置21~24にそれぞれ対応する4つの基準動画を記憶する。 The control device 50A according to the first modification stores four reference moving images corresponding to the imaging devices 21 to 24, respectively.
 図26は、実施の形態2の変形例1における基準動画の一例を示す図である。図26には、基準動画のフレーム84~86が示されている。フレーム84は、円筒部材10fが所望の位置および姿勢に到達したときのフレームである。フレーム85は、円筒部材10eのビス孔11aが円筒部材10fのビス孔12aに重なりあったときのフレームである。フレーム86は、ビス10cがビス孔11a,12aに挿入される直前のフレームである。 FIG. 26 is a diagram illustrating an example of the reference moving image according to the first modification of the second embodiment. FIG. 26 shows frames 84 to 86 of the reference moving image. The frame 84 is a frame when the cylindrical member 10f reaches a desired position and posture. The frame 85 is a frame when the screw hole 11a of the cylindrical member 10e overlaps the screw hole 12a of the cylindrical member 10f. The frame 86 is a frame immediately before the screw 10c is inserted into the screw holes 11a and 12a.
 本変形例1の変化情報生成部56は、4つの基準動画にそれぞれ対応する4つの変化情報セットを生成すればよい。そして、算出部58は、4つの変化情報セットに基づいて、対象ロボットの各自由度の制御量を算出すればよい。 変 化 The change information generation unit 56 of the first modification may generate four change information sets respectively corresponding to the four reference moving images. Then, the calculation unit 58 may calculate the control amount of each degree of freedom of the target robot based on the four change information sets.
 なお、円筒部材10e,10fの各々には、さらに2つのビス孔が設けられてもよい。この場合、円筒部材10e,10fは、4つのビスによって接合される。 Note that each of the cylindrical members 10e and 10f may be further provided with two screw holes. In this case, the cylindrical members 10e and 10f are joined by four screws.
 図27は、4つの撮像装置の配置例を示す図である。図27に示す例の場合、撮像装置21は、2つのビス10c,10gを撮像可能な位置に配置される。すなわち、撮像装置21の視野範囲21aにビス10c,10gが存在する。撮像装置22は、2つのビス10c,10hを撮像可能な位置に配置される。すなわち、撮像装置22の視野範囲22aにビス10c,10hが存在する。撮像装置23は、2つのビス10d,10gを撮像可能な位置に配置される。すなわち、撮像装置23の視野範囲23aにビス10d,10gが存在する。撮像装置24は、2つのビス10d,10hを撮像可能な位置に配置される。すなわち、撮像装置24の視野範囲24aにビス10d,10hが存在する。 FIG. 27 is a diagram illustrating an example of the arrangement of four imaging devices. In the case of the example shown in FIG. 27, the imaging device 21 is arranged at a position where two screws 10c and 10g can be imaged. That is, the screws 10c and 10g exist in the visual field range 21a of the imaging device 21. The imaging device 22 is arranged at a position where the two screws 10c and 10h can be imaged. That is, the screws 10c and 10h exist in the visual field range 22a of the imaging device 22. The imaging device 23 is arranged at a position where two screws 10d and 10g can be imaged. That is, the screws 10d and 10g exist in the visual field range 23a of the imaging device 23. The imaging device 24 is arranged at a position where the two screws 10d and 10h can be imaged. That is, the screws 10d and 10h exist in the visual field range 24a of the imaging device 24.
 <D2-2.変形例2>
 図28は、実施の形態2の変形例2に係る制御システムの対象物を示す模式図である。変形例2に係る制御システム1Aは、工業製品の生産ラインなどにおいて、溶接トーチ13cと溶接棒13dと用いて、2つの円筒部材13e,13f同士を溶接する。
<D2-2. Modification 2>
FIG. 28 is a schematic diagram illustrating an object of the control system according to the second modification of the second embodiment. The control system 1A according to the second modification uses a welding torch 13c and a welding rod 13d to weld two cylindrical members 13e and 13f to each other in an industrial product production line or the like.
 溶接トーチ13cは、ロボット30cのハンド31c(図20参照)によって把持される。溶接棒13dは、ロボット30dのハンド31dによって把持される。円筒部材13eは、ロボット30eのハンド31eによって把持される。円筒部材13fは、ロボット30fのステージ31f上に載置される。 The welding torch 13c is gripped by the hand 31c (see FIG. 20) of the robot 30c. The welding rod 13d is gripped by the hand 31d of the robot 30d. The cylindrical member 13e is gripped by the hand 31e of the robot 30e. The cylindrical member 13f is mounted on a stage 31f of the robot 30f.
 <D2-3.変形例3>
 目標フレーム選択部54Aは、第1関連フレームおよび第2関連フレームに対して時間差「指定なし」を受け付けてもよい。この場合、第1関連フレームから第2関連フレームの間にあるフレームは目標フレームとして選択されず、スキップされる。時間差「指定なし」の指定を受け付けた場合、算出部58は、制御量の調整を行なわない。
<D2-3. Modification 3>
The target frame selection unit 54A may receive a time difference “no designation” for the first related frame and the second related frame. In this case, a frame between the first related frame and the second related frame is not selected as the target frame and is skipped. When the designation of the time difference “no designation” is received, the calculation unit 58 does not adjust the control amount.
 <D2-4.変形例4>
 目標フレーム選択部54Aは、基準動画のうち連続する複数のフレーム群を動作グループとする指定を受け付けてもよい。たとえば、同じ動作を反復して複数回繰り返す場合、作業者は、当該動作に対応するフレーム群を動作グループとして指定する。さらに、作業者は、動作グループの反復回数も合わせて指定する。
<D2-4. Modification 4>
The target frame selection unit 54A may receive designation of a plurality of continuous frames in the reference moving image as an operation group. For example, when the same operation is repeated a plurality of times, the operator designates a frame group corresponding to the operation as an operation group. Further, the operator also specifies the number of repetitions of the operation group.
 目標フレーム選択部54Aは、動作グループに含まれるフレームを反復回数だけ繰り返し目標フレームとして指定すればよい。 The target frame selection unit 54A may specify the frames included in the operation group as target frames to be repeated by the number of repetitions.
 <D2-5.その他の変形例>
 実施の形態1で説明した<D1-5.その他の変形例>は、実施の形態2にも適用されてもよい。
<D2-5. Other Modifications>
<D1-5. Described in the first embodiment. Other Modifications> may be applied to the second embodiment.
 ≪実施の形態3≫
 図29を参照して、本開示が適用される場面の一例について説明する。図29は、実施の形態3に係る制御システムの概要を示す模式図である。実施の形態3では、実施の形態1,2と異なり、複数の対象物のうちの1つが定位置に設置され、撮像装置21,22の状態がロボットによって変化される。
Embodiment 3
An example of a scene to which the present disclosure is applied will be described with reference to FIG. FIG. 29 is a schematic diagram illustrating an outline of a control system according to the third embodiment. In the third embodiment, unlike the first and second embodiments, one of a plurality of objects is installed at a fixed position, and the state of the imaging devices 21 and 22 is changed by a robot.
 実施の形態3に係る制御システム1Bは、工業製品の生産ラインなどにおいて、加工具14h,14iを用いて、大型部材14jの加工対象部分15,16を順に加工する。大型部材14jは、たとえば、大型装置の筐体、自動車ボディーなどである。加工具14h,14iは、たとえばドリル、電動やすりなどである。 The control system 1B according to the third embodiment sequentially processes the processing target portions 15, 16 of the large member 14j using the processing tools 14h, 14i in a production line of an industrial product or the like. The large member 14j is, for example, a housing of a large device, an automobile body, or the like. The processing tools 14h and 14i are, for example, a drill, an electric file, or the like.
 制御システム1Bは、図1に示す制御システム1と比較して、ロボット30a,30b、ロボットコントローラ40a,40bおよび制御装置50の代わりに、ロボット30h,30i,30j、ロボットコントローラ40h,40i,40jおよび制御装置50Bを備える点で相違する。 The control system 1B is different from the control system 1 shown in FIG. 1 in that instead of the robots 30a, 30b, the robot controllers 40a, 40b, and the control device 50, the robots 30h, 30i, 30j, the robot controllers 40h, 40i, 40j, The difference is that a control device 50B is provided.
 ロボット30hは、加工具14hの状態(ここでは位置および姿勢)を変化させるための機構であり、たとえば垂直多関節ロボットである。ロボット30hは、先端に加工具14hを支持するハンド31hを有し、ハンド31hの位置および姿勢を複数の自由度で変化させる。さらに、ロボット30hは、レール34に沿って矢印ARの方向に移動可能な台座33hを含む。 The robot 30h is a mechanism for changing the state (here, position and posture) of the processing tool 14h, and is, for example, a vertical articulated robot. The robot 30h has a hand 31h that supports the processing tool 14h at the tip, and changes the position and posture of the hand 31h with a plurality of degrees of freedom. Further, the robot 30h includes a pedestal 33h movable along the rail 34 in the direction of the arrow AR.
 ロボット30iは、加工具14iの状態(ここでは位置および姿勢)を変化させるための機構であり、たとえば垂直多関節ロボットである。ロボット30iは、先端に加工具14iを支持するハンド31iを有し、ハンド31iの位置および姿勢を複数の自由度で変化させる。さらに、ロボット30iは、レール34に沿って矢印ARの方向に移動可能な台座33hを含む。 The robot 30i is a mechanism for changing the state (here, position and orientation) of the processing tool 14i, and is, for example, a vertical articulated robot. The robot 30i has a hand 31i at its tip for supporting the processing tool 14i, and changes the position and orientation of the hand 31i with a plurality of degrees of freedom. Further, the robot 30i includes a pedestal 33h movable along the rail 34 in the direction of the arrow AR.
 ロボット30jは、撮像装置21,22の状態(ここでは位置および姿勢)を変化させるための機構であり、たとえば垂直多関節ロボットである。ロボット30jは、先端に撮像装置21,22を支持するハンド31jを有し、ハンド31jの位置および姿勢を複数の自由度で変化させる。さらに、ロボット30jは、レール34に沿って矢印ARの方向に移動可能な台座33hを含む。 The robot 30j is a mechanism for changing the states (here, positions and postures) of the imaging devices 21 and 22, and is, for example, a vertical articulated robot. The robot 30j has a hand 31j that supports the imaging devices 21 and 22 at the tip, and changes the position and posture of the hand 31j with a plurality of degrees of freedom. Further, the robot 30j includes a pedestal 33h movable along the rail 34 in the direction of the arrow AR.
 なお、図29では、台座33h,33i,33jは、共通のレール34に沿って移動するものとした。しかしながら、台座33h,33i,33jごとにレールが設けられており、台座33h,33i,33jの各々は、対応するレールに沿って移動してもよい。 In FIG. 29, the pedestals 33h, 33i, and 33j move along the common rail. However, a rail is provided for each of the pedestals 33h, 33i, and 33j, and each of the pedestals 33h, 33i, and 33j may move along the corresponding rail.
 ロボットコントローラ40h,40i,40jは、制御装置50Bから受けた制御指令に従って、ロボット30h,30i,30jの動作制御をそれぞれ行なう。ロボットコントローラ40h,40i,40jは、制御装置50Bからの制御指令に従って、ハンド31h,31i,31jの状態をそれぞれ変化させるとともに、台座33h,33i,33jをそれぞれ移動させる。 (4) The robot controllers 40h, 40i, and 40j perform operation control of the robots 30h, 30i, and 30j, respectively, according to control commands received from the control device 50B. The robot controllers 40h, 40i, and 40j change the states of the hands 31h, 31i, and 31j, respectively, and move the pedestals 33h, 33i, and 33j, respectively, according to a control command from the control device 50B.
 制御装置50Bは、実施の形態1と同様に、図4に示されるようなハードウェア構成を有する。そのため、制御装置50Bのハードウェア構成の詳細な説明を省略する。 The control device 50B has a hardware configuration as shown in FIG. 4, as in the first embodiment. Therefore, a detailed description of the hardware configuration of the control device 50B will be omitted.
 図30は、実施の形態3に係る制御装置の機能構成を示すブロック図である。図30に示されるように、制御装置50Bは、図5に示す制御装置50と比較して、第1制御部55aおよび第2制御部55bの代わりに第1制御部55h、第2制御部55iおよび第3制御部55jを備える点で相違する。 FIG. 30 is a block diagram showing a functional configuration of the control device according to the third embodiment. As shown in FIG. 30, the control device 50B is different from the control device 50 shown in FIG. 5 in that a first control unit 55h and a second control unit 55i are used instead of the first control unit 55a and the second control unit 55b. And a third control unit 55j.
 ただし、基準動画記憶部51は、加工具14h,14iと大型部材14jとの見本を示す第1基準動画および第2基準動画を記憶する。 However, the reference moving image storage unit 51 stores a first reference moving image and a second reference moving image showing samples of the processing tools 14h and 14i and the large member 14j.
 第1基準動画および第2基準動画は、たとえば、以下のような第1~第3シーンを順に示す。第1シーンは、大型部材14jの加工対象部分15が画像上の定位置にある状態において、加工具14h,14iが加工対象部分15に対して加工するシーンである。第2シーンは、大型部材14jが画像上で移動し、大型部材14jの加工対象部分16が画像上の定位置に移動するシーンである。第3シーンは、大型部材14jの加工対象部分16が画像上の定位置にある状態において、加工具14h,14iが加工対象部分16に対して加工するシーンである。 The first reference moving image and the second reference moving image indicate, for example, the following first to third scenes in order. The first scene is a scene in which the processing tools 14h and 14i process the processing target portion 15 in a state where the processing target portion 15 of the large member 14j is at a fixed position on the image. The second scene is a scene in which the large member 14j moves on the image, and the processing target portion 16 of the large member 14j moves to a fixed position on the image. The third scene is a scene in which the processing tools 14h and 14i process the processing target portion 16 in a state where the processing target portion 16 of the large member 14j is at a fixed position on the image.
 教示範囲選択部52は、加工具14hと加工具14iと大型部材14jとの各々について教示範囲を選択する。 The teaching range selection unit 52 selects a teaching range for each of the processing tool 14h, the processing tool 14i, and the large member 14j.
 画像処理部53は、対象画像から加工具14hと加工具14iと大型部材14jとを検出する。なお、大型部材14jのサイズが大きいため、撮像装置21,22の視野には大型部材14jの一部のみが含まれる。そのため、画像処理部53は、大型部材14jの表面に形成された模様を検出する。 The image processing unit 53 detects the processing tool 14h, the processing tool 14i, and the large member 14j from the target image. Since the size of the large member 14j is large, the field of view of the imaging devices 21 and 22 includes only a part of the large member 14j. Therefore, the image processing unit 53 detects a pattern formed on the surface of the large member 14j.
 第1制御部55hは、ロボットコントローラ40hを介してロボット30hを制御し、加工具14hの状態を変化させる。 The first control unit 55h controls the robot 30h via the robot controller 40h to change the state of the processing tool 14h.
 第2制御部55iは、ロボットコントローラ40iを介してロボット30iを制御し、加工具14hの状態を変化させる。 The second control unit 55i controls the robot 30i via the robot controller 40i to change the state of the processing tool 14h.
 第3制御部55jは、ロボットコントローラ40jを介してロボット30jを制御し、撮像装置21,22の状態を変化させる。 The third control unit 55j controls the robot 30j via the robot controller 40j to change the states of the imaging devices 21 and 22.
 第1制御部55h、第2制御部55iおよび第3制御部55jの各々は、実施の形態1と同様に、変化情報生成部56と、変化情報記憶部57と、算出部58と、指令部59と、終了判定部60とを備える(図7参照)。 Each of the first control unit 55h, the second control unit 55i, and the third control unit 55j includes a change information generation unit 56, a change information storage unit 57, a calculation unit 58, and a command unit, as in the first embodiment. 59 and an end determination unit 60 (see FIG. 7).
 ただし、各部における対象物、対象ロボットおよび対象ロボットコントローラは、実施の形態1と異なる。対象物は、第1制御部55hでは加工具14hであり、第2制御部55iでは加工具14iであり、第3制御部55jでは大型部材14jである。対象ロボットは、第1制御部55hではロボット30hであり、第2制御部55iではロボット30iであり、第3制御部55jではロボット30jである。対象ロボットコントローラは、第1制御部55hではロボットコントローラ40hであり、第2制御部55iではロボットコントローラ40iであり、第3制御部55jではロボットコントローラ40jである。 However, the target object, the target robot, and the target robot controller in each unit are different from those in the first embodiment. The target object is the processing tool 14h in the first control unit 55h, the processing tool 14i in the second control unit 55i, and the large member 14j in the third control unit 55j. The target robot is the robot 30h in the first control unit 55h, the robot 30i in the second control unit 55i, and the robot 30j in the third control unit 55j. The target robot controller is the robot controller 40h in the first controller 55h, the robot controller 40i in the second controller 55i, and the robot controller 40j in the third controller 55j.
 第1制御部55hおよび第2制御部55iの変化情報生成部56は、ロボット30jが動作していない状態において、対象ロボットの単位制御量と、撮像装置21によって撮像された実画像上の対象物の状態の変化量との関係を示す第1変化情報を生成する。また、第1制御部55hおよび第2制御部55iの変化情報生成部56は、ロボット30jが動作していない状態において、対象ロボットの単位制御量と、撮像装置22によって撮像された実画像上の対象物の状態の変化量との関係を示す第2変化情報を生成する。 When the robot 30j is not operating, the change information generating unit 56 of the first control unit 55h and the second control unit 55i determines the unit control amount of the target robot and the target object on the real image captured by the imaging device 21. The first change information indicating the relationship with the change amount of the state is generated. Further, the change information generation unit 56 of the first control unit 55h and the second control unit 55i, when the robot 30j is not operating, controls the unit control amount of the target robot and the actual control image on the real image captured by the imaging device 22. The second change information indicating the relationship with the amount of change in the state of the object is generated.
 具体的には、第1制御部55hの変化情報生成部56は、大型部材14jの加工対象部分15が画像上の定位置となる撮像装置21,22の状態において、第1変化情報および第2変化情報を生成する。第2制御部55iの変化情報生成部56は、大型部材14jの加工対象部分16が画像上の定位置となる撮像装置21,22の状態において、第1変化情報および第2変化情報を生成する。 Specifically, the change information generation unit 56 of the first control unit 55h performs the first change information and the second change information in the state of the imaging devices 21 and 22 in which the processing target portion 15 of the large member 14j is at a fixed position on an image. Generate change information. The change information generation unit 56 of the second control unit 55i generates the first change information and the second change information in the state of the imaging devices 21 and 22 where the processing target portion 16 of the large member 14j is at a fixed position on the image. .
 大型部材14jの状態は、ロボット30jによって変化されない。しかしながら、ロボット30jによって撮像装置21,22の状態が変化するため、撮像装置21,22の実画像上の大型部材14jの状態は変化する。そのため、第3制御部55jの変化情報生成部56は、ロボット30jの単位制御量と、撮像装置21によって撮像された実画像上の大型部材14jの状態の変化量との関係を示す第1変化情報を生成する。また、第3制御部55jの変化情報生成部56は、ロボット30jの単位制御量と、撮像装置22によって撮像された実画像上の大型部材14jの状態の変化量との関係を示す第2変化情報を生成する。 状態 The state of the large member 14j is not changed by the robot 30j. However, since the state of the imaging devices 21 and 22 is changed by the robot 30j, the state of the large member 14j on the real image of the imaging devices 21 and 22 is changed. Therefore, the change information generation unit 56 of the third control unit 55j performs the first change indicating the relationship between the unit control amount of the robot 30j and the change amount of the state of the large member 14j on the real image captured by the imaging device 21. Generate information. In addition, the change information generation unit 56 of the third control unit 55j performs the second change indicating the relationship between the unit control amount of the robot 30j and the change amount of the state of the large member 14j on the real image captured by the imaging device 22. Generate information.
 第3制御部55jにおける算出部58、指令部59および終了判定部60の処理内容は、実施の形態1の第1制御部55aおよび第2制御部55bと同じである。 The processing contents of the calculation unit 58, the command unit 59, and the end determination unit 60 in the third control unit 55j are the same as those of the first control unit 55a and the second control unit 55b in the first embodiment.
 一方、第1制御部55hおよび第2制御部55iにおける算出部58は、以下の開始条件が満たされたときのみ、実画像上の対象物の状態が目標フレーム上の対象物の状態に近づくように対象ロボットの制御量を算出する。
開始条件:実画像上の大型部材14jの状態と目標フレーム上の大型部材14jの状態との偏差が閾値Thj未満である。
On the other hand, the calculation unit 58 in the first control unit 55h and the second control unit 55i determines that the state of the target on the real image approaches the state of the target on the target frame only when the following start conditions are satisfied. First, the control amount of the target robot is calculated.
Start condition: The deviation between the state of the large member 14j on the actual image and the state of the large member 14j on the target frame is less than the threshold Thj.
 第1制御部55hおよび第2制御部55iにおける指令部59および終了判定部60の処理内容は、実施の形態1の第1制御部55aおよび第2制御部55bと同じである。 The processing contents of the command unit 59 and the termination determination unit 60 in the first control unit 55h and the second control unit 55i are the same as those of the first control unit 55a and the second control unit 55b in the first embodiment.
 実施の形態3に係る制御装置50Bは、実施の形態1と同様に図12のフローチャートに沿って、第1基準動画および第2基準動画に従って対象物の状態が変化するように対象ロボットを制御する。 The control device 50B according to the third embodiment controls the target robot such that the state of the target object changes in accordance with the first reference moving image and the second reference moving image according to the flowchart in FIG. 12 as in the first embodiment. .
 さらに、実施の形態3において、第3制御部55jは、実施の形態1と同様に図15に示すフローチャートに従って、図12に示すステップS46のサブルーチンの処理を行なう。しかしながら、第1制御部55hおよび第2制御部55iは、図31に示すフローチャートに従って、図12に示すステップS46のサブルーチンの処理を行なう。 Further, in the third embodiment, the third control unit 55j performs the processing of the subroutine of step S46 shown in FIG. 12 according to the flowchart shown in FIG. 15, as in the first embodiment. However, first control unit 55h and second control unit 55i perform the processing of the subroutine of step S46 shown in FIG. 12 according to the flowchart shown in FIG.
 図31は、実施の形態3の第1制御部および第2制御部の処理の流れを示すフローチャートである。図31に示されるように、実施の形態3の第1制御部55hおよび第2制御部55iの処理の流れは、図15に示すフローチャートと比べて、ステップS90を備える点で相違する。そのため、ステップS90についてのみ説明する。 FIG. 31 is a flowchart showing the flow of processing of the first control unit and the second control unit of the third embodiment. As shown in FIG. 31, the processing flow of the first control unit 55h and the second control unit 55i according to the third embodiment is different from the flowchart shown in FIG. 15 in that step S90 is provided. Therefore, only step S90 will be described.
 ステップS90において、実画像上の大型部材14jの状態と目標フレーム上の大型部材14jの状態との偏差が閾値Thj未満であるか否かが判定される。偏差が閾値Thj以上である場合(ステップS90でNO)、処理は終了する。偏差が閾値Thj未満である場合(ステップS90でYES)、ステップS61~S65が行われる。 In step S90, it is determined whether or not the deviation between the state of the large member 14j on the actual image and the state of the large member 14j on the target frame is less than the threshold Thj. If the deviation is equal to or larger than the threshold Thj (NO in step S90), the process ends. If the deviation is less than the threshold Thj (YES in step S90), steps S61 to S65 are performed.
 このように、制御装置50Bは、実画像上の大型部材14jの状態と目標フレーム上の大型部材14jの状態との偏差が閾値Thjを超える場合、ロボット30jのみを制御する。制御装置50Bは、実画像上の大型部材14jの状態と目標フレーム上の大型部材14jの状態との偏差が閾値Thj未満である場合、ロボット30h~30jの各々を制御する。 As described above, when the deviation between the state of the large member 14j on the actual image and the state of the large member 14j on the target frame exceeds the threshold Thj, the control device 50B controls only the robot 30j. When the deviation between the state of large member 14j on the actual image and the state of large member 14j on the target frame is less than threshold value Thj, control device 50B controls each of robots 30h to 30j.
 たとえば、第1基準動画および第2基準動画が上記のような第1~第3シーンを順に示す場合は、以下のようにロボット30h~30jが制御される。第1シーンの中から目標フレームが選択されている場合、大型部材14jの加工対象部分15が画像上の定位置になるように撮像装置21,22が移動するまでの間、ロボット30h,30iは停止した状態となる。そして、大型部材14jの加工対象部分15が画像上の定位置になってから、第1制御部55hおよび第2制御部55iは、実画像上の対象物の状態が目標フレーム上の対象物の状態に近づくように対象ロボットを制御する。その結果、加工具14h,14iは、加工対象部分15に対して加工する。この間も、第3制御部55jは、実画像上の大型部材14jの状態が目標フレーム上の大型部材14jの状態に近づくようにロボット30jを制御する。ただし、第1シーンにおいて、大型部材14jの状態は一定であるため、ロボット30jはほとんど動作せず、撮像装置21,22の状態は略一定となる。 For example, when the first reference moving image and the second reference moving image sequentially indicate the first to third scenes as described above, the robots 30h to 30j are controlled as follows. When the target frame is selected from the first scene, the robots 30 h and 30 i are not moved until the imaging devices 21 and 22 move so that the processing target portion 15 of the large member 14 j is at a fixed position on the image. It will be in a stopped state. Then, after the processing target portion 15 of the large member 14j has reached the fixed position on the image, the first control unit 55h and the second control unit 55i change the state of the target on the real image to the position of the target on the target frame. Control the target robot to approach the state. As a result, the processing tools 14h and 14i process the processing target portion 15. Also during this time, the third control unit 55j controls the robot 30j such that the state of the large member 14j on the actual image approaches the state of the large member 14j on the target frame. However, in the first scene, since the state of the large member 14j is constant, the robot 30j hardly operates, and the states of the imaging devices 21 and 22 are substantially constant.
 上記の第2シーンの中から目標フレームが選択されている場合、実画像上の大型部材14jの状態が変化するようにロボット30jが制御され、撮像装置21,22が移動する。大型部材14jの加工対象部分16が画像上の定位置になるように撮像装置21,22が移動するまでの間、ロボット30h,30iは停止した状態となる。 When the target frame is selected from the second scene, the robot 30j is controlled so that the state of the large member 14j on the real image changes, and the imaging devices 21 and 22 move. The robots 30h and 30i are in a stopped state until the imaging devices 21 and 22 move so that the processing target portion 16 of the large member 14j is at a fixed position on the image.
 大型部材14jの加工対象部分16が画像上の定位置となり、上記の第3シーンの中から目標フレームが選択されると、第1制御部55hおよび第2制御部55iは、実画像上の対象物の状態が目標フレーム上の対象物の状態に近づくように対象ロボットを制御する。その結果、加工具14h,14iは、加工対象部分16に対して加工する。この間も、第3制御部55jは、実画像上の大型部材14jの状態が目標フレーム上の大型部材14jの状態に近づくようにロボット30jを制御する。ただし、第3シーンにおいて、大型部材14jの状態は一定であるため、ロボット30jはほとんど動作せず、撮像装置21,22の状態は略一定となる。 When the processing target portion 16 of the large member 14j becomes a fixed position on the image and a target frame is selected from the third scene, the first control unit 55h and the second control unit 55i The target robot is controlled so that the state of the object approaches the state of the target on the target frame. As a result, the processing tools 14h and 14i process the processing target portion 16. Also during this time, the third control unit 55j controls the robot 30j such that the state of the large member 14j on the actual image approaches the state of the large member 14j on the target frame. However, in the third scene, since the state of the large member 14j is constant, the robot 30j hardly operates, and the states of the imaging devices 21 and 22 are substantially constant.
 図32は、実施の形態3の変形例に係る制御システムの一部の概要を示す模式図である。図32に示されるように、ロボット30h,30iは、ロボット30jに含まれる台座33jの上に設置されてもよい。この場合、ロボット30h,30iは、ロボット30jと一体となって移動する。 FIG. 32 is a schematic diagram showing an outline of a part of a control system according to a modification of the third embodiment. As shown in FIG. 32, the robots 30h and 30i may be installed on a pedestal 33j included in the robot 30j. In this case, the robots 30h and 30i move integrally with the robot 30j.
 実施の形態1で説明した<D1-5.その他の変形例>は、実施の形態3にも適用されてもよい。 << D1-5. Described in the first embodiment. Other Modifications> may be applied to the third embodiment.
 ≪付記≫
 以上のように、実施の形態1~3および変形例は以下のような開示を含む。
≪Note≫
As described above, the first to third embodiments and the modified examples include the following disclosure.
 (構成1)
 第1~第Nのロボット(30a~30f,30h~30j)と、
 第1~第Nの対象物(2a,2b,6a,6b,8c~8f,10c~10f,13c~13f,14h~14j)を撮像するための撮像装置(21~24)と、
 前記第1~第Nのロボット(30a~30f,30h~30j)を制御するための制御装置(50,50A,50B)とを備える制御システム(1,1A,1B)であって、
 Nは2以上の整数であり、
 第iのロボット(30a~30f,30h~30j)は、第iの対象物の状態を変化させ、
 iは1~N-1の整数であり、
 前記第Nのロボット(30a~30f,30h~30j)は、前記第Nの対象物および前記撮像装置の一方の状態を変化させ、
 前記第Nの対象物および前記撮像装置(21~24)の他方は、定位置に設置され、
 前記制御装置(50,50A,50B)は、前記第1~第Nの対象物の各々について変化情報を取得し、
 第jの対象物に対応する前記変化情報は、第jのロボット(30a~30j)の制御量と、前記撮像装置の画像上における前記第jの対象物の状態の変化量との関係を示し、
 jは1~Nの整数であり、
 前記制御装置(50,50A,50B)は、
 前記撮像装置(21~24)によって撮像された実画像を取得する第1処理と、
 前記第1~第Nの対象物の見本を示す基準動画から目標フレームを選択する第2処理と、
 前記実画像と前記目標フレームとに基づいて前記第1~第Nのロボットの各々を制御する第3処理とを行ない、
 前記制御装置(50,50A,50B)は、前記第3処理において、前記第jの対象物に対応する前記変化情報に基づいて、前記実画像上の前記第jの対象物の状態を前記目標フレーム上の前記第jの対象物の状態に近づけるための前記第jのロボット(30a~30f,30h~30j)の制御量を算出し、算出した制御量に従って前記第jのロボット(30a~30f,30h~30j)を制御する、制御システム(1,1A,1B)。
(Configuration 1)
First to N-th robots (30a to 30f, 30h to 30j);
Imaging devices (21 to 24) for imaging the first to Nth objects (2a, 2b, 6a, 6b, 8c to 8f, 10c to 10f, 13c to 13f, and 14h to 14j);
A control system (1, 1A, 1B) including a control device (50, 50A, 50B) for controlling the first to Nth robots (30a to 30f, 30h to 30j),
N is an integer of 2 or more;
The i-th robot (30a to 30f, 30h to 30j) changes the state of the i-th object,
i is an integer of 1 to N-1,
The N-th robot (30a to 30f, 30h to 30j) changes one state of the N-th object and the imaging device,
The other of the N-th object and the imaging device (21 to 24) is installed at a fixed position,
The control device (50, 50A, 50B) acquires change information for each of the first to Nth objects,
The change information corresponding to the j-th object indicates a relationship between a control amount of the j-th robot (30a to 30j) and a change amount of a state of the j-th object on an image of the imaging device. ,
j is an integer from 1 to N;
The control device (50, 50A, 50B)
A first process of acquiring a real image captured by the imaging device (21 to 24);
A second process of selecting a target frame from a reference moving image indicating a sample of the first to Nth objects;
Performing a third process for controlling each of the first to Nth robots based on the actual image and the target frame;
In the third processing, the control device (50, 50A, 50B) sets the state of the j-th object on the real image to the target based on the change information corresponding to the j-th object. A control amount of the j-th robot (30a to 30f, 30h to 30j) for approximating the state of the j-th object on the frame is calculated, and the j-th robot (30a to 30f) is calculated according to the calculated control amount. , 30h to 30j), and a control system (1, 1A, 1B).
 (構成2)
 前記制御装置(50,50A,50B)は、前記実画像上における前記第1~第Nの対象物のうちの少なくとも1つの対象物の状態と前記目標フレーム上の前記少なくとも1つの対象物の状態との偏差が閾値未満となってから、前記目標フレームを更新する、構成1に記載の制御システム(1,1A,1B)。
(Configuration 2)
The control device (50, 50A, 50B) includes a state of at least one of the first to Nth objects on the real image and a state of the at least one object on the target frame. The control system (1, 1A, 1B) according to Configuration 1, wherein the target frame is updated after a deviation from the target frame becomes smaller than a threshold value.
 (構成3)
 前記制御装置(50,50A,50B)は、前記基準動画から第1目標フレームを選択した後、前記複数のフレームから第2目標フレームを選択し、
 前記制御装置(50,50A,50B)は、前記実画像上の前記第1の対象物の状態と前記第1目標フレーム上の前記第1の対象物の状態との偏差が第1閾値未満となる第1時刻と、前記実画像上の第2の対象物の状態と前記第2目標フレーム上の前記第2の対象物の状態との偏差が第2閾値未満となる第2時刻とが規定条件を満たすように、前記第1のロボット(30a~30f,30h~30j)および第2のロボット(30a~30f,30h~30j)を制御する、構成1に記載の制御システム(1,1A,1B)。
(Configuration 3)
The control device (50, 50A, 50B) selects a first target frame from the reference moving image, and then selects a second target frame from the plurality of frames.
The control device (50, 50A, 50B) determines that a deviation between the state of the first object on the real image and the state of the first object on the first target frame is less than a first threshold. And a second time at which the deviation between the state of the second object on the real image and the state of the second object on the second target frame is less than a second threshold. The control system (1,1A, 1A, 1A, 1A, 1A, 1), wherein the first robot (30a to 30f, 30h to 30j) and the second robot (30a to 30f, 30h to 30j) are controlled to satisfy a condition. 1B).
 (構成4)
 前記撮像装置(21~24)は、前記第1~第Nの対象物とともに第N+1の対象物(8g)を撮像し、
 前記基準動画は、前記第N+1の対象物を含み、
 前記制御装置(50A)は、前記実画像上の前記第N+1の対象物の状態と前記目標フレーム上の前記第N+1の対象物の状態との偏差が閾値未満となってから、前記目標フレームを更新する、構成1に記載の制御システム(1A)。
(Configuration 4)
The imaging devices (21 to 24) image the (N + 1) th object (8g) together with the first to Nth objects,
The reference moving image includes the (N + 1) th object,
The control device (50A) sets the target frame after the deviation between the state of the (N + 1) th object on the real image and the state of the (N + 1) th object on the target frame becomes less than a threshold value. The control system (1A) according to Configuration 1, which is updated.
 (構成5)
 前記撮像装置(21~24)は、前記第1~第Nの対象物とともに第N+1の対象物(8g)を撮像し、
 前記基準動画は、前記第N+1の対象物を含み、
 前記制御装置(50A)は、前記基準動画から第1目標フレームを選択した後、前記複数のフレームから第2目標フレームを選択し、
 前記制御装置(50A)は、前記実画像上の前記第N+1の対象物の状態と前記第1目標フレーム上の前記第N+1の対象物の状態との偏差が第1閾値未満となる第1時刻と、前記実画像上の前記第1の対象物の状態と前記第2目標フレーム上の前記第1の対象物の状態との偏差が第2閾値未満となる第2時刻とが規定条件を満たすように、前記第1のロボット(30d)を制御する、構成1に記載の制御システム(1A)。
(Configuration 5)
The imaging devices (21 to 24) image the (N + 1) th object (8g) together with the first to Nth objects,
The reference moving image includes the (N + 1) th object,
The control device (50A) selects a first target frame from the reference moving image, and then selects a second target frame from the plurality of frames,
The control device (50A) includes a first time when a deviation between the state of the (N + 1) th object on the real image and the state of the (N + 1) th object on the first target frame is less than a first threshold. And a second time at which the deviation between the state of the first object on the real image and the state of the first object on the second target frame is less than a second threshold satisfies a prescribed condition. The control system (1A) according to Configuration 1, which controls the first robot (30d) as described above.
 (構成6)
 前記制御装置(50,50A)は、前記目標フレームが選択されてから規定時間経過するまでの間に、前記実画像上の前記第1~第Nの対象物のうちの少なくとも1つの対象物の状態と前記目標フレーム上の前記少なくとも1つの対象物の状態との偏差が閾値未満とならない場合に、前記制御システムに異常が生じていると判定する、構成1に記載の制御システム(1,1A)。
(Configuration 6)
The control device (50, 50A) controls at least one of the first to N-th objects on the real image during a period from when the target frame is selected to when a specified time has elapsed. The control system (1, 1A) according to configuration 1, wherein it is determined that an abnormality has occurred in the control system when a deviation between a state and a state of the at least one object on the target frame is not less than a threshold. ).
 (構成7)
 前記第Nのロボット(30j)は、前記撮像装置(21,22)の状態を変化させ、
 前記制御装置(50B)は、前記実画像上の前記第Nの対象物(14j)の状態と前記目標フレーム上の前記第Nの対象物の状態との偏差が閾値を超える場合、前記第3処理において、前記第Nのロボット(30j)のみを制御し、前記実画像上の前記第Nの対象物の状態と前記目標フレーム上の前記第Nの対象物の状態との偏差が閾値未満である場合、前記第3処理において、前記第1~第Nのロボット(30h~30j)の各々を制御する、構成1に記載の制御システム(50B)。
(Configuration 7)
The N-th robot (30j) changes the state of the imaging devices (21, 22),
When the deviation between the state of the N-th object (14j) on the real image and the state of the N-th object on the target frame exceeds a threshold, the control device (50B) performs the third operation. In the processing, only the N-th robot (30j) is controlled, and the deviation between the state of the N-th object on the real image and the state of the N-th object on the target frame is less than a threshold. In some cases, in the third process, the control system (50B) according to Configuration 1, wherein each of the first to Nth robots (30h to 30j) is controlled.
 (構成8)
 前記制御装置(50,50A,50B)は、前記第1処理から前記第3処理を含む一連処理を繰り返し実行し、前記第3処理を行なっている間に次の前記一連処理の前記第1処理を開始する、構成1に記載の制御システム(1,1A,1B)。
(Configuration 8)
The control device (50, 50A, 50B) repeatedly executes a series of processing including the first processing to the third processing, and performs the first processing of the next series of processing while performing the third processing. The control system (1, 1A, 1B) according to Configuration 1, which starts the control.
 (構成9)
 前記制御装置(50,50A,50B)は、前記第2処理において、前記基準動画のうち予測ホライズン期間に含まれるフレームを前記目標フレームとして選択し、
 前記制御装置(50,50A)は、前記第3処理において、前記基準動画のうち予測ホライズン期間に含まれるフレーム上の前記第jの対象物の状態と、前記予測ホライズン期間における前記撮像装置の画像上の前記第jの対象物の状態との偏差を最小化するように、制御ホライズン期間における前記第jのロボットの制御量を算出する、構成1に記載の制御システム(1,1A,1B)。
(Configuration 9)
In the second processing, the control device (50, 50A, 50B) selects a frame included in a predicted horizon period among the reference moving images as the target frame,
In the third processing, the control device (50, 50A) includes a state of the j-th object on a frame included in a predicted horizon period of the reference moving image, and an image of the imaging device during the predicted horizon period. The control system (1, 1A, 1B) according to Configuration 1, wherein a control amount of the j-th robot during a control horizon period is calculated so as to minimize a deviation from the state of the j-th object. .
 (構成10)
 前記状態は、物体の位置、姿勢、形状、サイズの少なくとも1つを示す、構成1から8のいずれかに記載の制御システム(1,1A,1B)。
(Configuration 10)
The control system (1, 1A, 1B) according to any of Configurations 1 to 8, wherein the state indicates at least one of a position, a posture, a shape, and a size of the object.
 (構成11)
 第1~第Nの対象物(2a,2b,6a,6b,8c~8f,10c~10f,13c~13f,14h~14j)を撮像するための撮像装置(21~24)を用いて、第1~第Nのロボット(30a~30f,30h~30j)を制御する制御方法であって、
 Nは2以上の整数であり、
 第iのロボット(30a~30f,30h~30j)は、第iの対象物の状態を変化させ、
 iは1~N-1の整数であり、
 前記第Nのロボット(30a~30f,30h~30j)は、前記第Nの対象物および前記撮像装置(21~24)の一方の状態を変化させ、
 前記第Nの対象物および前記撮像装置(21~24)の他方は、定位置に設置され、
 前記制御方法は、
 前記第1~第Nの対象物の各々について変化情報を取得する第1ステップを備え、
 第jの対象物に対応する前記変化情報は、第jのロボットの制御量と、前記撮像装置の画像上における前記第jの対象物の状態の変化量との関係を示し、
 jは1~Nの整数であり、
 前記制御方法は、
 前記撮像装置(21~24)によって撮像された実画像を取得する第2ステップと、
 前記第1~第Nの対象物の見本を示す基準動画から目標フレームを選択する第3ステップと、
 前記実画像と前記目標フレームとに基づいて前記第1~第Nのロボットの各々を制御する第4ステップとをさらに備え、
 前記第4ステップは、前記第jの対象物に対応する前記変化情報に基づいて、前記実画像上の前記第jの対象物の状態を前記目標フレーム上の前記第jの対象物の状態に近づけるための前記第jのロボット(30a~30f,30h~30j)の制御量を算出し、算出した制御量に従って前記第jのロボット(30a~30f,30h~30j)を制御するステップを含む、制御方法。
(Configuration 11)
Using an imaging device (21 to 24) for imaging the first to Nth objects (2a, 2b, 6a, 6b, 8c to 8f, 10c to 10f, 13c to 13f, and 14h to 14j), A control method for controlling first to Nth robots (30a to 30f, 30h to 30j),
N is an integer of 2 or more;
The i-th robot (30a to 30f, 30h to 30j) changes the state of the i-th object,
i is an integer of 1 to N-1,
The N-th robot (30a to 30f, 30h to 30j) changes one state of the N-th object and the imaging device (21 to 24),
The other of the N-th object and the imaging device (21 to 24) is installed at a fixed position,
The control method includes:
A first step of acquiring change information for each of the first to Nth objects;
The change information corresponding to the j-th object indicates a relationship between a control amount of the j-th robot and a change amount of a state of the j-th object on an image of the imaging device,
j is an integer from 1 to N;
The control method includes:
A second step of acquiring a real image captured by the imaging device (21 to 24);
A third step of selecting a target frame from a reference moving image indicating a sample of the first to Nth objects;
A fourth step of controlling each of the first to Nth robots based on the real image and the target frame,
The fourth step changes the state of the j-th object on the real image to the state of the j-th object on the target frame based on the change information corresponding to the j-th object. Calculating a control amount of the j-th robot (30a to 30f, 30h to 30j) for approaching, and controlling the j-th robot (30a to 30f, 30h to 30j) according to the calculated control amount. Control method.
 (構成12)
 第1~第Nの対象物(2a,2b,6a,6b,8c~8f,10c~10f,13c~13f)を撮像するための撮像装置(21~24)を用いて、第1~第Nのロボット(30a~30f,30h~30j)を制御する制御方法をコンピュータに実行させるためのプログラム(550)であって、
 Nは2以上の整数であり、
 第iのロボット(30a~30f,30h~30j)は、第iの対象物の状態を変化させ、
 iは1~N-1の整数であり、
 前記第Nのロボット(30a~30f,30h~30j)は、前記第Nの対象物および前記撮像装置(21~24)の一方の状態を変化させ、
 前記第Nの対象物および前記撮像装置(21~24)の他方は、定位置に設置され、
 前記制御方法は、
 前記第1~第Nの対象物の各々について変化情報を取得する第1ステップを備え、
 第jの対象物に対応する前記変化情報は、第jのロボットの制御量と、前記撮像装置の画像上における前記第jの対象物の状態の変化量との関係を示し、
 jは1~Nの整数であり、
 前記制御方法は、
 前記撮像装置(21~24)によって撮像された実画像を取得する第2ステップと、
 前記第1~第Nの対象物の見本を示す基準動画から目標フレームを選択する第3ステップと、
 前記実画像と前記目標フレームとに基づいて前記第1~第Nのロボットの各々を制御する第4ステップとをさらに備え、
 前記第4ステップは、前記第jの対象物に対応する前記変化情報に基づいて、前記実画像上の前記第jの対象物の状態を前記目標フレーム上の前記第jの対象物の状態に近づけるための前記第jのロボット(30a~30f,30h~30j)の制御量を算出し、算出した制御量に従って前記第jのロボット(30a~30f,30h~30j)を制御するステップを含む、プログラム(550)。
(Configuration 12)
First to Nth objects are obtained by using imaging devices (21 to 24) for imaging the first to Nth objects (2a, 2b, 6a, 6b, 8c to 8f, 10c to 10f, 13c to 13f). A program (550) for causing a computer to execute a control method for controlling the robots (30a to 30f, 30h to 30j).
N is an integer of 2 or more;
The i-th robot (30a to 30f, 30h to 30j) changes the state of the i-th object,
i is an integer of 1 to N-1,
The N-th robot (30a to 30f, 30h to 30j) changes one state of the N-th object and the imaging device (21 to 24),
The other of the N-th object and the imaging device (21 to 24) is installed at a fixed position,
The control method includes:
A first step of acquiring change information for each of the first to Nth objects;
The change information corresponding to the j-th object indicates a relationship between a control amount of the j-th robot and a change amount of a state of the j-th object on an image of the imaging device,
j is an integer from 1 to N;
The control method includes:
A second step of acquiring a real image captured by the imaging device (21 to 24);
A third step of selecting a target frame from a reference moving image indicating a sample of the first to Nth objects;
A fourth step of controlling each of the first to Nth robots based on the real image and the target frame,
The fourth step changes the state of the j-th object on the real image to the state of the j-th object on the target frame based on the change information corresponding to the j-th object. Calculating a control amount of the j-th robot (30a to 30f, 30h to 30j) for approaching, and controlling the j-th robot (30a to 30f, 30h to 30j) according to the calculated control amount. Program (550).
 本発明の実施の形態について説明したが、今回開示された実施の形態はすべての点で例示であって制限的なものではないと考えられるべきである。本発明の範囲は請求の範囲によって示され、請求の範囲と均等の意味および範囲内でのすべての変更が含まれることが意図される。 Although the embodiments of the present invention have been described, the embodiments disclosed this time are to be considered in all respects as illustrative and not restrictive. The scope of the present invention is defined by the terms of the claims, and is intended to include any modifications within the scope and meaning equivalent to the terms of the claims.
 1,1A,1B 制御システム、2a オスコネクタ、2b メスコネクタ、3 線、4a,4g 特徴点、5,9 基板、6a 上ケース、6b 下ケース、7a,7b 係合爪、8c 半田ごて、8d 半田フィーダ、8e 電線、8f パッド、8g 溶融半田、10c,10d,10g,10h ビス、10e,10f,13e,13f 円筒部材、11a,11b,12a,12b ビス孔、13c 溶接トーチ、13d 溶接棒、14h,14i 加工具、14j 大型部材、15,16 加工対象部分、21~24 撮像装置、21a~24a 視野範囲、30a~30f,30h~30j ロボット、31a,31c~31e,31h~31j ハンド、31b,31f ステージ、32a~32d 制御棒、33h,33i,33j 台座、34 レール、40a~40f,40h~40j ロボットコントローラ、50,50A,50B 制御装置、51 基準動画記憶部、52 教示範囲選択部、53 画像処理部、54,54A 目標フレーム選択部、55a,55c,55h 第1制御部、55b,55d,55i 第2制御部、55e,55j 第3制御部、55f 第4制御部、56 変化情報生成部、57 変化情報記憶部、58 算出部、59 指令部、60 終了判定部、510 プロセッサ、512 RAM、514 表示コントローラ、516 システムコントローラ、518 I/Oコントローラ、520 ハードディスク、522 カメラインターフェイス、522a,522b 画像バッファ、524 入力インターフェイス、526 ロボットコントローラインターフェイス、528 通信インターフェイス、530 メモリカードインターフェイス、532 表示部、534 入力装置、536 メモリカード、550 制御プログラム、571 第1変化情報セット、572 第2変化情報セット。 1, 1A, 1B control system, 2a male connector, 2b female connector, 3 wire, 4a, 4g feature point, 5, 9 board, 6a upper case, 6b lower case, 7a, 7b engaging claw, 8c soldering iron, 8d solder feeder, 8e electric wire, 8f pad, 8g molten solder, 10c, 10d, 10g, 10h screw, 10e, 10f, 13e, 13f cylindrical member, 11a, 11b, 12a, 12b screw hole, 13c welding torch, 13d welding rod , 14h, 14i processing tool, 14j large member, 15, 16 processing target part, 21-24 imaging device, 21a-24a field of view, 30a-30f, 30h-30j robot, 31a, 31c-31e, 31h-31j hand, 31b, 31f stage, 32a to 32d control rod, 33h, 3 i, 33j pedestal, 34 rail, 40a to 40f, 40h to 40j robot controller, 50, 50A, 50B control device, 51 reference moving image storage unit, 52 teaching range selection unit, 53 image processing unit, 54, 54A target frame selection unit 55a, 55c, 55h {first control unit, 55b, 55d, 55i} second control unit, 55e, 55j {third control unit, 55f} fourth control unit, 56} change information generation unit, 57} change information storage unit, 58} calculation unit , 59 command section, 60 end determination section, 510 processor, 512 RAM, 514 display controller, 516 system controller, 518 I / O controller, 520 hard disk, 522 camera interface, 522 a, 522 b image buffer, 524 input interface, 26 robot controller interface 528 communication interface, 530 memory card interface, 532 display unit, 534 input unit, 536 memory card, 550 control program, 571 a first change information set 572 second change information set.

Claims (12)

  1.  第1~第Nのロボットと、
     第1~第Nの対象物を撮像するための撮像装置と、
     前記第1~第Nのロボットを制御するための制御装置とを備える制御システムであって、
     Nは2以上の整数であり、
     第iのロボットは、第iの対象物の状態を変化させ、
     iは1~N-1の整数であり、
     前記第Nのロボットは、前記第Nの対象物および前記撮像装置の一方の状態を変化させ、
     前記第Nの対象物および前記撮像装置の他方は、定位置に設置され、
     前記制御装置は、前記第1~第Nの対象物の各々について変化情報を取得し、
     第jの対象物に対応する前記変化情報は、第jのロボットの制御量と、前記撮像装置の画像上における前記第jの対象物の状態の変化量との関係を示し、
     jは1~Nの整数であり、
     前記制御装置は、
     前記撮像装置によって撮像された実画像を取得する第1処理と、
     前記第1~第Nの対象物の見本を示す基準動画から目標フレームを選択する第2処理と、
     前記実画像と前記目標フレームとに基づいて前記第1~第Nのロボットの各々を制御する第3処理とを行ない、
     前記制御装置は、前記第3処理において、前記第jの対象物に対応する前記変化情報に基づいて、前記実画像上の前記第jの対象物の状態を前記目標フレーム上の前記第jの対象物の状態に近づけるための前記第jのロボットの制御量を算出し、算出した制御量に従って前記第jのロボットを制御する、制御システム。
    1st to Nth robots,
    An imaging device for imaging the first to Nth objects;
    A control device for controlling the first to Nth robots,
    N is an integer of 2 or more;
    The i-th robot changes the state of the i-th object,
    i is an integer of 1 to N-1,
    The N-th robot changes one state of the N-th object and the imaging device,
    The other of the N-th object and the imaging device is installed at a fixed position,
    The control device acquires change information for each of the first to Nth objects,
    The change information corresponding to the j-th object indicates a relationship between a control amount of the j-th robot and a change amount of a state of the j-th object on an image of the imaging device,
    j is an integer from 1 to N;
    The control device includes:
    A first process of acquiring a real image captured by the imaging device;
    A second process of selecting a target frame from a reference moving image indicating a sample of the first to Nth objects;
    Performing a third process for controlling each of the first to Nth robots based on the actual image and the target frame;
    The control device, in the third processing, based on the change information corresponding to the j-th object, changes the state of the j-th object on the real image to the j-th object on the target frame. A control system for calculating a control amount of the j-th robot for approaching a state of an object and controlling the j-th robot according to the calculated control amount.
  2.  前記制御装置は、前記実画像上における前記第1~第Nの対象物のうちの少なくとも1つの対象物の状態と前記目標フレーム上の前記少なくとも1つの対象物の状態との偏差が閾値未満となってから、前記目標フレームを更新する、請求項1に記載の制御システム。 The control device may be configured such that a deviation between a state of at least one of the first to Nth objects on the real image and a state of the at least one object on the target frame is less than a threshold. The control system according to claim 1, wherein the target frame is updated after that.
  3.  前記制御装置は、前記基準動画から、第1目標フレームを選択した後に第2目標フレームを選択し、
     前記制御装置は、前記実画像上の前記第1の対象物の状態と前記第1目標フレーム上の前記第1の対象物の状態との偏差が第1閾値未満となる第1時刻と、前記実画像上の第2の対象物の状態と前記第2目標フレーム上の前記第2の対象物の状態との偏差が第2閾値未満となる第2時刻とが規定条件を満たすように、前記第1のロボットおよび第2のロボットを制御する、請求項1に記載の制御システム。
    The control device selects a second target frame after selecting a first target frame from the reference moving image,
    A first time at which a deviation between the state of the first object on the real image and the state of the first object on the first target frame is less than a first threshold; The second time at which the deviation between the state of the second object on the real image and the state of the second object on the second target frame is less than a second threshold satisfies a specified condition, The control system according to claim 1, wherein the control system controls the first robot and the second robot.
  4.  前記撮像装置は、前記第1~第Nの対象物とともに第N+1の対象物を撮像し、
     前記基準動画は、前記第N+1の対象物を含み、
     前記制御装置は、前記実画像上の前記第N+1の対象物の状態と前記目標フレーム上の前記第N+1の対象物の状態との偏差が閾値未満となってから、前記目標フレームを更新する、請求項1に記載の制御システム。
    The imaging device images the (N + 1) th object together with the first to Nth objects,
    The reference moving image includes the (N + 1) th object,
    The control device, after the deviation between the state of the (N + 1) th object on the real image and the state of the (N + 1) th object on the target frame is less than a threshold, updates the target frame. The control system according to claim 1.
  5.  前記撮像装置は、前記第1~第Nの対象物とともに第N+1の対象物を撮像し、
     前記基準動画は、前記第N+1の対象物を含み、
     前記制御装置は、前記基準動画から、第1目標フレームを選択した後に第2目標フレームを選択し、
     前記制御装置は、前記実画像上の前記第N+1の対象物の状態と前記第1目標フレーム上の前記第N+1の対象物の状態との偏差が第1閾値未満となる第1時刻と、前記実画像上の前記第1の対象物の状態と前記第2目標フレーム上の前記第1の対象物の状態との偏差が第2閾値未満となる第2時刻とが規定条件を満たすように、前記第1のロボットを制御する、請求項1に記載の制御システム。
    The imaging device images the (N + 1) th object together with the first to Nth objects,
    The reference moving image includes the (N + 1) th object,
    The control device selects a second target frame after selecting a first target frame from the reference moving image,
    A first time when a deviation between the state of the (N + 1) th object on the real image and the state of the (N + 1) th object on the first target frame is less than a first threshold value; A second time at which the deviation between the state of the first object on the real image and the state of the first object on the second target frame is less than a second threshold satisfies a specified condition, The control system according to claim 1, wherein the control system controls the first robot.
  6.  前記制御装置は、前記目標フレームが選択されてから規定時間経過するまでの間に、前記実画像上の前記第1~第Nの対象物のうちの少なくとも1つの対象物の状態と前記目標フレーム上の前記少なくとも1つの対象物の状態との偏差が閾値未満とならない場合に、前記制御システムに異常が生じていると判定する、請求項1に記載の制御システム。 The control device is configured to control a state of at least one of the first to Nth objects on the real image and the target frame during a period from when the target frame is selected to when a specified time has elapsed. The control system according to claim 1, wherein when the deviation from the state of the at least one object does not become less than a threshold value, it is determined that an abnormality has occurred in the control system.
  7.  前記第Nのロボットは、前記撮像装置の状態を変化させ、
     前記制御装置は、前記実画像上の前記第Nの対象物の状態と前記目標フレーム上の前記第Nの対象物の状態との偏差が閾値を超える場合、前記第3処理において、前記第Nのロボットのみを制御し、前記実画像上の前記第Nの対象物の状態と前記目標フレーム上の前記第Nの対象物の状態との偏差が閾値未満である場合、前記第3処理において、前記第1~第Nのロボットの各々を制御する、請求項1に記載の制御システム。
    The N-th robot changes a state of the imaging device,
    The control device may be configured such that, when the deviation between the state of the N-th object on the real image and the state of the N-th object on the target frame exceeds a threshold, Control only the robot, and when the deviation between the state of the Nth object on the real image and the state of the Nth object on the target frame is less than a threshold value, in the third processing, The control system according to claim 1, wherein each of the first to Nth robots is controlled.
  8.  前記制御装置は、前記第1処理から前記第3処理を含む一連処理を繰り返し実行し、前記第3処理を行なっている間に次の前記一連処理の前記第1処理を開始する、請求項1に記載の制御システム。 The control device repeatedly executes a series of processing including the first processing to the third processing, and starts the first processing of the next series of processing while performing the third processing. The control system according to item 1.
  9.  前記制御装置は、前記第2処理において、前記基準動画のうち予測ホライズン期間に含まれるフレームを前記目標フレームとして選択し、
     前記制御装置は、前記第3処理において、前記基準動画のうち予測ホライズン期間に含まれるフレーム上の前記第jの対象物の状態と、前記予測ホライズン期間における前記撮像装置の画像上の前記第jの対象物の状態との偏差を最小化するように、制御ホライズン期間における前記第jのロボットの制御量を算出する、請求項1に記載の制御システム。
    The control device, in the second process, selects a frame included in the predicted horizon period among the reference moving images as the target frame,
    The control device may include, in the third processing, a state of the j-th object on a frame included in the predicted horizon period of the reference moving image, and the j-th object on the image of the imaging device in the predicted horizon period. The control system according to claim 1, wherein a control amount of the j-th robot during a control horizon period is calculated so as to minimize a deviation from the state of the target object.
  10.  前記状態は、物体の位置、姿勢、形状、サイズの少なくとも1つを示す、請求項1から9のいずれか1項に記載の制御システム。 The control system according to any one of claims 1 to 9, wherein the state indicates at least one of a position, a posture, a shape, and a size of the object.
  11.  第1~第Nの対象物を撮像するための撮像装置を用いて、第1~第Nのロボットを制御する制御方法であって、
     Nは2以上の整数であり、
     第iのロボットは、第iの対象物の状態を変化させ、
     iは1~N-1の整数であり、
     前記第Nのロボットは、前記第Nの対象物および前記撮像装置の一方の状態を変化させ、
     前記第Nの対象物および前記撮像装置の他方は、定位置に設置され、
     前記制御方法は、
     前記第1~第Nの対象物の各々について変化情報を取得する第1ステップを備え、
     第jの対象物に対応する前記変化情報は、第jのロボットの制御量と、前記撮像装置の画像上における前記第jの対象物の状態の変化量との関係を示し、
     jは1~Nの整数であり、
     前記制御方法は、
     前記撮像装置によって撮像された実画像を取得する第2ステップと、
     前記第1~第Nの対象物の見本を示す基準動画から目標フレームを選択する第3ステップと、
     前記実画像と前記目標フレームとに基づいて前記第1~第Nのロボットの各々を制御する第4ステップとをさらに備え、
     前記第4ステップは、前記第jの対象物に対応する前記変化情報に基づいて、前記実画像上の前記第jの対象物の状態を前記目標フレーム上の前記第jの対象物の状態に近づけるための前記第jのロボットの制御量を算出し、算出した制御量に従って前記第jのロボットを制御するステップを含む、制御方法。
    A control method for controlling first to Nth robots using an imaging device for imaging first to Nth objects,
    N is an integer of 2 or more;
    The i-th robot changes the state of the i-th object,
    i is an integer of 1 to N-1,
    The N-th robot changes one state of the N-th object and the imaging device,
    The other of the N-th object and the imaging device is installed at a fixed position,
    The control method includes:
    A first step of acquiring change information for each of the first to Nth objects;
    The change information corresponding to the j-th object indicates a relationship between a control amount of the j-th robot and a change amount of a state of the j-th object on an image of the imaging device,
    j is an integer from 1 to N;
    The control method includes:
    A second step of acquiring a real image captured by the imaging device;
    A third step of selecting a target frame from a reference moving image indicating a sample of the first to Nth objects;
    A fourth step of controlling each of the first to Nth robots based on the real image and the target frame,
    The fourth step changes the state of the j-th object on the real image to the state of the j-th object on the target frame based on the change information corresponding to the j-th object. A control amount of the j-th robot for approaching, and controlling the j-th robot according to the calculated control amount.
  12.  第1~第Nの対象物を撮像するための撮像装置を用いて、第1~第Nのロボットを制御する制御方法をコンピュータに実行させるためのプログラムであって、
     Nは2以上の整数であり、
     第iのロボットは、第iの対象物の状態を変化させ、
     iは1~N-1の整数であり、
     前記第Nのロボットは、前記第Nの対象物および前記撮像装置の一方の状態を変化させ、
     前記第Nの対象物および前記撮像装置の他方は、定位置に設置され、
     前記制御方法は、
     前記第1~第Nの対象物の各々について変化情報を取得する第1ステップを備え、
     第jの対象物に対応する前記変化情報は、第jのロボットの制御量と、前記撮像装置の画像上における前記第jの対象物の状態の変化量との関係を示し、
     jは1~Nの整数であり、
     前記制御方法は、
     前記撮像装置によって撮像された実画像を取得する第2ステップと、
     前記第1~第Nの対象物の見本を示す基準動画から目標フレームを選択する第3ステップと、
     前記実画像と前記目標フレームとに基づいて前記第1~第Nのロボットの各々を制御する第4ステップとをさらに備え、
     前記第4ステップは、前記第jの対象物に対応する前記変化情報に基づいて、前記実画像上の前記第jの対象物の状態を前記目標フレーム上の前記第jの対象物の状態に近づけるための前記第jのロボットの制御量を算出し、算出した制御量に従って前記第jのロボットを制御するステップを含む、プログラム。
    A program for causing a computer to execute a control method for controlling the first to Nth robots using an imaging device for imaging the first to Nth objects,
    N is an integer of 2 or more;
    The i-th robot changes the state of the i-th object,
    i is an integer of 1 to N-1,
    The N-th robot changes one state of the N-th object and the imaging device,
    The other of the N-th object and the imaging device is installed at a fixed position,
    The control method includes:
    A first step of acquiring change information for each of the first to Nth objects;
    The change information corresponding to the j-th object indicates a relationship between a control amount of the j-th robot and a change amount of a state of the j-th object on an image of the imaging device,
    j is an integer from 1 to N;
    The control method includes:
    A second step of acquiring a real image captured by the imaging device;
    A third step of selecting a target frame from a reference moving image indicating a sample of the first to Nth objects;
    A fourth step of controlling each of the first to Nth robots based on the real image and the target frame,
    The fourth step changes the state of the j-th object on the real image to the state of the j-th object on the target frame based on the change information corresponding to the j-th object. A program for calculating a control amount of the j-th robot for approaching the robot and controlling the j-th robot according to the calculated control amount.
PCT/JP2019/026959 2018-07-23 2019-07-08 Control system, control method, and program WO2020022041A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-137704 2018-07-23
JP2018137704A JP7115096B2 (en) 2018-07-23 2018-07-23 Control system, control method and program

Publications (1)

Publication Number Publication Date
WO2020022041A1 true WO2020022041A1 (en) 2020-01-30

Family

ID=69180459

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/026959 WO2020022041A1 (en) 2018-07-23 2019-07-08 Control system, control method, and program

Country Status (2)

Country Link
JP (1) JP7115096B2 (en)
WO (1) WO2020022041A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3974119A1 (en) * 2020-09-23 2022-03-30 Liebherr-Verzahntechnik GmbH Device for automatically making a plug connection
CN114421258A (en) * 2022-01-26 2022-04-29 中国铁建电气化局集团有限公司 Automatic welding method for signal transmission line
CN115134529A (en) * 2022-06-29 2022-09-30 广联达科技股份有限公司 Method and device for displaying project model in multiple views and readable storage medium
WO2023209974A1 (en) * 2022-04-28 2023-11-02 株式会社ニコン Control device, control system, robot system, control method, and computer program

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6763493B1 (en) * 2019-07-19 2020-09-30 日本ゼオン株式会社 Acrylic rubber sheet with excellent storage stability
US11548150B2 (en) 2020-05-29 2023-01-10 Mitsubishi Electric Research Laboratories, Inc. Apparatus and method for planning contact-interaction trajectories
CN112894830A (en) * 2021-03-05 2021-06-04 西安热工研究院有限公司 Intelligent wiring system and wiring method for robot to machine room jumper

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6211905A (en) * 1985-07-10 1987-01-20 Hitachi Ltd Copying controller for robot route
JPH04129951A (en) * 1989-09-04 1992-04-30 Ricoh Co Ltd Automatic document feeder
JPH04167002A (en) * 1990-10-30 1992-06-15 Kiyouhou Seisakusho:Kk Nc control method for robot
JP2002224977A (en) * 2001-01-30 2002-08-13 Nec Corp Controller for robot, control method and robot
JP2005085111A (en) * 2003-09-10 2005-03-31 Fumio Miyazaki Method and device for controlling mechanical system
JP2013146844A (en) * 2012-01-23 2013-08-01 Seiko Epson Corp Device, method, and program for generating teaching image, and device, method, and program for controlling robot
JP2015150636A (en) * 2014-02-13 2015-08-24 ファナック株式会社 Robot system utilizing visual feedback
JP2015223649A (en) * 2014-05-27 2015-12-14 株式会社安川電機 Gear incorporation system and gear incorporation method
JP2017004036A (en) * 2015-06-04 2017-01-05 株式会社寺岡精工 Commodity sales processor
WO2017018113A1 (en) * 2015-07-29 2017-02-02 株式会社オートネットワーク技術研究所 Object handling simulation device, object handling simulation system, method for simulating object handling, manufacturing method for object, and object handling simulation program
JP2018015856A (en) * 2016-07-29 2018-02-01 セイコーエプソン株式会社 Robot, robot control device, and robot system

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6211905A (en) * 1985-07-10 1987-01-20 Hitachi Ltd Copying controller for robot route
JPH04129951A (en) * 1989-09-04 1992-04-30 Ricoh Co Ltd Automatic document feeder
JPH04167002A (en) * 1990-10-30 1992-06-15 Kiyouhou Seisakusho:Kk Nc control method for robot
JP2002224977A (en) * 2001-01-30 2002-08-13 Nec Corp Controller for robot, control method and robot
JP2005085111A (en) * 2003-09-10 2005-03-31 Fumio Miyazaki Method and device for controlling mechanical system
JP2013146844A (en) * 2012-01-23 2013-08-01 Seiko Epson Corp Device, method, and program for generating teaching image, and device, method, and program for controlling robot
JP2015150636A (en) * 2014-02-13 2015-08-24 ファナック株式会社 Robot system utilizing visual feedback
JP2015223649A (en) * 2014-05-27 2015-12-14 株式会社安川電機 Gear incorporation system and gear incorporation method
JP2017004036A (en) * 2015-06-04 2017-01-05 株式会社寺岡精工 Commodity sales processor
WO2017018113A1 (en) * 2015-07-29 2017-02-02 株式会社オートネットワーク技術研究所 Object handling simulation device, object handling simulation system, method for simulating object handling, manufacturing method for object, and object handling simulation program
JP2018015856A (en) * 2016-07-29 2018-02-01 セイコーエプソン株式会社 Robot, robot control device, and robot system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
2004, Retrieved from the Internet <URL:http://ishikawa-net.ac.jp/lab/E/y_kawai/www/paper/2004/EEhokuriku04.pdf> [retrieved on 20190731] *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3974119A1 (en) * 2020-09-23 2022-03-30 Liebherr-Verzahntechnik GmbH Device for automatically making a plug connection
CN114421258A (en) * 2022-01-26 2022-04-29 中国铁建电气化局集团有限公司 Automatic welding method for signal transmission line
CN114421258B (en) * 2022-01-26 2024-01-12 中国铁建电气化局集团有限公司 Automatic welding method for signal transmission line
WO2023209974A1 (en) * 2022-04-28 2023-11-02 株式会社ニコン Control device, control system, robot system, control method, and computer program
CN115134529A (en) * 2022-06-29 2022-09-30 广联达科技股份有限公司 Method and device for displaying project model in multiple views and readable storage medium

Also Published As

Publication number Publication date
JP7115096B2 (en) 2022-08-09
JP2020015101A (en) 2020-01-30

Similar Documents

Publication Publication Date Title
WO2020022041A1 (en) Control system, control method, and program
JP5165160B2 (en) Robot arm control device and control method, robot, robot arm control program, and integrated electronic circuit
JP3004279B2 (en) Image processing system for optical seam tracker
CN112109075A (en) Control system and control method
JP2013043271A (en) Information processing device, method for controlling the same, and program
CN111565895A (en) Robot system and robot control method
CN113664835B (en) Automatic hand-eye calibration method and system for robot
US20220080581A1 (en) Dual arm robot teaching from dual hand human demonstration
CN109814434B (en) Calibration method and device of control program
US20230219223A1 (en) Programming device
EP4005745A1 (en) Autonomous robot tooling system, control system, control method, and storage medium
JP2015074061A (en) Robot control device, robot system, robot, robot control method and robot control program
CN110815208B (en) Control system, analysis device, and control method
WO2020022040A1 (en) Control system, control method, and program
US20230150142A1 (en) Device and method for training a machine learning model for generating descriptor images for images of objects
CN114670189B (en) Storage medium, and method and system for generating control program of robot
TWI807990B (en) Robot teaching system
CN108711174B (en) Approximate parallel vision positioning system for mechanical arm
CN116749233A (en) Mechanical arm grabbing system and method based on visual servoing
JP5343609B2 (en) Motion trajectory generation device, motion trajectory generation method, robot control device, and robot control method
Kalitsios et al. Vision-enhanced system for human-robot disassembly factory cells: introducing a new screw dataset
JP7112528B2 (en) Work coordinate creation device
JP2020113568A (en) Component mounting device
US20230294291A1 (en) Efficient and robust line matching approach
JP7177239B1 (en) Marker detection device and robot teaching system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19840079

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19840079

Country of ref document: EP

Kind code of ref document: A1