WO2019244638A1 - Positioning system, monitor device, monitor method, and program - Google Patents

Positioning system, monitor device, monitor method, and program Download PDF

Info

Publication number
WO2019244638A1
WO2019244638A1 PCT/JP2019/022372 JP2019022372W WO2019244638A1 WO 2019244638 A1 WO2019244638 A1 WO 2019244638A1 JP 2019022372 W JP2019022372 W JP 2019022372W WO 2019244638 A1 WO2019244638 A1 WO 2019244638A1
Authority
WO
WIPO (PCT)
Prior art keywords
trajectory
positioning system
determination unit
target
specified
Prior art date
Application number
PCT/JP2019/022372
Other languages
French (fr)
Japanese (ja)
Inventor
功征 川又
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2019244638A1 publication Critical patent/WO2019244638A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B23MACHINE TOOLS; METAL-WORKING NOT OTHERWISE PROVIDED FOR
    • B23QDETAILS, COMPONENTS, OR ACCESSORIES FOR MACHINE TOOLS, e.g. ARRANGEMENTS FOR COPYING OR CONTROLLING; MACHINE TOOLS IN GENERAL CHARACTERISED BY THE CONSTRUCTION OF PARTICULAR DETAILS OR COMPONENTS; COMBINATIONS OR ASSOCIATIONS OF METAL-WORKING MACHINES, NOT DIRECTED TO A PARTICULAR RESULT
    • B23Q17/00Arrangements for observing, indicating or measuring on machine tools
    • B23Q17/24Arrangements for observing, indicating or measuring on machine tools using optics or electromagnetic waves
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03FPHOTOMECHANICAL PRODUCTION OF TEXTURED OR PATTERNED SURFACES, e.g. FOR PRINTING, FOR PROCESSING OF SEMICONDUCTOR DEVICES; MATERIALS THEREFOR; ORIGINALS THEREFOR; APPARATUS SPECIALLY ADAPTED THEREFOR
    • G03F9/00Registration or positioning of originals, masks, frames, photographic sheets or textured or patterned surfaces, e.g. automatically
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D3/00Control of position or direction
    • G05D3/12Control of position or direction using feedback

Definitions

  • the present technology relates to a positioning system using a visual sensor, a monitoring device, a monitoring method, and a program.
  • Patent Document 1 JP-A-2003-50106 discloses a technique for determining a calibration parameter by performing the following steps (a) to (e).
  • step (a) the table or the imaging unit is moved, and the positional relationship between the table and the imaging unit is calculated.
  • step (b) the table or the imaging unit is moved, and the positional relationship between the table and a predetermined portion of the table is calculated.
  • step (c) the table or the imaging unit is moved, and a correction amount is obtained to correct the positional relationship between the table and a predetermined portion of the table.
  • step (d) the table or the imaging unit is moved, and the positional relationship between the table and the imaging unit is calculated again.
  • steps (c) and (d) are repeated until the correction amount obtained in step (c) becomes equal to or less than a predetermined value.
  • Calibration is performed at the start-up stage of the device.
  • the coordinate system of the visual sensor cannot be accurately converted to the coordinate system of the moving mechanism.
  • it is necessary to perform maintenance such as re-calibration, replacement of parts, and adjustment of the machine.
  • whether or not maintenance is required has been determined based on a criterion depending on an operator. That is, the worker has empirically determined the necessity of maintenance based on trends such as an increase in the defect rate of the product and an increase in the production tact. Therefore, an abnormality in the positioning system may be overlooked.
  • the present invention has been made in view of the above problems, and has as its object to provide a positioning system capable of accurately determining the presence or absence of an abnormality, a monitoring device used in the positioning system, a monitoring method and a program for the positioning system. To provide.
  • the moving mechanism moves the object.
  • the visual sensor captures an image of an object, and executes a position specifying process for specifying the position of the object based on the captured image in each imaging cycle.
  • the movement control unit controls the movement mechanism so that the position specified by the visual sensor approaches the target position.
  • the trajectory determination unit determines an ideal trajectory of the target object from the position of the target object specified by the initial position specifying process to the target position.
  • the determining unit determines whether there is an abnormality in the positioning system based on a comparison result between the actual trajectory of the position of the target object and the ideal trajectory specified by the position specifying process after the first position specifying process.
  • the determination unit can accurately determine the presence or absence of an abnormality that causes a deviation of a calibration parameter based on a comparison result between an actual trajectory and an ideal trajectory without depending on personal judgment. A good judgment can be made.
  • the visual sensor performs an initial position specifying process when the object is stopped, and performs a subsequent position specifying process when the object is moving. According to this disclosure, the positioning process can be speeded up.
  • the determination unit determines that an abnormality has occurred and notifies a warning when the feature amount indicating the degree of deviation between the actual trajectory and the ideal trajectory exceeds the first threshold. According to the present disclosure, an operator can perform maintenance for removing a cause of a deviation of a calibration parameter at an appropriate timing.
  • the determination unit stops the moving mechanism when the feature amount exceeds the second threshold.
  • the second threshold is larger than the first threshold.
  • the characteristic amount is a deviation between the position of the object specified by the subsequent position specification processing and the ideal trajectory.
  • the actual trajectory and the ideal trajectory are indicated by information that associates the elapsed time from the start of the control of the moving mechanism by the movement control unit with the position of the object.
  • the feature amount may be a maximum value or an integral value of the deviation between the position on the actual trajectory and the position on the ideal trajectory, where the elapsed time is the same.
  • the trajectory determination unit determines an ideal trajectory by performing a simulation using the simulation model of the movement mechanism and the movement control unit and the position of the target object specified by the initial position specification processing. According to this disclosure, an ideal trajectory can be accurately determined by using a simulation model.
  • the trajectory determination unit is configured to output, by machine learning using learning data, information indicating the trajectory of the target object from the position of the input target object to the target position.
  • the learned learning device is further provided.
  • the learning data is data indicating the position of the target object specified by the visual sensor in the positioning processing executed when the positioning system is normal.
  • the trajectory determination unit determines the trajectory indicated by the information output from the learning device as an ideal trajectory by inputting the position of the target object specified by the initial position specification processing to the learning device. According to this disclosure, the trajectory determination unit can determine an ideal trajectory that matches the actual situation.
  • the determination unit outputs a screen showing the actual trajectory and the ideal trajectory to the display device. According to this disclosure, the operator can grasp the degree of deviation of the calibration parameter by checking the screen.
  • the actual trajectory and the ideal trajectory are indicated by information that associates the elapsed time from the start of the control of the moving mechanism by the movement control unit with the position of the object.
  • the movement mechanism includes a plurality of translation mechanisms that translate in different movement directions.
  • the determination unit decomposes the deviation between the position on the actual trajectory and the position on the ideal trajectory having the same elapsed time into components in the movement direction corresponding to each of the plurality of translation mechanisms, and calculates the temporal change of each component.
  • the displayed screen is output to the display device.
  • the operator can infer the abnormal position of the moving mechanism by checking the screen of the display device.
  • the determination unit outputs a screen showing the transition of the feature amount to the display device. According to this disclosure, an operator can grasp a situation such as aging of the moving mechanism.
  • a monitoring device used for the positioning system includes the trajectory determination unit and the determination unit.
  • the determination unit can accurately determine the presence or absence of an abnormality that causes a deviation of the calibration parameter based on the comparison result between the actual trajectory and the ideal trajectory without depending on personal judgment. Can be.
  • a method for monitoring a positioning system that performs a positioning process on an object includes first and second steps.
  • the positioning system includes a moving mechanism for moving the object, a visual sensor for imaging the object, and executing a position specifying process for specifying the position of the object based on the captured image for each imaging cycle,
  • a movement control unit for controlling the movement mechanism so that the position specified by the sensor approaches the target position.
  • the first step is a step of determining an ideal trajectory of the object from the position of the object specified by the initial position specifying process to the target position.
  • the second step is a step of determining the presence or absence of an abnormality in the positioning system based on a comparison result between the actual trajectory and the ideal trajectory of the position of the object specified by the position specifying processing after the first position specifying processing. is there. According to this disclosure as well, it is possible to accurately determine the presence or absence of an abnormality that causes a deviation of a calibration parameter.
  • a program for causing a computer to execute the above monitoring method causes the computer to execute the first and second steps. According to this disclosure as well, it is possible to accurately determine the presence or absence of an abnormality that causes a deviation of a calibration parameter.
  • the presence or absence of an abnormality in the positioning system can be accurately determined.
  • FIG. 1 is a schematic diagram illustrating an outline of a positioning system according to the present embodiment. It is a figure showing an example of an ideal locus and an actual locus.
  • FIG. 2 is a schematic diagram illustrating a hardware configuration of an image processing device that configures the positioning system 1 according to the present embodiment. It is a schematic diagram which shows the hardware constitutions of the controller which comprises the positioning system concerning embodiment.
  • FIG. 2 is a block diagram illustrating an example of a functional configuration of the positioning system illustrated in FIG. 1.
  • 9 is a flowchart illustrating an example of a flow of a positioning process in the controller. 9 is a flowchart illustrating an example of a flow of movement control processing performed by a movement control unit.
  • FIG. 1 is a schematic diagram illustrating an outline of a positioning system according to the present embodiment. It is a figure showing an example of an ideal locus and an actual locus.
  • FIG. 2 is a schematic diagram illustrating a hardware configuration of an image processing device that configures the positioning system 1 according to
  • FIG. 9 is a diagram illustrating a method for calculating an encoder value EnsX at an imaging time.
  • FIG. 9 is a diagram illustrating an example of a change in the position of a mark in a positioning process according to a comparative example and a positioning process according to an embodiment.
  • FIG. 9 is a diagram illustrating an example of a change in the moving speed of the stage device in the positioning process of the comparative example and the positioning process according to the embodiment.
  • 9 is a flowchart illustrating an example of the flow of a monitoring process in a controller.
  • 12 is a flowchart showing the contents of a subroutine of an ideal trajectory determination process (step S21) shown in FIG.
  • FIG. 12 is a flowchart showing the contents of a subroutine of a process (step S22) for determining the presence or absence of an abnormality shown in FIG. 13 is a flowchart illustrating the contents of a subroutine of a determination process according to Modification 2. 13 is a flowchart illustrating a flow of a monitoring process according to a third modification. 16 is a flowchart showing the contents of a subroutine of a process (step S121) for determining the presence / absence of abnormality and the level of abnormality shown in FIG.
  • FIG. 14 is a diagram illustrating a functional configuration of a controller according to a modification 5.
  • FIG. 14 is a diagram illustrating a configuration of a positioning system according to a modification 6.
  • FIG. 9 is a diagram illustrating an example of a screen showing a time change of a feature amount (integral value of deviation) indicating a degree of deviation between an ideal trajectory and an actual trajectory.
  • FIG. 1 is a schematic diagram showing an outline of a positioning system according to the present embodiment.
  • the positioning system 1 shown in FIG. 1 positions an object (hereinafter, referred to as “target work W”) using image processing.
  • the positioning typically means a process of arranging the target work W at an original position of a production line in a manufacturing process of an industrial product or the like.
  • the positioning system 1 positions the glass substrate with respect to the exposure mask before printing the circuit pattern on the glass substrate (exposure process) on the liquid crystal panel production line.
  • the positioning system 1 includes a moving mechanism 10, a driver unit 20, a visual sensor 30, a controller 40, and a display device 50.
  • the moving mechanism 10 moves the target workpiece W to be placed.
  • the moving mechanism 10 may have any degree of freedom as long as the mechanism can arrange the target work W at the target position.
  • the moving mechanism 10 is, for example, an XY ⁇ stage that can apply horizontal translation and rotation to the target workpiece W.
  • the moving mechanism 10 in the example shown in FIG. 1 is an XY ⁇ stage, and includes an X stage 11, a Y stage 13, a ⁇ stage 15, and servomotors 12, 14, and 16.
  • the X stage 11 is a translation mechanism that translates along the X direction by driving the servo motor 12.
  • the Y stage 13 is a translation mechanism that translates along the Y direction by driving a servo motor 14.
  • stage 15 is a rotating mechanism that rotates and moves in the ⁇ direction by driving a servo motor 16.
  • the target work W is placed on the ⁇ stage 15.
  • the servo motors 12, 14, 16 are provided with encoders 12E, 14E, 16E, respectively.
  • Each of the encoders 12E, 14E, and 16E generates a pulse signal corresponding to the drive amount (rotation amount) of the corresponding servo motor.
  • the encoder 12E measures the amount of movement of the X stage 11 from the initial position as an encoder value EnX by counting the number of pulses included in the pulse signal. The count value of the number of pulses and the movement amount are related by a predetermined coefficient. Therefore, the encoder 12E can measure the movement amount by multiplying the count value of the pulse number by the coefficient.
  • the encoder 14E measures the translation amount of the Y stage 13 in the Y direction from the initial position as an encoder value EnY.
  • the encoder 16E measures the rotational movement amount of the ⁇ stage 15 from the initial position as an encoder value En ⁇ .
  • the driver unit 20 controls the operation of the moving mechanism 10 in accordance with the movement command received from the controller 40 for each control cycle Ts.
  • the driver unit 20 includes servo drivers 22, 24, and 26.
  • the servo drivers 22, 24, 26 acquire the encoder values EnX, EnY, En ⁇ from the servo motors 12, 14, 16 for each control cycle Ts.
  • the control cycle Ts is fixed, for example, 1 ms.
  • the servo driver 22 performs feedback control on the servomotor 12 so that the movement amount of the X stage 11 approaches the movement command.
  • the servo driver 24 performs feedback control on the servo motor 14 so that the movement amount of the Y stage 13 approaches the movement command.
  • the servo driver 26 performs feedback control on the servo motor 16 so that the movement amount of the ⁇ stage 15 approaches the movement command.
  • the driver unit 20 outputs the encoder values EnX, EnY, En ⁇ acquired from the moving mechanism 10 to the controller 40.
  • the visual sensor 30 captures an image of the target work W and executes a position specifying process for specifying the position of the target work W based on the captured image for each imaging cycle Tb.
  • the imaging cycle Tb varies according to the imaging situation and the like, and is, for example, about 60 ms.
  • the visual sensor 30 specifies, as the position of the target work W, the positions of the marks 5a and 5b, which are characteristic portions provided on the target work W.
  • the visual sensor 30 includes an imaging unit 31 installed at a fixed position and an image processing device 32.
  • the imaging unit 31 performs an imaging process of imaging a subject existing in an imaging field of view and generating image data, and captures an image of the target work W.
  • the imaging unit 31 is, for example, a camera.
  • the image processing device 32 performs image analysis on the image data generated by the imaging unit 31 and calculates the coordinates of the positions of the marks 5a and 5b.
  • the coordinates are indicated by a local coordinate system (hereinafter, referred to as a “camera coordinate system”) corresponding to the visual sensor 30.
  • the controller 40 is, for example, a PLC (programmable logic controller) and performs various FA controls.
  • the controller 40 Based on the position of the target work W specified by the visual sensor 30, the controller 40 updates the movement command for each control cycle Ts and outputs it to the driver unit 20 so that the position of the target work W approaches the target position. .
  • the controller 40 updates the movement command for each control cycle Ts so that the position of each of the marks 5a and 5b specified by the visual sensor 30 approaches the corresponding target position, and updates the driver unit 20. Output to
  • the coordinates of the positions of the marks 5a and 5b specified by the visual sensor 30 are indicated in the camera coordinate system.
  • the amount of movement of the moving mechanism 10 to be controlled by the controller 40 is indicated in a coordinate system of the moving mechanism 10 (hereinafter, referred to as “mechanical coordinate system”). Therefore, the controller 40 converts the coordinates of the positions of the marks 5a and 5b from the camera coordinate system to the machine coordinate system using the calibration parameters for associating the camera coordinate system with the machine coordinate system, and uses the converted coordinates. Generate a movement command MV.
  • the controller 40 monitors the presence or absence of an abnormality that causes a deviation of the calibration parameter.
  • the “calibration parameter deviation” means a deviation between a currently set calibration parameter and an ideal calibration parameter corresponding to the current state of the moving mechanism 10.
  • the abnormalities that cause the deviation of the calibration parameters include, for example, aging of the moving mechanism 10 and external factors.
  • the controller 40 determines an ideal trajectory on the image of the target work W from the position (initial position) of the target work W specified in the initial position specifying process of the visual sensor 30 to the target work W to the target position.
  • the ideal trajectory indicates a trajectory on the image of the target work W when it is assumed that the deviation of the calibration parameter is zero.
  • the controller 40 determines an ideal trajectory on the image of the mark 5a from the initial position of the mark 5a specified by the visual sensor 30 to the target position of the mark 5a. Further, the controller 40 determines an ideal trajectory on the image of the mark 5b from the initial position of the mark 5b specified by the visual sensor 30 to the target position of the mark 5b.
  • the controller 40 determines the presence or absence of an abnormality that causes a deviation of the calibration parameter based on a comparison result between the actual trajectory and the ideal trajectory of the position of the target work W specified by the second and subsequent position specifying processes.
  • the determination unit 49 outputs the determination result to the display device 50.
  • the controller 40 controls the moving mechanism 10 so that the positions of the marks 5a and 5b specified by the visual sensor 30 for each imaging cycle Tb approach the corresponding target positions. Therefore, even if the calibration parameter is shifted, each of the marks 5a and 5b of the target work W finally reaches the corresponding target position. However, the actual trajectory deviates from the ideal trajectory because of the deviation of the calibration parameters.
  • FIG. 2 is a diagram showing an example of an ideal trajectory and an actual trajectory.
  • FIG. 2 shows an ideal trajectory and an actual trajectory for one of the marks 5a and 5b. The same applies to the other mark.
  • the position P (0) indicates the initial position of the target mark on the image specified by the first position specifying process.
  • a solid line 70 indicates an ideal trajectory from the position P (0) to the target position SP of the target mark.
  • the solid line 72a shows an example of the actual trajectory of the target mark.
  • a solid line 72b shows another example of the actual trajectory of the target mark.
  • the controller 40 sets, for example, an allowable area 71 including an ideal locus indicated by a solid line 70.
  • the allowable area 71 is determined experimentally or theoretically.
  • the controller 40 determines that an abnormality that causes a deviation of the calibration parameter has not occurred.
  • the controller 40 determines that an abnormality that causes a deviation of the calibration parameter has not occurred with respect to the actual locus indicated by the solid line 72a.
  • the controller 40 determines that an abnormality that causes a deviation of the calibration parameter has occurred.
  • the determination unit 49 determines that an abnormality that causes a deviation of the calibration parameter has occurred with respect to the actual locus indicated by the solid line 72b.
  • the controller 40 can accurately determine the presence or absence of an abnormality that causes a deviation of the calibration parameter without depending on the personal judgment.
  • FIG. 3 is a schematic diagram illustrating a hardware configuration of the image processing device 32 included in the positioning system 1 according to the present embodiment.
  • the image processing device 32 typically has a structure according to a general-purpose computer architecture, and realizes various types of image processing as described later by executing a program installed in advance by a processor.
  • the image processing device 32 includes a processor 310 such as a CPU (Central Processing Unit) or an MPU (Micro-Processing Unit), a RAM (Random Access Memory) 312, a display controller 314, and a system controller 316. , An input / output (I / O) controller 318, a hard disk 320, a camera interface 322, an input interface 324, a controller interface 326, a communication interface 328, and a memory card interface 330. These units are connected to each other so as to enable data communication with the system controller 316 at the center.
  • a processor 310 such as a CPU (Central Processing Unit) or an MPU (Micro-Processing Unit), a RAM (Random Access Memory) 312, a display controller 314, and a system controller 316.
  • I / O controller 318 An input / output (I / O) controller 318, a hard disk 320, a camera interface 322, an input interface 324, a controller interface 326, a
  • the processor 310 exchanges programs (codes) and the like with the system controller 316 and executes them in a predetermined order, thereby realizing the intended arithmetic processing.
  • the system controller 316 is connected to the processor 310, the RAM 312, the display controller 314, and the I / O controller 318 via buses, respectively, exchanges data with each unit, and performs processing of the entire image processing apparatus 32. Governing
  • the RAM 312 is typically a volatile storage device such as a DRAM (Dynamic Random Access Memory), and stores a program read from the hard disk 320, an image (image data) captured by the imaging unit 31, and an image. Holds processing results and work data.
  • the processing result for the image includes the position coordinates of the marks 5a and 5b included in the image.
  • the display controller 314 is connected to the display unit 60, and outputs a signal for displaying various information to the display unit 60 according to an internal command from the system controller 316.
  • the I / O controller 318 controls data exchange between a recording medium connected to the image processing apparatus 32 and an external device. More specifically, I / O controller 318 is connected to hard disk 320, camera interface 322, input interface 324, controller interface 326, communication interface 328, and memory card interface 330.
  • the hard disk 320 is typically a non-volatile magnetic storage device, and stores various setting values in addition to a program executed by the processor 310.
  • the camera interface 322 corresponds to an input unit that receives image data generated by photographing the target work W, and mediates data transmission between the processor 310 and the imaging unit 31.
  • the camera interface 322 includes an image buffer for temporarily storing image data from the imaging unit 31, respectively.
  • the input interface 324 mediates data transmission between the processor 310 and input devices such as a keyboard 334, a mouse, a touch panel, and a dedicated console.
  • the controller interface 326 mediates data transmission between the processor 310 and the controller 40.
  • the communication interface 328 mediates data transmission between the processor 310 and another personal computer or server device (not shown).
  • the communication interface 328 is typically made of Ethernet (registered trademark), USB (Universal Serial Bus), or the like.
  • the memory card interface 330 mediates data transmission between the processor 310 and the recording medium 61.
  • FIG. 4 is a schematic diagram illustrating a hardware configuration of the controller 40 configuring the positioning system according to the embodiment.
  • the controller 40 includes a chipset 412, a processor 414, a nonvolatile memory 416, a main memory 418, a system clock 420, a memory card interface 422, a display controller 426, a communication interface 428, and an internal bus controller 430. , A fieldbus controller 438.
  • the chipset 412 and other components are respectively connected via various buses.
  • the processor 414 and the chipset 412 typically have a configuration according to a general-purpose computer architecture. That is, the processor 414 interprets and executes the instruction codes sequentially supplied from the chipset 412 according to the internal clock.
  • the chipset 412 exchanges internal data with various connected components and generates an instruction code necessary for the processor 414.
  • the system clock 420 generates a system clock having a predetermined cycle and provides the system clock to the processor 414.
  • the chip set 412 has a function of caching data and the like obtained as a result of execution of arithmetic processing by the processor 414.
  • the controller 40 has a nonvolatile memory 416 and a main memory 418 as storage means.
  • the non-volatile memory 416 non-volatilely stores data definition information, log information, and the like, in addition to the control program 440 executed by the processor 414.
  • the control program 440 is distributed while being stored in the recording medium 424 or the like.
  • the main memory 418 is a volatile storage area that holds various programs to be executed by the processor 414 and is also used as a working memory when executing various programs.
  • the controller 40 has a communication interface 428 and an internal bus controller 430 as communication means. These communication circuits transmit and receive data.
  • the communication interface 428 exchanges data with the visual sensor 30.
  • the internal bus controller 430 controls data exchange. More specifically, the internal bus controller 430 includes a buffer memory 436, a DMA (Dynamic Memory Access) control circuit 432, and an internal bus control circuit 434.
  • DMA Dynamic Memory Access
  • the memory card interface 422 connects the recording medium 424 detachable to the controller 40 and the processor 414.
  • the recording medium 424 stores information such as a program by an electrical, magnetic, optical, mechanical, or chemical action so that a computer or other device, machine, or the like can read information such as a recorded program. It is a medium to do.
  • the recording medium 51 circulates in a state where a control program 440 executed by the controller 40 and the like are stored, and the memory card interface 422 reads the control program from the recording medium 424.
  • the recording medium 424 may be a general-purpose semiconductor storage device such as SD (Secure Digital), a magnetic recording medium such as a flexible disk (Flexible Disk), or an optical recording medium such as a CD-ROM (Compact Disk Read Only Memory). Become. Alternatively, a program downloaded from a distribution server or the like may be installed in the controller 40 via the communication interface 417.
  • SD Secure Digital
  • a magnetic recording medium such as a flexible disk (Flexible Disk)
  • an optical recording medium such as a CD-ROM (Compact Disk Read Only Memory).
  • a program downloaded from a distribution server or the like may be installed in the controller 40 via the communication interface 417.
  • the field bus controller 438 is a communication interface for connecting to a field network.
  • a field network for example, EtherCAT (registered trademark), EtherNet / IP (registered trademark), CompoNet (registered trademark), or the like is employed.
  • the controller 40 is connected to the servo drivers 22, 24, 26 via a field bus controller 438.
  • the display controller 426 is connected to the display device 50, and outputs a signal for displaying various information to the display device 50 according to a command from the chipset 412.
  • FIG. 5 is a block diagram showing an example of a functional configuration of the positioning system shown in FIG.
  • the image processing device 32 searches for the marks 5a and 5b (see FIG. 1) from the image captured by the imaging unit 31, and detects the positions of the marks 5a (hereinafter, referred to as “measurement positions PMa”) and the positions of the marks 5b ( Hereinafter, “measurement position PMb” is specified.
  • the image processing device 32 may recognize the marks 5a and 5b from the image using a known pattern recognition technology.
  • the image processing device 32 measures the coordinates of the measurement positions PMa and PMb. The coordinates are indicated in a camera coordinate system.
  • the controller 40 includes a movement control unit 41, a model storage unit 46, and a monitoring unit 47.
  • the movement control unit 41 and the monitoring unit 47 are realized by the processor 414 illustrated in FIG.
  • the model storage unit 46 is realized by the nonvolatile memory 416 and the main memory 418 shown in FIG.
  • the movement control unit 41 generates a movement command such that the measured position PMa of the mark 5a approaches the target position SPa and the measured position PMb of the mark 5b approaches the target position SPb.
  • Target positions SPa and SPb are predetermined.
  • the movement control unit 41 includes a coordinate conversion unit 42, a position determination unit 43, a subtraction unit 44, and a calculation unit 45.
  • the measurement positions PMa and PMb output from the visual sensor 30 are indicated by the camera coordinate system.
  • the movement amount of the movement mechanism 10 to be controlled by the movement control unit 41 is shown in a machine coordinate system.
  • the coordinate conversion unit 42 performs the coordinate conversion of the measurement positions PMa and PMb specified by the visual sensor 30 using the calibration parameters for converting the camera coordinate system into the machine coordinate system, and performs measurement in the machine coordinate system.
  • the coordinates of the positions PMa and PMb are calculated.
  • the coordinate conversion unit 42 may perform the coordinate conversion using, for example, affine transformation.
  • the coordinate conversion unit 42 outputs the coordinates of the measurement positions PMa and PMb in the machine coordinate system to the position determination unit 43.
  • the measurement positions PMa and PMb are specified for each imaging cycle Tb.
  • the movement command is generated for each control cycle Ts.
  • the imaging cycle Tb is longer than the control cycle Ts. Therefore, if the moving mechanism 10 is controlled using only the measurement positions PMa and PMb specified by the visual sensor 30, overshoot and vibration are likely to occur.
  • the position determination unit 43 determines the measurement positions PMa, PMb specified by the visual sensor 30 for each imaging cycle Tb and the encoder values EnX, EnY, En ⁇ output for each control cycle Ts.
  • the estimated positions PVa and PVb of the two marks 5a and 5b of the target work W are determined for each control cycle Ts. The method for determining the estimated positions PVa and PVb will be described later.
  • the subtraction unit 44 outputs a deviation (distance) between the estimated position PVa and the target position SPa, and a deviation (distance) between the estimated position PVb and the target position SPb.
  • the operation unit 45 performs an operation (P operation or PID operation) so that the deviation between the estimated position PVa and the target position SPa and the deviation between the estimated position PVb and the target position SPb converge to 0, and moves every control cycle Ts.
  • the commands MVX, MVY, MV ⁇ are calculated.
  • the movement command MVX is a movement command for the X stage 11 (see FIG. 1).
  • the movement command MVY is a movement command for the Y stage 13 (see FIG. 1).
  • the movement command MV ⁇ is a movement command for the ⁇ stage 15 (see FIG. 1).
  • the calculation unit 45 outputs the calculated movement commands MVX, MVY, MV ⁇ to the driver unit 20.
  • the movement commands MVX, MVY, MV ⁇ are, for example, position commands or speed commands.
  • the model storage unit 46 stores a simulation model simulating the moving mechanism 10, the driver unit 20, and the movement control unit 41.
  • the simulation model is composed of design data of the moving mechanism 10, a program that defines processing of the moving control unit 41, and the like.
  • the monitoring unit 47 monitors the presence or absence of an abnormality that causes a deviation of a calibration parameter for converting a camera coordinate system into a machine coordinate system. As shown in FIG. 5, the monitoring unit 47 includes a trajectory determination unit 48 and a determination unit 49.
  • the trajectory determination unit 48 performs a simulation using the simulation model stored in the model storage unit 46, so that an image of the target work W from the initial position to the target position of the target work W specified in the first position specification processing is displayed. Is determined.
  • the trajectory determination unit 48 performs a simulation using a simulation model, and thereby performs the simulation of the mark 5a from the measurement position PMa of the mark 5a specified in the initial position specification processing by the visual sensor 30 to the target position SPa. Determine the ideal trajectory on the image.
  • the trajectory determination unit 48 performs a simulation using the simulation model, and thereby, on the image of the mark 5b from the measurement position PMb of the mark 5b specified in the initial position specification processing by the visual sensor 30 to the target position SPb. Is determined. A specific processing method of the trajectory determination unit 48 will be described later.
  • the determination unit 49 determines the presence or absence of an abnormality based on a comparison result between the actual trajectory and the ideal trajectory of the position of the target work W specified in the second and subsequent position specifying processes.
  • the determination unit 49 outputs the determination result to the display device 50.
  • the determination unit 49 evaluates the degree of deviation between the ideal trajectory of the mark 5a determined by the trajectory determination unit 48 and the measurement position PMa of the mark 5a specified by the second and subsequent position specifying processes. Similarly, the determination unit 49 evaluates the degree of deviation between the ideal trajectory of the mark 5b determined by the trajectory determination unit 48 and the measurement position PMb of the mark 5b specified by the second and subsequent position specification processing. The determining unit 49 determines whether there is an abnormality based on the evaluation result. A specific processing method of the determination unit 49 will be described later.
  • FIG. 6 is a flowchart illustrating an example of the flow of the positioning process in the controller.
  • step S1 the controller 40 initializes the estimated positions PVa, PVb and the encoder values EnX, EnY, Eb ⁇ .
  • Step S2 when the target work W is placed on the moving mechanism 10, the controller 40 outputs an imaging trigger to the visual sensor 30 in accordance with an instruction from a higher-level control device. Then, in step S2, the visual sensor 30 captures an image of the target work W on the stopped moving mechanism 10. In step S3, the visual sensor 30 specifies the measurement positions PMa and PMb of the marks 5a and 5b included in the captured image, respectively. Steps S2 and S3 correspond to the initial position identification processing by the visual sensor 30.
  • step S4 the movement control unit 41 starts the movement control of the movement mechanism 10. The details of the movement control by the movement control unit 41 will be described later.
  • step S5 the movement control unit 41 determines whether each deviation of the estimated positions PVa, PVb from the target positions SPa, SPb is less than a threshold Th0.
  • the threshold value Th0 is predetermined in accordance with the required positioning accuracy. If the respective deviations of the estimated positions PVa, PVb from the target positions SPa, SPb are smaller than the threshold Th0 (YES in step S5), the movement control unit 41 ends the movement control of the movement mechanism 10. Thus, the positioning process ends.
  • step S5 If at least one of the deviations of the estimated positions PVa and PVb from the target positions SPa and SPb is not less than the threshold value Th0 (NO in step S5), the processing in steps S6 to S8 is repeated for each imaging cycle Tb. Note that the movement control by the movement control unit 41 is also performed in parallel between steps S6 to S8.
  • step S6 the controller 40 determines whether or not the current time is the imaging trigger output time.
  • the imaging trigger output time is determined in advance according to the imaging cycle Tb. That is, a time at which the imaging cycle Tb has elapsed from the previous imaging trigger output time is determined as a new imaging trigger output time. If the current time is not the imaging trigger output time (NO in step S6), the positioning process returns to step S5.
  • step S6 the controller 40 outputs an imaging trigger to the visual sensor 30.
  • step S7 the visual sensor 30 that has received the imaging trigger captures an image of the target work W on the moving mechanism 10 that is moving.
  • step S8 the visual sensor 30 specifies the measurement positions PMa and PMb of the marks 5a and 5b included in the captured image, respectively. Steps S7 and S8 correspond to the position specifying process after the first position specifying process (the second and subsequent position specifying processes). After step S8, the positioning process returns to step S5.
  • FIG. 7 is a flowchart illustrating an example of the flow of the movement control process performed by the movement control unit.
  • the movement control unit 41 acquires the coordinates (camera coordinate system) of the latest measurement positions PMa and PMb specified by the visual sensor 30.
  • the coordinate conversion unit 42 converts the coordinates of the measurement positions PMa and PMb into a machine coordinate system.
  • the position determination unit 43 acquires the imaging time ti.
  • the position determining unit 43 acquires a time when a fixed transmission delay time Tsd has elapsed from the time at which the imaging trigger was output, as the imaging time ti.
  • the transmission delay time Tsd is a delay time from when the visual sensor 30 receives an imaging trigger to when imaging starts.
  • step S14 the position determination unit 43 acquires the encoder values EnX, EnY, En ⁇ output from the encoders 12E, 14E, 16E at a plurality of times near the imaging time ti (hereinafter, referred to as “output times”).
  • the encoder values detected in the past are stored in the storage unit of the controller 40 (for example, the nonvolatile memory 416 or the main memory 418 (see FIG. 4)).
  • step S15 the position determination unit 43 calculates an interpolation value of the encoder value EnX at a plurality of output times, and sets the interpolation value as the encoder value EnsX at the imaging time ti. Similarly, the position determination unit 43 calculates an interpolation value of the encoder value EnY at a plurality of times, and sets the interpolation value as the encoder value EnsY at the imaging time. The position determination unit 43 calculates an interpolation value of the encoder value En ⁇ at a plurality of times, and sets the interpolation value as the encoder value Ens ⁇ at the imaging time.
  • FIG. 8 is a diagram illustrating a method of calculating the encoder value EnsX at the imaging time.
  • position determination section 43 calculates an interpolation value as follows.
  • the encoder value EnX output at the output time t (j) is defined as an encoder value EnX (j).
  • the transmission is delayed by a certain transmission delay time Ted from when the encoder 12E detects the pulse signal to when the encoder value EnX is output. Therefore, the encoder value EnX (j) output at the output time t (j) indicates the drive amount of the servo motor 12 from the initial position at a time before the output time t (j) by the transmission delay time Ted. I have.
  • the position determination unit 43 specifies two output times that are close to the imaging time ti. For example, when the control cycle Ts, the transmission delay time Ted of the encoder value, and the transmission delay time Tsd of the imaging trigger satisfy Ts ⁇ Ted ⁇ Tsd ⁇ 2Ts ⁇ Ted, the position determination unit 43 sets the position immediately after the imaging time ti.
  • the output time t (n) and the output time t (n + 1) are specified.
  • the position determining unit 43 acquires the encoder value EnX (n) output at the output time t (n) and the encoder value EnX (n + 1) output at the output time t (n + 1).
  • the position determination unit 43 calculates the encoder value EnsX (i) at the imaging time ti by using an interpolation value between the encoder value EnX (n) and the encoder value EnX (n + 1). Specifically, the position determination unit 43 calculates the encoder value EnsX (i) at the imaging time ti using the following equation (1).
  • EnsX (i) EnX (n) + Kk * (EnX (n + 1) -EnX (n)) (1)
  • Kk is an interpolation coefficient.
  • the encoder value EnsX (i) at the imaging time ti can be calculated with high accuracy.
  • the encoder values EnsY (i) and Ens ⁇ (i) at the imaging time ti are calculated. If the imaging time ti matches the time before the transmission time of the encoder value by the transmission delay time Ted, the encoder value may be used as it is.
  • step S16 the position determination unit 43 uses the measured positions PMa and PMb, the encoder values EnX, EnY, and En ⁇ after the imaging time, and the encoder values EnsX, EnsY, and Ens ⁇ at the imaging time.
  • the estimated positions PVa and PVb are calculated.
  • the position determination unit 43 performs the affine transformation when the translation is performed by (EnX ⁇ EnsX) in the X direction, the translation is performed by (EnY ⁇ EnsY) in the Y direction, and the rotation is performed by (En ⁇ Ens ⁇ ).
  • the measurement position PMa is input into the equation.
  • the position determining unit 43 calculates the coordinates of the estimated position PVa of the mark 5a.
  • the position determination unit 43 calculates the coordinates of the estimated position PVb of the mark 5b by inputting the measurement position PMb into the affine transformation equation.
  • step S17 the calculation unit 45 generates the movement commands MVX, MVY, MV ⁇ by, for example, P calculation based on the respective deviations of the estimated positions PVa, PVb with respect to the target positions SPa, SPb. Then, the arithmetic unit 45 outputs the movement commands MVX, MVY, MV ⁇ to the driver unit 20.
  • the controller 40 uses the high-precision measurement positions PMa, PMb at the time when the high-precision measurement positions PMa, PMb are input by the image processing, and uses the high-precision measurement positions PMa, PMb.
  • the time interval at which the measurement positions PMa and PMb are input is the imaging cycle Tb, and is longer than the control cycle Ts at which the encoder values EnX, EnY, and En ⁇ are input.
  • the position determination unit 43 determines the estimated positions PVa and PVb for each input time of the encoder values EnX, EnY and En ⁇ having a short input cycle. Then, the movement of the moving mechanism 10 is controlled. This enables high-accuracy and short-period positioning control. Further, the position determining unit 43 performs a process using the above-described simple four arithmetic operations. Therefore, high-speed and high-precision positioning can be realized by a simple configuration and processing.
  • FIG. 9 is a diagram illustrating an example of a change in the position of a mark in the positioning process of the comparative example and the positioning process according to the embodiment.
  • FIG. 10 is a diagram illustrating an example of a change in the moving speed of the stage device between the positioning process of the comparative example and the positioning process according to the embodiment.
  • the required moving amount of the moving mechanism 10 is determined from an image obtained by imaging the target work W, and a unit process of moving the moving mechanism 10 by the determined necessary moving amount is performed. Then, after the movement of the moving mechanism 10 by the unit process is completed, the target work W is imaged, and the unit process is repeated again. By repeating the unit processing a plurality of times in this manner, the position of the target work W is positioned at the target position.
  • reference numerals 81, 82, 83, and 84 denote trajectories on the image of the target mark (the mark 5 a or the mark 5 b) on the target work W in the first, second, third, and fourth unit processes, respectively.
  • reference numerals 91, 92, 93, and 94 indicate changes in the moving speed of the moving mechanism 10 in the first, second, third, and fourth unit processes, respectively.
  • reference numeral 80 denotes a locus on the image of the target mark when the movement of the moving mechanism 10 is controlled by the movement control unit 41 according to the embodiment.
  • reference numeral 90 indicates a change in the moving speed of the moving mechanism 10 that is controlled by the movement control unit 41 according to the embodiment.
  • the target workpiece W is imaged after the moving mechanism 10 is stopped. Therefore, the time required for the positioning process is long.
  • the target work W is imaged while moving, and a movement command is generated based on the imaged image. Therefore, the time required for the positioning process can be shortened.
  • FIG. 11 is a flowchart illustrating an example of the flow of a monitoring process in the controller. The monitoring process shown in FIG. 11 is performed for each positioning process of the target workpiece W. The monitoring process shown in FIG. 11 is executed in parallel with the positioning process shown in FIG.
  • step S21 the trajectory determination unit 48 determines the ideal positions of the marks 5a and 5b on the image based on the measurement positions PMa and PMb specified by the initial position specifying process (steps S2 and S3 in FIG. 6) by the visual sensor 30. Determine the trajectories respectively.
  • the measurement positions PMa and PMb specified by the first position specification processing are referred to as initial positions Pa (0) and Pb (0), respectively.
  • step S22 the determination unit 49 compares, for each of the marks 5a and 5b, the actual trajectory of the measurement position specified by the second and subsequent position specification processing with the ideal trajectory determined in step S21. Then, the determining unit 49 determines whether there is an abnormality in the positioning system 1 based on the comparison result.
  • step S24 the determination unit 49 notifies a warning indicating that an abnormality that causes a deviation of the calibration parameter has occurred. Specifically, the determination unit 49 causes the display device 50 to display a warning screen. After step S24, the process proceeds to step S25. If the determination result indicates that there is no abnormality (NO in step S23), the process proceeds to step S25.
  • step S25 the monitoring unit 47 determines whether the positioning process has been completed. That is, if YES is determined in step S5 shown in FIG. 6, the monitoring unit 47 determines that the positioning process has been completed. If the positioning process has not been completed (NO in step S25), the monitoring process returns to step S22. If the positioning processing has been completed (YES in step S25), the monitoring processing ends.
  • the trajectory determination unit 48 determines an ideal trajectory by performing processing such as that shown in the flowchart of FIG.
  • FIG. 12 is a flowchart showing the contents of a subroutine of the ideal trajectory determination process (step S21) shown in FIG.
  • step S31 the trajectory determination unit 48 sets the elapsed time t from the start of the control of the moving mechanism 10 by the movement control unit 41 to 0.
  • step S32 the trajectory determination unit 48 acquires from the visual sensor 30 the coordinates (XY coordinates) of the initial positions Pa (0) and Pb (0) specified by the initial position specification processing.
  • the coordinates of the initial positions Pa (0) and Pb (0) acquired from the visual sensor 30 are indicated by a camera coordinate system.
  • the trajectory determination unit 48 stores the coordinates (camera coordinate system) of the initial positions Pa (0) and Pb (0) in a storage unit (for example, the nonvolatile memory 416 or the main memory 418 (see FIG. 4)) of the controller 40. save.
  • step S33 the trajectory determination unit 48 converts the coordinates of the initial positions Pa (0) and Pb (0) from the camera coordinate system to the machine coordinate system using the calibration parameters.
  • step S34 the trajectory determination unit 48 calculates the required movement amount of the movement mechanism 10 using the simulation model stored in the model storage unit 46 and the positions Pa (t) and Pb (t).
  • the positions Pa (t) and Pb (t) are the initial positions Pa (0) and Pb (0), respectively.
  • the trajectory determination unit 48 calculates the deviation between the position Pa (t) and the target position SPa and the deviation between the position Pb (t) and the target position SPb as the required movement amount of the moving mechanism 10.
  • the calculation process is the same as the process of the subtraction unit 44 of the movement control unit 41, and is performed using the XY coordinates of the machine coordinate system.
  • step S35 the trajectory determination unit 48 calculates the movement commands MVX, MVY, MV ⁇ according to the required movement amount.
  • the calculation process is the same as the process of the calculation unit 45 of the movement control unit 41.
  • step S36 the trajectory determination unit 48 calculates the translation amount ⁇ X of the X stage 11, the translation amount ⁇ Y of the Y stage 13, and the rotation amount ⁇ of the ⁇ stage 15 in one control cycle Ts.
  • the trajectory determination unit 48 calculates the translation amount ⁇ X assuming that the X stage 11 has ideally moved according to the movement command MVX calculated in step S35.
  • the trajectory determination unit 48 assumes that the Y stage 13 and the ⁇ stage 15 have ideally moved according to the movement commands MVY and MV ⁇ calculated in step S35, and calculates the translational movement amount ⁇ Y and the rotational movement amount ⁇ , respectively. calculate.
  • step S37 the trajectory determination unit 48 calculates the coordinates (mechanical coordinate system) of the position Pa (t + Ts) of the mark 5a after one control cycle Ts according to the following equation (3). Similarly, the trajectory determination unit 48 calculates the coordinates (mechanical coordinate system) of the position Pb (t + Ts) of the mark 5b after one control cycle Ts according to the following equation (3).
  • x and y indicate the XY coordinates (machine coordinate system) of the position Pa (t) or the position Pb (t).
  • x ′ and y ′ indicate XY coordinates (machine coordinate system) of the position Pa (t + Ts) or the position Pb (t + Ts).
  • Xo and Yo indicate the X coordinate and the Y coordinate of the rotation center of the ⁇ stage 15, respectively.
  • step S38 the trajectory determination unit 48 performs coordinate conversion of the XY coordinates (mechanical coordinate system) of the positions Pa (t + Ts) and Pb (t + Ts) into the camera coordinate system. Then, the trajectory determination unit 48 stores the XY coordinates (camera coordinate system) of the positions Pa (t + Ts) and Pb (t + Ts) in the storage unit of the controller 40.
  • step S39 the trajectory determination unit 48 determines whether the positions Pa (t + Ts) and Pb (t + Ts) match the target positions SPa and SPb, respectively. Specifically, when the respective deviations between the positions Pa (t + Ts) and Pb (t + Ts) and the target positions SPa and SPb are both smaller than the threshold value Th0, the trajectory determining unit 48 sets the positions Pa (t + Ts) and Pb It is determined that (t + Ts) coincides with the target positions SPa and SPb, respectively.
  • step S40 the trajectory determination unit 48 resets the elapsed time t from the start of the movement control of the movement mechanism 10 to a time obtained by adding one control cycle Ts. After step S40, steps S34 to S39 are repeated.
  • the coordinates (camera coordinate system) of the positions Pa (0), Pa (Ts), Pa (2Ts),..., Pa (nTs) stored in the storage unit of the controller 40 are the control start times of the movement control unit 41. This is information that associates the elapsed time t from the position with the ideal position of the mark 5a on the image.
  • the coordinates (camera coordinate system) of the positions Pa (0), Pa (Ts), Pa (2Ts),..., Pa (nTs) indicate an ideal trajectory of the mark 5a on the image.
  • the coordinates (camera coordinate system) of the positions Pb (0), Pb (Ts), Pb (2Ts),..., Pb (nTs) stored in the storage unit of the controller 40 are determined by the movement control unit 41.
  • the coordinates (camera coordinate system) of the positions Pb (0), Pb (Ts), Pb (2Ts),..., Pb (nTs) indicate an ideal trajectory of the mark 5b on the image.
  • the determination unit 49 determines whether or not the positioning system 1 is abnormal, for example, by performing a process as shown in the flowchart of FIG.
  • FIG. 13 is a flowchart showing the contents of a subroutine of the processing (step S22) for determining the presence or absence of an abnormality shown in FIG.
  • step S41 the determination unit 49 determines a first allowable area on an image through which the mark 5a passes based on the ideal trajectory of the mark 5a. Specifically, the determination unit 49 determines the distance from the curve (see the solid line 70 in FIG. 2) connecting the positions Pa (0), Pa (Ts), Pa (2Ts),. Is determined as a first allowable area (see the allowable area 71 in FIG. 2).
  • the determination unit 49 determines a second allowable area on the image through which the mark 5b passes based on the ideal trajectory of the mark 5b. Specifically, the determination unit 49 determines the deviation from the curve (see the solid line 70 in FIG. 2) that connects the positions Pb (0), Pb (Ts), Pb (2Ts),. An area whose (distance) is equal to or smaller than the threshold Th1 is determined as a second allowable area (see the allowable area 71 in FIG. 2).
  • step S42 the determination unit 49 acquires the latest coordinates (camera coordinate system) of the measurement positions PMa and PMb from the visual sensor 30.
  • the coordinates of the measurement positions PMa and PMb are specified by the second and subsequent position specification processing (steps S7 and S8 in FIG. 6).
  • the determination unit 49 determines whether the latest measurement position PMa is located in the first allowable area and the latest measurement position PMb is located in the second allowable area.
  • the latest measurement position PMa is located within the first allowable area, it indicates that the deviation (distance) between the measurement position PMa and the ideal trajectory of the mark 5a is equal to or smaller than the threshold Th1.
  • the deviation (distance) between the measurement position PMa and the ideal trajectory of the mark 5a is one of the feature amounts indicating the degree of deviation between the actual trajectory and the ideal trajectory.
  • the latest measurement position PMb is located in the second allowable area, it indicates that the deviation (distance) between the measurement position PMb and the ideal trajectory of the mark 5b is equal to or smaller than the threshold Th1.
  • the deviation (distance) between the measurement positions PMa and PMb and the ideal trajectories of the marks 5a and 5b is a feature amount indicating the degree of deviation between the actual trajectory and the ideal trajectory.
  • step S43 determines in step S44 that there is an abnormality. On the other hand, in the case of YES in step S43, the determination unit 49 determines in step S45 that there is no abnormality. After step S44 or step S45, the process of determining the presence or absence of abnormality ends.
  • step S41 is the same every time, it is omitted in the processing for determining the presence / absence of abnormality after the second time.
  • the positioning system 1 that performs the positioning processing of the target work W includes the moving mechanism 10, the visual sensor 30, the movement control unit 41, the trajectory determination unit 48, and the determination unit 49.
  • the moving mechanism 10 moves the target work W.
  • the visual sensor 30 captures the target work W and performs a position specifying process for specifying the position of the target work W (specifically, the measurement positions PMa and PMb of the marks 5a and 5b) based on the captured image for each imaging cycle Tb.
  • the movement control unit 41 controls the movement mechanism 10 so that the measurement positions PMa and PMb specified by the visual sensor 30 approach the target positions SPa and SPb, respectively.
  • the trajectory determination unit 48 determines an ideal trajectory of the mark 5a from the initial position Pa (0) of the mark 5a specified by the first position specifying process to the target position SPa. Further, the trajectory determination unit 48 determines an ideal trajectory of the mark 5b from the initial position Pb (0) of the mark 5a specified by the first position specifying process to the target position SPb. The determination unit 49 compares the actual trajectory of the measured position PMa of the mark 5a specified by the position specifying process after the initial position specifying process with the ideal trajectory of the mark 5a. Further, the determination unit 49 compares the actual trajectory of the measured position PMb of the mark 5b specified by the position specifying process after the first position specifying process with the ideal trajectory of the mark 5b. The determination unit 49 determines whether there is an abnormality in the positioning system 1 based on the comparison result.
  • the determination unit 49 accurately determines the presence or absence of an abnormality that causes a deviation of the calibration parameter based on the comparison result between the actual trajectories of the marks 5a and 5b and the ideal trajectory without depending on the personal judgment. Can be determined.
  • the visual sensor 30 executes the first position specifying process while the target work W is stopped, and executes the second and subsequent position specifying processes while the target work W is moving. This makes it possible to speed up the positioning process.
  • the determination unit 49 generates an abnormality when a feature amount (deviation (distance) between the ideal trajectories of the measurement positions PMa and PMb and the marks 5a and 5b) indicating the degree of deviation between the actual trajectory and the ideal trajectory exceeds the threshold Th1. Is determined, and a warning is notified. Thereby, the operator can perform maintenance for removing the cause of the deviation of the calibration parameter at an appropriate timing.
  • the trajectory determination unit 48 performs a simulation using the simulation models of the movement mechanism 10 and the movement control unit 41 and the initial positions Pa (0) and Pb (0) specified by the initial position specification processing, thereby obtaining a mark.
  • the ideal trajectories 5a and 5b are respectively determined. By using a simulation model, an ideal trajectory can be determined with high accuracy.
  • the determination unit 49 determines whether or not there is an abnormality according to whether or not the latest measurement positions PMa and PMb specified by the visual sensor 30 are located in the first allowable area and the second allowable area, respectively. Judged. On the other hand, the determination unit 49 determines whether the estimated positions PVa and PVb output from the position determination unit 43 for each control cycle Ts are located in the first allowable region and the second allowable region, respectively. May be determined.
  • the determination unit 49 converts the coordinates (mechanical coordinate system) of the estimated positions PVa and PVb into the camera coordinate system, and determines whether the estimated positions PVa and PVb are located in the first allowable area and the second allowable area, respectively. What is necessary is just to judge.
  • the determination unit 49 may execute the determination process of the presence or absence of the abnormality illustrated in FIG. 14 instead of the determination process of the presence or absence of the abnormality illustrated in FIG.
  • FIG. 14 is a flowchart illustrating the processing content of a subroutine of the determination processing according to the second modification.
  • the presence or absence of an abnormality is determined using the estimated positions PVa and PVb.
  • step S141 the determination unit 49 acquires the estimated positions PVa and PVb from the position determination unit 43.
  • step S142 the determination unit 49 converts the coordinates (mechanical coordinate system) of the estimated positions PVa and PVb into a camera coordinate system.
  • step S143 the determination unit 49 specifies the elapsed time t from the time when the movement control unit 41 starts control to the time when the estimated positions PVa and PVb are output.
  • the estimated positions PVa and PVb are output for each control cycle Ts. Therefore, the time t is a time kTs obtained by multiplying the control cycle Ts by the number k of times the estimated positions PVa and PVb are output from the control start time.
  • step S144 the determination unit 49 reads out the coordinates of the positions Pa (kTs) and Pb (kTs) on the ideal trajectory from the information stored in the storage unit in the trajectory determination processing shown in FIG.
  • step S145 the determination unit 49 calculates deviations (distances) between the estimated positions PVa and PVb and the positions Pa (kTs) and Pb (kTs) on the ideal trajectory, respectively.
  • Each deviation between the estimated positions PVa and PVb and the positions Pa (kTs) and Pb (kTs) on the ideal trajectory is one of the feature amounts indicating the degree of divergence between the actual trajectory and the ideal trajectory.
  • step S146 the determination unit 49 determines whether the deviation is equal to or less than the threshold Th1.
  • step S146 determines in step S147 that there is an abnormality.
  • step S148 determines in the step S148 that there is no abnormality.
  • the presence or absence of an abnormality is determined based on the divergence between the actual trajectory and the ideal trajectory in consideration of the elapsed time t from the start of the control of the movement control unit 41.
  • the monitoring unit 47 may execute the monitoring process shown in FIG. 15 instead of the monitoring process shown in FIG.
  • FIG. 15 is a flowchart illustrating the flow of the monitoring process according to the third modification.
  • the monitoring process according to the third modification is different from the monitoring process shown in FIG. 11 in that step S121 is executed instead of step S22, and steps S122 and S123 are executed after step S24.
  • steps S121, S122, and S123 will be described.
  • step S121 the determination unit 49 calculates, for each of the marks 5a and 5b, a feature amount indicating a degree of divergence between the actual trajectory and the ideal trajectory, and based on the feature amount, determines whether there is an abnormality and whether the abnormality degree is high or low. Make a decision. Details of step S121 will be described later.
  • step S122 the level of the abnormality degree determined in step S121 is confirmed. If the degree of abnormality is high (YES in step S122), the determination unit 49 outputs an operation stop instruction to the movement control unit 41 in step S123. Thereby, the movement control unit 41 stops the movement control of the movement mechanism 10.
  • step S122 if the abnormality level is low (NO in step S122), the monitoring process proceeds to step S25.
  • FIG. 16 is a flowchart showing the processing contents of the subroutine of the processing (step S121) for judging the presence or absence of an abnormality and the degree of abnormality shown in FIG. 15 (step S121).
  • the determination processing of the presence / absence of abnormality and the degree of abnormality shown in FIG. 16 is different from the determination processing of presence / absence of abnormality shown in FIG. 14 in that steps S149 to S151 are further executed.
  • steps S149 to S151 will be described.
  • the determination unit 49 determines in step S149 whether the deviation between the measurement positions PMa and PMb and the positions Pa (kTs) and Pb (kTs) on the ideal trajectory is equal to or smaller than the threshold Th2.
  • the threshold Th2 is larger than the threshold Th1.
  • step S149 determines in step S150 that the degree of abnormality is high. On the other hand, when YES is determined in the step S149, the determination unit 49 determines in the step S151 that the degree of abnormality is low. After step S150 or step S151, the process of determining the presence or absence of abnormality and the level of abnormality is completed.
  • the monitoring process is executed in parallel with the positioning process. However, the monitoring process may be executed after the positioning process ends. Since the positioning process has been completed, the actual trajectory is specified for each of the marks 5a and 5b.
  • the actual trajectories of the marks 5a and 5b are specified by sequentially connecting the estimated positions PVa and PVb output from the position determination unit 43 during the positioning process.
  • the actual trajectories of the marks 5a and 5b may be specified by sequentially connecting the measurement positions PMa and PMb output from the visual sensor 30 during the positioning process.
  • the determination unit 49 calculates, for each of the marks 5a and 5b, a feature amount indicating the degree of divergence between the actual trajectory and the ideal trajectory, and determines the presence or absence of an abnormality and the level of the abnormality based on the calculated feature amount.
  • the feature amount for example, the area of a region surrounded by the real trajectory and the ideal trajectory (integrated value of the deviation between the real trajectory and the ideal trajectory) can be used.
  • the maximum value of the deviation (distance) between a point on the actual trajectory and a point on the ideal trajectory having the same elapsed time from the start of control by the movement control unit 41 may be used as the feature amount.
  • FIG. 17 is a diagram illustrating a functional configuration of a controller according to the fifth modification.
  • the positioning system 1A according to the fifth modification is different from the positioning system 1 shown in FIG. 5 in that a controller 40A is provided instead of the controller 40.
  • the controller 40A is different from the controller 40 shown in FIG. 5 in that a monitoring unit 47A is provided instead of the monitoring unit 47 and the model storage unit 46 is not provided.
  • the monitoring unit 47A is different from the monitoring unit 47 in that a trajectory determination unit 48A is provided instead of the trajectory determination unit 48.
  • the trajectory determination unit 48A has a learning unit 148 on which machine learning for outputting a function indicating an ideal trajectory in accordance with the input of the initial position is performed, and the ideal trajectories of the marks 5a and 5b are determined using the learning unit 148. To determine.
  • the learning device 148 is constructed by performing machine learning using, as learning data, coordinates of a plurality of measurement positions specified in each of a plurality of positioning processes performed when the positioning system is normal.
  • the time when the positioning system is normal is, for example, the time when the positioning system is started up when the deviation of the calibration parameter is small.
  • the learning device 148 sequentially connects the measurement position PMa specified in the first position specification process in each positioning process and the plurality of measurement positions PMa specified for each imaging cycle Tb in the positioning process. Is learned in correspondence with a function Pa (t) which is information indicating.
  • the function Pa (t) represents the elapsed time t from the control start time by the movement control unit 41 and the coordinates (camera coordinate system) of the position of the mark 5a on the image when the elapsed time t has elapsed from the control start time.
  • the associated function is information indicating.
  • the learning device 148 shows a locus that sequentially connects the measurement position PMb specified in the first position specification process in each positioning process and the plurality of measurement positions PMb specified for each imaging cycle Tb in the positioning process.
  • the correspondence with the function Pb (t) as information is learned.
  • the function Pb (t) is an elapsed time t from the start of control by the movement control unit 41, an XY coordinate (camera coordinate system) of a position on the image of the mark 5b when the elapsed time t has elapsed from the start of control. Is a function in which.
  • the trajectory determination unit 48A inputs the initial position Pa (0) to the learning device 148, and determines a trajectory represented by a function output from the learning device 148 as an ideal trajectory on the image of the mark 5a. By inputting the initial position Pb (0) to the learning device 148, the trajectory determination unit 48A determines the trajectory represented by the function output from the learning device 148 as the ideal trajectory of the mark 5b on the image.
  • the trajectory determination unit 48A can determine an ideal trajectory that matches the actual situation.
  • FIG. 18 is a diagram illustrating a configuration of a positioning system according to the sixth modification.
  • the visual sensor 30 of the positioning system according to Modification 6 captures the target work W and the reference work W0 fixed at the fixed position.
  • the reference work W0 is made of a light-transmitting material. Therefore, the visual sensor 30 can simultaneously capture the marks 5a and 5b, which are the characteristic portions of the target work W, and the marks 6a, 6b, which are the characteristic portions of the reference work W0.
  • the visual sensor 30 also specifies the positions of the marks 6a and 6b of the reference work W0.
  • the controller 40 sets the position of the mark 6a specified by the visual sensor 30 as the target position SPa, and sets the position of the mark 6b specified by the visual sensor 30 as the target position SPb.
  • the determination unit 49 may cause the display device 50 to display a screen showing the ideal trajectory and the actual trajectory for each positioning process.
  • FIG. 19 is an example of a screen showing an ideal trajectory and an actual trajectory.
  • a solid line 70 indicates an ideal trajectory of the target mark (the mark 5a or the mark 5b), and a solid line 72 indicates a real trajectory of the target mark. The operator can grasp the degree of deviation of the calibration parameter by checking the screen as shown in FIG.
  • the determination unit 49 decomposes the deviation between the position on the actual trajectory and the position on the ideal trajectory, in which the time elapsed since the control start by the movement control unit 41 is the same, into components in the X and Y directions, A screen showing changes with time of each component may be displayed on the display device 50.
  • the X direction is the moving direction of the X stage 11
  • the Y direction is the moving direction of the Y stage 13.
  • FIG. 20 is a diagram showing an example of a screen showing a temporal change of each component in the X direction and the Y direction of the deviation between the position on the actual trajectory and the position on the ideal trajectory.
  • the determination unit 49 may cause the display device 50 to display a screen that indicates a transition of a feature amount that is calculated for each positioning process and that indicates the degree of deviation between the ideal trajectory and the actual trajectory. Thereby, the worker can grasp the situation such as the deterioration of the moving mechanism 10 over time.
  • the determination unit 49 determines, for example, the area of a region surrounded by the ideal trajectory and the actual trajectory (that is, the integrated value of the deviation between the ideal trajectory and the actual trajectory) (the area of the hatched portion in FIG. 19) Is calculated as a feature quantity indicating the degree of deviation of The determination unit 49 calculates the feature amount for each positioning process, and causes the display device 50 to display a screen showing the transition of the feature amount.
  • FIG. 21 is a diagram showing an example of a screen showing a temporal change of a feature amount (integral value of deviation) indicating a degree of deviation between an ideal trajectory and an actual trajectory.
  • the feature amount indicating the degree of deviation between the ideal trajectory and the actual trajectory gradually increases as the moving mechanism 10 deteriorates over time.
  • the operator can grasp the difference between the value of the feature amount at the current time and the threshold Th1. As a result, the operator can predict when maintenance such as calibration is required.
  • the screen example shown in FIG. 21 includes a graph image showing the transition of the feature amount.
  • the determination unit 21 may cause the display device 50 to display a screen in which the numerical values of the feature amounts are arranged in chronological order.
  • the trajectory determination units 48 and 48A have determined the ideal trajectories of the marks 5a and 5b. However, the trajectory determination units 48 and 48A may determine the ideal trajectory of the midpoint of the marks 5a and 5b. In this case, the determination unit 49 determines whether or not an abnormality has occurred in the positioning system 1 based on the comparison result between the actual trajectory of the midpoint of the measurement positions PMa and PMb specified by the visual sensor 30 and the ideal trajectory. Is also good.
  • the target work W is positioned using the marks 5a and 5b provided on the target work W as characteristic features of the target work W.
  • the target work W may be positioned using another part of the target work W as a characteristic portion.
  • a screw or a screw hole provided in the target work W may be used as the characteristic portion.
  • a corner of the target work W may be used as a characteristic portion.
  • the controller 40 includes the movement control unit 41 and the monitoring unit 47.
  • the monitoring unit 47 may be provided in an information processing device different from the controller 40.
  • a positioning system (1, 1A) comprising: a determination unit (49) for determining presence / absence.
  • (Configuration 2) Configuration 1 in which the visual sensor (30) executes the first position specifying process while the object (W) is stopped, and executes the subsequent position specifying process while the object (W) is moving.
  • the positioning system (1, 1A) according to 1.
  • (Configuration 3) A determining unit configured to determine that the abnormality has occurred and to issue a warning when a feature amount indicating a degree of deviation between the actual trajectory and the ideal trajectory exceeds a first threshold; Or the positioning system (1, 1A) according to 2.
  • the actual trajectory and the ideal trajectory are indicated by information that associates the elapsed time from the start of the control of the movement mechanism (10) by the movement control unit (41) with the position of the object (W).
  • the trajectory determination unit (48) uses a simulation model of the movement mechanism (10) and the movement control unit (41), and a position of the object (W) specified by the first position specification processing.
  • the positioning system (1) according to any one of Configurations 1 to 6, wherein the ideal trajectory is determined by performing a simulation.
  • the trajectory determination unit (48A) calculates a trajectory of the object (W) from the position corresponding to the input position of the object (W) to the target position by machine learning using learning data. Further comprising a trained learner (148) configured to output the indicated information;
  • the learning data is data indicating a position of the object (W) specified by the visual sensor (30) in the positioning process performed when the positioning system is normal,
  • the trajectory determination unit (48A) is output from the learning device (148) by inputting the position of the object (W) specified by the initial position specification processing to the learning device (148).
  • the positioning system (1A) according to any of Configurations 1 to 6, wherein a trajectory indicated by the information is determined as the ideal trajectory.
  • the actual trajectory and the ideal trajectory are indicated by information that associates the elapsed time from the start of the control of the moving mechanism by the movement control unit with the position of the object,
  • the movement mechanism (10) includes a plurality of translation mechanisms (11, 13) that translate in different directions of movement.
  • the determination unit (49) determines a deviation between a position on the actual trajectory and a position on the ideal trajectory, where the elapsed time is the same, for each of the plurality of translation mechanisms (11, 13). 3.
  • the positioning system according to configuration 1 or 2 wherein the positioning system is decomposed into components in the moving direction, and outputs a screen showing a temporal change of each component to the display device (50).
  • (Configuration 12) A monitoring device used in the positioning system (1, 1A) according to any one of the configurations 1 to 11, The trajectory determination unit, A monitoring device (40, 40A) including the determination unit (49).
  • the positioning system (1, 1A) comprises: A moving mechanism (10) for moving the object (W); A visual sensor (30) for taking an image of the object (W) and executing a position specifying process for specifying a position of the object (W) based on the picked-up image for each imaging cycle; A movement control unit (41) for controlling the movement mechanism (10) such that the position specified by the visual sensor (30) approaches the target position; Determining an ideal trajectory of the object (W) from the position of the object (W) specified by the initial position specifying process to the target position; Determining whether there is an abnormality in the positioning system based on a comparison result between the actual trajectory of the position of the object (W) specified by the position specifying process after the first position specifying process and the ideal trajectory
  • a monitoring method comprising:
  • the positioning system (1, 1A) comprises: A moving mechanism (10) for moving the object (W); A visual sensor (30) for taking an image of the object (W) and executing a position specifying process for specifying a position of the object (W) based on the picked-up image for each imaging cycle; A movement control unit (41) for controlling the movement mechanism (10) such that the position specified by the visual sensor (30) approaches the target position;
  • the monitoring method comprises: Determining an ideal trajectory of the object (W) from the position of the object (W) specified by the initial position specifying process to the target position; An abnormality of the positioning system (1, 1A) is determined based on a comparison result between the actual trajectory of the position of the object (W) specified by the position specifying process after the initial position specifying process and the ideal trajectory.
  • a step of determining the presence or absence of the program comprises: A moving mechanism (10) for moving the object

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Mechanical Engineering (AREA)
  • Optics & Photonics (AREA)
  • Manufacturing & Machinery (AREA)
  • Exposure And Positioning Against Photoresist Photosensitive Materials (AREA)
  • Machine Tool Sensing Apparatuses (AREA)
  • Numerical Control (AREA)
  • Control Of Position Or Direction (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

Provided is a positioning system comprising: a movement mechanism for moving an object; a visual sensor for executing in each image capture period a position identification process for identifying the position of the object on the basis of a captured image of the object; a movement control unit for controlling the movement mechanism such that the identified position approaches a target position; a path determination unit; and a determination unit. The path determination unit determines an ideal path of the object from the position identified by the initial position identification process to the target position. On the basis of the result of a comparison between the ideal path and the actual path of the position identified by the position identification processes after the initial position identification process, the determination unit determines whether there is an abnormality in the positioning system. As a result, it is possible to precisely determine if there are abnormalities.

Description

位置決めシステム、監視装置、監視方法およびプログラムPositioning system, monitoring device, monitoring method and program
 本技術は、視覚センサを用いた位置決めシステム、監視装置、監視方法およびプログラムに関する。 (4) The present technology relates to a positioning system using a visual sensor, a monitoring device, a monitoring method, and a program.
 FA(ファクトリーオートメーション)において、対象物の位置を目標位置に合わせる位置決めシステムが各種実用化されている。対象物の位置と目標位置との偏差(距離)を計測する方法として、視覚センサによって撮像された画像を用いる方法がある。当該方法では、視覚センサの座標系と対象物を移動させる移動機構の座標系とを対応付ける処理であるキャリブレーションが行なわれる。 In 各種 FA (factory automation), various positioning systems for adjusting the position of an object to a target position have been put to practical use. As a method of measuring a deviation (distance) between the position of the target object and the target position, there is a method of using an image captured by a visual sensor. In this method, calibration is performed, which is a process of associating the coordinate system of the visual sensor with the coordinate system of the moving mechanism that moves the target.
 特開2003-50106号公報(特許文献1)には、以下のステップ(a)~(e)を行なうことによりキャリブレーションパラメータを決定する技術が開示されている。ステップ(a)は、テーブル又は撮像部を移動させ、テーブルと撮像部との位置関係を算出する。ステップ(b)は、テーブル又は撮像部を移動させ、テーブルとテーブルの所定部分との位置関係を算出する。ステップ(c)は、テーブル又は撮像部を移動させ、補正量を求めてテーブルとテーブルの所定部分との位置関係を補正する。ステップ(d)は、テーブル又は撮像部を移動させ、テーブルと撮像部との位置関係を再算出する。ステップ(e)は、ステップ(c)において求められた補正量が所定の値以下となるまで、ステップ(c)及びステップ(d)を繰り返す。 JP-A-2003-50106 (Patent Document 1) discloses a technique for determining a calibration parameter by performing the following steps (a) to (e). In step (a), the table or the imaging unit is moved, and the positional relationship between the table and the imaging unit is calculated. In step (b), the table or the imaging unit is moved, and the positional relationship between the table and a predetermined portion of the table is calculated. In step (c), the table or the imaging unit is moved, and a correction amount is obtained to correct the positional relationship between the table and a predetermined portion of the table. In the step (d), the table or the imaging unit is moved, and the positional relationship between the table and the imaging unit is calculated again. In step (e), steps (c) and (d) are repeated until the correction amount obtained in step (c) becomes equal to or less than a predetermined value.
特開2003-50106号公報JP-A-2003-50106
 キャリブレーションは、装置の立ち上げ段階に行なわれる。しかしながら、移動機構を構成する機械の経年変化に伴って、視覚センサの座標系を移動機構の座標系に精度良く変換することができなくなる。このような場合、再キャリブレーション、部品の交換、機械の調整などのメンテナンスを行なう必要がある。従来、メンテナンスを行なう必要があるか否かは、作業者に依存した基準で判定されていた。すなわち、作業者は、製造物の不良率の増加量、生産タクトの増加量などの傾向から、メンテナンスの必要性を経験的に判定していた。そのため、位置決めシステムにおける異常が見逃される可能性があった。 Calibration is performed at the start-up stage of the device. However, with the aging of the machine constituting the moving mechanism, the coordinate system of the visual sensor cannot be accurately converted to the coordinate system of the moving mechanism. In such a case, it is necessary to perform maintenance such as re-calibration, replacement of parts, and adjustment of the machine. Conventionally, whether or not maintenance is required has been determined based on a criterion depending on an operator. That is, the worker has empirically determined the necessity of maintenance based on trends such as an increase in the defect rate of the product and an increase in the production tact. Therefore, an abnormality in the positioning system may be overlooked.
 本発明は、上記の問題を鑑みてなされたものであり、その目的は、異常の有無を精度良く判定可能な位置決めシステム、当該位置決めシステムに用いられる監視装置、当該位置決めシステムの監視方法およびプログラムを提供することである。 The present invention has been made in view of the above problems, and has as its object to provide a positioning system capable of accurately determining the presence or absence of an abnormality, a monitoring device used in the positioning system, a monitoring method and a program for the positioning system. To provide.
 本開示の一例によれば、対象物の位置決め処理を行なう位置決めシステムは、移動機構と、視覚センサと、移動制御部と、軌跡決定部と、判定部とを備える。移動機構は、対象物を移動させる。視覚センサは、対象物を撮像し、撮像画像に基づいて対象物の位置を特定する位置特定処理を撮像周期ごとに実行する。移動制御部は、視覚センサによって特定された位置が目標位置に近づくように移動機構を制御する。軌跡決定部は、初回の位置特定処理によって特定された対象物の位置から目標位置までの対象物の理想軌跡を決定する。判定部は、初回の位置特定処理の後の位置特定処理によって特定された対象物の位置の実軌跡と理想軌跡との比較結果に基づいて、位置決めシステムの異常の有無を判定する。 According to an example of the present disclosure, a positioning system that performs a positioning process of an object includes a moving mechanism, a visual sensor, a movement control unit, a trajectory determination unit, and a determination unit. The moving mechanism moves the object. The visual sensor captures an image of an object, and executes a position specifying process for specifying the position of the object based on the captured image in each imaging cycle. The movement control unit controls the movement mechanism so that the position specified by the visual sensor approaches the target position. The trajectory determination unit determines an ideal trajectory of the target object from the position of the target object specified by the initial position specifying process to the target position. The determining unit determines whether there is an abnormality in the positioning system based on a comparison result between the actual trajectory of the position of the target object and the ideal trajectory specified by the position specifying process after the first position specifying process.
 視覚センサの座標系と移動機構の機械座標系との対応関係を示すキャリブレーションパラメータのずれがある場合、実軌跡は理想軌跡から外れる。そのため、上記の開示によれば、判定部は、属人的判断に依存することなく、実軌跡と理想軌跡との比較結果に基づいて、キャリブレーションパラメータのずれの要因となる異常の有無を精度良く判定することができる。 (4) If there is a deviation of the calibration parameter indicating the correspondence between the coordinate system of the visual sensor and the machine coordinate system of the moving mechanism, the actual trajectory deviates from the ideal trajectory. Therefore, according to the above disclosure, the determination unit can accurately determine the presence or absence of an abnormality that causes a deviation of a calibration parameter based on a comparison result between an actual trajectory and an ideal trajectory without depending on personal judgment. A good judgment can be made.
 上記の開示において、視覚センサは、対象物が停止中において初回の位置特定処理を実行し、対象物が移動中において後の位置特定処理を実行する。この開示によれば、位置決め処理を高速化できる。 In the above disclosure, the visual sensor performs an initial position specifying process when the object is stopped, and performs a subsequent position specifying process when the object is moving. According to this disclosure, the positioning process can be speeded up.
 上記の開示において、判定部は、実軌跡と理想軌跡との乖離度を示す特徴量が第1閾値を超える場合に、異常が発生していると判定し、警告を通知する。この開示によれば、作業者は、キャリブレーションパラメータのずれの要因を取り除くためのメンテナンスを適切なタイミングで行なうことができる。 In the above disclosure, the determination unit determines that an abnormality has occurred and notifies a warning when the feature amount indicating the degree of deviation between the actual trajectory and the ideal trajectory exceeds the first threshold. According to the present disclosure, an operator can perform maintenance for removing a cause of a deviation of a calibration parameter at an appropriate timing.
 上記の開示において、判定部は、特徴量が第2閾値を超える場合に、移動機構を停止させる。第2閾値は第1閾値よりも大きい。 に お い て In the above disclosure, the determination unit stops the moving mechanism when the feature amount exceeds the second threshold. The second threshold is larger than the first threshold.
 特徴量が第1閾値よりも大きい第2閾値を超える場合、移動機構の引っ掛かりなどの突発的な異常が発生している可能性が高い。このような場合、移動機構を継続して駆動すると、移動機構の故障を引き起こす可能性がある。しかしながら、この開示によれば、特徴量が第2閾値を超える場合に、移動機構を停止させる。その結果、移動機構の故障を未然に回避できる。 (4) If the feature value exceeds the second threshold value that is larger than the first threshold value, there is a high possibility that a sudden abnormality such as a catch of the moving mechanism has occurred. In such a case, if the moving mechanism is continuously driven, there is a possibility that the moving mechanism may fail. However, according to this disclosure, the moving mechanism is stopped when the feature amount exceeds the second threshold. As a result, a failure of the moving mechanism can be avoided.
 上記の開示において、特徴量は、上記の後の位置特定処理によって特定された対象物の位置と理想軌跡との偏差である。 In the above disclosure, the characteristic amount is a deviation between the position of the object specified by the subsequent position specification processing and the ideal trajectory.
 もしくは、実軌跡および理想軌跡は、移動制御部による移動機構の制御開始時点からの経過時間と、対象物の位置とを対応付けた情報により示される。特徴量は、経過時間が同一である、実軌跡上の位置と理想軌跡上の位置との偏差の最大値または積分値であってもよい。 Alternatively, the actual trajectory and the ideal trajectory are indicated by information that associates the elapsed time from the start of the control of the moving mechanism by the movement control unit with the position of the object. The feature amount may be a maximum value or an integral value of the deviation between the position on the actual trajectory and the position on the ideal trajectory, where the elapsed time is the same.
 上記の開示において、軌跡決定部は、移動機構および移動制御部のシミュレーションモデルと、初回の位置特定処理によって特定された対象物の位置とを用いてシミュレーションを行なうことにより、理想軌跡を決定する。この開示によれば、シミュレーションモデルを用いることにより、精度良く理想軌跡を決定することができる。 In the above disclosure, the trajectory determination unit determines an ideal trajectory by performing a simulation using the simulation model of the movement mechanism and the movement control unit and the position of the target object specified by the initial position specification processing. According to this disclosure, an ideal trajectory can be accurately determined by using a simulation model.
 上記の開示において、軌跡決定部は、学習データを用いた機械学習によって、入力された対象物の位置に対応する、当該位置から目標位置までの対象物の軌跡を示す情報を出力するように構築された学習済みの学習器をさらに備える。学習データは、位置決めシステムが正常であるときに実行された位置決め処理において視覚センサによって特定された対象物の位置を示すデータである。軌跡決定部は、初回の位置特定処理によって特定された対象物の位置を学習器に入力することにより、学習器から出力された情報で示される軌跡を理想軌跡として決定する。この開示によれば、軌跡決定部は、実態に合った理想軌跡を決定することができる。 In the above disclosure, the trajectory determination unit is configured to output, by machine learning using learning data, information indicating the trajectory of the target object from the position of the input target object to the target position. The learned learning device is further provided. The learning data is data indicating the position of the target object specified by the visual sensor in the positioning processing executed when the positioning system is normal. The trajectory determination unit determines the trajectory indicated by the information output from the learning device as an ideal trajectory by inputting the position of the target object specified by the initial position specification processing to the learning device. According to this disclosure, the trajectory determination unit can determine an ideal trajectory that matches the actual situation.
 上記の開示において、判定部は、実軌跡と理想軌跡とを示す画面を表示装置に出力する。この開示によれば、作業者は、当該画面を確認することにより、キャリブレーションパラメータのずれの程度を把握できる。 に お い て In the above disclosure, the determination unit outputs a screen showing the actual trajectory and the ideal trajectory to the display device. According to this disclosure, the operator can grasp the degree of deviation of the calibration parameter by checking the screen.
 上記の開示において、実軌跡および理想軌跡は、移動制御部による移動機構の制御開始時点からの経過時間と、対象物の位置とを対応付けた情報により示される。移動機構は、互いに異なる移動方向に並進移動する複数の並進機構を含む。判定部は、経過時間が同一である、実軌跡上の位置と理想軌跡上の位置との偏差を、複数の並進機構の各々に対応する移動方向の成分に分解し、各成分の経時変化を示す画面を表示装置に出力する。 In the above disclosure, the actual trajectory and the ideal trajectory are indicated by information that associates the elapsed time from the start of the control of the moving mechanism by the movement control unit with the position of the object. The movement mechanism includes a plurality of translation mechanisms that translate in different movement directions. The determination unit decomposes the deviation between the position on the actual trajectory and the position on the ideal trajectory having the same elapsed time into components in the movement direction corresponding to each of the plurality of translation mechanisms, and calculates the temporal change of each component. The displayed screen is output to the display device.
 この開示によれば、作業者は、表示装置の画面を確認することにより、移動機構の異常箇所を推測することができる。 According to the present disclosure, the operator can infer the abnormal position of the moving mechanism by checking the screen of the display device.
 上記の開示において、判定部は、特徴量の推移を示す画面を表示装置に出力する。この開示によれば、作業者は、移動機構の経年劣化などの状況を把握することができる。 に お い て In the above disclosure, the determination unit outputs a screen showing the transition of the feature amount to the display device. According to this disclosure, an operator can grasp a situation such as aging of the moving mechanism.
 本開示の一例によれば、上記の位置決めシステムに用いられる監視装置は、上記の軌跡決定部と、上記の判定部とを備える。この開示によっても、判定部は、属人的判断に依存することなく、実軌跡と理想軌跡との比較結果に基づいて、キャリブレーションパラメータのずれの要因となる異常の有無を精度良く判定することができる。 According to an example of the present disclosure, a monitoring device used for the positioning system includes the trajectory determination unit and the determination unit. According to this disclosure as well, the determination unit can accurately determine the presence or absence of an abnormality that causes a deviation of the calibration parameter based on the comparison result between the actual trajectory and the ideal trajectory without depending on personal judgment. Can be.
 本開示の一例によれば、対象物の位置決め処理を行なう位置決めシステムの監視方法は、第1および第2のステップを備える。位置決めシステムは、対象物を移動させるための移動機構と、対象物を撮像し、撮像画像に基づいて対象物の位置を特定する位置特定処理を撮像周期ごとに実行するための視覚センサと、視覚センサによって特定された位置が目標位置に近づくように移動機構を制御するための移動制御部とを備える。第1のステップは、初回の位置特定処理によって特定された対象物の位置から目標位置までの対象物の理想軌跡を決定するステップである。第2のステップは、初回の位置特定処理の後の位置特定処理によって特定された対象物の位置の実軌跡と理想軌跡との比較結果に基づいて、位置決めシステムの異常の有無を判定するステップである。この開示によっても、キャリブレーションパラメータのずれの要因となる異常の有無を精度良く判定することができる。 According to an example of the present disclosure, a method for monitoring a positioning system that performs a positioning process on an object includes first and second steps. The positioning system includes a moving mechanism for moving the object, a visual sensor for imaging the object, and executing a position specifying process for specifying the position of the object based on the captured image for each imaging cycle, A movement control unit for controlling the movement mechanism so that the position specified by the sensor approaches the target position. The first step is a step of determining an ideal trajectory of the object from the position of the object specified by the initial position specifying process to the target position. The second step is a step of determining the presence or absence of an abnormality in the positioning system based on a comparison result between the actual trajectory and the ideal trajectory of the position of the object specified by the position specifying processing after the first position specifying processing. is there. According to this disclosure as well, it is possible to accurately determine the presence or absence of an abnormality that causes a deviation of a calibration parameter.
 本開示の一例によれば、上記の監視方法をコンピュータに実行させるためのプログラムは、上記の第1および第2のステップをコンピュータに実行させる。この開示によっても、キャリブレーションパラメータのずれの要因となる異常の有無を精度良く判定することができる。 According to an example of the present disclosure, a program for causing a computer to execute the above monitoring method causes the computer to execute the first and second steps. According to this disclosure as well, it is possible to accurately determine the presence or absence of an abnormality that causes a deviation of a calibration parameter.
 本発明によれば、位置決めシステムの異常の有無を精度良く判定できる。 According to the present invention, the presence or absence of an abnormality in the positioning system can be accurately determined.
本実施の形態に係る位置決めシステムの概要を示す模式図である。1 is a schematic diagram illustrating an outline of a positioning system according to the present embodiment. 理想軌跡と実軌跡との一例を示す図である。It is a figure showing an example of an ideal locus and an actual locus. 本実施の形態に係る位置決めシステム1を構成する画像処理装置のハードウェア構成を示す模式図である。FIG. 2 is a schematic diagram illustrating a hardware configuration of an image processing device that configures the positioning system 1 according to the present embodiment. 実施の形態に係る位置決めシステムを構成するコントローラのハードウェア構成を示す模式図である。It is a schematic diagram which shows the hardware constitutions of the controller which comprises the positioning system concerning embodiment. 図1に示される位置決めシステムの機能構成の一例を示すブロック図である。FIG. 2 is a block diagram illustrating an example of a functional configuration of the positioning system illustrated in FIG. 1. コントローラにおける位置決め処理の流れの一例を示すフローチャートである。9 is a flowchart illustrating an example of a flow of a positioning process in the controller. 移動制御部による移動制御の処理の流れの一例を示すフローチャートである。9 is a flowchart illustrating an example of a flow of movement control processing performed by a movement control unit. 撮像時刻のエンコーダ値EnsXの算出方法を説明する図である。FIG. 9 is a diagram illustrating a method for calculating an encoder value EnsX at an imaging time. 比較例の位置決め処理と、実施の形態に係る位置決め処理とにおける、マークの位置変化の一例を示す図である。FIG. 9 is a diagram illustrating an example of a change in the position of a mark in a positioning process according to a comparative example and a positioning process according to an embodiment. 比較例の位置決め処理と、実施の形態に係る位置決め処理とにおける、ステージ装置の移動速度の変化の一例を示す図である。FIG. 9 is a diagram illustrating an example of a change in the moving speed of the stage device in the positioning process of the comparative example and the positioning process according to the embodiment. コントローラにおける監視処理の流れの一例を示すフローチャートである。9 is a flowchart illustrating an example of the flow of a monitoring process in a controller. 図11に示す理想軌跡の決定処理(ステップS21)のサブルーチンの処理内容を示すフローチャートである。12 is a flowchart showing the contents of a subroutine of an ideal trajectory determination process (step S21) shown in FIG. 図11に示す異常の有無の判定処理(ステップS22)のサブルーチンの処理内容を示すフローチャートである。12 is a flowchart showing the contents of a subroutine of a process (step S22) for determining the presence or absence of an abnormality shown in FIG. 変形例2に係る判定処理のサブルーチンの処理内容を示すフローチャートである。13 is a flowchart illustrating the contents of a subroutine of a determination process according to Modification 2. 変形例3に係る監視処理の流れを示すフローチャートである。13 is a flowchart illustrating a flow of a monitoring process according to a third modification. 図15に示す異常の有無および異常度の高低の判定処理(ステップS121)のサブルーチンの処理内容を示すフローチャートである。16 is a flowchart showing the contents of a subroutine of a process (step S121) for determining the presence / absence of abnormality and the level of abnormality shown in FIG. 変形例5に係るコントローラの機能構成を示す図である。FIG. 14 is a diagram illustrating a functional configuration of a controller according to a modification 5. 変形例6に係る位置決めシステムの構成を示す図である。FIG. 14 is a diagram illustrating a configuration of a positioning system according to a modification 6. 理想軌跡および実軌跡を示す画面の一例である。It is an example of a screen showing an ideal locus and an actual locus. 実軌跡上の位置と理想軌跡上の位置との偏差のX方向およびY方向の各々の成分の経時変化を示す画面の一例を示す図である。It is a figure showing an example of a screen which shows a temporal change of each component of a deviation of a position on an actual locus and a position on an ideal locus in each of the X direction and the Y direction. 理想軌跡と実軌跡との乖離度を示す特徴量(偏差の積分値)の時間変化を示す画面の一例を示す図である。FIG. 9 is a diagram illustrating an example of a screen showing a time change of a feature amount (integral value of deviation) indicating a degree of deviation between an ideal trajectory and an actual trajectory.
 本発明の実施の形態について、図面を参照しながら詳細に説明する。なお、図中の同一または相当部分については、同一符号を付してその説明は繰返さない。 Embodiments of the present invention will be described in detail with reference to the drawings. The same or corresponding parts in the drawings have the same reference characters allotted, and description thereof will not be repeated.
 §1 適用例
 図1を参照して、本発明が適用される場面の一例について説明する。図1は、本実施の形態に係る位置決めシステムの概要を示す模式図である。図1に示す位置決めシステム1は、画像処理を用いて対象物(以下、「対象ワークW」という)の位置決めを行う。位置決めは、典型的には、工業製品の製造過程などにおいて、対象ワークWを生産ラインの本来の位置に配置する処理などを意味する。たとえば、位置決めシステム1は、液晶パネルの生産ラインにおいて、ガラス基板に回路パターンの焼付処理(露光処理)前に、露光マスクに対するガラス基板の位置決めを行なう。
§1 Application Example An example of a scene to which the present invention is applied will be described with reference to FIG. FIG. 1 is a schematic diagram showing an outline of a positioning system according to the present embodiment. The positioning system 1 shown in FIG. 1 positions an object (hereinafter, referred to as “target work W”) using image processing. The positioning typically means a process of arranging the target work W at an original position of a production line in a manufacturing process of an industrial product or the like. For example, the positioning system 1 positions the glass substrate with respect to the exposure mask before printing the circuit pattern on the glass substrate (exposure process) on the liquid crystal panel production line.
 図1に示すように、位置決めシステム1は、移動機構10と、ドライバユニット20と、視覚センサ30と、コントローラ40と、表示装置50とを備える。 As shown in FIG. 1, the positioning system 1 includes a moving mechanism 10, a driver unit 20, a visual sensor 30, a controller 40, and a display device 50.
 移動機構10は、載置される対象ワークWを移動する。移動機構10は、対象ワークWを目標位置に配置できる機構であればどのような自由度のものであってもよい。移動機構10は、たとえば、水平方向の並進移動と回転移動とを対象ワークWに与えることができるXYθステージである。 The moving mechanism 10 moves the target workpiece W to be placed. The moving mechanism 10 may have any degree of freedom as long as the mechanism can arrange the target work W at the target position. The moving mechanism 10 is, for example, an XYθ stage that can apply horizontal translation and rotation to the target workpiece W.
 図1に示す例の移動機構10は、XYθステージであり、Xステージ11と、Yステージ13と、θステージ15と、サーボモータ12,14,16とを含む。Xステージ11は、サーボモータ12の駆動によりX方向に沿って並進移動する並進機構である。Yステージ13は、サーボモータ14の駆動によりY方向に沿って並進移動する並進機構である。θステージ15は、サーボモータ16の駆動によりθ方向に回転移動する回転機構である。θステージ15の上に対象ワークWが載置される。 The moving mechanism 10 in the example shown in FIG. 1 is an XYθ stage, and includes an X stage 11, a Y stage 13, a θ stage 15, and servomotors 12, 14, and 16. The X stage 11 is a translation mechanism that translates along the X direction by driving the servo motor 12. The Y stage 13 is a translation mechanism that translates along the Y direction by driving a servo motor 14. stage 15 is a rotating mechanism that rotates and moves in the θ direction by driving a servo motor 16. The target work W is placed on the θ stage 15.
 サーボモータ12,14,16にはエンコーダ12E,14E,16Eがそれぞれ設けられる。エンコーダ12E,14E,16Eの各々は、対応するサーボモータの駆動量(回転量)に応じたパルス信号をそれぞれ発生する。エンコーダ12Eは、パルス信号に含まれるパルス数をカウントすることにより、初期位置からのXステージ11の移動量をエンコーダ値EnXとして計測する。パルス数のカウント値と移動量とは、所定の係数によって関係付けられる。そのため、エンコーダ12Eは、パルス数のカウント値に当該係数を乗ずることにより、移動量を計測できる。同様に、エンコーダ14Eは、初期位置からのYステージ13のY方向の並進移動量をエンコーダ値EnYとして計測する。エンコーダ16Eは、初期位置からのθステージ15の回転移動量をエンコーダ値Enθとして計測する。 The servo motors 12, 14, 16 are provided with encoders 12E, 14E, 16E, respectively. Each of the encoders 12E, 14E, and 16E generates a pulse signal corresponding to the drive amount (rotation amount) of the corresponding servo motor. The encoder 12E measures the amount of movement of the X stage 11 from the initial position as an encoder value EnX by counting the number of pulses included in the pulse signal. The count value of the number of pulses and the movement amount are related by a predetermined coefficient. Therefore, the encoder 12E can measure the movement amount by multiplying the count value of the pulse number by the coefficient. Similarly, the encoder 14E measures the translation amount of the Y stage 13 in the Y direction from the initial position as an encoder value EnY. The encoder 16E measures the rotational movement amount of the θ stage 15 from the initial position as an encoder value Enθ.
 ドライバユニット20は、制御周期Tsごとにコントローラ40から受ける移動指令に従って、移動機構10の動作制御を行う。図1に示されるように、ドライバユニット20は、サーボドライバ22,24,26を含む。サーボドライバ22,24,26は、サーボモータ12,14,16からエンコーダ値EnX,EnY,Enθを制御周期Tsごとにそれぞれ取得する。制御周期Tsは、固定であり、たとえば1msである。 The driver unit 20 controls the operation of the moving mechanism 10 in accordance with the movement command received from the controller 40 for each control cycle Ts. As shown in FIG. 1, the driver unit 20 includes servo drivers 22, 24, and 26. The servo drivers 22, 24, 26 acquire the encoder values EnX, EnY, Enθ from the servo motors 12, 14, 16 for each control cycle Ts. The control cycle Ts is fixed, for example, 1 ms.
 サーボドライバ22は、Xステージ11の移動量が移動指令に近付くように、サーボモータ12に対してフィードバック制御を行なう。サーボドライバ24は、Yステージ13の移動量が移動指令に近付くように、サーボモータ14に対してフィードバック制御を行なう。サーボドライバ26は、θステージ15の移動量が移動指令に近付くように、サーボモータ16に対してフィードバック制御を行なう。 (4) The servo driver 22 performs feedback control on the servomotor 12 so that the movement amount of the X stage 11 approaches the movement command. The servo driver 24 performs feedback control on the servo motor 14 so that the movement amount of the Y stage 13 approaches the movement command. The servo driver 26 performs feedback control on the servo motor 16 so that the movement amount of the θ stage 15 approaches the movement command.
 ドライバユニット20は、移動機構10から取得したエンコーダ値EnX,EnY,Enθをコントローラ40に出力する。 The driver unit 20 outputs the encoder values EnX, EnY, Enθ acquired from the moving mechanism 10 to the controller 40.
 視覚センサ30は、対象ワークWを撮像し、撮像した画像に基づいて対象ワークWの位置を特定する位置特定処理を撮像周期Tbごとに実行する。撮像周期Tbは、撮像状況などに応じて変動し、たとえば約60msである。図1に示す例では、視覚センサ30は、対象ワークWの位置として、対象ワークWに設けられた特徴部分であるマーク5a,5bの位置を特定する。 The visual sensor 30 captures an image of the target work W and executes a position specifying process for specifying the position of the target work W based on the captured image for each imaging cycle Tb. The imaging cycle Tb varies according to the imaging situation and the like, and is, for example, about 60 ms. In the example illustrated in FIG. 1, the visual sensor 30 specifies, as the position of the target work W, the positions of the marks 5a and 5b, which are characteristic portions provided on the target work W.
 視覚センサ30は、定位置に設置された撮像部31と、画像処理装置32とを含む。撮像部31は、撮像視野に存在する被写体を撮像して画像データを生成する撮像処理を行なうものであり、対象ワークWを撮像する。撮像部31は、たとえばカメラである。 The visual sensor 30 includes an imaging unit 31 installed at a fixed position and an image processing device 32. The imaging unit 31 performs an imaging process of imaging a subject existing in an imaging field of view and generating image data, and captures an image of the target work W. The imaging unit 31 is, for example, a camera.
 画像処理装置32は、撮像部31により生成された画像データに対して画像解析を行ない、マーク5a,5bの位置の座標を算出する。当該座標は、視覚センサ30に対応するローカル座標系(以下、「カメラ座標系」という)で示される。 The image processing device 32 performs image analysis on the image data generated by the imaging unit 31 and calculates the coordinates of the positions of the marks 5a and 5b. The coordinates are indicated by a local coordinate system (hereinafter, referred to as a “camera coordinate system”) corresponding to the visual sensor 30.
 コントローラ40は、たとえばPLC(プログラマブルロジックコントローラ)であり、各種のFA制御を行なう。 The controller 40 is, for example, a PLC (programmable logic controller) and performs various FA controls.
 コントローラ40は、視覚センサ30によって特定された対象ワークWの位置に基づいて、対象ワークWの位置が目標位置に近づくように、制御周期Tsごとに移動指令を更新してドライバユニット20に出力する。図1に示す例では、コントローラ40は、視覚センサ30によって特定されたマーク5a,5bの各々の位置が対応する目標位置に近づくように、制御周期Tsごとに移動指令を更新してドライバユニット20に出力する。 Based on the position of the target work W specified by the visual sensor 30, the controller 40 updates the movement command for each control cycle Ts and outputs it to the driver unit 20 so that the position of the target work W approaches the target position. . In the example shown in FIG. 1, the controller 40 updates the movement command for each control cycle Ts so that the position of each of the marks 5a and 5b specified by the visual sensor 30 approaches the corresponding target position, and updates the driver unit 20. Output to
 上述したように、視覚センサ30によって特定されるマーク5a,5bの位置の座標は、カメラ座標系で示される。一方、コントローラ40の制御対象である移動機構10の移動量は、移動機構10の座標系(以下、「機械座標系」という)で示される。そのため、コントローラ40は、カメラ座標系と機械座標系とを対応付けるキャリブレーションパラメータを用いて、マーク5a,5bの位置の座標をカメラ座標系から機械座標系に変換し、変換後の座標を用いて移動指令MVを生成する。 As described above, the coordinates of the positions of the marks 5a and 5b specified by the visual sensor 30 are indicated in the camera coordinate system. On the other hand, the amount of movement of the moving mechanism 10 to be controlled by the controller 40 is indicated in a coordinate system of the moving mechanism 10 (hereinafter, referred to as “mechanical coordinate system”). Therefore, the controller 40 converts the coordinates of the positions of the marks 5a and 5b from the camera coordinate system to the machine coordinate system using the calibration parameters for associating the camera coordinate system with the machine coordinate system, and uses the converted coordinates. Generate a movement command MV.
 さらに、コントローラ40は、キャリブレーションパラメータのずれの要因となる異常の有無を監視する。「キャリブレーションパラメータのずれ」とは、現在設定されているキャリブレーションパラメータと、移動機構10の現在の状態に対応する理想的なキャリブレーションパラメータとの偏差を意味する。キャリブレーションパラメータのずれの要因となる異常には、たとえば、移動機構10の経年劣化、外部要因などが含まれる。 {Circle around (4)} The controller 40 monitors the presence or absence of an abnormality that causes a deviation of the calibration parameter. The “calibration parameter deviation” means a deviation between a currently set calibration parameter and an ideal calibration parameter corresponding to the current state of the moving mechanism 10. The abnormalities that cause the deviation of the calibration parameters include, for example, aging of the moving mechanism 10 and external factors.
 コントローラ40は、対象ワークWに対する視覚センサ30の初回の位置特定処理で特定された対象ワークWの位置(初期位置)から目標位置までの対象ワークWの画像上の理想軌跡を決定する。理想軌跡は、キャリブレーションパラメータのずれが0であると仮定したときの、対象ワークWの画像上の軌跡を示す。 The controller 40 determines an ideal trajectory on the image of the target work W from the position (initial position) of the target work W specified in the initial position specifying process of the visual sensor 30 to the target work W to the target position. The ideal trajectory indicates a trajectory on the image of the target work W when it is assumed that the deviation of the calibration parameter is zero.
 たとえば、コントローラ40は、視覚センサ30によって特定されたマーク5aの初期位置からマーク5aの目標位置までのマーク5aの画像上の理想軌跡を決定する。さらに、コントローラ40は、視覚センサ30によって特定されたマーク5bの初期位置からマーク5bの目標位置までのマーク5bの画像上の理想軌跡を決定する。 {For example, the controller 40 determines an ideal trajectory on the image of the mark 5a from the initial position of the mark 5a specified by the visual sensor 30 to the target position of the mark 5a. Further, the controller 40 determines an ideal trajectory on the image of the mark 5b from the initial position of the mark 5b specified by the visual sensor 30 to the target position of the mark 5b.
 コントローラ40は、2回目以降の位置特定処理によって特定された対象ワークWの位置の実軌跡と理想軌跡との比較結果に基づいて、キャリブレーションパラメータのずれの要因となる異常の有無を判定する。判定部49は、判定結果を表示装置50に出力する。 The controller 40 determines the presence or absence of an abnormality that causes a deviation of the calibration parameter based on a comparison result between the actual trajectory and the ideal trajectory of the position of the target work W specified by the second and subsequent position specifying processes. The determination unit 49 outputs the determination result to the display device 50.
 上述したように、視覚センサ30によって撮像周期Tbごとに特定されるマーク5a,5bの位置が対応する目標位置にそれぞれ近づくように、コントローラ40は移動機構10を制御する。そのため、キャリブレーションパラメータのずれが生じていたとしても、対象ワークWのマーク5a,5bの各々は、最終的に対応する目標位置に到達する。ただし、キャリブレーションパラメータのずれが生じているため、実軌跡は理想軌跡から外れる。 As described above, the controller 40 controls the moving mechanism 10 so that the positions of the marks 5a and 5b specified by the visual sensor 30 for each imaging cycle Tb approach the corresponding target positions. Therefore, even if the calibration parameter is shifted, each of the marks 5a and 5b of the target work W finally reaches the corresponding target position. However, the actual trajectory deviates from the ideal trajectory because of the deviation of the calibration parameters.
 図2は、理想軌跡と実軌跡との一例を示す図である。図2には、マーク5a,5bのうちの一方である対象マークについての理想軌跡と実軌跡とが示される。なお、他方のマークについても同様である。 FIG. 2 is a diagram showing an example of an ideal trajectory and an actual trajectory. FIG. 2 shows an ideal trajectory and an actual trajectory for one of the marks 5a and 5b. The same applies to the other mark.
 図2において、位置P(0)は、初回の位置特定処理によって特定された画像上の対象マークの初期位置を示す。実線70は、位置P(0)から対象マークの目標位置SPまでの理想軌跡を示す。実線72aは、対象マークの実軌跡の一例を示す。実線72bは、対象マークの実軌跡の別の例を示す。 In FIG. 2, the position P (0) indicates the initial position of the target mark on the image specified by the first position specifying process. A solid line 70 indicates an ideal trajectory from the position P (0) to the target position SP of the target mark. The solid line 72a shows an example of the actual trajectory of the target mark. A solid line 72b shows another example of the actual trajectory of the target mark.
 コントローラ40は、たとえば、実線70で示される理想軌跡を含む許容領域71を設定する。許容領域71は、実験的または理論的に定められる。コントローラ40は、実軌跡が許容領域71内である場合、キャリブレーションパラメータのずれの要因となる異常が発生していないと判定する。図2に示す例では、実線72aで示される実軌跡に対して、コントローラ40は、キャリブレーションパラメータのずれの要因のなる異常が発生していないと判定する。 The controller 40 sets, for example, an allowable area 71 including an ideal locus indicated by a solid line 70. The allowable area 71 is determined experimentally or theoretically. When the actual trajectory is within the allowable area 71, the controller 40 determines that an abnormality that causes a deviation of the calibration parameter has not occurred. In the example illustrated in FIG. 2, the controller 40 determines that an abnormality that causes a deviation of the calibration parameter has not occurred with respect to the actual locus indicated by the solid line 72a.
 一方、コントローラ40は、実軌跡が許容領域71から外れた場合、キャリブレーションパラメータのずれの要因となる異常が発生していると判定する。図2に示す例では、実線72bで示される実軌跡に対して、判定部49は、キャリブレーションパラメータのずれの要因となる異常が発生していると判定する。 On the other hand, if the actual trajectory deviates from the allowable area 71, the controller 40 determines that an abnormality that causes a deviation of the calibration parameter has occurred. In the example illustrated in FIG. 2, the determination unit 49 determines that an abnormality that causes a deviation of the calibration parameter has occurred with respect to the actual locus indicated by the solid line 72b.
 これにより、コントローラ40は、属人的判断に依存することなく、キャリブレーションパラメータのずれの要因となる異常の有無を精度良く判定することができる。 Thereby, the controller 40 can accurately determine the presence or absence of an abnormality that causes a deviation of the calibration parameter without depending on the personal judgment.
 §2 具体例
 次に、本実施の形態に係る位置決めシステムの一例について説明する。
§2 Specific Example Next, an example of the positioning system according to the present embodiment will be described.
 <2-1.画像処理装置のハードウェア構成>
 図3は、本実施の形態に係る位置決めシステム1を構成する画像処理装置32のハードウェア構成を示す模式図である。画像処理装置32は、典型的には、汎用的なコンピュータアーキテクチャに従う構造を有しており、予めインストールされたプログラムをプロセッサが実行することで、後述するような各種の画像処理を実現する。
<2-1. Hardware configuration of image processing device>
FIG. 3 is a schematic diagram illustrating a hardware configuration of the image processing device 32 included in the positioning system 1 according to the present embodiment. The image processing device 32 typically has a structure according to a general-purpose computer architecture, and realizes various types of image processing as described later by executing a program installed in advance by a processor.
 より具体的には、画像処理装置32は、CPU(Central Processing Unit)やMPU(Micro-Processing Unit)などのプロセッサ310と、RAM(Random Access Memory)312と、表示コントローラ314と、システムコントローラ316と、I/O(Input Output)コントローラ318と、ハードディスク320と、カメラインターフェイス322と、入力インターフェイス324と、コントローラインターフェイス326と、通信インターフェイス328と、メモリカードインターフェイス330とを含む。これらの各部は、システムコントローラ316を中心として、互いにデータ通信可能に接続される。 More specifically, the image processing device 32 includes a processor 310 such as a CPU (Central Processing Unit) or an MPU (Micro-Processing Unit), a RAM (Random Access Memory) 312, a display controller 314, and a system controller 316. , An input / output (I / O) controller 318, a hard disk 320, a camera interface 322, an input interface 324, a controller interface 326, a communication interface 328, and a memory card interface 330. These units are connected to each other so as to enable data communication with the system controller 316 at the center.
 プロセッサ310は、システムコントローラ316との間でプログラム(コード)などを交換して、これらを所定順序で実行することで、目的の演算処理を実現する。 The processor 310 exchanges programs (codes) and the like with the system controller 316 and executes them in a predetermined order, thereby realizing the intended arithmetic processing.
 システムコントローラ316は、プロセッサ310、RAM312、表示コントローラ314、およびI/Oコントローラ318とそれぞれバスを介して接続されており、各部との間でデータ交換などを行うとともに、画像処理装置32全体の処理を司る。 The system controller 316 is connected to the processor 310, the RAM 312, the display controller 314, and the I / O controller 318 via buses, respectively, exchanges data with each unit, and performs processing of the entire image processing apparatus 32. Govern
 RAM312は、典型的には、DRAM(Dynamic Random Access Memory)などの揮発性の記憶装置であり、ハードディスク320から読み出されたプログラムや、撮像部31によって撮像された画像(画像データ)、画像に対する処理結果、およびワークデータなどを保持する。画像に対する処理結果には、当該画像に含まれるマーク5a,5bの位置座標が含まれる。 The RAM 312 is typically a volatile storage device such as a DRAM (Dynamic Random Access Memory), and stores a program read from the hard disk 320, an image (image data) captured by the imaging unit 31, and an image. Holds processing results and work data. The processing result for the image includes the position coordinates of the marks 5a and 5b included in the image.
 表示コントローラ314は、表示部60と接続されており、システムコントローラ316からの内部コマンドに従って、各種の情報を表示するための信号を表示部60へ出力する。 The display controller 314 is connected to the display unit 60, and outputs a signal for displaying various information to the display unit 60 according to an internal command from the system controller 316.
 I/Oコントローラ318は、画像処理装置32に接続される記録媒体や外部機器との間のデータ交換を制御する。より具体的には、I/Oコントローラ318は、ハードディスク320と、カメラインターフェイス322と、入力インターフェイス324と、コントローラインターフェイス326と、通信インターフェイス328と、メモリカードインターフェイス330と接続される。 The I / O controller 318 controls data exchange between a recording medium connected to the image processing apparatus 32 and an external device. More specifically, I / O controller 318 is connected to hard disk 320, camera interface 322, input interface 324, controller interface 326, communication interface 328, and memory card interface 330.
 ハードディスク320は、典型的には、不揮発性の磁気記憶装置であり、プロセッサ310で実行されるプログラムに加えて、各種設定値などが格納される。 The hard disk 320 is typically a non-volatile magnetic storage device, and stores various setting values in addition to a program executed by the processor 310.
 カメラインターフェイス322は、対象ワークWを撮影することで生成された画像データを受付ける入力部に相当し、プロセッサ310と撮像部31との間のデータ伝送を仲介する。カメラインターフェイス322は、撮像部31からの画像データをそれぞれ一時的に蓄積するための画像バッファを含む。 The camera interface 322 corresponds to an input unit that receives image data generated by photographing the target work W, and mediates data transmission between the processor 310 and the imaging unit 31. The camera interface 322 includes an image buffer for temporarily storing image data from the imaging unit 31, respectively.
 入力インターフェイス324は、プロセッサ310とキーボード334、マウス、タッチパネル、専用コンソールなどの入力装置との間のデータ伝送を仲介する。 The input interface 324 mediates data transmission between the processor 310 and input devices such as a keyboard 334, a mouse, a touch panel, and a dedicated console.
 コントローラインターフェイス326は、プロセッサ310とコントローラ40との間のデータ伝送を仲介する。 The controller interface 326 mediates data transmission between the processor 310 and the controller 40.
 通信インターフェイス328は、プロセッサ310と図示しない他のパーソナルコンピュータやサーバ装置などとの間のデータ伝送を仲介する。通信インターフェイス328は、典型的には、イーサネット(登録商標)やUSB(Universal Serial Bus)などからなる。 The communication interface 328 mediates data transmission between the processor 310 and another personal computer or server device (not shown). The communication interface 328 is typically made of Ethernet (registered trademark), USB (Universal Serial Bus), or the like.
 メモリカードインターフェイス330は、プロセッサ310と記録媒体61との間のデータ伝送を仲介する。 The memory card interface 330 mediates data transmission between the processor 310 and the recording medium 61.
 <2-2.コントローラのハードウェア構成>
 図4は、実施の形態に係る位置決めシステムを構成するコントローラ40のハードウェア構成を示す模式図である。コントローラ40は、チップセット412と、プロセッサ414と、不揮発性メモリ416と、主メモリ418と、システムクロック420と、メモリカードインターフェイス422と、表示コントローラ426と、通信インターフェイス428と、内部バスコントローラ430と、フィールドバスコントローラ438とを含む。チップセット412と他のコンポーネントとの間は、各種のバスを介してそれぞれ結合されている。
<2-2. Controller hardware configuration>
FIG. 4 is a schematic diagram illustrating a hardware configuration of the controller 40 configuring the positioning system according to the embodiment. The controller 40 includes a chipset 412, a processor 414, a nonvolatile memory 416, a main memory 418, a system clock 420, a memory card interface 422, a display controller 426, a communication interface 428, and an internal bus controller 430. , A fieldbus controller 438. The chipset 412 and other components are respectively connected via various buses.
 プロセッサ414およびチップセット412は、典型的には、汎用的なコンピュータアーキテクチャに従う構成を有している。すなわち、プロセッサ414は、チップセット412から内部クロックに従って順次供給される命令コードを解釈して実行する。チップセット412は、接続されている各種コンポーネントとの間で内部的なデータを遣り取りするとともに、プロセッサ414に必要な命令コードを生成する。システムクロック420は、予め定められた周期のシステムクロックを発生してプロセッサ414に提供する。チップセット412は、プロセッサ414での演算処理の実行の結果得られたデータなどをキャッシュする機能を有する。 The processor 414 and the chipset 412 typically have a configuration according to a general-purpose computer architecture. That is, the processor 414 interprets and executes the instruction codes sequentially supplied from the chipset 412 according to the internal clock. The chipset 412 exchanges internal data with various connected components and generates an instruction code necessary for the processor 414. The system clock 420 generates a system clock having a predetermined cycle and provides the system clock to the processor 414. The chip set 412 has a function of caching data and the like obtained as a result of execution of arithmetic processing by the processor 414.
 コントローラ40は、記憶手段として、不揮発性メモリ416および主メモリ418を有する。不揮発性メモリ416は、プロセッサ414で実行される制御プログラム440に加えて、データ定義情報、ログ情報などを不揮発的に保持する。制御プログラム440は、記録媒体424などに格納された状態で流通する。主メモリ418は、揮発性の記憶領域であり、プロセッサ414で実行されるべき各種プログラムを保持するとともに、各種プログラムの実行時の作業用メモリとしても使用される。 The controller 40 has a nonvolatile memory 416 and a main memory 418 as storage means. The non-volatile memory 416 non-volatilely stores data definition information, log information, and the like, in addition to the control program 440 executed by the processor 414. The control program 440 is distributed while being stored in the recording medium 424 or the like. The main memory 418 is a volatile storage area that holds various programs to be executed by the processor 414 and is also used as a working memory when executing various programs.
 コントローラ40は、通信手段として、通信インターフェイス428および内部バスコントローラ430を有する。これらの通信回路は、データの送信および受信を行う。 The controller 40 has a communication interface 428 and an internal bus controller 430 as communication means. These communication circuits transmit and receive data.
 通信インターフェイス428は、視覚センサ30との間でデータを遣り取りする。内部バスコントローラ430は、データの遣り取りを制御する。より具体的には、内部バスコントローラ430は、バッファメモリ436と、DMA(Dynamic Memory Access)制御回路432と、内部バス制御回路434とを含む。 The communication interface 428 exchanges data with the visual sensor 30. The internal bus controller 430 controls data exchange. More specifically, the internal bus controller 430 includes a buffer memory 436, a DMA (Dynamic Memory Access) control circuit 432, and an internal bus control circuit 434.
 メモリカードインターフェイス422は、コントローラ40に対して着脱可能な記録媒体424とプロセッサ414とを接続する。記録媒体424は、コンピュータその他装置、機械等が記録されたプログラム等の情報を読み取り可能なように、当該プログラム等の情報を、電気的、磁気的、光学的、機械的または化学的作用によって蓄積する媒体である。記録媒体51には、コントローラ40で実行される制御プログラム440などが格納された状態で流通し、メモリカードインターフェイス422は、記録媒体424から制御プログラムを読み出す。記録媒体424は、SD(Secure Digital)などの汎用的な半導体記憶デバイスや、フレキシブルディスク(Flexible Disk)などの磁気記録媒体や、CD-ROM(Compact Disk Read Only Memory)などの光学記録媒体等からなる。あるいは、通信インターフェイス417を介して、配信サーバなどからダウンロードしたプログラムをコントローラ40にインストールしてもよい。 The memory card interface 422 connects the recording medium 424 detachable to the controller 40 and the processor 414. The recording medium 424 stores information such as a program by an electrical, magnetic, optical, mechanical, or chemical action so that a computer or other device, machine, or the like can read information such as a recorded program. It is a medium to do. The recording medium 51 circulates in a state where a control program 440 executed by the controller 40 and the like are stored, and the memory card interface 422 reads the control program from the recording medium 424. The recording medium 424 may be a general-purpose semiconductor storage device such as SD (Secure Digital), a magnetic recording medium such as a flexible disk (Flexible Disk), or an optical recording medium such as a CD-ROM (Compact Disk Read Only Memory). Become. Alternatively, a program downloaded from a distribution server or the like may be installed in the controller 40 via the communication interface 417.
 フィールドバスコントローラ438は、フィールドネットワークに接続するための通信インターフェイスである当該フィールドネットワークには、たとえば、EtherCAT(登録商標)、EtherNet/IP(登録商標)、CompoNet(登録商標)などが採用される。コントローラ40は、フィールドバスコントローラ438を介してサーボドライバ22,24,26と接続される。 The field bus controller 438 is a communication interface for connecting to a field network. For the field network, for example, EtherCAT (registered trademark), EtherNet / IP (registered trademark), CompoNet (registered trademark), or the like is employed. The controller 40 is connected to the servo drivers 22, 24, 26 via a field bus controller 438.
 表示コントローラ426は、表示装置50と接続されており、チップセット412からのコマンドに従って、各種の情報を表示するための信号を表示装置50へ出力する。 The display controller 426 is connected to the display device 50, and outputs a signal for displaying various information to the display device 50 according to a command from the chipset 412.
 <2-3.画像処理装置の処理例>
 図5は、図1に示される位置決めシステムの機能構成の一例を示すブロック図である。画像処理装置32は、撮像部31によって撮像された画像の中からマーク5a,5b(図1参照)を探索し、マーク5aの位置(以下、「計測位置PMa」という)とマーク5bの位置(以下、「計測位置PMb」という)とを特定する。画像処理装置32は、公知のパターン認識技術を用いて、画像の中からマーク5a,5bを認識すればよい。画像処理装置32は、計測位置PMa,PMbの座標を計測する。当該座標は、カメラ座標系で示される。
<2-3. Processing example of image processing device>
FIG. 5 is a block diagram showing an example of a functional configuration of the positioning system shown in FIG. The image processing device 32 searches for the marks 5a and 5b (see FIG. 1) from the image captured by the imaging unit 31, and detects the positions of the marks 5a (hereinafter, referred to as “measurement positions PMa”) and the positions of the marks 5b ( Hereinafter, “measurement position PMb” is specified. The image processing device 32 may recognize the marks 5a and 5b from the image using a known pattern recognition technology. The image processing device 32 measures the coordinates of the measurement positions PMa and PMb. The coordinates are indicated in a camera coordinate system.
 <2-4.コントローラの機能構成>
 図5に示されるように、コントローラ40は、移動制御部41とモデル記憶部46と監視部47とを備える。移動制御部41および監視部47は、図4に示すプロセッサ414が制御プログラム440を実行することにより実現される。モデル記憶部46は、図4に示す不揮発性メモリ416および主メモリ418により実現される。
<2-4. Functional configuration of controller>
As shown in FIG. 5, the controller 40 includes a movement control unit 41, a model storage unit 46, and a monitoring unit 47. The movement control unit 41 and the monitoring unit 47 are realized by the processor 414 illustrated in FIG. The model storage unit 46 is realized by the nonvolatile memory 416 and the main memory 418 shown in FIG.
 <2-4-1.移動制御部>
 移動制御部41は、マーク5aの計測位置PMaが目標位置SPaに近づくとともに、マーク5bの計測位置PMbが目標位置SPbに近づくように、移動指令を生成する。目標位置SPa,SPbは予め定められている。図5に示されるように、移動制御部41は、座標変換部42と、位置決定部43と、減算部44と、演算部45とを備える。
<2-4-1. Movement control section>
The movement control unit 41 generates a movement command such that the measured position PMa of the mark 5a approaches the target position SPa and the measured position PMb of the mark 5b approaches the target position SPb. Target positions SPa and SPb are predetermined. As shown in FIG. 5, the movement control unit 41 includes a coordinate conversion unit 42, a position determination unit 43, a subtraction unit 44, and a calculation unit 45.
 上述したように、視覚センサ30から出力される計測位置PMa、PMbは、カメラ座標系で示される。一方、移動制御部41の制御対象である移動機構10の移動量は、機械座標系で示される。 As described above, the measurement positions PMa and PMb output from the visual sensor 30 are indicated by the camera coordinate system. On the other hand, the movement amount of the movement mechanism 10 to be controlled by the movement control unit 41 is shown in a machine coordinate system.
 そのため、座標変換部42は、カメラ座標系を機械座標系に変換するためのキャリブレーションパラメータを用いて、視覚センサ30によって特定された計測位置PMa,PMbの座標変換を行ない、機械座標系における計測位置PMa,PMbの座標を算出する。座標変換部42は、たとえばアフィン変換を用いて座標変換を行なえばよい。座標変換部42は、機械座標系における計測位置PMa,PMbの座標を位置決定部43に出力する。 Therefore, the coordinate conversion unit 42 performs the coordinate conversion of the measurement positions PMa and PMb specified by the visual sensor 30 using the calibration parameters for converting the camera coordinate system into the machine coordinate system, and performs measurement in the machine coordinate system. The coordinates of the positions PMa and PMb are calculated. The coordinate conversion unit 42 may perform the coordinate conversion using, for example, affine transformation. The coordinate conversion unit 42 outputs the coordinates of the measurement positions PMa and PMb in the machine coordinate system to the position determination unit 43.
 計測位置PMa,PMbは、撮像周期Tbごとに特定される。一方、移動指令は、制御周期Tsごとに生成される。上述したように、撮像周期Tbは、制御周期Tsよりも長い。そのため、視覚センサ30によって特定された計測位置PMa,PMbのみを用いて移動機構10が制御されると、オーバシュートおよび振動が生じやすくなる。このようなオーバシュートおよび振動を避けるため、位置決定部43は、視覚センサ30によって撮像周期Tbごとに特定された計測位置PMa,PMbと制御周期Tsごとに出力されるエンコーダ値EnX,EnY,Enθとに基づいて、制御周期Tsごとに対象ワークWの2つのマーク5a,5bの推定位置PVa,PVbをそれぞれ決定する。推定位置PVa,PVbの決定方法については後述する。 The measurement positions PMa and PMb are specified for each imaging cycle Tb. On the other hand, the movement command is generated for each control cycle Ts. As described above, the imaging cycle Tb is longer than the control cycle Ts. Therefore, if the moving mechanism 10 is controlled using only the measurement positions PMa and PMb specified by the visual sensor 30, overshoot and vibration are likely to occur. In order to avoid such overshoot and vibration, the position determination unit 43 determines the measurement positions PMa, PMb specified by the visual sensor 30 for each imaging cycle Tb and the encoder values EnX, EnY, Enθ output for each control cycle Ts. , The estimated positions PVa and PVb of the two marks 5a and 5b of the target work W are determined for each control cycle Ts. The method for determining the estimated positions PVa and PVb will be described later.
 減算部44は、推定位置PVaと目標位置SPaとの偏差(距離)と、推定位置PVbと目標位置SPbとの偏差(距離)とを出力する。 The subtraction unit 44 outputs a deviation (distance) between the estimated position PVa and the target position SPa, and a deviation (distance) between the estimated position PVb and the target position SPb.
 演算部45は、推定位置PVaと目標位置SPaとの偏差および推定位置PVbと目標位置SPbとの偏差が0に収束するように演算(P演算またはPID演算)を行ない、制御周期Tsごとに移動指令MVX,MVY,MVθを算出する。移動指令MVXは、Xステージ11(図1参照)に対する移動指令である。移動指令MVYは、Yステージ13(図1参照)に対する移動指令である。移動指令MVθは、θステージ15(図1参照)に対する移動指令である。演算部45は、算出した移動指令MVX、MVY,MVθをドライバユニット20に出力する。移動指令MVX、MVY,MVθは、たとえば位置指令または速度指令である。 The operation unit 45 performs an operation (P operation or PID operation) so that the deviation between the estimated position PVa and the target position SPa and the deviation between the estimated position PVb and the target position SPb converge to 0, and moves every control cycle Ts. The commands MVX, MVY, MVθ are calculated. The movement command MVX is a movement command for the X stage 11 (see FIG. 1). The movement command MVY is a movement command for the Y stage 13 (see FIG. 1). The movement command MVθ is a movement command for the θ stage 15 (see FIG. 1). The calculation unit 45 outputs the calculated movement commands MVX, MVY, MVθ to the driver unit 20. The movement commands MVX, MVY, MVθ are, for example, position commands or speed commands.
 <2-4-2.モデル記憶部>
 モデル記憶部46は、移動機構10、ドライバユニット20および移動制御部41を模擬したシミュレーションモデルを記憶する。シミュレーションモデルは、移動機構10の設計データ、移動制御部41の処理を規定するプログラムなどから構成される。
<2-4-2. Model storage>
The model storage unit 46 stores a simulation model simulating the moving mechanism 10, the driver unit 20, and the movement control unit 41. The simulation model is composed of design data of the moving mechanism 10, a program that defines processing of the moving control unit 41, and the like.
 <2-4-3.監視部>
 監視部47は、カメラ座標系を機械座標系に変換するためのキャリブレーションパラメータのずれの要因となる異常の有無を監視する。図5に示されるように、監視部47は、軌跡決定部48と、判定部49とを備える。
<2-4-3. Monitoring section>
The monitoring unit 47 monitors the presence or absence of an abnormality that causes a deviation of a calibration parameter for converting a camera coordinate system into a machine coordinate system. As shown in FIG. 5, the monitoring unit 47 includes a trajectory determination unit 48 and a determination unit 49.
 軌跡決定部48は、モデル記憶部46が記憶するシミュレーションモデルを用いてシミュレーションを行なうことにより、初回の位置特定処理で特定された対象ワークWの初期位置から目標位置までの対象ワークWの画像上の理想軌跡を決定する。 The trajectory determination unit 48 performs a simulation using the simulation model stored in the model storage unit 46, so that an image of the target work W from the initial position to the target position of the target work W specified in the first position specification processing is displayed. Is determined.
 具体的には、軌跡決定部48は、シミュレーションモデルを用いてシミュレーションを行なうことにより、視覚センサ30による初回の位置特定処理で特定されたマーク5aの計測位置PMaから目標位置SPaまでのマーク5aの画像上の理想軌跡を決定する。同様に、軌跡決定部48は、シミュレーションモデルを用いてシミュレーションを行なうことにより、視覚センサ30による初回の位置特定処理で特定されたマーク5bの計測位置PMbから目標位置SPbまでのマーク5bの画像上の理想軌跡を決定する。軌跡決定部48の具体的な処理方法については後述する。 More specifically, the trajectory determination unit 48 performs a simulation using a simulation model, and thereby performs the simulation of the mark 5a from the measurement position PMa of the mark 5a specified in the initial position specification processing by the visual sensor 30 to the target position SPa. Determine the ideal trajectory on the image. Similarly, the trajectory determination unit 48 performs a simulation using the simulation model, and thereby, on the image of the mark 5b from the measurement position PMb of the mark 5b specified in the initial position specification processing by the visual sensor 30 to the target position SPb. Is determined. A specific processing method of the trajectory determination unit 48 will be described later.
 判定部49は、2回目以降の位置特定処理によって特定された対象ワークWの位置の実軌跡と理想軌跡との比較結果に基づいて、異常の有無を判定する。判定部49は、判定結果を表示装置50に出力する。 The determination unit 49 determines the presence or absence of an abnormality based on a comparison result between the actual trajectory and the ideal trajectory of the position of the target work W specified in the second and subsequent position specifying processes. The determination unit 49 outputs the determination result to the display device 50.
 判定部49は、軌跡決定部48によって決定されたマーク5aの理想軌跡と、2回目以降の位置特定処理によって特定されたマーク5aの計測位置PMaとの乖離度を評価する。同様に、判定部49は、軌跡決定部48によって決定されたマーク5bの理想軌跡と、2回目以降の位置特定処理によって特定されたマーク5bの計測位置PMbとの乖離度を評価する。判定部49は、評価結果に基づいて、異常の有無を判定する。判定部49の具体的な処理方法については後述する。 The determination unit 49 evaluates the degree of deviation between the ideal trajectory of the mark 5a determined by the trajectory determination unit 48 and the measurement position PMa of the mark 5a specified by the second and subsequent position specifying processes. Similarly, the determination unit 49 evaluates the degree of deviation between the ideal trajectory of the mark 5b determined by the trajectory determination unit 48 and the measurement position PMb of the mark 5b specified by the second and subsequent position specification processing. The determining unit 49 determines whether there is an abnormality based on the evaluation result. A specific processing method of the determination unit 49 will be described later.
 §3 動作例
 <3-1.コントローラの位置決め処理の流れ>
 図6を参照して、コントローラ40の位置決め処理の流れの一例について説明する。図6は、コントローラにおける位置決め処理の流れの一例を示すフローチャートである。
§3 Operation example <3-1. Flow of controller positioning processing>
An example of the flow of the positioning process of the controller 40 will be described with reference to FIG. FIG. 6 is a flowchart illustrating an example of the flow of the positioning process in the controller.
 まずステップS1において、コントローラ40は、推定位置PVa,PVbおよびエンコーダ値EnX,EnY,Ebθを初期化する。 First, in step S1, the controller 40 initializes the estimated positions PVa, PVb and the encoder values EnX, EnY, Ebθ.
 次に、対象ワークWが移動機構10上に載置されると、コントローラ40は、上位の制御装置からの指示に従って、視覚センサ30に対して撮像トリガを出力する。そして、ステップS2において、視覚センサ30は、停止中の移動機構10上の対象ワークWを撮像する。ステップS3において、視覚センサ30は、撮像された画像に含まれるマーク5a,5bの計測位置PMa,PMbをそれぞれ特定する。ステップS2,S3は、視覚センサ30による初回の位置特定処理に対応する。 Next, when the target work W is placed on the moving mechanism 10, the controller 40 outputs an imaging trigger to the visual sensor 30 in accordance with an instruction from a higher-level control device. Then, in step S2, the visual sensor 30 captures an image of the target work W on the stopped moving mechanism 10. In step S3, the visual sensor 30 specifies the measurement positions PMa and PMb of the marks 5a and 5b included in the captured image, respectively. Steps S2 and S3 correspond to the initial position identification processing by the visual sensor 30.
 次にステップS4において、移動制御部41は、移動機構10の移動制御を開始する。移動制御部41による移動制御の詳細については後述する。 Next, in step S4, the movement control unit 41 starts the movement control of the movement mechanism 10. The details of the movement control by the movement control unit 41 will be described later.
 次にステップS5において、移動制御部41は、目標位置SPa,SPbに対する推定位置PVa,PVbのそれぞれの偏差が閾値Th0未満か否かを判定する。閾値Th0は、要求される位置決め精度に応じて予め定められる。目標位置SPa,SPbに対する推定位置PVa,PVbのそれぞれの偏差が閾値Th0未満である場合(ステップS5でYES)、移動制御部41は、移動機構10の移動制御を終了する。これにより、位置決め処理が終了する。 Next, in step S5, the movement control unit 41 determines whether each deviation of the estimated positions PVa, PVb from the target positions SPa, SPb is less than a threshold Th0. The threshold value Th0 is predetermined in accordance with the required positioning accuracy. If the respective deviations of the estimated positions PVa, PVb from the target positions SPa, SPb are smaller than the threshold Th0 (YES in step S5), the movement control unit 41 ends the movement control of the movement mechanism 10. Thus, the positioning process ends.
 目標位置SPa,SPbに対する推定位置PVa,PVbのそれぞれの偏差の少なくとも一方が閾値Th0未満でない場合(ステップS5でNO)、ステップS6~ステップS8の処理が撮像周期Tbごとに繰り返される。なお、ステップS6~S8の間にも、移動制御部41による移動制御が並行して実行される。 (4) If at least one of the deviations of the estimated positions PVa and PVb from the target positions SPa and SPb is not less than the threshold value Th0 (NO in step S5), the processing in steps S6 to S8 is repeated for each imaging cycle Tb. Note that the movement control by the movement control unit 41 is also performed in parallel between steps S6 to S8.
 ステップS6において、コントローラ40は、現時刻が撮像トリガ出力時刻であるか否かを判定する。撮像トリガ出力時刻は、撮像周期Tbに従って、予め定められる。すなわち、前回の撮像トリガ出力時刻から撮像周期Tbだけ経過した時刻が新たな撮像トリガ出力時刻として定められる。現時刻が撮像トリガ出力時刻でない場合(ステップS6でNO)、位置決め処理はステップS5に戻る。 In step S6, the controller 40 determines whether or not the current time is the imaging trigger output time. The imaging trigger output time is determined in advance according to the imaging cycle Tb. That is, a time at which the imaging cycle Tb has elapsed from the previous imaging trigger output time is determined as a new imaging trigger output time. If the current time is not the imaging trigger output time (NO in step S6), the positioning process returns to step S5.
 現時刻が撮像トリガ出力時刻である場合(ステップS6でYES)、コントローラ40は、撮像トリガを視覚センサ30に出力する。これにより、ステップS7において、撮像トリガを受けた視覚センサ30は、移動中の移動機構10上の対象ワークWを撮像する。ステップS8において、視覚センサ30は、撮像された画像に含まれるマーク5a,5bの計測位置PMa,PMbをそれぞれ特定する。ステップS7,S8は、初回の位置特定処理より後の位置特定処理(2回目以降の位置特定処理)に対応する。ステップS8の後、位置決め処理はステップS5に戻る。 If the current time is the imaging trigger output time (YES in step S6), the controller 40 outputs an imaging trigger to the visual sensor 30. Accordingly, in step S7, the visual sensor 30 that has received the imaging trigger captures an image of the target work W on the moving mechanism 10 that is moving. In step S8, the visual sensor 30 specifies the measurement positions PMa and PMb of the marks 5a and 5b included in the captured image, respectively. Steps S7 and S8 correspond to the position specifying process after the first position specifying process (the second and subsequent position specifying processes). After step S8, the positioning process returns to step S5.
 <3-2.移動制御部の処理>
 図7は、移動制御部による移動制御の処理の流れの一例を示すフローチャートである。まずステップS11において、移動制御部41は、視覚センサ30によって特定された最新の計測位置PMa,PMbの座標(カメラ座標系)を取得する。次にステップS12において、座標変換部42は、計測位置PMa,PMbの座標を機械座標系に変換する。
<3-2. Processing of movement control unit>
FIG. 7 is a flowchart illustrating an example of the flow of the movement control process performed by the movement control unit. First, in step S11, the movement control unit 41 acquires the coordinates (camera coordinate system) of the latest measurement positions PMa and PMb specified by the visual sensor 30. Next, in step S12, the coordinate conversion unit 42 converts the coordinates of the measurement positions PMa and PMb into a machine coordinate system.
 ステップS13において、位置決定部43は、撮像時刻tiを取得する。位置決定部43は、撮像トリガが出力された時刻から一定の伝送遅延時間Tsdだけ経過した時刻を撮像時刻tiとして取得する。伝送遅延時間Tsdは、視覚センサ30が撮像トリガを受けてから撮像開始するまでの遅延時間である。 に お い て In step S13, the position determination unit 43 acquires the imaging time ti. The position determining unit 43 acquires a time when a fixed transmission delay time Tsd has elapsed from the time at which the imaging trigger was output, as the imaging time ti. The transmission delay time Tsd is a delay time from when the visual sensor 30 receives an imaging trigger to when imaging starts.
 次にステップS14において、位置決定部43は、撮像時刻tiに近い複数の時刻(以下、「出力時刻」という)にエンコーダ12E,14E,16Eから出力されたエンコーダ値EnX,EnY,Enθを取得する。過去に検出されたエンコーダ値は、コントローラ40の記憶部(たとえば不揮発性メモリ416または主メモリ418(図4参照))に記憶される。 Next, in step S14, the position determination unit 43 acquires the encoder values EnX, EnY, Enθ output from the encoders 12E, 14E, 16E at a plurality of times near the imaging time ti (hereinafter, referred to as “output times”). . The encoder values detected in the past are stored in the storage unit of the controller 40 (for example, the nonvolatile memory 416 or the main memory 418 (see FIG. 4)).
 次にステップS15において、位置決定部43は、複数の出力時刻のエンコーダ値EnXの内挿補間値を算出し、当該内挿補間値を撮像時刻tiのエンコーダ値EnsXとする。同様に、位置決定部43は、複数の時刻のエンコーダ値EnYの内挿補間値を算出し、当該内挿補間値を撮像時刻のエンコーダ値EnsYとする。位置決定部43は、複数の時刻のエンコーダ値Enθの内挿補間値を算出し、当該内挿補間値を撮像時刻のエンコーダ値Ensθとする。 Next, in step S15, the position determination unit 43 calculates an interpolation value of the encoder value EnX at a plurality of output times, and sets the interpolation value as the encoder value EnsX at the imaging time ti. Similarly, the position determination unit 43 calculates an interpolation value of the encoder value EnY at a plurality of times, and sets the interpolation value as the encoder value EnsY at the imaging time. The position determination unit 43 calculates an interpolation value of the encoder value Enθ at a plurality of times, and sets the interpolation value as the encoder value Ensθ at the imaging time.
 図8は、撮像時刻のエンコーダ値EnsXの算出方法を説明する図である。図8を参照して、位置決定部43は、以下のようにして内挿補間値を算出する。出力時刻t(j)に出力されたエンコーダ値EnXをエンコーダ値EnX(j)とする。なお、エンコーダ12Eがパルス信号を検出してからエンコーダ値EnXが出力するまで一定の伝送遅延時間Tedだけ遅延する。そのため、出力時刻t(j)に出力されたエンコーダ値EnX(j)は、出力時刻t(j)から伝送遅延時間Tedだけ前の時刻における、初期位置からのサーボモータ12の駆動量を示している。 FIG. 8 is a diagram illustrating a method of calculating the encoder value EnsX at the imaging time. Referring to FIG. 8, position determination section 43 calculates an interpolation value as follows. The encoder value EnX output at the output time t (j) is defined as an encoder value EnX (j). Note that the transmission is delayed by a certain transmission delay time Ted from when the encoder 12E detects the pulse signal to when the encoder value EnX is output. Therefore, the encoder value EnX (j) output at the output time t (j) indicates the drive amount of the servo motor 12 from the initial position at a time before the output time t (j) by the transmission delay time Ted. I have.
 位置決定部43は、撮像時刻tiに近接する2つの出力時刻を特定する。たとえば、制御周期Tsと、エンコーダ値の伝送遅延時間Tedと、撮像トリガの伝送遅延時間TsdとがTs-Ted≦Tsd<2Ts-Tedを満たす場合、位置決定部43は、撮像時刻tiの直後の出力時刻t(n)と出力時刻t(n+1)とを特定する。 The position determination unit 43 specifies two output times that are close to the imaging time ti. For example, when the control cycle Ts, the transmission delay time Ted of the encoder value, and the transmission delay time Tsd of the imaging trigger satisfy Ts−Ted ≦ Tsd <2Ts−Ted, the position determination unit 43 sets the position immediately after the imaging time ti. The output time t (n) and the output time t (n + 1) are specified.
 位置決定部43は、出力時刻t(n)に出力されたエンコーダ値EnX(n)と、出力時刻t(n+1)に出力されたエンコーダ値EnX(n+1)とを取得する。 The position determining unit 43 acquires the encoder value EnX (n) output at the output time t (n) and the encoder value EnX (n + 1) output at the output time t (n + 1).
 位置決定部43は、エンコーダ値EnX(n)とエンコーダ値EnX(n+1)との内挿補間値を用いて、撮像時刻tiのエンコーダ値EnsX(i)を算出する。具体的には、位置決定部43は、次の式(1)を用いて、撮像時刻tiのエンコーダ値EnsX(i)を算出する。 The position determination unit 43 calculates the encoder value EnsX (i) at the imaging time ti by using an interpolation value between the encoder value EnX (n) and the encoder value EnX (n + 1). Specifically, the position determination unit 43 calculates the encoder value EnsX (i) at the imaging time ti using the following equation (1).
 EnsX(i)=EnX(n)+Kk*(EnX(n+1)-EnX(n)) ・・・(1)
 ここで、Kkは、内挿補間係数である。Ts-Ted≦Tsd<2Ts-Tedの場合、内挿補間係数Kkは、次の式(2)を用いて算出される。
EnsX (i) = EnX (n) + Kk * (EnX (n + 1) -EnX (n)) (1)
Here, Kk is an interpolation coefficient. When Ts−Ted ≦ Tsd <2Ts−Ted, the interpolation coefficient Kk is calculated using the following equation (2).
 Kk={Tsd-(Ts-Ted)}/Ts ・・・(2)
 このような内挿補間値の算出方法を用いることによって、撮像時刻tiのエンコーダ値EnsX(i)を高精度に算出できる。同様にして、撮像時刻tiのエンコーダ値EnsY(i),Ensθ(i)が算出される。なお、撮像時刻tiがエンコーダ値の出力時刻から伝送遅延時間Tedだけ前の時刻と一致する場合には、このエンコーダ値をそのまま用いればよい。
Kk = {Tsd- (Ts-Ted)} / Ts (2)
By using such a method of calculating the interpolation value, the encoder value EnsX (i) at the imaging time ti can be calculated with high accuracy. Similarly, the encoder values EnsY (i) and Ensθ (i) at the imaging time ti are calculated. If the imaging time ti matches the time before the transmission time of the encoder value by the transmission delay time Ted, the encoder value may be used as it is.
 次にステップS16(図7参照)において、位置決定部43は、計測位置PMa,PMbと、撮像時刻以後のエンコーダ値EnX,EnY,Enθと、撮像時刻のエンコーダ値EnsX,EnsY,Ensθとを用いて、推定位置PVa,PVbを算出する。 Next, in step S16 (see FIG. 7), the position determination unit 43 uses the measured positions PMa and PMb, the encoder values EnX, EnY, and Enθ after the imaging time, and the encoder values EnsX, EnsY, and Ensθ at the imaging time. Thus, the estimated positions PVa and PVb are calculated.
 具体的には、位置決定部43は、X方向に(EnX-EnsX)だけ並進移動し、Y方向に(EnY-EnsY)だけ並進移動し、(Enθ-Ensθ)だけ回転移動するときのアフィン変換式に計測位置PMaを入力する。これにより、位置決定部43は、マーク5aの推定位置PVaの座標を算出する。同様に、位置決定部43は、当該アフィン変換式に計測位置PMbを入力することにより、マーク5bの推定位置PVbの座標を算出する。 Specifically, the position determination unit 43 performs the affine transformation when the translation is performed by (EnX−EnsX) in the X direction, the translation is performed by (EnY−EnsY) in the Y direction, and the rotation is performed by (Enθ−Ensθ). The measurement position PMa is input into the equation. Thereby, the position determining unit 43 calculates the coordinates of the estimated position PVa of the mark 5a. Similarly, the position determination unit 43 calculates the coordinates of the estimated position PVb of the mark 5b by inputting the measurement position PMb into the affine transformation equation.
 次にステップS17において、演算部45は、目標位置SPa,SPbに対する推定位置PVa,PVbのそれぞれの偏差に基づいて、たとえばP演算により、移動指令MVX,MVY,MVθを生成する。そして、演算部45は、ドライバユニット20に移動指令MVX,MVY,MVθを出力する。 Next, in step S17, the calculation unit 45 generates the movement commands MVX, MVY, MVθ by, for example, P calculation based on the respective deviations of the estimated positions PVa, PVb with respect to the target positions SPa, SPb. Then, the arithmetic unit 45 outputs the movement commands MVX, MVY, MVθ to the driver unit 20.
 このような処理を実行することによって、コントローラ40は、画像処理による高精度な計測位置PMa,PMbが入力される時刻には、この高精度な計測位置PMa,PMbを用いて推定位置PVa.PVbを算出し、高精度な位置決め制御を実現できる。ここで、計測位置PMa,PMbが入力される時間間隔は、撮像周期Tbであり、エンコーダ値EnX,EnY,Enθが入力される制御周期Tsに比べて長い。しかしながら、時間軸上で隣り合う計測位置PMa,PMbの入力時刻間において、位置決定部43は、入力周期が短いエンコーダ値EnX,EnY,Enθの入力時刻毎に、推定位置PVa,PVbを決定して、移動機構10の移動制御を行う。これにより、高精度且つ短周期の位置決め制御が可能になる。さらに、位置決定部43は、上述の簡単な四則演算を用いる処理を行なう。そのため、簡素な構成および処理による高速且つ高精度な位置決めを実現できる。 こ と By executing such a process, the controller 40 uses the high-precision measurement positions PMa, PMb at the time when the high-precision measurement positions PMa, PMb are input by the image processing, and uses the high-precision measurement positions PMa, PMb. By calculating PVb, highly accurate positioning control can be realized. Here, the time interval at which the measurement positions PMa and PMb are input is the imaging cycle Tb, and is longer than the control cycle Ts at which the encoder values EnX, EnY, and Enθ are input. However, between the input times of the adjacent measurement positions PMa and PMb on the time axis, the position determination unit 43 determines the estimated positions PVa and PVb for each input time of the encoder values EnX, EnY and Enθ having a short input cycle. Then, the movement of the moving mechanism 10 is controlled. This enables high-accuracy and short-period positioning control. Further, the position determining unit 43 performs a process using the above-described simple four arithmetic operations. Therefore, high-speed and high-precision positioning can be realized by a simple configuration and processing.
 <3-3.マークの位置変化>
 図9は、比較例の位置決め処理と、実施の形態に係る位置決め処理とにおける、マークの位置変化の一例を示す図である。図10は、比較例の位置決め処理と、実施の形態に係る位置決め処理とにおける、ステージ装置の移動速度の変化の一例を示す図である。
<3-3. Mark Position Change>
FIG. 9 is a diagram illustrating an example of a change in the position of a mark in the positioning process of the comparative example and the positioning process according to the embodiment. FIG. 10 is a diagram illustrating an example of a change in the moving speed of the stage device between the positioning process of the comparative example and the positioning process according to the embodiment.
 比較例の位置決め処理では、対象ワークWを撮像することにより得られた画像から移動機構10の必要移動量が決定され、決定された必要移動量だけ移動機構10を移動する単位処理が行なわれる。そして、単位処理による移動機構10の移動完了後に対象ワークWを撮像し、再度単位処理が繰り返される。このようにして単位処理を複数回繰り返すことにより、対象ワークWの位置が目標位置に位置決めされる。 In the positioning process of the comparative example, the required moving amount of the moving mechanism 10 is determined from an image obtained by imaging the target work W, and a unit process of moving the moving mechanism 10 by the determined necessary moving amount is performed. Then, after the movement of the moving mechanism 10 by the unit process is completed, the target work W is imaged, and the unit process is repeated again. By repeating the unit processing a plurality of times in this manner, the position of the target work W is positioned at the target position.
 図9において、符号81,82,83,84は、1,2,3,4回目の単位処理における対象ワークW上の対象マーク(マーク5aまたはマーク5b)の画像上の軌跡をそれぞれ示す。図10において、符号91,92,93,94は、1,2,3,4回目の単位処理における移動機構10の移動速度の変化をそれぞれ示す。 In FIG. 9, reference numerals 81, 82, 83, and 84 denote trajectories on the image of the target mark (the mark 5 a or the mark 5 b) on the target work W in the first, second, third, and fourth unit processes, respectively. In FIG. 10, reference numerals 91, 92, 93, and 94 indicate changes in the moving speed of the moving mechanism 10 in the first, second, third, and fourth unit processes, respectively.
 一方、図9において、符号80は、実施の形態に係る移動制御部41によって移動機構10が移動制御される場合の対象マークの画像上の軌跡を示す。図10において、符号90は、実施の形態に係る移動制御部41によって移動制御される移動機構10の移動速度の変化を示す。 On the other hand, in FIG. 9, reference numeral 80 denotes a locus on the image of the target mark when the movement of the moving mechanism 10 is controlled by the movement control unit 41 according to the embodiment. In FIG. 10, reference numeral 90 indicates a change in the moving speed of the moving mechanism 10 that is controlled by the movement control unit 41 according to the embodiment.
 図9および図10に示されるように、比較例の位置決め処理では、移動機構10を停止させてから対象ワークWが撮像される。そのため、位置決め処理に要する時間が長い。これに対し、実施の形態に係る位置決めシステム1では、対象ワークWを移動しながら撮像し、撮像された画像に基づいて移動指令が生成される。そのため、位置決め処理に要する時間を短くすることができる。 As shown in FIGS. 9 and 10, in the positioning process of the comparative example, the target workpiece W is imaged after the moving mechanism 10 is stopped. Therefore, the time required for the positioning process is long. On the other hand, in the positioning system 1 according to the embodiment, the target work W is imaged while moving, and a movement command is generated based on the imaged image. Therefore, the time required for the positioning process can be shortened.
 <3-4.監視処理>
 図11を参照して、コントローラ40による位置決めシステム1の監視処理の流れの一例について説明する。図11は、コントローラにおける監視処理の流れの一例を示すフローチャートである。図11に示す監視処理は、対象ワークWの位置決め処理ごとに行なわれる。また、図11に示す監視処理は、図6に示す位置決め処理と並行して実行される。
<3-4. Monitoring process>
With reference to FIG. 11, an example of a flow of a monitoring process of the positioning system 1 by the controller 40 will be described. FIG. 11 is a flowchart illustrating an example of the flow of a monitoring process in the controller. The monitoring process shown in FIG. 11 is performed for each positioning process of the target workpiece W. The monitoring process shown in FIG. 11 is executed in parallel with the positioning process shown in FIG.
 ステップS21において、軌跡決定部48は、視覚センサ30による初回の位置特定処理(図6のステップS2,S3)によって特定された計測位置PMa,PMbに基づいて、マーク5a,5bの画像上の理想軌跡をそれぞれ決定する。以下では、初回の位置特定処理によって特定された計測位置PMa,PMbをそれぞれ初期位置Pa(0),Pb(0)という。 In step S21, the trajectory determination unit 48 determines the ideal positions of the marks 5a and 5b on the image based on the measurement positions PMa and PMb specified by the initial position specifying process (steps S2 and S3 in FIG. 6) by the visual sensor 30. Determine the trajectories respectively. Hereinafter, the measurement positions PMa and PMb specified by the first position specification processing are referred to as initial positions Pa (0) and Pb (0), respectively.
 次にステップS22において、判定部49は、マーク5a,5bの各々について、2回目以降の位置特定処理によって特定された計測位置の実軌跡と、ステップS21で決定された理想軌跡とを比較する。そして、判定部49は、比較結果に基づいて、位置決めシステム1の異常の有無を判定する。 Next, in step S22, the determination unit 49 compares, for each of the marks 5a and 5b, the actual trajectory of the measurement position specified by the second and subsequent position specification processing with the ideal trajectory determined in step S21. Then, the determining unit 49 determines whether there is an abnormality in the positioning system 1 based on the comparison result.
 判定結果が異常ありを示す場合(ステップS23でYES)、判定部49は、ステップS24において、キャリブレーションパラメータのずれの要因となる異常が発生している旨を示す警告を通知する。具体的には、判定部49は、警告画面を表示装置50に表示させる。ステップS24の後、処理はステップS25に移る。判定結果が異常ないを示す場合も(ステップS23でNO)、処理はステップS25に移る。 If the determination result indicates that there is an abnormality (YES in step S23), in step S24, the determination unit 49 notifies a warning indicating that an abnormality that causes a deviation of the calibration parameter has occurred. Specifically, the determination unit 49 causes the display device 50 to display a warning screen. After step S24, the process proceeds to step S25. If the determination result indicates that there is no abnormality (NO in step S23), the process proceeds to step S25.
 ステップS25において、監視部47は、位置決め処理が終了したか否かを判定する。すなわち、監視部47は、図6に示すステップS5においてYESである場合、位置決め処理が終了したものと判定する。位置決め処理が終了していない場合(ステップS25でNO)、監視処理は、ステップS22に戻る。位置決め処理が終了している場合(ステップS25でYES)、監視処理は終了する。 In step S25, the monitoring unit 47 determines whether the positioning process has been completed. That is, if YES is determined in step S5 shown in FIG. 6, the monitoring unit 47 determines that the positioning process has been completed. If the positioning process has not been completed (NO in step S25), the monitoring process returns to step S22. If the positioning processing has been completed (YES in step S25), the monitoring processing ends.
 <3-5.理想軌跡の決定処理>
 軌跡決定部48は、たとえば図12のフローチャートに示すような処理を行うことで、理想軌跡を決定する。図12は、図11に示す理想軌跡の決定処理(ステップS21)のサブルーチンの処理内容を示すフローチャートである。
<3-5. Determination process of ideal trajectory>
The trajectory determination unit 48 determines an ideal trajectory by performing processing such as that shown in the flowchart of FIG. FIG. 12 is a flowchart showing the contents of a subroutine of the ideal trajectory determination process (step S21) shown in FIG.
 ステップS31において、軌跡決定部48は、移動制御部41による移動機構10の制御開始時点からの経過時間tを0にセットする。 In step S31, the trajectory determination unit 48 sets the elapsed time t from the start of the control of the moving mechanism 10 by the movement control unit 41 to 0.
 次にステップS32において、軌跡決定部48は、初回の位置特定処理によって特定された初期位置Pa(0),Pb(0)の座標(XY座標)を視覚センサ30から取得する。視覚センサ30から取得された初期位置Pa(0),Pb(0)の座標は、カメラ座標系で示される。さらに、軌跡決定部48は、コントローラ40の記憶部(たとえば不揮発性メモリ416または主メモリ418(図4参照))に、初期位置Pa(0),Pb(0)の座標(カメラ座標系)を保存する。 Next, in step S32, the trajectory determination unit 48 acquires from the visual sensor 30 the coordinates (XY coordinates) of the initial positions Pa (0) and Pb (0) specified by the initial position specification processing. The coordinates of the initial positions Pa (0) and Pb (0) acquired from the visual sensor 30 are indicated by a camera coordinate system. Further, the trajectory determination unit 48 stores the coordinates (camera coordinate system) of the initial positions Pa (0) and Pb (0) in a storage unit (for example, the nonvolatile memory 416 or the main memory 418 (see FIG. 4)) of the controller 40. save.
 ステップS33において、軌跡決定部48は、キャリブレーションパラメータを用いて、初期位置Pa(0),Pb(0)の座標をカメラ座標系から機械座標系に変換する。 In step S33, the trajectory determination unit 48 converts the coordinates of the initial positions Pa (0) and Pb (0) from the camera coordinate system to the machine coordinate system using the calibration parameters.
 ステップS34において、軌跡決定部48は、モデル記憶部46が記憶するシミュレーションモデルと位置Pa(t),Pb(t)とを用いて、移動機構10の必要移動量を算出する。なお、ステップS33の直後のステップS34では、位置Pa(t),Pb(t)は、それぞれ初期位置Pa(0),Pb(0)である。 In step S34, the trajectory determination unit 48 calculates the required movement amount of the movement mechanism 10 using the simulation model stored in the model storage unit 46 and the positions Pa (t) and Pb (t). In step S34 immediately after step S33, the positions Pa (t) and Pb (t) are the initial positions Pa (0) and Pb (0), respectively.
 軌跡決定部48は、位置Pa(t)と目標位置SPaとの偏差、および、位置Pb(t)と目標位置SPbとの偏差を、移動機構10の必要移動量として算出する。当該算出処理は、移動制御部41の減算部44の処理と同一の処理であり、機械座標系のXY座標を用いて行なわれる。 The trajectory determination unit 48 calculates the deviation between the position Pa (t) and the target position SPa and the deviation between the position Pb (t) and the target position SPb as the required movement amount of the moving mechanism 10. The calculation process is the same as the process of the subtraction unit 44 of the movement control unit 41, and is performed using the XY coordinates of the machine coordinate system.
 次にステップS35において、軌跡決定部48は、必要移動量に応じた移動指令MVX,MVY,MVθを算出する。当該算出処理は、移動制御部41の演算部45の処理と同一の処理である。 Next, in step S35, the trajectory determination unit 48 calculates the movement commands MVX, MVY, MVθ according to the required movement amount. The calculation process is the same as the process of the calculation unit 45 of the movement control unit 41.
 次にステップS36において、軌跡決定部48は、1制御周期Tsにおける、Xステージ11の並進移動量ΔX,Yステージ13の並進移動量ΔYおよびθステージ15の回転移動量Δθを算出する。軌跡決定部48は、ステップS35で算出した移動指令MVXに従ってXステージ11が理想的に移動したものと仮定して、並進移動量ΔXを算出する。同様に、軌跡決定部48は、ステップS35で算出した移動指令MVY,MVθに従ってYステージ13,θステージ15が理想的に移動したものと仮定して、並進移動量ΔY,回転移動量Δθをそれぞれ算出する。 Next, in step S36, the trajectory determination unit 48 calculates the translation amount ΔX of the X stage 11, the translation amount ΔY of the Y stage 13, and the rotation amount Δθ of the θ stage 15 in one control cycle Ts. The trajectory determination unit 48 calculates the translation amount ΔX assuming that the X stage 11 has ideally moved according to the movement command MVX calculated in step S35. Similarly, the trajectory determination unit 48 assumes that the Y stage 13 and the θ stage 15 have ideally moved according to the movement commands MVY and MVθ calculated in step S35, and calculates the translational movement amount ΔY and the rotational movement amount Δθ, respectively. calculate.
 次にステップS37において、軌跡決定部48は、以下の式(3)に従って、1制御周期Ts後のマーク5aの位置Pa(t+Ts)の座標(機械座標系)を算出する。同様に、軌跡決定部48は、以下の式(3)に従って、1制御周期Ts後のマーク5bの位置Pb(t+Ts)の座標(機械座標系)を算出する。式(3)において、x,yは、位置Pa(t)または位置Pb(t)のXY座標(機械座標系)を示す。x’,y’は、位置Pa(t+Ts)または位置Pb(t+Ts)のXY座標(機械座標系)を示す。Xo,Yoは、θステージ15の回転中心のX座標,Y座標をそれぞれ示す。 Next, in step S37, the trajectory determination unit 48 calculates the coordinates (mechanical coordinate system) of the position Pa (t + Ts) of the mark 5a after one control cycle Ts according to the following equation (3). Similarly, the trajectory determination unit 48 calculates the coordinates (mechanical coordinate system) of the position Pb (t + Ts) of the mark 5b after one control cycle Ts according to the following equation (3). In Expression (3), x and y indicate the XY coordinates (machine coordinate system) of the position Pa (t) or the position Pb (t). x ′ and y ′ indicate XY coordinates (machine coordinate system) of the position Pa (t + Ts) or the position Pb (t + Ts). Xo and Yo indicate the X coordinate and the Y coordinate of the rotation center of the θ stage 15, respectively.
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 次にステップS38において、軌跡決定部48は、位置Pa(t+Ts),Pb(t+Ts)のXY座標(機械座標系)をカメラ座標系に座標変換する。そして、軌跡決定部48は、コントローラ40の記憶部に位置Pa(t+Ts),Pb(t+Ts)のXY座標(カメラ座標系)を保存する。 Next, in step S38, the trajectory determination unit 48 performs coordinate conversion of the XY coordinates (mechanical coordinate system) of the positions Pa (t + Ts) and Pb (t + Ts) into the camera coordinate system. Then, the trajectory determination unit 48 stores the XY coordinates (camera coordinate system) of the positions Pa (t + Ts) and Pb (t + Ts) in the storage unit of the controller 40.
 次にステップS39において、軌跡決定部48は、位置Pa(t+Ts),Pb(t+Ts)が目標位置SPa,SPbにそれぞれ一致したか否かを判定する。具体的には、位置Pa(t+Ts),Pb(t+Ts)と目標位置SPa,SPbとのそれぞれの偏差がいずれも閾値Th0未満である場合に、軌跡決定部48は、位置Pa(t+Ts),Pb(t+Ts)が目標位置SPa,SPbにそれぞれ一致したと判定する。 Next, in step S39, the trajectory determination unit 48 determines whether the positions Pa (t + Ts) and Pb (t + Ts) match the target positions SPa and SPb, respectively. Specifically, when the respective deviations between the positions Pa (t + Ts) and Pb (t + Ts) and the target positions SPa and SPb are both smaller than the threshold value Th0, the trajectory determining unit 48 sets the positions Pa (t + Ts) and Pb It is determined that (t + Ts) coincides with the target positions SPa and SPb, respectively.
 位置Pa(t+Ts)が目標位置SPaに一致していない、もしくは、位置Pb(t+Ts)が目標位置SPbに一致していない場合(ステップS39でNO)、軌跡決定処理はステップS40に移る。ステップS40において、軌跡決定部48は、移動機構10の移動制御の開始時点からの経過時間tを1制御周期Tsだけ加算した時間に再設定する。ステップS40の後、ステップS34~S39が繰り返される。 場合 If the position Pa (t + Ts) does not match the target position SPa, or if the position Pb (t + Ts) does not match the target position SPb (NO in step S39), the trajectory determination process proceeds to step S40. In step S40, the trajectory determination unit 48 resets the elapsed time t from the start of the movement control of the movement mechanism 10 to a time obtained by adding one control cycle Ts. After step S40, steps S34 to S39 are repeated.
 位置Pa(t+Ts),Pb(t+Ts)が目標位置SPa,SPbにそれぞれ一致した場合(ステップS39でYES)、軌跡決定処理は終了する。 If the positions Pa (t + Ts) and Pb (t + Ts) match the target positions SPa and SPb, respectively (YES in step S39), the trajectory determination processing ends.
 コントローラ40の記憶部に保存された位置Pa(0),Pa(Ts),Pa(2Ts),・・・,Pa(nTs)の座標(カメラ座標系)は、移動制御部41による制御開始時点からの経過時間tと、マーク5aの画像上の理想位置とを対応付けた情報である。位置Pa(0),Pa(Ts),Pa(2Ts),・・・,Pa(nTs)の座標(カメラ座標系)は、マーク5aの画像上の理想軌跡を示す。同様に、コントローラ40の記憶部に保存された位置Pb(0),Pb(Ts),Pb(2Ts),・・・,Pb(nTs)の座標(カメラ座標系)は、移動制御部41による制御開始時点からの経過時間tと、マーク5bの画像上の理想位置とを対応付けた情報である。位置Pb(0),Pb(Ts),Pb(2Ts),・・・,Pb(nTs)の座標(カメラ座標系)は、マーク5bの画像上の理想軌跡を示す。 The coordinates (camera coordinate system) of the positions Pa (0), Pa (Ts), Pa (2Ts),..., Pa (nTs) stored in the storage unit of the controller 40 are the control start times of the movement control unit 41. This is information that associates the elapsed time t from the position with the ideal position of the mark 5a on the image. The coordinates (camera coordinate system) of the positions Pa (0), Pa (Ts), Pa (2Ts),..., Pa (nTs) indicate an ideal trajectory of the mark 5a on the image. Similarly, the coordinates (camera coordinate system) of the positions Pb (0), Pb (Ts), Pb (2Ts),..., Pb (nTs) stored in the storage unit of the controller 40 are determined by the movement control unit 41. This is information that associates the elapsed time t from the control start time with the ideal position of the mark 5b on the image. The coordinates (camera coordinate system) of the positions Pb (0), Pb (Ts), Pb (2Ts),..., Pb (nTs) indicate an ideal trajectory of the mark 5b on the image.
 <3-6.異常の有無の判定処理>
 判定部49は、たとえば図13のフローチャートに示すような処理を行うことで、位置決めシステム1の異常の有無を判定する。図13は、図11に示す異常の有無の判定処理(ステップS22)のサブルーチンの処理内容を示すフローチャートである。
<3-6. Processing for judging presence or absence of abnormality
The determination unit 49 determines whether or not the positioning system 1 is abnormal, for example, by performing a process as shown in the flowchart of FIG. FIG. 13 is a flowchart showing the contents of a subroutine of the processing (step S22) for determining the presence or absence of an abnormality shown in FIG.
 ステップS41において、判定部49は、マーク5aの理想軌跡に基づいて、マーク5aが通過する画像上の第1許容領域を決定する。具体的には、判定部49は、位置Pa(0),Pa(Ts),Pa(2Ts),・・・,Pa(nTs)を順に繋いだ曲線(図2の実線70参照)との距離が閾値Th1以下である領域を第1許容領域(図2の許容領域71参照)として決定する。 In step S41, the determination unit 49 determines a first allowable area on an image through which the mark 5a passes based on the ideal trajectory of the mark 5a. Specifically, the determination unit 49 determines the distance from the curve (see the solid line 70 in FIG. 2) connecting the positions Pa (0), Pa (Ts), Pa (2Ts),. Is determined as a first allowable area (see the allowable area 71 in FIG. 2).
 同様に、ステップS41において、判定部49は、マーク5bの理想軌跡に基づいて、マーク5bが通過する画像上の第2許容領域を決定する。具体的には、判定部49は、位置Pb(0),Pb(Ts),Pb(2Ts),・・・,Pb(nTs)を順に繋いだ曲線(図2の実線70参照)との偏差(距離)が閾値Th1以下である領域を第2許容領域(図2の許容領域71参照)として決定する。 Similarly, in step S41, the determination unit 49 determines a second allowable area on the image through which the mark 5b passes based on the ideal trajectory of the mark 5b. Specifically, the determination unit 49 determines the deviation from the curve (see the solid line 70 in FIG. 2) that connects the positions Pb (0), Pb (Ts), Pb (2Ts),. An area whose (distance) is equal to or smaller than the threshold Th1 is determined as a second allowable area (see the allowable area 71 in FIG. 2).
 ステップS42において、判定部49は、視覚センサ30から最新の計測位置PMa,PMbの座標(カメラ座標系)を取得する。当該計測位置PMa,PMbの座標は、2回目以降の位置特定処理(図6のステップS7,S8)によって特定される。 In step S42, the determination unit 49 acquires the latest coordinates (camera coordinate system) of the measurement positions PMa and PMb from the visual sensor 30. The coordinates of the measurement positions PMa and PMb are specified by the second and subsequent position specification processing (steps S7 and S8 in FIG. 6).
 次にステップS43において、判定部49は、第1許容領域内に最新の計測位置PMaが位置し、かつ、第2許容領域内に最新の計測位置PMbが位置するか否かを判定する。最新の計測位置PMaが第1許容領域内に位置している場合、計測位置PMaとマーク5aの理想軌跡との偏差(距離)が閾値Th1以下であることを示している。計測位置PMaとマーク5aの理想軌跡との偏差(距離)は、実軌跡と理想軌跡との乖離度を示す特徴量の1つである。同様に、最新の計測位置PMbが第2許容領域内に位置している場合、計測位置PMbとマーク5bの理想軌跡との偏差(距離)が閾値Th1以下であることを示している。計測位置PMa,PMbとマーク5a,5bの理想軌跡との偏差(距離)は、実軌跡と理想軌跡との乖離度を示す特徴量である。 Next, in step S43, the determination unit 49 determines whether the latest measurement position PMa is located in the first allowable area and the latest measurement position PMb is located in the second allowable area. When the latest measurement position PMa is located within the first allowable area, it indicates that the deviation (distance) between the measurement position PMa and the ideal trajectory of the mark 5a is equal to or smaller than the threshold Th1. The deviation (distance) between the measurement position PMa and the ideal trajectory of the mark 5a is one of the feature amounts indicating the degree of deviation between the actual trajectory and the ideal trajectory. Similarly, when the latest measurement position PMb is located in the second allowable area, it indicates that the deviation (distance) between the measurement position PMb and the ideal trajectory of the mark 5b is equal to or smaller than the threshold Th1. The deviation (distance) between the measurement positions PMa and PMb and the ideal trajectories of the marks 5a and 5b is a feature amount indicating the degree of deviation between the actual trajectory and the ideal trajectory.
 ステップS43でNOの場合、判定部49は、ステップS44において、異常ありと判定する。一方、ステップS43でYESの場合、判定部49は、ステップS45において、異常なしと判定する。ステップS44またはステップS45の後、異常の有無の判定処理は終了する。 場合 If NO in step S43, the determination unit 49 determines in step S44 that there is an abnormality. On the other hand, in the case of YES in step S43, the determination unit 49 determines in step S45 that there is no abnormality. After step S44 or step S45, the process of determining the presence or absence of abnormality ends.
 なお、図13に示す異常の有無の判定処理は、位置決め処理が終了するまで繰り返し実行される。ステップS41は、毎回同じであるため、2回目以降の異常の有無の判定処理では省略される。 The process of determining whether there is an abnormality shown in FIG. 13 is repeatedly performed until the positioning process is completed. Since step S41 is the same every time, it is omitted in the processing for determining the presence / absence of abnormality after the second time.
 <3-7.作用・効果>
 以上のように、対象ワークWの位置決め処理を行なう位置決めシステム1は、移動機構10と、視覚センサ30と、移動制御部41と、軌跡決定部48と、判定部49とを備える。移動機構10は、対象ワークWを移動させる。視覚センサ30は、対象ワークWを撮像し、撮像画像に基づいて対象ワークWの位置(具体的には、マーク5a,5bの計測位置PMa,PMb)を特定する位置特定処理を撮像周期Tbごとに実行する。移動制御部41は、視覚センサ30によって特定された計測位置PMa,PMbが目標位置SPa,SPbにそれぞれ近づくように移動機構10を制御する。軌跡決定部48は、初回の位置特定処理によって特定されたマーク5aの初期位置Pa(0)から目標位置SPaまでのマーク5aの理想軌跡を決定する。さらに、軌跡決定部48は、初回の位置特定処理によって特定されたマーク5aの初期位置Pb(0)から目標位置SPbまでのマーク5bの理想軌跡を決定する。判定部49は、初回の位置特定処理の後の位置特定処理によって特定されたマーク5aの計測位置PMaの実軌跡とマーク5aの理想軌跡とを比較する。さらに、判定部49は、初回の位置特定処理の後の位置特定処理によって特定されたマーク5bの計測位置PMbの実軌跡とマーク5bの理想軌跡とを比較する。判定部49は、比較結果に基づいて、位置決めシステム1の異常の有無を判定する。
<3-7. Action / Effect>
As described above, the positioning system 1 that performs the positioning processing of the target work W includes the moving mechanism 10, the visual sensor 30, the movement control unit 41, the trajectory determination unit 48, and the determination unit 49. The moving mechanism 10 moves the target work W. The visual sensor 30 captures the target work W and performs a position specifying process for specifying the position of the target work W (specifically, the measurement positions PMa and PMb of the marks 5a and 5b) based on the captured image for each imaging cycle Tb. To run. The movement control unit 41 controls the movement mechanism 10 so that the measurement positions PMa and PMb specified by the visual sensor 30 approach the target positions SPa and SPb, respectively. The trajectory determination unit 48 determines an ideal trajectory of the mark 5a from the initial position Pa (0) of the mark 5a specified by the first position specifying process to the target position SPa. Further, the trajectory determination unit 48 determines an ideal trajectory of the mark 5b from the initial position Pb (0) of the mark 5a specified by the first position specifying process to the target position SPb. The determination unit 49 compares the actual trajectory of the measured position PMa of the mark 5a specified by the position specifying process after the initial position specifying process with the ideal trajectory of the mark 5a. Further, the determination unit 49 compares the actual trajectory of the measured position PMb of the mark 5b specified by the position specifying process after the first position specifying process with the ideal trajectory of the mark 5b. The determination unit 49 determines whether there is an abnormality in the positioning system 1 based on the comparison result.
 視覚センサ30のカメラ座標系と移動機構10の機械座標系との対応関係を示すキャリブレーションパラメータのずれがない場合、マーク5a,5bの実軌跡と理想軌跡とは一致する。一方、キャリブレーションパラメータのずれがある場合、マーク5a,5bの実軌跡は、理想軌跡から外れる。そのため、判定部49は、属人的判断に依存することなく、マーク5a,5bの実軌跡と理想軌跡との比較結果に基づいて、キャリブレーションパラメータのずれの要因となる異常の有無を精度良く判定することができる。 (4) When there is no deviation of the calibration parameters indicating the correspondence between the camera coordinate system of the visual sensor 30 and the mechanical coordinate system of the moving mechanism 10, the actual trajectories of the marks 5a and 5b coincide with the ideal trajectories. On the other hand, when there is a deviation of the calibration parameters, the actual trajectories of the marks 5a and 5b deviate from the ideal trajectories. Therefore, the determination unit 49 accurately determines the presence or absence of an abnormality that causes a deviation of the calibration parameter based on the comparison result between the actual trajectories of the marks 5a and 5b and the ideal trajectory without depending on the personal judgment. Can be determined.
 視覚センサ30は、対象ワークWが停止中において初回の位置特定処理を実行し、対象ワークWが移動中において2回目以降の位置特定処理を実行する。これにより、位置決め処理を高速化できる。 The visual sensor 30 executes the first position specifying process while the target work W is stopped, and executes the second and subsequent position specifying processes while the target work W is moving. This makes it possible to speed up the positioning process.
 判定部49は、実軌跡と理想軌跡との乖離度を示す特徴量(計測位置PMa,PMbとマーク5a,5bの理想軌跡との偏差(距離))が閾値Th1を超える場合に、異常が発生していると判定し、警告を通知する。これにより、作業者は、キャリブレーションパラメータのずれの要因を取り除くためのメンテナンスを適切なタイミングで行なうことができる。 The determination unit 49 generates an abnormality when a feature amount (deviation (distance) between the ideal trajectories of the measurement positions PMa and PMb and the marks 5a and 5b) indicating the degree of deviation between the actual trajectory and the ideal trajectory exceeds the threshold Th1. Is determined, and a warning is notified. Thereby, the operator can perform maintenance for removing the cause of the deviation of the calibration parameter at an appropriate timing.
 軌跡決定部48は、移動機構10および移動制御部41のシミュレーションモデルと、初回の位置特定処理によって特定された初期位置Pa(0),Pb(0)とを用いてシミュレーションを行なうことにより、マーク5a,5bの理想軌跡をそれぞれ決定する。シミュレーションモデルを用いることにより、精度良く理想軌跡を決定することができる。 The trajectory determination unit 48 performs a simulation using the simulation models of the movement mechanism 10 and the movement control unit 41 and the initial positions Pa (0) and Pb (0) specified by the initial position specification processing, thereby obtaining a mark. The ideal trajectories 5a and 5b are respectively determined. By using a simulation model, an ideal trajectory can be determined with high accuracy.
 §4 変形例
 <4-1.変形例1>
 上記の説明では、判定部49は、視覚センサ30によって特定された最新の計測位置PMa,PMbが第1許容領域,第2許容領域内にそれぞれ位置するか否かに応じて、異常の有無を判定した。これに対し、判定部49は、位置決定部43から制御周期Tsごとに出力される推定位置PVa,PVbが第1許容領域,第2許容領域内にそれぞれ位置するか否かに応じて、異常の有無を判定してもよい。
§4 Modification <4-1. Modification 1>
In the above description, the determination unit 49 determines whether or not there is an abnormality according to whether or not the latest measurement positions PMa and PMb specified by the visual sensor 30 are located in the first allowable area and the second allowable area, respectively. Judged. On the other hand, the determination unit 49 determines whether the estimated positions PVa and PVb output from the position determination unit 43 for each control cycle Ts are located in the first allowable region and the second allowable region, respectively. May be determined.
 なお、推定位置PVa,PVbは、機械座標系で示される。そのため、判定部49は、推定位置PVa,PVbの座標(機械座標系)をカメラ座標系に変換した上で、推定位置PVa,PVbが第1許容領域,第2許容領域内にそれぞれ位置するか否かを判定すればよい。 Note that the estimated positions PVa and PVb are shown in a machine coordinate system. Therefore, the determination unit 49 converts the coordinates (mechanical coordinate system) of the estimated positions PVa and PVb into the camera coordinate system, and determines whether the estimated positions PVa and PVb are located in the first allowable area and the second allowable area, respectively. What is necessary is just to judge.
 <4-2.変形例2>
 判定部49は、図13に示す異常の有無の判定処理に代えて、図14に示す異常の有無の判定処理を実行してもよい。図14は、変形例2に係る判定処理のサブルーチンの処理内容を示すフローチャートである。変形例2においても、変形例1と同様に、推定位置PVa,PVbを用いて、異常の有無が判定される。
<4-2. Modification 2>
The determination unit 49 may execute the determination process of the presence or absence of the abnormality illustrated in FIG. 14 instead of the determination process of the presence or absence of the abnormality illustrated in FIG. FIG. 14 is a flowchart illustrating the processing content of a subroutine of the determination processing according to the second modification. In the second modification, similarly to the first modification, the presence or absence of an abnormality is determined using the estimated positions PVa and PVb.
 まずステップS141において、判定部49は、推定位置PVa,PVbを位置決定部43から取得する。 First, in step S141, the determination unit 49 acquires the estimated positions PVa and PVb from the position determination unit 43.
 ステップS142において、判定部49は、推定位置PVa,PVbの座標(機械座標系)をカメラ座標系に変換する。 In step S142, the determination unit 49 converts the coordinates (mechanical coordinate system) of the estimated positions PVa and PVb into a camera coordinate system.
 ステップS143において、判定部49は、移動制御部41による制御開始時点から推定位置PVa,PVbが出力された時点までの経過時間tを特定する。推定位置PVa,PVbは、制御周期Tsごとに出力される。そのため、当該時間tは、制御開始時点から推定位置PVa,PVbが出力された回数kを制御周期Tsに乗じた時間kTsとなる。 In step S143, the determination unit 49 specifies the elapsed time t from the time when the movement control unit 41 starts control to the time when the estimated positions PVa and PVb are output. The estimated positions PVa and PVb are output for each control cycle Ts. Therefore, the time t is a time kTs obtained by multiplying the control cycle Ts by the number k of times the estimated positions PVa and PVb are output from the control start time.
 ステップS144において、判定部49は、図12に示す軌跡決定処理において記憶部に保存された情報の中から、理想軌跡上の位置Pa(kTs),Pb(kTs)の座標を読み出す。 In step S144, the determination unit 49 reads out the coordinates of the positions Pa (kTs) and Pb (kTs) on the ideal trajectory from the information stored in the storage unit in the trajectory determination processing shown in FIG.
 ステップS145において、判定部49は、推定位置PVa,PVbと、理想軌跡上の位置Pa(kTs),Pb(kTs)との偏差(距離)をそれぞれ算出する。推定位置PVa,PVbと、理想軌跡上の位置Pa(kTs),Pb(kTs)とのそれぞれの偏差は、実軌跡と理想軌跡との乖離度を示す特徴量の1つである。 In step S145, the determination unit 49 calculates deviations (distances) between the estimated positions PVa and PVb and the positions Pa (kTs) and Pb (kTs) on the ideal trajectory, respectively. Each deviation between the estimated positions PVa and PVb and the positions Pa (kTs) and Pb (kTs) on the ideal trajectory is one of the feature amounts indicating the degree of divergence between the actual trajectory and the ideal trajectory.
 ステップS146において、判定部49は、偏差が閾値Th1以下であるか否かを判定する。 に お い て In step S146, the determination unit 49 determines whether the deviation is equal to or less than the threshold Th1.
 ステップS146でNOの場合、判定部49は、ステップS147において、異常ありと判定する。一方、ステップS146でYESの場合、判定部49は、ステップS148において、異常なしと判定する。ステップS147またはステップS148の後、異常の有無の判定処理は終了する。 場合 If NO in step S146, the determination unit 49 determines in step S147 that there is an abnormality. On the other hand, when YES is determined in the step S146, the determination unit 49 determines in the step S148 that there is no abnormality. After step S147 or step S148, the process of determining whether there is an abnormality ends.
 変形例2によれば、移動制御部41の制御開始時点から経過時間tを考慮した、実軌跡と理想軌跡との乖離度に基づいて、異常の有無が判定される。 According to the second modification, the presence or absence of an abnormality is determined based on the divergence between the actual trajectory and the ideal trajectory in consideration of the elapsed time t from the start of the control of the movement control unit 41.
 <4-3.変形例3>
 監視部47は、図11に示す監視処理に代えて、図15に示す監視処理を実行してもよい。図15は、変形例3に係る監視処理の流れを示すフローチャートである。変形例3に係る監視処理は、図11に示す監視処理と比較して、ステップS22の代わりにステップS121を実行し、かつ、ステップS24の後にステップS122,S123を実行する点で相違する。以下、ステップS121,S122,S123について説明する。
<4-3. Modification 3>
The monitoring unit 47 may execute the monitoring process shown in FIG. 15 instead of the monitoring process shown in FIG. FIG. 15 is a flowchart illustrating the flow of the monitoring process according to the third modification. The monitoring process according to the third modification is different from the monitoring process shown in FIG. 11 in that step S121 is executed instead of step S22, and steps S122 and S123 are executed after step S24. Hereinafter, steps S121, S122, and S123 will be described.
 ステップS121において、判定部49は、マーク5a,5bの各々について、実軌跡と理想軌跡との乖離度を示す特徴量を算出し、当該特徴量に基づいて、異常の有無および異常度の高低の判定を行なう。ステップS121の詳細については後述する。 In step S121, the determination unit 49 calculates, for each of the marks 5a and 5b, a feature amount indicating a degree of divergence between the actual trajectory and the ideal trajectory, and based on the feature amount, determines whether there is an abnormality and whether the abnormality degree is high or low. Make a decision. Details of step S121 will be described later.
 ステップS122において、ステップS121で判定された異常度の高低が確認される。異常度高である場合(ステップS122でYES)、判定部49は、ステップS123において、動作停止指示を移動制御部41に出力する。これにより、移動制御部41は、移動機構10の移動制御を停止する。 に お い て In step S122, the level of the abnormality degree determined in step S121 is confirmed. If the degree of abnormality is high (YES in step S122), the determination unit 49 outputs an operation stop instruction to the movement control unit 41 in step S123. Thereby, the movement control unit 41 stops the movement control of the movement mechanism 10.
 一方、異常度低である場合(ステップS122でNO)、監視処理は、ステップS25に移る。 On the other hand, if the abnormality level is low (NO in step S122), the monitoring process proceeds to step S25.
 図16は、図15に示す異常の有無および異常度の高低の判定処理(ステップS121)のサブルーチンの処理内容を示すフローチャートである。図16に示す異常の有無および異常度の高低の判定処理は、図14に示す異常の有無の判定処理と比較して、ステップS149~S151をさらに実行する点で相違する。以下、ステップS149~S151について説明する。 FIG. 16 is a flowchart showing the processing contents of the subroutine of the processing (step S121) for judging the presence or absence of an abnormality and the degree of abnormality shown in FIG. 15 (step S121). The determination processing of the presence / absence of abnormality and the degree of abnormality shown in FIG. 16 is different from the determination processing of presence / absence of abnormality shown in FIG. 14 in that steps S149 to S151 are further executed. Hereinafter, steps S149 to S151 will be described.
 ステップS147の後、判定部49は、ステップS149において、計測位置PMa,PMbと理想軌跡上の位置Pa(kTs),Pb(kTs)との偏差が閾値Th2以下であるか否かを判定する。閾値Th2は、閾値Th1よりも大きい。 After step S147, the determination unit 49 determines in step S149 whether the deviation between the measurement positions PMa and PMb and the positions Pa (kTs) and Pb (kTs) on the ideal trajectory is equal to or smaller than the threshold Th2. The threshold Th2 is larger than the threshold Th1.
 ステップS149でNOの場合、判定部49は、ステップS150において、異常度高と判定する。一方、ステップS149でYESの場合、判定部49は、ステップS151において、異常度低と判定する。ステップS150またはステップS151の後、異常の有無および異常度の高低の判定処理は終了する。 場合 If NO in step S149, the determination unit 49 determines in step S150 that the degree of abnormality is high. On the other hand, when YES is determined in the step S149, the determination unit 49 determines in the step S151 that the degree of abnormality is low. After step S150 or step S151, the process of determining the presence or absence of abnormality and the level of abnormality is completed.
 <4-4.変形例4>
 上記の説明では、監視処理は、位置決め処理と並行して実行される。しかしながら、監視処理は、位置決め処理が終了した後に実行されてもよい。位置決め処理が終了しているため、マーク5a,5bの各々について実軌跡が特定されている。マーク5a,5bの実軌跡は、位置決め処理中に位置決定部43から出力される推定位置PVa,PVbをそれぞれ順に繋ぐことにより特定される。もしくは、マーク5a,5bの実軌跡は、位置決め処理中に視覚センサ30から出力される計測位置PMa,PMbをそれぞれ順に繋ぐことにより特定されてもよい。
<4-4. Modification 4>
In the above description, the monitoring process is executed in parallel with the positioning process. However, the monitoring process may be executed after the positioning process ends. Since the positioning process has been completed, the actual trajectory is specified for each of the marks 5a and 5b. The actual trajectories of the marks 5a and 5b are specified by sequentially connecting the estimated positions PVa and PVb output from the position determination unit 43 during the positioning process. Alternatively, the actual trajectories of the marks 5a and 5b may be specified by sequentially connecting the measurement positions PMa and PMb output from the visual sensor 30 during the positioning process.
 判定部49は、マーク5a,5bの各々について、実軌跡と理想軌跡との乖離度を示す特徴量を算出し、算出した特徴量に基づいて、異常の有無および異常度の高低の判定を行なえばよい。特徴量としては、たとえば、実軌跡と理想軌跡とで囲まれる領域の面積(実軌跡と理想軌跡との偏差の積分値)を用いることができる。もしくは、特徴量として、移動制御部41による制御開始時点からの経過時間が同一である、実軌跡上の点と理想軌跡上の点との偏差(距離)の最大値を用いてもよい。 The determination unit 49 calculates, for each of the marks 5a and 5b, a feature amount indicating the degree of divergence between the actual trajectory and the ideal trajectory, and determines the presence or absence of an abnormality and the level of the abnormality based on the calculated feature amount. Just fine. As the feature amount, for example, the area of a region surrounded by the real trajectory and the ideal trajectory (integrated value of the deviation between the real trajectory and the ideal trajectory) can be used. Alternatively, the maximum value of the deviation (distance) between a point on the actual trajectory and a point on the ideal trajectory having the same elapsed time from the start of control by the movement control unit 41 may be used as the feature amount.
 <4-5.変形例5>
 図17は、変形例5に係るコントローラの機能構成を示す図である。図17に示されるように、変形例5に係る位置決めシステム1Aは、図5に示す位置決めシステム1と比較して、コントローラ40の代わりにコントローラ40Aを備える点で相違する。コントローラ40Aは、図5に示すコントローラ40と比較して、監視部47の代わりに監視部47Aを備え、モデル記憶部46を備えない点で相違する。監視部47Aは、監視部47と比較して、軌跡決定部48の代わりに軌跡決定部48Aを備える点で相違する。
<4-5. Modification 5>
FIG. 17 is a diagram illustrating a functional configuration of a controller according to the fifth modification. As shown in FIG. 17, the positioning system 1A according to the fifth modification is different from the positioning system 1 shown in FIG. 5 in that a controller 40A is provided instead of the controller 40. The controller 40A is different from the controller 40 shown in FIG. 5 in that a monitoring unit 47A is provided instead of the monitoring unit 47 and the model storage unit 46 is not provided. The monitoring unit 47A is different from the monitoring unit 47 in that a trajectory determination unit 48A is provided instead of the trajectory determination unit 48.
 軌跡決定部48Aは、初期位置の入力に応じて理想軌跡を示す関数を出力するための機械学習が行なわれた学習器148を有し、学習器148を用いて、マーク5a,5bの理想軌跡を決定する。 The trajectory determination unit 48A has a learning unit 148 on which machine learning for outputting a function indicating an ideal trajectory in accordance with the input of the initial position is performed, and the ideal trajectories of the marks 5a and 5b are determined using the learning unit 148. To determine.
 学習器148は、位置決めシステムが正常であるときに実行された複数回の位置決め処理の各々において特定された複数の計測位置の座標を学習データとして用いて機械学習を行なうことにより構築される。位置決めシステムが正常であるときとは、たとえば、キャリブレーションパラメータのずれが小さい、位置決めシステムの立ち上げ時期である。 The learning device 148 is constructed by performing machine learning using, as learning data, coordinates of a plurality of measurement positions specified in each of a plurality of positioning processes performed when the positioning system is normal. The time when the positioning system is normal is, for example, the time when the positioning system is started up when the deviation of the calibration parameter is small.
 具体的には、学習器148は、各位置決め処理における初回の位置特定処理で特定された計測位置PMaと、当該位置決め処理において撮像周期Tbごとに特定される複数の計測位置PMaを順に繋いだ軌跡を示す情報である関数Pa(t)との対応関係を学習する。関数Pa(t)は、移動制御部41による制御開始時点からの経過時間tと、制御開始時点から経過時間tだけ経過したときのマーク5aの画像上の位置の座標(カメラ座標系)とを対応付けた関数である。 Specifically, the learning device 148 sequentially connects the measurement position PMa specified in the first position specification process in each positioning process and the plurality of measurement positions PMa specified for each imaging cycle Tb in the positioning process. Is learned in correspondence with a function Pa (t) which is information indicating. The function Pa (t) represents the elapsed time t from the control start time by the movement control unit 41 and the coordinates (camera coordinate system) of the position of the mark 5a on the image when the elapsed time t has elapsed from the control start time. The associated function.
 同様に、学習器148は、各位置決め処理における初回の位置特定処理で特定された計測位置PMbと、当該位置決め処理において撮像周期Tbごとに特定される複数の計測位置PMbを順に繋いだ軌跡を示す情報である関数Pb(t)との対応関係を学習する。関数Pb(t)は、移動制御部41による制御開始時点からの経過時間tと、制御開始時点から経過時間tだけ経過したときのマーク5bの画像上の位置のXY座標(カメラ座標系)とを対応付けた関数である。 Similarly, the learning device 148 shows a locus that sequentially connects the measurement position PMb specified in the first position specification process in each positioning process and the plurality of measurement positions PMb specified for each imaging cycle Tb in the positioning process. The correspondence with the function Pb (t) as information is learned. The function Pb (t) is an elapsed time t from the start of control by the movement control unit 41, an XY coordinate (camera coordinate system) of a position on the image of the mark 5b when the elapsed time t has elapsed from the start of control. Is a function in which.
 軌跡決定部48Aは、初期位置Pa(0)を学習器148に入力し、学習器148から出力される関数で示される軌跡を、マーク5aの画像上の理想軌跡として決定する。軌跡決定部48Aは、初期位置Pb(0)を学習器148に入力することにより、学習器148から出力された関数で示される軌跡を、マーク5bの画像上の理想軌跡として決定する。 The trajectory determination unit 48A inputs the initial position Pa (0) to the learning device 148, and determines a trajectory represented by a function output from the learning device 148 as an ideal trajectory on the image of the mark 5a. By inputting the initial position Pb (0) to the learning device 148, the trajectory determination unit 48A determines the trajectory represented by the function output from the learning device 148 as the ideal trajectory of the mark 5b on the image.
 変形例5によれば、軌跡決定部48Aは、実態に合った理想軌跡を決定することができる。 According to the fifth modification, the trajectory determination unit 48A can determine an ideal trajectory that matches the actual situation.
 <4-6.変形例6>
 図18は、変形例6に係る位置決めシステムの構成を示す図である。図18に示されるように、変形例6に係る位置決めシステムの視覚センサ30は、対象ワークWとともに、定位置に固定された基準ワークW0を撮像する。基準ワークW0は、透光性を有する素材で構成される。そのため、視覚センサ30は、対象ワークWの特徴部分であるマーク5a,5bと、基準ワークW0の特徴部分であるマーク6a、6bとを同時に撮像できる。
<4-6. Modification 6>
FIG. 18 is a diagram illustrating a configuration of a positioning system according to the sixth modification. As shown in FIG. 18, the visual sensor 30 of the positioning system according to Modification 6 captures the target work W and the reference work W0 fixed at the fixed position. The reference work W0 is made of a light-transmitting material. Therefore, the visual sensor 30 can simultaneously capture the marks 5a and 5b, which are the characteristic portions of the target work W, and the marks 6a, 6b, which are the characteristic portions of the reference work W0.
 視覚センサ30は、基準ワークW0のマーク6a,6bの位置も合わせて特定する。コントローラ40は、視覚センサ30によって特定されたマーク6aの位置を目標位置SPaとして設定し、視覚センサ30によって特定されたマーク6bの位置を目標位置SPbとして設定する。 (4) The visual sensor 30 also specifies the positions of the marks 6a and 6b of the reference work W0. The controller 40 sets the position of the mark 6a specified by the visual sensor 30 as the target position SPa, and sets the position of the mark 6b specified by the visual sensor 30 as the target position SPb.
 <4-7.変形例7>
 判定部49は、位置決め処理ごとに、理想軌跡および実軌跡を示す画面を表示装置50に表示させてもよい。
<4-7. Modification 7>
The determination unit 49 may cause the display device 50 to display a screen showing the ideal trajectory and the actual trajectory for each positioning process.
 図19は、理想軌跡および実軌跡を示す画面の一例である。図19において、実線70は対象マーク(マーク5aまたはマーク5b)の理想軌跡を示し、実線72は対象マークの実軌跡を示す。作業者は、図19に示されるような画面を確認することにより、キャリブレーションパラメータのずれの程度を把握できる。 FIG. 19 is an example of a screen showing an ideal trajectory and an actual trajectory. In FIG. 19, a solid line 70 indicates an ideal trajectory of the target mark (the mark 5a or the mark 5b), and a solid line 72 indicates a real trajectory of the target mark. The operator can grasp the degree of deviation of the calibration parameter by checking the screen as shown in FIG.
 さらに、判定部49は、移動制御部41による制御開始時点からの経過時間が同一である、実軌跡上の位置と理想軌跡上の位置との偏差をX方向およびY方向の成分に分解し、各成分の経時変化を示す画面を表示装置50に表示させてもよい。X方向は、Xステージ11の移動方向であり、Y方向はYステージ13の移動方向である。 Further, the determination unit 49 decomposes the deviation between the position on the actual trajectory and the position on the ideal trajectory, in which the time elapsed since the control start by the movement control unit 41 is the same, into components in the X and Y directions, A screen showing changes with time of each component may be displayed on the display device 50. The X direction is the moving direction of the X stage 11, and the Y direction is the moving direction of the Y stage 13.
 図20は、実軌跡上の位置と理想軌跡上の位置との偏差のX方向およびY方向の各々の成分の経時変化を示す画面の一例を示す図である。作業者は、図20のような画面を確認することにより、移動機構10の異常箇所を推測することができる。すなわち、理想軌跡と実軌跡とでY方向の偏差がX方向の偏差よりも大きい場合、作業者は、Yステージ13に異常が発生していると推測できる。その結果、メンテナンスに要する時間を短縮できる。 FIG. 20 is a diagram showing an example of a screen showing a temporal change of each component in the X direction and the Y direction of the deviation between the position on the actual trajectory and the position on the ideal trajectory. By checking the screen as shown in FIG. 20, the operator can infer the abnormal part of the moving mechanism 10. That is, when the deviation in the Y direction between the ideal trajectory and the actual trajectory is larger than the deviation in the X direction, the operator can infer that the Y stage 13 is abnormal. As a result, the time required for maintenance can be reduced.
 さらに、判定部49は、位置決め処理ごとに算出される、理想軌跡と実軌跡との乖離度を示す特徴量の推移を示す画面を表示装置50に表示させてもよい。これにより、作業者は、移動機構10の経年劣化などの状況を把握することができる。 (4) The determination unit 49 may cause the display device 50 to display a screen that indicates a transition of a feature amount that is calculated for each positioning process and that indicates the degree of deviation between the ideal trajectory and the actual trajectory. Thereby, the worker can grasp the situation such as the deterioration of the moving mechanism 10 over time.
 判定部49は、たとえば理想軌跡と実軌跡とで囲まれる領域の面積(つまり、理想軌跡と実軌跡との偏差の積分値)(図19の斜線部の面積)を、理想軌跡と実軌跡との乖離度を示す特徴量として算出する。判定部49は、位置決め処理ごとに当該特徴量を算出し、特徴量の推移を示す画面を表示装置50に表示させる。 The determination unit 49 determines, for example, the area of a region surrounded by the ideal trajectory and the actual trajectory (that is, the integrated value of the deviation between the ideal trajectory and the actual trajectory) (the area of the hatched portion in FIG. 19) Is calculated as a feature quantity indicating the degree of deviation of The determination unit 49 calculates the feature amount for each positioning process, and causes the display device 50 to display a screen showing the transition of the feature amount.
 図21は、理想軌跡と実軌跡との乖離度を示す特徴量(偏差の積分値)の時間変化を示す画面の一例を示す図である。図21に示されるように、理想軌跡と実軌跡との乖離度を示す特徴量は、移動機構10の経年劣化に伴い、徐々に大きくなる。図21に示す画面を確認することにより、作業者は、現時刻の特徴量の値と閾値Th1との差を把握することができる。その結果、作業者は、キャリブレーションなどのメンテナンスが必要とされる時期を予測できる。なお、図21に示す画面例は、特徴量の推移を示すグラフ画像を含む。しかしながら、判定部21は、特徴量の数値を時系列に並べた画面を表示装置50に表示させてもよい。 FIG. 21 is a diagram showing an example of a screen showing a temporal change of a feature amount (integral value of deviation) indicating a degree of deviation between an ideal trajectory and an actual trajectory. As shown in FIG. 21, the feature amount indicating the degree of deviation between the ideal trajectory and the actual trajectory gradually increases as the moving mechanism 10 deteriorates over time. By checking the screen shown in FIG. 21, the operator can grasp the difference between the value of the feature amount at the current time and the threshold Th1. As a result, the operator can predict when maintenance such as calibration is required. Note that the screen example shown in FIG. 21 includes a graph image showing the transition of the feature amount. However, the determination unit 21 may cause the display device 50 to display a screen in which the numerical values of the feature amounts are arranged in chronological order.
 <4-8.その他の変形例>
 上記の説明では、軌跡決定部48,48Aは、マーク5a,5bの各々の理想軌跡を決定した。しかしながら、軌跡決定部48,48Aは、マーク5a,5bの中点の理想軌跡を決定してもよい。この場合、判定部49は、視覚センサ30によって特定された計測位置PMa,PMbの中点の実軌跡と理想軌跡との比較結果に基づいて、位置決めシステム1の異常の発生の有無を判定してもよい。
<4-8. Other Modifications>
In the above description, the trajectory determination units 48 and 48A have determined the ideal trajectories of the marks 5a and 5b. However, the trajectory determination units 48 and 48A may determine the ideal trajectory of the midpoint of the marks 5a and 5b. In this case, the determination unit 49 determines whether or not an abnormality has occurred in the positioning system 1 based on the comparison result between the actual trajectory of the midpoint of the measurement positions PMa and PMb specified by the visual sensor 30 and the ideal trajectory. Is also good.
 上記の説明では、対象ワークWに設けられたマーク5a,5bを対象ワークWの特徴部分として用いて、対象ワークWが位置決めされる。しかしながら、対象ワークWの他の部分を特徴部分として用いて対象ワークWが位置決めされてもよい。たとえば、対象ワークWに設けられたネジまたはネジ穴を特徴部分として用いてもよい。もしくは、対象ワークWの角部を特徴部分として用いてもよい。 In the above description, the target work W is positioned using the marks 5a and 5b provided on the target work W as characteristic features of the target work W. However, the target work W may be positioned using another part of the target work W as a characteristic portion. For example, a screw or a screw hole provided in the target work W may be used as the characteristic portion. Alternatively, a corner of the target work W may be used as a characteristic portion.
 上記の説明では、コントローラ40は、移動制御部41と監視部47とを備える。しかしながら、監視部47は、コントローラ40とは別の情報処理装置に備えられていてもよい。 In the above description, the controller 40 includes the movement control unit 41 and the monitoring unit 47. However, the monitoring unit 47 may be provided in an information processing device different from the controller 40.
 §5 付記
 以上のように、本実施の形態および変形例は以下のような開示を含む。
§5 Supplement As described above, the present embodiment and modifications include the following disclosure.
 (構成1)
 対象物(W)の位置決め処理を行なう位置決めシステム(1,1A)であって、
 前記対象物(W)を移動させるための移動機構(10)と、
 前記対象物(W)を撮像し、撮像画像に基づいて前記対象物(W)の位置を特定する位置特定処理を撮像周期ごとに実行するための視覚センサ(30)と、
 前記視覚センサ(30)によって特定された位置が目標位置に近づくように前記移動機構(10)を制御するための移動制御部(41)と、
 初回の位置特定処理によって特定された前記対象物(W)の位置から前記目標位置までの前記対象物(W)の理想軌跡を決定する軌跡決定部(48,48A)と、
 前記初回の位置特定処理の後の位置特定処理によって特定された前記対象物(W)の位置の実軌跡と前記理想軌跡との比較結果に基づいて、前記位置決めシステム(1,1A)の異常の有無を判定する判定部(49)とを備える、位置決めシステム(1,1A)。
(Configuration 1)
A positioning system (1, 1A) for performing positioning processing of an object (W),
A moving mechanism (10) for moving the object (W);
A visual sensor (30) for taking an image of the object (W) and executing a position specifying process for specifying a position of the object (W) based on the picked-up image for each imaging cycle;
A movement control unit (41) for controlling the movement mechanism (10) such that the position specified by the visual sensor (30) approaches a target position;
A trajectory determiner (48, 48A) that determines an ideal trajectory of the object (W) from the position of the object (W) specified by the initial position specifying process to the target position;
An abnormality of the positioning system (1, 1A) is determined based on a comparison result between the actual trajectory of the position of the object (W) specified by the position specifying process after the initial position specifying process and the ideal trajectory. A positioning system (1, 1A), comprising: a determination unit (49) for determining presence / absence.
 (構成2)
 前記視覚センサ(30)は、前記対象物(W)が停止中において前記初回の位置特定処理を実行し、前記対象物(W)が移動中において前記後の位置特定処理を実行する、構成1に記載の位置決めシステム(1,1A)。
(Configuration 2)
Configuration 1 in which the visual sensor (30) executes the first position specifying process while the object (W) is stopped, and executes the subsequent position specifying process while the object (W) is moving. The positioning system (1, 1A) according to 1.
 (構成3)
 前記判定部(49)は、前記実軌跡と前記理想軌跡との乖離度を示す特徴量が第1閾値を超える場合に、前記異常が発生していると判定し、警告を通知する、構成1または2に記載の位置決めシステム(1,1A)。
(Configuration 3)
A determining unit configured to determine that the abnormality has occurred and to issue a warning when a feature amount indicating a degree of deviation between the actual trajectory and the ideal trajectory exceeds a first threshold; Or the positioning system (1, 1A) according to 2.
 (構成4)
 前記判定部(49)は、前記特徴量が第2閾値を超える場合に、前記移動機構(10)を停止させ、
 前記第2閾値は前記第1閾値よりも大きい、構成3に記載の位置決めシステム(1,1A)。
(Configuration 4)
The determination unit (49) stops the moving mechanism (10) when the feature amount exceeds a second threshold,
The positioning system (1, 1A) according to configuration 3, wherein the second threshold is greater than the first threshold.
 (構成5)
 前記特徴量は、後の位置特定処理によって特定された前記対象物(W)の位置と前記理想軌跡との偏差である、構成3または4に記載の位置決めシステム(1,1A)。
(Configuration 5)
The positioning system (1, 1A) according to configuration 3 or 4, wherein the feature amount is a deviation between a position of the object (W) specified by a position specifying process later and the ideal trajectory.
 (構成6)
 前記実軌跡および前記理想軌跡は、前記移動制御部(41)による前記移動機構(10)の制御開始時点からの経過時間と、前記対象物(W)の位置とを対応付けた情報により示され、
 前記特徴量は、前記経過時間が同一である、前記実軌跡上の位置と前記理想軌跡上の位置との偏差の最大値または積分値である、構成3または4に記載の位置決めシステム(1,1A)。
(Configuration 6)
The actual trajectory and the ideal trajectory are indicated by information that associates the elapsed time from the start of the control of the movement mechanism (10) by the movement control unit (41) with the position of the object (W). ,
The positioning system (1, 1) according to Configuration 3 or 4, wherein the feature amount is a maximum value or an integral value of a deviation between the position on the actual trajectory and the position on the ideal trajectory, where the elapsed time is the same. 1A).
 (構成7)
 前記軌跡決定部(48)は、前記移動機構(10)および前記移動制御部(41)のシミュレーションモデルと、前記初回の位置特定処理によって特定された前記対象物(W)の位置とを用いてシミュレーションを行なうことにより、前記理想軌跡を決定する、構成1から6のいずれかに記載の位置決めシステム(1)。
(Configuration 7)
The trajectory determination unit (48) uses a simulation model of the movement mechanism (10) and the movement control unit (41), and a position of the object (W) specified by the first position specification processing. The positioning system (1) according to any one of Configurations 1 to 6, wherein the ideal trajectory is determined by performing a simulation.
 (構成8)
 前記軌跡決定部(48A)は、学習データを用いた機械学習によって、入力された前記対象物(W)の位置に対応する、当該位置から前記目標位置までの前記対象物(W)の軌跡を示す情報を出力するように構築された学習済みの学習器(148)をさらに備え、
 前記学習データは、前記位置決めシステムが正常であるときに実行された前記位置決め処理において前記視覚センサ(30)によって特定された前記対象物(W)の位置を示すデータであり、
 前記軌跡決定部(48A)は、前記初回の位置特定処理によって特定された前記対象物(W)の位置を前記学習器(148)に入力することにより、前記学習器(148)から出力された前記情報で示される軌跡を前記理想軌跡として決定する、構成1から6のいずれかに記載の位置決めシステム(1A)。
(Configuration 8)
The trajectory determination unit (48A) calculates a trajectory of the object (W) from the position corresponding to the input position of the object (W) to the target position by machine learning using learning data. Further comprising a trained learner (148) configured to output the indicated information;
The learning data is data indicating a position of the object (W) specified by the visual sensor (30) in the positioning process performed when the positioning system is normal,
The trajectory determination unit (48A) is output from the learning device (148) by inputting the position of the object (W) specified by the initial position specification processing to the learning device (148). The positioning system (1A) according to any of Configurations 1 to 6, wherein a trajectory indicated by the information is determined as the ideal trajectory.
 (構成9)
 前記判定部(49)は、前記実軌跡と前記理想軌跡とを示す画面を表示装置(50)に出力する、構成1から8のいずれかに記載の位置決めシステム(1,1A)。
(Configuration 9)
The positioning system (1, 1A) according to any one of Configurations 1 to 8, wherein the determination unit (49) outputs a screen indicating the actual trajectory and the ideal trajectory to a display device (50).
 (構成10)
 前記実軌跡および前記理想軌跡は、前記移動制御部による前記移動機構の制御開始時点からの経過時間と、前記対象物の位置とを対応付けた情報により示され、
 前記移動機構(10)は、互いに異なる移動方向に並進移動する複数の並進機構(11,13)を含み、
 前記判定部(49)は、前記経過時間が同一である、前記実軌跡上の位置と前記理想軌跡上の位置との偏差を、前記複数の並進機構(11,13)の各々に対応する前記移動方向の成分に分解し、各成分の経時変化を示す画面を表示装置(50)に出力する、構成1または2に記載の位置決めシステム。
(Configuration 10)
The actual trajectory and the ideal trajectory are indicated by information that associates the elapsed time from the start of the control of the moving mechanism by the movement control unit with the position of the object,
The movement mechanism (10) includes a plurality of translation mechanisms (11, 13) that translate in different directions of movement.
The determination unit (49) determines a deviation between a position on the actual trajectory and a position on the ideal trajectory, where the elapsed time is the same, for each of the plurality of translation mechanisms (11, 13). 3. The positioning system according to configuration 1 or 2, wherein the positioning system is decomposed into components in the moving direction, and outputs a screen showing a temporal change of each component to the display device (50).
 (構成11)
 前記判定部(49)は、前記特徴量の推移を示す画面を表示装置(50)に出力する、構成3から6のいずれかに記載の位置決めシステム(1,1A)。
(Configuration 11)
The positioning system (1, 1A) according to any one of Configurations 3 to 6, wherein the determination unit (49) outputs a screen indicating a transition of the feature amount to a display device (50).
 (構成12)
 構成1から11のいずれかに記載の位置決めシステム(1,1A)に用いられる監視装置であって、
 前記軌跡決定部と、
 前記判定部(49)とを備える、監視装置(40,40A)。
(Configuration 12)
A monitoring device used in the positioning system (1, 1A) according to any one of the configurations 1 to 11,
The trajectory determination unit,
A monitoring device (40, 40A) including the determination unit (49).
 (構成13)
 対象物(W)の位置決め処理を行なう位置決めシステム(1,1A)の監視方法であって、
 前記位置決めシステム(1,1A)は、
 前記対象物(W)を移動させるための移動機構(10)と、
 前記対象物(W)を撮像し、撮像画像に基づいて前記対象物(W)の位置を特定する位置特定処理を撮像周期ごとに実行するための視覚センサ(30)と、
 前記視覚センサ(30)によって特定された位置が目標位置に近づくように前記移動機構(10)を制御するための移動制御部(41)とを備え、
 初回の位置特定処理によって特定された前記対象物(W)の位置から前記目標位置までの前記対象物(W)の理想軌跡を決定するステップと、
 前記初回の位置特定処理の後の位置特定処理によって特定された前記対象物(W)の位置の実軌跡と前記理想軌跡との比較結果に基づいて、前記位置決めシステムの異常の有無を判定するステップとを備える、監視方法。
(Configuration 13)
A monitoring method of a positioning system (1, 1A) for performing positioning processing of an object (W),
The positioning system (1, 1A) comprises:
A moving mechanism (10) for moving the object (W);
A visual sensor (30) for taking an image of the object (W) and executing a position specifying process for specifying a position of the object (W) based on the picked-up image for each imaging cycle;
A movement control unit (41) for controlling the movement mechanism (10) such that the position specified by the visual sensor (30) approaches the target position;
Determining an ideal trajectory of the object (W) from the position of the object (W) specified by the initial position specifying process to the target position;
Determining whether there is an abnormality in the positioning system based on a comparison result between the actual trajectory of the position of the object (W) specified by the position specifying process after the first position specifying process and the ideal trajectory A monitoring method comprising:
 (構成14)
 対象物(W)の位置決め処理を行なう位置決めシステム(1,1A)の監視方法をコンピュータに実行させるためのプログラムであって、
 前記位置決めシステム(1,1A)は、
 前記対象物(W)を移動させるための移動機構(10)と、
 前記対象物(W)を撮像し、撮像画像に基づいて前記対象物(W)の位置を特定する位置特定処理を撮像周期ごとに実行するための視覚センサ(30)と、
 前記視覚センサ(30)によって特定された位置が目標位置に近づくように前記移動機構(10)を制御するための移動制御部(41)とを備え、
 前記監視方法は、
 初回の位置特定処理によって特定された前記対象物(W)の位置から前記目標位置までの前記対象物(W)の理想軌跡を決定するステップと、
 前記初回の位置特定処理の後の位置特定処理によって特定された前記対象物(W)の位置の実軌跡と前記理想軌跡との比較結果に基づいて、前記位置決めシステム(1,1A)の異常の有無を判定するステップとを備える、プログラム。
(Configuration 14)
A program for causing a computer to execute a method of monitoring a positioning system (1, 1A) for performing positioning processing of an object (W),
The positioning system (1, 1A) comprises:
A moving mechanism (10) for moving the object (W);
A visual sensor (30) for taking an image of the object (W) and executing a position specifying process for specifying a position of the object (W) based on the picked-up image for each imaging cycle;
A movement control unit (41) for controlling the movement mechanism (10) such that the position specified by the visual sensor (30) approaches the target position;
The monitoring method comprises:
Determining an ideal trajectory of the object (W) from the position of the object (W) specified by the initial position specifying process to the target position;
An abnormality of the positioning system (1, 1A) is determined based on a comparison result between the actual trajectory of the position of the object (W) specified by the position specifying process after the initial position specifying process and the ideal trajectory. A step of determining the presence or absence of the program.
 今回開示された各実施の形態は全ての点で例示であって制限的なものではないと考えられるべきである。本発明の範囲は上記した説明ではなくて請求の範囲によって示され、請求の範囲と均等の意味および範囲内での全ての変更が含まれることが意図される。また、実施の形態および各変形例において説明された発明は、可能な限り、単独でも、組合せても、実施することが意図される。 The embodiments disclosed this time are to be considered in all respects as illustrative and not restrictive. The scope of the present invention is defined by the terms of the claims, rather than the description above, and is intended to include any modifications within the scope and meaning equivalent to the terms of the claims. Further, it is intended that the inventions described in the embodiments and the respective modified examples are implemented alone or in combination as much as possible.
 1,1A 位置決めシステム、5a,5b,6a,6b マーク、10 移動機構、11 Xステージ、12,14,16 サーボモータ、12E,14E,16E エンコーダ、13 Yステージ、15 θステージ、20 ドライバユニット、22,24,26 サーボドライバ、30 視覚センサ、31 撮像部、32 画像処理装置、40,40A コントローラ、41 移動制御部、42 座標変換部、43 位置決定部、44 減算部、45 演算部、46 モデル記憶部、47,47A 監視部、48,48A 軌跡決定部、49 判定部、50 表示装置、51,61,424 記録媒体、60 表示部、71 許容領域、148 学習器、310,414 プロセッサ、312 RAM、314,426 表示コントローラ、316 システムコントローラ、318 I/Oコントローラ、320 ハードディスク、322 カメラインターフェイス、324 入力インターフェイス、326 コントローラインターフェイス、328,417,428 通信インターフェイス、330,422 メモリカードインターフェイス、334 キーボード、412 チップセット、416 不揮発性メモリ、418 主メモリ、420 システムクロック、430 内部バスコントローラ、432 DMA制御回路、434 内部バス制御回路、436 バッファメモリ、438 フィールドバスコントローラ、440 制御プログラム、W 対象ワーク、W0 基準ワーク。 1, 1A positioning system, 5a, 5b, 6a, 6b mark, 10 moving mechanism, 11 X stage, 12, 14, 16 servo motor, 12E, 14E, 16E encoder, 13 Y stage, 15 θ stage, 20 driver unit, 22, 24, 26 servo driver, 30 visual sensor, 31 imaging unit, 32 image processing device, 40, 40A controller, 41 movement control unit, 42 coordinate conversion unit, 43 position determination unit, 44 subtraction unit, 45 operation unit, 46 Model storage unit, 47, 47A monitoring unit, 48, 48A trajectory determination unit, 49 determination unit, 50 display unit, 51, 61, 424 recording medium, 60 display unit, 71 allowable area, 148 learning unit, 310, 414 processor, 312 RAM, 314, 426 display control LA, 316 system controller, 318 I / O controller, 320 hard disk, 322 camera interface, 324 input interface, 326 controller interface, 328, 417, 428 communication interface, 330, 422 memory card interface, 334 keyboard, 412 chipset, 416 Nonvolatile memory, 418 main memory, 420 system clock, 430 internal bus controller, 432 DMA control circuit, 434 internal bus control circuit, 436 buffer memory, 438 field bus controller, 440 control program, W target work, W0 reference work.

Claims (14)

  1.  対象物の位置決め処理を行なう位置決めシステムであって、
     前記対象物を移動させるための移動機構と、
     前記対象物を撮像し、撮像画像に基づいて前記対象物の位置を特定する位置特定処理を撮像周期ごとに実行するための視覚センサと、
     前記視覚センサによって特定された位置が目標位置に近づくように前記移動機構を制御するための移動制御部と、
     初回の位置特定処理によって特定された前記対象物の位置から前記目標位置までの前記対象物の理想軌跡を決定するための軌跡決定部と、
     前記初回の位置特定処理の後の位置特定処理によって特定された前記対象物の位置の実軌跡と前記理想軌跡との比較結果に基づいて、前記位置決めシステムの異常の有無を判定するための判定部とを備える、位置決めシステム。
    A positioning system that performs a positioning process for an object,
    A moving mechanism for moving the object,
    A visual sensor for imaging the object, and performing a position specifying process for specifying the position of the object based on the captured image for each imaging cycle,
    A movement control unit for controlling the movement mechanism so that the position specified by the visual sensor approaches the target position,
    A trajectory determination unit for determining an ideal trajectory of the target object from the position of the target object specified by the initial position specifying process to the target position,
    A determining unit configured to determine whether there is an abnormality in the positioning system based on a comparison result between the actual trajectory of the position of the target object and the ideal trajectory specified by the position specifying process after the first position specifying process; A positioning system comprising:
  2.  前記視覚センサは、前記対象物が停止中において前記初回の位置特定処理を実行し、前記対象物が移動中において前記後の位置特定処理を実行する、請求項1に記載の位置決めシステム。 2. The positioning system according to claim 1, wherein the visual sensor executes the first position identification processing while the object is stopped, and executes the subsequent position identification processing while the object is moving. 3.
  3.  前記判定部は、前記実軌跡と前記理想軌跡との乖離度を示す特徴量が第1閾値を超える場合に、前記異常が発生していると判定し、警告を通知する、請求項1または2に記載の位置決めシステム。 3. The determination unit determines that the abnormality has occurred and notifies a warning when a feature amount indicating a degree of deviation between the actual trajectory and the ideal trajectory exceeds a first threshold. 4. The positioning system according to claim 1.
  4.  前記判定部は、前記特徴量が第2閾値を超える場合に、前記移動機構を停止させ、
     前記第2閾値は前記第1閾値よりも大きい、請求項3に記載の位置決めシステム。
    The determination unit stops the moving mechanism when the feature amount exceeds a second threshold,
    The positioning system according to claim 3, wherein the second threshold is larger than the first threshold.
  5.  前記特徴量は、前記後の位置特定処理によって特定された前記対象物の位置と前記理想軌跡との偏差である、請求項3または4に記載の位置決めシステム。 5. The positioning system according to claim 3, wherein the feature amount is a deviation between the position of the object specified by the subsequent position specifying process and the ideal trajectory.
  6.  前記実軌跡および前記理想軌跡は、前記移動制御部による前記移動機構の制御開始時点からの経過時間と、前記対象物の位置とを対応付けた情報により示され、
     前記特徴量は、前記経過時間が同一である、前記実軌跡上の位置と前記理想軌跡上の位置との偏差の最大値または積分値である、請求項3または4に記載の位置決めシステム。
    The actual trajectory and the ideal trajectory are indicated by information that associates the elapsed time from the start of the control of the moving mechanism by the movement control unit with the position of the object,
    5. The positioning system according to claim 3, wherein the feature amount is a maximum value or an integral value of a deviation between the position on the actual trajectory and the position on the ideal trajectory having the same elapsed time.
  7.  前記軌跡決定部は、前記移動機構および前記移動制御部のシミュレーションモデルと、前記初回の位置特定処理によって特定された前記対象物の位置とを用いてシミュレーションを行なうことにより、前記理想軌跡を決定する、請求項1から6のいずれか1項に記載の位置決めシステム。 The trajectory determination unit determines the ideal trajectory by performing a simulation using a simulation model of the movement mechanism and the movement control unit and a position of the target object specified by the first position specification processing. The positioning system according to any one of claims 1 to 6.
  8.  前記軌跡決定部は、学習データを用いた機械学習によって、入力された前記対象物の位置に対応する、当該位置から前記目標位置までの前記対象物の軌跡を示す情報を出力するように構築された学習済みの学習器をさらに備え、
     前記学習データは、前記位置決めシステムが正常であるときに実行された前記位置決め処理において前記視覚センサによって特定された前記対象物の位置を示すデータであり、
     前記軌跡決定部は、前記初回の位置特定処理によって特定された前記対象物の位置を前記学習器に入力することにより、前記学習器から出力された前記情報で示される軌跡を前記理想軌跡として決定する、請求項1から6のいずれか1項に記載の位置決めシステム。
    The trajectory determination unit is configured to output information indicating a trajectory of the object from the position to the target position corresponding to the position of the input object by machine learning using learning data. Further equipped with a learned learning device
    The learning data is data indicating the position of the object identified by the visual sensor in the positioning process performed when the positioning system is normal,
    The trajectory determination unit determines the trajectory indicated by the information output from the learning device as the ideal trajectory by inputting the position of the object specified by the first position specification processing to the learning device. The positioning system according to claim 1, wherein
  9.  前記判定部は、前記実軌跡と前記理想軌跡とを示す画面を表示装置に出力する、請求項1から8のいずれか1項に記載の位置決めシステム。 The positioning system according to any one of claims 1 to 8, wherein the determination unit outputs a screen indicating the actual trajectory and the ideal trajectory to a display device.
  10.  前記実軌跡および前記理想軌跡は、前記移動制御部による前記移動機構の制御開始時点からの経過時間と、前記対象物の位置とを対応付けた情報により示され、
     前記移動機構は、互いに異なる移動方向に並進移動する複数の並進機構を含み、
     前記判定部は、前記経過時間が同一である、前記実軌跡上の位置と前記理想軌跡上の位置との偏差を、前記複数の並進機構の各々に対応する前記移動方向の成分に分解し、各成分の経時変化を示す画面を表示装置に出力する、請求項1または2に記載の位置決めシステム。
    The actual trajectory and the ideal trajectory are indicated by information that associates the elapsed time from the start of the control of the moving mechanism by the movement control unit with the position of the object,
    The movement mechanism includes a plurality of translation mechanisms that translate in different movement directions,
    The determination unit, the elapsed time is the same, the deviation between the position on the actual trajectory and the position on the ideal trajectory, decomposed into components in the movement direction corresponding to each of the plurality of translation mechanisms, The positioning system according to claim 1, wherein a screen showing a change with time of each component is output to a display device.
  11.  前記判定部は、前記特徴量の推移を示す画面を表示装置に出力する、請求項3から6のいずれか1項に記載の位置決めシステム。 7. The positioning system according to claim 3, wherein the determination unit outputs a screen indicating a transition of the feature amount to a display device. 8.
  12.  請求項1から11のいずれか1項に記載の位置決めシステムに用いられる監視装置であって、
     前記軌跡決定部と、
     前記判定部とを備える、監視装置。
    A monitoring device used for the positioning system according to any one of claims 1 to 11,
    The trajectory determination unit,
    A monitoring device comprising: the determination unit.
  13.  対象物の位置決め処理を行なう位置決めシステムの監視方法であって、
     前記位置決めシステムは、
     前記対象物を移動させるための移動機構と、
     前記対象物を撮像し、撮像画像に基づいて前記対象物の位置を特定する位置特定処理を撮像周期ごとに実行するための視覚センサと、
     前記視覚センサによって特定された位置が目標位置に近づくように前記移動機構を制御するための移動制御部とを備え、
     初回の位置特定処理によって特定された前記対象物の位置から前記目標位置までの前記対象物の理想軌跡を決定するステップと、
     前記初回の位置特定処理の後の位置特定処理によって特定された前記対象物の位置の実軌跡と前記理想軌跡との比較結果に基づいて、前記位置決めシステムの異常の有無を判定するステップとを備える、監視方法。
    A monitoring method of a positioning system that performs a positioning process of an object,
    The positioning system comprises:
    A moving mechanism for moving the object,
    A visual sensor for imaging the object, and performing a position specifying process for specifying the position of the object based on the captured image for each imaging cycle,
    A movement control unit for controlling the movement mechanism so that the position specified by the visual sensor approaches the target position,
    Determining an ideal trajectory of the target object from the position of the target object specified by the initial position specifying process to the target position;
    Determining the presence or absence of an abnormality in the positioning system based on a comparison result between the actual trajectory of the position of the target object specified by the position specifying process after the first position specifying process and the ideal trajectory. , Monitoring method.
  14.  対象物の位置決め処理を行なう位置決めシステムの監視方法をコンピュータに実行させるためのプログラムであって、
     前記位置決めシステムは、
     前記対象物を移動させるための移動機構と、
     前記対象物を撮像し、撮像画像に基づいて前記対象物の位置を特定する位置特定処理を撮像周期ごとに実行するための視覚センサと、
     前記視覚センサによって特定された位置が目標位置に近づくように前記移動機構を制御するための移動制御部とを備え、
     前記監視方法は、
     初回の位置特定処理によって特定された前記対象物の位置から前記目標位置までの前記対象物の理想軌跡を決定するステップと、
     前記初回の位置特定処理の後の位置特定処理によって特定された前記対象物の位置の実軌跡と前記理想軌跡との比較結果に基づいて、前記位置決めシステムの異常の有無を判定するステップとを備える、プログラム。
    A program for causing a computer to execute a method of monitoring a positioning system that performs a positioning process of an object,
    The positioning system comprises:
    A moving mechanism for moving the object,
    A visual sensor for imaging the object, and performing a position specifying process for specifying the position of the object based on the captured image for each imaging cycle,
    A movement control unit for controlling the movement mechanism so that the position specified by the visual sensor approaches the target position,
    The monitoring method comprises:
    Determining an ideal trajectory of the target object from the position of the target object specified by the initial position specifying process to the target position;
    Determining the presence or absence of an abnormality in the positioning system based on a comparison result between the actual trajectory of the position of the target object specified by the position specifying process after the first position specifying process and the ideal trajectory. ,program.
PCT/JP2019/022372 2018-06-21 2019-06-05 Positioning system, monitor device, monitor method, and program WO2019244638A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-117879 2018-06-21
JP2018117879A JP7143643B2 (en) 2018-06-21 2018-06-21 Positioning system, monitoring device, monitoring method and program

Publications (1)

Publication Number Publication Date
WO2019244638A1 true WO2019244638A1 (en) 2019-12-26

Family

ID=68983648

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/022372 WO2019244638A1 (en) 2018-06-21 2019-06-05 Positioning system, monitor device, monitor method, and program

Country Status (2)

Country Link
JP (1) JP7143643B2 (en)
WO (1) WO2019244638A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7131578B2 (en) * 2020-03-13 2022-09-06 株式会社リコー Information processing device, information processing method and program
JP7040567B2 (en) * 2020-08-18 2022-03-23 オムロン株式会社 Control device, control method of control device, information processing program, and recording medium
US20240081034A1 (en) * 2021-01-28 2024-03-07 Panasonic Intellectual Property Management Co., Ltd. Motor control system, motor control device, motor control method, and program
WO2024080048A1 (en) * 2022-10-12 2024-04-18 パナソニックIpマネジメント株式会社 Position adjusting device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002066875A (en) * 2000-08-25 2002-03-05 Canon Inc Machining device and machining method
JP2002287827A (en) * 2001-03-28 2002-10-04 Fujitsu Ltd Moving body position control device, program for adjusting band blocking filter for moving body position control device, and program for adjusting loop gain for moving body position control device
JP2003050106A (en) * 2001-08-07 2003-02-21 Fast:Kk Calibration method, positioning method, positioning device, calibration program and positioning program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002066875A (en) * 2000-08-25 2002-03-05 Canon Inc Machining device and machining method
JP2002287827A (en) * 2001-03-28 2002-10-04 Fujitsu Ltd Moving body position control device, program for adjusting band blocking filter for moving body position control device, and program for adjusting loop gain for moving body position control device
JP2003050106A (en) * 2001-08-07 2003-02-21 Fast:Kk Calibration method, positioning method, positioning device, calibration program and positioning program

Also Published As

Publication number Publication date
JP7143643B2 (en) 2022-09-29
JP2019219998A (en) 2019-12-26

Similar Documents

Publication Publication Date Title
WO2019244638A1 (en) Positioning system, monitor device, monitor method, and program
CN108021099B (en) Machine learning device and machining time prediction device
JP7078894B2 (en) Control systems, controls, image processing devices and programs
WO2020031790A1 (en) Control system and control device
CN110581945B (en) Control system, control device, image processing device, and storage medium
CN112748699A (en) Simulation device, numerical controller, and simulation method
US11992953B2 (en) Abnormality determination device and abnormality determination method
JP6950631B2 (en) Positioning system, control methods and programs
CN111886556B (en) Control system, control method, and computer-readable storage medium
JP7110843B2 (en) Abnormality determination device and abnormality determination method
CN111338294A (en) Numerical controller, numerical control machine system, machining simulation device, and machining simulation method
CN112534236A (en) Abnormality diagnosis device and abnormality diagnosis method
JP7172151B2 (en) Control systems, controllers and programs
KR101421672B1 (en) robot vision monitoring system using trigger signal, and method thereof
JP7020262B2 (en) Control systems, control methods and programs
JP2019188550A (en) Control system, control method, and control program
CN112912803A (en) Numerical control device, learning device, and learning method
JP6922829B2 (en) Control systems, control methods, and control programs
JP2005292898A (en) Absolute positioning device by servo motor
US11820007B2 (en) Abnormality detection device and abnormality detection method
TW202301049A (en) Prediction system, information processing device, and information processing program
CN117739884A (en) Test system, and method and device for testing distance accuracy of industrial robot

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19822398

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19822398

Country of ref document: EP

Kind code of ref document: A1