WO2019208108A1 - Système, procédé et programme de commande - Google Patents

Système, procédé et programme de commande Download PDF

Info

Publication number
WO2019208108A1
WO2019208108A1 PCT/JP2019/014127 JP2019014127W WO2019208108A1 WO 2019208108 A1 WO2019208108 A1 WO 2019208108A1 JP 2019014127 W JP2019014127 W JP 2019014127W WO 2019208108 A1 WO2019208108 A1 WO 2019208108A1
Authority
WO
WIPO (PCT)
Prior art keywords
moving mechanism
moving
target
visual sensor
target position
Prior art date
Application number
PCT/JP2019/014127
Other languages
English (en)
Japanese (ja)
Inventor
正樹 浪江
功征 川又
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Priority to KR1020207026088A priority Critical patent/KR20210004958A/ko
Priority to CN201980018616.2A priority patent/CN111886556B/zh
Publication of WO2019208108A1 publication Critical patent/WO2019208108A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D3/00Control of position or direction
    • G05D3/12Control of position or direction using feedback

Definitions

  • This disclosure relates to a technique for positioning a workpiece based on the position of the workpiece measured by a visual sensor.
  • Patent Document 1 discloses a visual system in which a movable table, a moving mechanism for moving the movable table, and a workpiece placed on the movable table are repeatedly imaged and the position of the workpiece is repeatedly detected.
  • a workpiece positioning device including a sensor is disclosed. Each time the position is detected by the visual sensor, the workpiece positioning device calculates the difference between the detected position and the target position, and moves the movable base when it is determined that the difference is within the allowable range. Stop.
  • the workpiece positioning device calculates the difference between the position detected by the visual sensor after the movable table stops moving and the target position, and determines whether the calculated difference is within an allowable range. If it is determined that the difference is outside the allowable range, the moving direction of the movable table that reduces the difference is determined, and the moving mechanism is controlled to move the movable table in the determined moving direction.
  • the interval at which the actual position of the workpiece is measured by the visual sensor is shorter than the interval at which the command value is output to the moving mechanism. Therefore, in order to drive the moving mechanism more smoothly, the command value output to the moving mechanism is interpolated by some means between the time when the visual sensor measures the actual position of the workpiece and the next time the actual position is measured. There is a need to.
  • an object in one aspect is to more smoothly drive a moving mechanism that is driven based on a measurement position of a visual sensor.
  • An object in another aspect is to provide a control method that can more smoothly drive a moving mechanism that is driven based on a measurement position of a visual sensor.
  • An object in another aspect is to provide a control program that can more smoothly drive a moving mechanism that is driven based on a measurement position of a visual sensor.
  • the control system captures an image of the object based on a moving mechanism for moving the object and an imaging instruction, and the actual position of the object from an image obtained by the imaging.
  • a visual sensor for measuring the position a calculation unit for calculating a required moving distance of the moving mechanism for moving the object from the actual position to a predetermined target position, and the required moving distance and time And at least an explanatory variable, and based on a target trajectory represented by a multi-order function having a target position of the moving mechanism as a target variable, a predetermined interval shorter than the interval at which the imaging instruction is output to the visual sensor
  • a position determining unit for determining a target position corresponding to the current time
  • control system can interpolate the target position of the moving mechanism between the time when the visual sensor measures the actual position of the object and the next time the actual position of the object is measured. Can be driven more smoothly.
  • the multi-order function is a function of fifth order or higher. According to this disclosure, the target position of the moving mechanism becomes smoother because the multi-order function is defined by a function of fifth order or higher.
  • the position determination unit generates the target trajectory so that the acceleration of the moving mechanism does not exceed a predetermined maximum acceleration.
  • control system can suppress a sudden change in the speed of the moving mechanism.
  • the position determination unit generates the target trajectory each time the visual sensor measures the actual position of the object, and the target generated last time with the newly generated target trajectory. Update the trajectory.
  • the error of the target trajectory is corrected for each imaging cycle of the visual sensor.
  • the position determination unit generates a new target trajectory so that the speed of the moving mechanism does not change before and after the update of the target trajectory.
  • control system includes a detection unit for detecting an actual position of the moving mechanism for each control cycle, an actual position detected by the detection unit at the timing of updating the target trajectory, A correction unit for correcting the necessary moving distance by a position deviation from the target position of the moving mechanism at the timing;
  • the position error of the moving mechanism is absorbed, and the speed of the moving mechanism is prevented from changing suddenly.
  • sliding of the object and residual vibration after positioning of the moving mechanism are suppressed, and as a result, the alignment time of the object is shortened.
  • a method for controlling a moving mechanism for moving an object outputs an imaging instruction to a visual sensor, and determines an actual position of the object from an image obtained by imaging the object. At least a step of causing the visual sensor to measure, a step of calculating a required moving distance of the moving mechanism for moving the object from the actual position to a predetermined target position, and the required moving distance and time Based on a target trajectory represented by a multi-order function with the target position of the moving mechanism as an objective variable as an explanatory variable, for each predetermined control cycle shorter than the interval at which the imaging instruction is output to the visual sensor And determining the target position corresponding to the current time and moving the moving mechanism to the target position determined in the determining step.
  • control system can interpolate the target position of the moving mechanism between the time when the visual sensor measures the actual position of the object and the next time the actual position of the object is measured. Can be driven more smoothly.
  • a moving mechanism control program for moving an object is obtained by outputting an imaging instruction to a visual sensor to the controller for controlling the moving mechanism and imaging the object.
  • the imaging instruction is output to the visual sensor based on a target trajectory represented by a multi-order function having at least the necessary moving distance and time as explanatory variables and the target position of the moving mechanism as an objective variable. For each predetermined control cycle shorter than the interval, the step of determining a target position corresponding to the current time and the moving mechanism at the target position determined in the step of determining And a step of causing the movement.
  • control system can interpolate the target position of the moving mechanism between the time when the visual sensor measures the actual position of the object and the next time the actual position of the object is measured. Can be driven more smoothly.
  • the moving mechanism driven based on the measurement position of the visual sensor can be driven more smoothly.
  • FIG. 1 is a schematic diagram showing an outline of a control system 1 according to the present embodiment.
  • the control system 1 performs alignment using image processing.
  • the alignment typically means a process of placing an object (hereinafter also referred to as “work W”) at an original position of a production line in the manufacturing process of an industrial product.
  • the control system 1 positions the glass substrate with respect to the exposure mask before the circuit pattern printing process (exposure process) on the glass substrate in the production line of the liquid crystal panel.
  • the control system 1 includes, for example, a visual sensor 50, a controller 200, a servo driver 300, and a moving mechanism 400.
  • the moving mechanism 400 includes, for example, a servo motor 410 and a stage 420.
  • the visual sensor 50 performs an imaging process of capturing an image of a subject existing in the imaging field and generating image data, and images the workpiece W placed on the stage 420.
  • the visual sensor 50 performs imaging according to the imaging trigger TR from the controller 200.
  • the visual sensor 50 images the workpiece W based on the reception of the imaging trigger TR, and measures the actual position PVv of the workpiece W by performing image analysis on the image data obtained by the imaging.
  • the actual position PVv is output to the controller 200 every time it is measured.
  • the controller 200 is a PLC (programmable logic controller), for example, and performs various FA controls.
  • the controller 200 includes a calculation unit 250, a position determination unit 252, and a movement control unit 254 as an example of a functional configuration.
  • the calculation unit 250 moves the workpiece W from the actual position PVv to the reaching target position SP based on the actual position PVv of the workpiece W detected by the visual sensor 50 and the predetermined reaching target position SP.
  • the required moving distance L is calculated.
  • the calculated required moving distance L is output to the position determining unit 252.
  • the reaching target position SP is detected by the visual sensor 50 performing predetermined image processing.
  • the visual sensor 50 detects a predetermined mark from the image and recognizes the mark as the reaching target position SP.
  • the reaching target position SP is determined in advance for each production process.
  • the position determining unit 252 uses the current trajectory TG represented by a multi-order function having at least the necessary moving distance L and the time t as explanatory variables and the target position SP (t) of the moving mechanism 400 as an objective variable. A target position SP (t) at time t is determined.
  • FIG. 2 is a diagram illustrating an example of the target trajectory TG.
  • the target trajectory TG defines a target position SP (t) of the moving mechanism 400 for each control cycle Ts.
  • the initial value of the target position SP (t) is the required moving distance L, and the final value of the target position SP (t) is zero.
  • the target position SP (t) is output to the movement control unit 254 every control cycle Ts shorter than the imaging cycle Tb.
  • the imaging cycle Tb varies depending on the imaging situation or the like, and is about 60 ms, for example.
  • the control cycle Ts is fixed, for example 1 ms.
  • the movement control unit 254 generates a movement command MV for moving the movement mechanism 400 to the target position SP (t) corresponding to the current time t for each control cycle Ts, and outputs the movement command MV to the servo driver 300.
  • the movement command MV is, for example, any one of a command position, a command speed, and a command torque for the servo driver 300.
  • Servo driver 300 drives moving mechanism 400 in accordance with movement command MV received every control cycle Ts. More specifically, the servo driver 300 acquires an encoder value PVm detected by an encoder 412 (see FIG. 6) described later, and the speed / position of the stage 420 specified from the encoder value PVm approaches the target value. Thus, the servo motor 410 is feedback-controlled so that the movement command MV approaches.
  • the encoder value PVm detected by the encoder is input to the controller 200 at the same cycle as the control cycle Ts.
  • the position determination unit 252 is a multi-order function having the necessary moving distance L and time t as at least explanatory variables and the target position SP (t) of the moving mechanism 400 as an objective variable. Based on the expressed target trajectory TG, the target position SP (t) corresponding to the current time t is determined. The target position SP (t) is output to the movement control unit 254 every control cycle Ts shorter than the imaging cycle Tb.
  • the movement command output to the moving mechanism 400 can be interpolated between the time when the visual sensor 50 measures the actual position PVv of the workpiece W and the next time when the actual position PVv is measured, thereby making the moving mechanism 400 smoother. It becomes possible to drive.
  • FIG. 3 is a diagram illustrating an example of a device configuration of the control system 1.
  • the control system 1 includes a visual sensor 50, a controller 200, one or more servo drivers 300 (servo drivers 300X and 300Y in the example of FIG. 3), and a moving mechanism 400.
  • the visual sensor 50 includes the image processing apparatus 100 and one or more cameras (cameras 102 and 104 in the example of FIG. 3).
  • the image processing apparatus 100 detects a feature portion 12 (for example, a screw hole) of the workpiece W based on image data obtained by the cameras 102 and 104 photographing the workpiece W.
  • the image processing apparatus 100 detects the detected position of the feature portion 12 as the actual position PVv of the workpiece W.
  • the controller 200 is connected to one or more servo drivers 300 (servo drivers 300X and 300Y in the example of FIG. 3).
  • the servo driver 300X drives the servo motor 410X to be controlled in accordance with the movement command in the X direction received from the controller 200.
  • the servo driver 300Y drives the servo motor 410Y to be controlled in accordance with the movement command in the Y direction received from the controller 200.
  • the controller 200 gives a target position in the X direction as a command value to the servo driver 300X according to the target trajectory TGx generated in the X direction. Further, the controller 200 gives a target position in the Y direction as a command value to the servo driver 300Y according to the target trajectory TGy generated in the Y direction. By sequentially updating the respective target positions in the X and Y directions, the workpiece W is moved to the final target position SP.
  • Controller 200 and servo driver 300 are connected in a daisy chain via a field network.
  • a field network for example, EtherCAT (registered trademark) is adopted.
  • EtherCAT registered trademark
  • the field network is not limited to EtherCAT, and any communication means can be adopted.
  • the controller 200 and the servo driver 300 may be directly connected by a signal line. Further, the controller 200 and the servo driver 300 may be integrally configured.
  • the moving mechanism 400 includes base plates 4 and 7, ball screws 6 and 9, a stage 420, and one or more servo motors 410 (servo motors 410X and 410Y in the example of FIG. 3).
  • the base plate 4 is provided with a ball screw 6 that moves the stage 420 along the X direction.
  • the ball screw 6 is engaged with a nut included in the stage 420.
  • the servo motor 410X connected to one end of the ball screw 6 is rotationally driven, the nut included in the stage 420 and the ball screw 6 are relatively rotated, and as a result, the stage 420 is moved along the X direction.
  • the base plate 7 is provided with a ball screw 9 for moving the stage 420 and the base plate 4 along the Y direction.
  • the ball screw 9 is engaged with a nut included in the base plate 4.
  • the servo motor 410Y connected to one end of the ball screw 9 is rotationally driven, the nut included in the base plate 4 and the ball screw 9 are relatively rotated. As a result, the stage 420 and the base plate 4 move along the Y direction.
  • FIG. 3 shows a biaxially driven moving mechanism 400 by servomotors 410X and 410Y, the moving mechanism 400 is a servomotor that drives the stage 420 in the rotational direction ( ⁇ direction) on the XY plane. May be further incorporated.
  • FIG. 4 is a schematic diagram illustrating an example of a hardware configuration of the image processing apparatus 100 configuring the visual sensor 50.
  • image processing apparatus 100 typically has a structure according to a general-purpose computer architecture, and a processor executes various programs as described later by executing a preinstalled program. Realize image processing.
  • the image processing apparatus 100 includes a processor 110 such as a CPU (Central Processing Unit) or an MPU (Micro-Processing Unit), a RAM (Random Access Memory) 112, a display controller 114, and a system controller 116. , An I / O (Input Output) controller 118, a hard disk 120, a camera interface 122, an input interface 124, a controller interface 126, a communication interface 128, and a memory card interface 130. These units are connected to each other so that data communication is possible with the system controller 116 as a center.
  • a processor 110 such as a CPU (Central Processing Unit) or an MPU (Micro-Processing Unit), a RAM (Random Access Memory) 112, a display controller 114, and a system controller 116.
  • An I / O (Input Output) controller 118 a hard disk 120, a camera interface 122, an input interface 124, a controller interface 126, a communication interface 128, and a memory card
  • the processor 110 exchanges programs (codes) and the like with the system controller 116 and executes them in a predetermined order, thereby realizing the target arithmetic processing.
  • the system controller 116 is connected to the processor 110, the RAM 112, the display controller 114, and the I / O controller 118 via buses, and performs data exchange with each unit and processes of the entire image processing apparatus 100. To manage.
  • the RAM 112 is typically a volatile storage device such as a DRAM (Dynamic Random Access Memory), a program read from the hard disk 120, camera images (image data) acquired by the cameras 102 and 104, Stores processing results for camera images and work data.
  • DRAM Dynamic Random Access Memory
  • the display controller 114 is connected to the display unit 132, and outputs signals for displaying various types of information to the display unit 132 in accordance with internal commands from the system controller 116.
  • the I / O controller 118 controls data exchange with a recording medium or an external device connected to the image processing apparatus 100. More specifically, the I / O controller 118 is connected to the hard disk 120, the camera interface 122, the input interface 124, the controller interface 126, the communication interface 128, and the memory card interface 130.
  • the hard disk 120 is typically a nonvolatile magnetic storage device, and stores various setting values in addition to the control program 150 executed by the processor 110.
  • the control program 150 installed in the hard disk 120 is distributed while being stored in the memory card 136 or the like.
  • a semiconductor storage device such as a flash memory or an optical storage device such as a DVD-RAM (Digital Versatile Disk Random Access Memory) may be employed.
  • the camera interface 122 corresponds to an input unit that receives image data generated by photographing a workpiece, and mediates data transmission between the processor 110 and the cameras 102 and 104.
  • the camera interface 122 includes image buffers 122a and 122b for temporarily storing image data from the cameras 102 and 104, respectively.
  • image buffers 122a and 122b for temporarily storing image data from the cameras 102 and 104, respectively.
  • a single image buffer that can be shared among the cameras may be provided.
  • it is preferable that a plurality of cameras are independently arranged in association with each camera.
  • the input interface 124 mediates data transmission between the processor 110 and input devices such as a keyboard 134, a mouse, a touch panel, and a dedicated console.
  • the controller interface 126 mediates data transmission between the processor 110 and the controller 200.
  • the communication interface 128 mediates data transmission between the processor 110 and other personal computers or server devices (not shown).
  • the communication interface 128 typically includes Ethernet (registered trademark), USB (Universal Serial Bus), or the like.
  • the memory card interface 130 mediates data transmission between the processor 110 and the memory card 136 as a recording medium.
  • the memory card 136 is distributed in a state where the control program 150 executed by the image processing apparatus 100 is stored, and the memory card interface 130 reads the control program from the memory card 136.
  • the memory card 136 is a general-purpose semiconductor storage device such as SD (Secure Digital), a magnetic recording medium such as a flexible disk, or an optical recording medium such as a CD-ROM (Compact Disk Read Only Memory). Become.
  • a program downloaded from a distribution server or the like may be installed in the image processing apparatus 100 via the communication interface 128.
  • an OS for providing a basic function of the computer ( Operating System) may be installed.
  • the control program according to the present embodiment may execute processing by calling necessary modules out of program modules provided as part of the OS in a predetermined order and / or timing. Good.
  • control program according to the present embodiment may be provided by being incorporated in a part of another program. Even in that case, the program itself does not include the modules included in the other programs to be combined as described above, and the processing is executed in cooperation with the other programs. That is, the control program according to the present embodiment may be in a form incorporated in such another program.
  • control program may be implemented as a dedicated hardware circuit.
  • FIG. 5 is a schematic diagram illustrating a hardware configuration of the controller 200.
  • controller 200 includes a main control unit 210.
  • FIG. 5 shows servo motors 410X, 410Y, 410 ⁇ for three axes, and the number of servo drivers 300X, 300Y, 300 ⁇ corresponding to the number of axes is provided.
  • the main control unit 210 includes a chip set 212, a processor 214, a nonvolatile memory 216, a main memory 218, a system clock 220, a memory card interface 222, a communication interface 228, an internal bus controller 230, and a field bus. Controller 238.
  • the chip set 212 and other components are coupled via various buses.
  • the processor 214 and the chipset 212 typically have a configuration according to a general-purpose computer architecture. That is, the processor 214 interprets and executes the instruction codes sequentially supplied from the chip set 212 according to the internal clock.
  • the chip set 212 exchanges internal data with various connected components and generates instruction codes necessary for the processor 214.
  • the system clock 220 generates a system clock having a predetermined period and provides it to the processor 214.
  • the chip set 212 has a function of caching data obtained as a result of execution of arithmetic processing by the processor 214.
  • the main control unit 210 has a nonvolatile memory 216 and a main memory 218 as storage means.
  • the nonvolatile memory 216 holds the OS, system program, user program, data definition information, log information, and the like in a nonvolatile manner.
  • the main memory 218 is a volatile storage area, holds various programs to be executed by the processor 214, and is also used as a working memory when executing the various programs.
  • the main control unit 210 includes a communication interface 228, an internal bus controller 230, and a field bus controller 238 as communication means. These communication circuits transmit and receive data.
  • the communication interface 228 exchanges data with the image processing apparatus 100.
  • the internal bus controller 230 controls exchange of data via the internal bus 226. More specifically, the internal bus controller 230 includes a buffer memory 236, a DMA (Dynamic Memory Access) control circuit 232, and an internal bus control circuit 234.
  • DMA Dynamic Memory Access
  • the memory card interface 222 connects the memory card 224 detachable to the main control unit 210 and the processor 214.
  • the fieldbus controller 238 is a communication interface for connecting to a field network.
  • the controller 200 is connected to a servo driver 300 (for example, servo drivers 300X, 300Y, 300 ⁇ ) via a fieldbus controller 238.
  • a servo driver 300 for example, servo drivers 300X, 300Y, 300 ⁇
  • a fieldbus controller 238 for example, EtherCAT (registered trademark), EtherNet / IP (registered trademark), CompoNet (registered trademark), or the like is adopted.
  • the position determining unit 252 (see FIG. 1) generates a target trajectory TG for each imaging cycle Tb of the visual sensor 50. At this time, the position determining unit 252 updates the previously generated target trajectory TG with the newly generated target trajectory TG. That is, the target trajectory TG is updated every time the actual position of the workpiece W is measured by the visual sensor 50. Thereby, the error of the target trajectory TG is corrected for each imaging cycle Tb of the visual sensor 50.
  • the position determination unit 252 generates a new target trajectory TG so that the speed of the moving mechanism 400 does not change before and after the update of the target trajectory TG.
  • the update process of the target trajectory TG will be described with reference to FIGS.
  • FIG. 6 is a diagram that further embodies the functional configuration of the control system 1 shown in FIG.
  • the controller 200 includes a calculation unit 250, correction units 251X and 251Y, position determination units 252X and 252Y, and movement control units 254X and 254Y.
  • the correction unit 251X, the position determination unit 252X, and the movement control unit 254X are functional configurations for the servo driver 300X that performs drive control in the X-axis direction.
  • the correction unit 251Y, the position determination unit 252Y, and the movement control unit 254Y are functional configurations for the servo driver 300Y that performs drive control in the Y-axis direction.
  • the functions of the correction units 251X and 251Y are the same, the functions of the position determination units 252X and 252Y are the same, and the functions of the movement control units 254X and 254Y are the same.
  • the calculation unit 250 moves the workpiece W from the actual position PVv to the reaching target position SP based on the actual position PVv of the workpiece W detected by the visual sensor 50 and the predetermined reaching target position SP.
  • the required moving distance L is calculated. Thereafter, the calculation unit 250 breaks down the necessary movement distance L of the movement mechanism 400 into the necessary movement distance Lx in the X-axis direction and the necessary movement distance Ly in the Y-axis direction, and outputs the necessary movement distance Lx to the correction unit 251X.
  • the necessary movement distance Ly is output to the correction unit 251Y.
  • the correcting unit 251X specifies the actual position of the moving mechanism 400 based on the encoder value PVm for the encoder 412X (detecting unit) for detecting the actual position of the moving mechanism 400. More specifically, the encoder 412X generates a pulse signal according to the movement amount of the servo motor 410X. The counter included in the servo motor 410X receives the pulse signal from the encoder 412X and counts the number of pulses included in the pulse signal to measure the movement amount of the moving mechanism 400 as the encoder value PVm.
  • the encoder value PVm is input to the controller 200 by the correction unit 251 every control cycle Ts.
  • the correcting unit 251 specifies the actual position of the moving mechanism 400 in the X direction based on the encoder value PVm corresponding to the moving amount of the moving mechanism 400.
  • the correction unit 251X calculates the position deviation En (t) between the actual position of the moving mechanism 400 and the target position SP (t) as an error. Then, the correction unit 251X corrects the necessary movement distance Lx with the position deviation En (t), and outputs the corrected necessary movement distance Lm to the position determination unit 252X. Similarly to the correction unit 251X, the correction unit 251Y outputs the necessary movement distance Lm in the Y direction to the position determination unit 252Y based on the encoder value PVm from the encoder 412Y.
  • the position determination unit 252X generates the target trajectory TG from the necessary moving distance Lm based on the arrival of the imaging cycle Tb of the visual sensor 50.
  • FIG. 7 is a diagram showing the target trajectory TG1 before update and the target trajectory TG2 after update.
  • the correcting unit 251X corrects the necessary moving distance L by the position deviation En (t) between the actual position of the moving mechanism 400 detected at the update timing of the target trajectory and the target position of the moving mechanism 400 at the timing.
  • the required moving distance L is corrected to the required moving distance Lm by adding the position deviation En (t5) to the required moving distance L.
  • the position determination unit 252X generates a new target trajectory TG2 based on the necessary travel distance Lm after correction.
  • the position error of the moving mechanism 400 is absorbed, and the speed of the moving mechanism 400 is prevented from changing suddenly.
  • sliding of the workpiece W on the moving mechanism 400 and residual vibration after positioning of the moving mechanism 400 are suppressed, and as a result, the alignment time of the workpiece W is shortened.
  • the position determination unit 252X determines a target position SP (t) corresponding to the current time t based on the updated target trajectory TG2, and sends the target position SP (t) to the movement control unit 254X for each control cycle Ts. Output. Since the function of the movement control unit 254X is the same as that of the movement control unit 254 described in FIG. 1, the description thereof will not be repeated.
  • FIG. 8 is a flowchart showing a part of the control process executed by controller 200.
  • the processing shown in FIG. 8 is realized by the processor 214 of the controller 200 executing a program. In other aspects, some or all of the processing may be performed by circuit elements or other hardware.
  • the process shown in FIG. 8 represents a control flow for a certain axial direction. That is, in practice, processes other than steps S130 and S150 shown in FIG. 8 are executed in parallel in the axial direction.
  • step S110 the processor 214 initializes the measurement time t (current time) to zero.
  • step S ⁇ b> 130 the processor 214 determines whether information indicating that the position measurement of the workpiece W has been completed is received from the visual sensor 50.
  • the processor 214 determines that information indicating that the position measurement of the workpiece W has been completed is received from the visual sensor 50 (YES in step S130)
  • the processor 214 switches the control to step S131. Otherwise (NO in step S130), processor 214 switches control to step S138.
  • step S131 the processor 214 executes the workpiece W as the calculation unit 250 (see FIG. 1) based on the actual position PVv of the workpiece W detected by the visual sensor 50 and the predetermined target position SP. A required moving distance L of the moving mechanism 400 for moving from the position PVv to the arrival target position SP is calculated.
  • step S132 the processor 214 adds the position deviation En (t) at the measurement time t to the required moving distance L as the correcting unit 251 (see FIG. 6), and corrects the required moving distance L to the required moving distance Lm. To do. Since the correction method of the required movement distance L is as having demonstrated in FIG. 7, the description is not repeated.
  • step S134 the processor 214 initializes the measurement time t to zero.
  • step S136 the processor 214 calculates the trajectory time T.
  • the trajectory time T represents the time required to move the moving mechanism 400 from the start point to the end point of the target trajectory TG.
  • the trajectory time T is calculated based on the following formula (1).
  • T max ⁇ f (A max ), T min ⁇ (1)
  • a max shown in the above equation (1) represents the maximum acceleration.
  • F () is a function for obtaining the trajectory time T required when the required moving distance L is moved to the moving mechanism 400 with the maximum acceleration A max .
  • T min is a predetermined minimum orbit time.
  • Max ( ⁇ , ⁇ ) is a function for obtaining the maximum value from the numerical values ⁇ and ⁇ .
  • the trajectory time T is determined so as not to be less than the minimum trajectory time Tmin . If the minimum trajectory time Tmin is not provided, when the required moving distance L is very short, the moving mechanism 400 will reach the target position immediately, so that the time until the next imaging timing is wasted. become. However, by providing the minimum trajectory time Tmin , the moving mechanism 400 moves at an acceleration lower than the maximum acceleration when the required moving distance L is very short, and the moving mechanism 400 moves smoothly. be able to. As an example, the trajectory time T min is calculated by multiplying a certain ratio (for example, 50%) with respect to the average imaging interval.
  • a certain ratio for example, 50%
  • step S138 the processor 214, as the position determination unit 252 (see FIG. 1), based on the necessary travel distance Lm after correction obtained in step S132 and the trajectory time T calculated in step S136.
  • a target position SP (t) corresponding to the current time t is calculated.
  • the target position SP (t) is calculated based on the following formula (2).
  • the target trajectory TG is represented by a quintic function at time t.
  • the order of the target trajectory TG may be represented by a sixth-order or higher-order function.
  • the target trajectory TG may be represented by a spline interpolation function.
  • step S140 the processor 214 generates a movement command MV for moving the movement mechanism 400 to the target position SP (t) obtained in step S138 as the movement control unit 254 (see FIG. 1).
  • the movement command MV is output to the servo driver 300.
  • step S142 the processor 214 adds the control period Ts to the measurement time t, and updates the measurement time t.
  • step S150 the processor 214 determines whether or not to finish the update process of the target trajectory TG. As an example, the processor 214 ends the process illustrated in FIG. 8 based on receiving a stop command for updating the target trajectory TG. When processor 214 determines to end the update process of target trajectory TG (YES in step S150), it ends the process shown in FIG. Otherwise (NO in step S150), processor 214 returns control to step S130.
  • the processor 214 does not stop until the moving mechanism 400 reaches the final target arrival position SP.
  • the target position SP (t) at each time may be calculated collectively.
  • the present embodiment includes the following disclosure.
  • a moving mechanism (400) for moving the object A visual sensor (50) for imaging the object based on receiving an imaging instruction and measuring an actual position of the object from an image obtained by imaging; A calculating unit (250) for calculating a necessary moving distance of the moving mechanism (400) for moving the object from the actual position to a predetermined target position;
  • the imaging instruction is output to the visual sensor based on a target trajectory represented by a multi-order function in which the required moving distance and time are at least explanatory variables and the target position of the moving mechanism (400) is an objective variable.
  • a position determination unit (252) for determining a target position corresponding to the current time for each predetermined control cycle shorter than the interval of A control system comprising: a movement control unit for moving the moving mechanism (400) to a target position determined by the position determination unit (252).
  • the position determination unit (252) generates the target trajectory each time the visual sensor (50) measures the actual position of the object, and the target generated last time with the newly generated target trajectory.
  • the control system according to any one of configurations 1 to 3, wherein the trajectory is updated.
  • the control system includes: A detection unit (412) for detecting an actual position of the moving mechanism (400) for each control period; A correction unit for correcting the necessary movement distance with a positional deviation between the actual position detected by the detection unit at the timing of updating the target trajectory and the target position of the movement mechanism at the timing; The control system according to Configuration 5.
  • a control method of a moving mechanism (400) for moving an object Outputting an imaging instruction to a visual sensor and causing the visual sensor to measure the actual position of the object from an image obtained by imaging the object; Calculating a required moving distance of the moving mechanism (400) for moving the object from the actual position to a predetermined target position;
  • the imaging instruction is output to the visual sensor based on a target trajectory represented by a multi-order function in which the required moving distance and time are at least explanatory variables and the target position of the moving mechanism (400) is an objective variable. Determining a target position corresponding to the current time for each predetermined control cycle shorter than a predetermined interval; Moving the moving mechanism (400) to the target position determined in the determining step.
  • a control program for a moving mechanism (400) for moving an object The control program is sent to a controller (200) for controlling the moving mechanism (400). Outputting an imaging instruction to a visual sensor and causing the visual sensor to measure the actual position of the object from an image obtained by imaging the object; Calculating a necessary moving distance of the moving mechanism (400) for moving the object from the actual position to a predetermined reaching target position (S131); The imaging instruction is output to the visual sensor based on a target trajectory represented by a multi-order function in which the required moving distance and time are at least explanatory variables and the target position of the moving mechanism (400) is an objective variable. Determining a target position corresponding to the current time for each predetermined control cycle shorter than a predetermined interval; A control program for executing the step (S140) of moving the moving mechanism (400) to the target position determined in the determining step.
  • control system 4, 7 base plate, 6, 9 ball screw, 12 features, 50 visual sensor, 100 image processing device, 102, 104 camera, 110, 214 processor, 112 RAM, 114 display controller, 116 system controller, 118 I / O controller, 120 hard disk, 122 camera interface, 122a image buffer, 124 input interface, 126 motion controller interface, 128,228 communication interface, 130,222 memory card interface, 132 display unit, 134 keyboard, 136,224 memory card, 150 control program, 200 controller, 210 main control unit, 212 chipset 216, non-volatile memory, 218 main memory, 220 system clock, 230 internal bus controller, 232 control circuit, 234 internal bus control circuit, 236 buffer memory, 238 field bus controller, 250 calculation unit, 251, 251X, 251Y correction unit , 252, 252 X, 252 Y position determining unit, 254, 254 X, 254 Y movement control unit, 300, 300 X, 300 Y servo driver, 400 moving mechanism, 410, 410,

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position Or Direction (AREA)
  • Manipulator (AREA)

Abstract

L'invention concerne une technologie entraînant sans heurts un mécanisme de déplacement (400) entraîné en fonction d'une position de mesure provenant d'un capteur visuel (50). Le système de commande (10) selon l'invention comprend : le mécanisme de déplacement (400) qui déplace un objet cible ; un capteur visuel (50) qui mesure séquentiellement une position réelle de l'objet cible à partir d'images obtenues par imagerie de l'objet cible pour chaque période d'un premier laps de temps ; une unité de calcul (25) qui calcule une distance de déplacement nécessaire pour déplacer l'objet cible jusqu'à une position cible de portée à partir de la position réelle ; une unité de détermination de position (252) qui détermine une position cible correspondant à un laps de temps courant pour chaque période d'un deuxième laps de temps, plus courte que le premier laps de temps, en fonction d'une trajectoire cible indiquée par une fonction d'ordre multiple présentant au moins la distance de déplacement nécessaire et un laps de temps comme variables explicatives et la position cible du mécanisme de déplacement (400) comme variable cible ; et une unité de commande de mouvement (254) qui déplace le mécanisme de déplacement (400) vers la position cible.
PCT/JP2019/014127 2018-04-26 2019-03-29 Système, procédé et programme de commande WO2019208108A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020207026088A KR20210004958A (ko) 2018-04-26 2019-03-29 제어 시스템, 제어 방법 및 제어 프로그램
CN201980018616.2A CN111886556B (zh) 2018-04-26 2019-03-29 控制系统、控制方法以及计算机可读存储介质

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-085121 2018-04-26
JP2018085121A JP6919622B2 (ja) 2018-04-26 2018-04-26 制御システム、制御方法、および制御プログラム

Publications (1)

Publication Number Publication Date
WO2019208108A1 true WO2019208108A1 (fr) 2019-10-31

Family

ID=68295242

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/014127 WO2019208108A1 (fr) 2018-04-26 2019-03-29 Système, procédé et programme de commande

Country Status (3)

Country Link
JP (1) JP6919622B2 (fr)
KR (1) KR20210004958A (fr)
WO (1) WO2019208108A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020179507A1 (fr) * 2019-03-01 2020-09-10 オムロン株式会社 Dispositif de commande et dispositif d'alignement
US11999068B2 (en) 2019-03-01 2024-06-04 Omron Corporation Control device and alignment device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09244725A (ja) * 1996-03-05 1997-09-19 Sony Corp 軌道補間装置及びその方法並びに制御装置
JPH1124718A (ja) * 1997-07-07 1999-01-29 Toshiba Corp ロボットの制御装置及び制御方法
JP2007257276A (ja) * 2006-03-23 2007-10-04 Toyota Motor Corp 移動経路作成方法、自律移動体および自律移動体制御システム
JP2015213139A (ja) * 2014-05-07 2015-11-26 国立大学法人 東京大学 位置決め装置
JP2017024134A (ja) * 2015-07-24 2017-02-02 ファナック株式会社 ワークを位置決めするためのワーク位置決め装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09244725A (ja) * 1996-03-05 1997-09-19 Sony Corp 軌道補間装置及びその方法並びに制御装置
JPH1124718A (ja) * 1997-07-07 1999-01-29 Toshiba Corp ロボットの制御装置及び制御方法
JP2007257276A (ja) * 2006-03-23 2007-10-04 Toyota Motor Corp 移動経路作成方法、自律移動体および自律移動体制御システム
JP2015213139A (ja) * 2014-05-07 2015-11-26 国立大学法人 東京大学 位置決め装置
JP2017024134A (ja) * 2015-07-24 2017-02-02 ファナック株式会社 ワークを位置決めするためのワーク位置決め装置

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020179507A1 (fr) * 2019-03-01 2020-09-10 オムロン株式会社 Dispositif de commande et dispositif d'alignement
US11999068B2 (en) 2019-03-01 2024-06-04 Omron Corporation Control device and alignment device

Also Published As

Publication number Publication date
CN111886556A (zh) 2020-11-03
JP2019188549A (ja) 2019-10-31
JP6919622B2 (ja) 2021-08-18
KR20210004958A (ko) 2021-01-13

Similar Documents

Publication Publication Date Title
JP6167622B2 (ja) 制御システムおよび制御方法
CN110581946B (zh) 控制系统、控制装置、图像处理装置以及存储介质
CN110581945B (zh) 控制系统、控制装置、图像处理装置以及存储介质
WO2019208108A1 (fr) Système, procédé et programme de commande
WO2020003945A1 (fr) Système de détermination de position, procédé de commande et programme
KR102613860B1 (ko) 제어 시스템, 제어 방법 및 컴퓨터 판독 가능한 기억 매체
CN111886556B (zh) 控制系统、控制方法以及计算机可读存储介质
CN110581944B (zh) 控制系统、控制装置以及存储介质
JP7020262B2 (ja) 制御システム、制御方法およびプログラム
JP6922829B2 (ja) 制御システム、制御方法、および制御プログラム
WO2024023975A1 (fr) Système d'alignement, procédé d'alignement et programme
JP7258259B1 (ja) アライメントシステム、アライメント方法及びプログラム
JP7374353B1 (ja) アライメントシステム、アライメント方法及びプログラム
US11999068B2 (en) Control device and alignment device
US20220134570A1 (en) Control device and alignment device
JPH1034571A (ja) ロボットの位置ずれ修正装置
JPH11177296A (ja) 電子部品の実装設備のための認識補正装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19792706

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19792706

Country of ref document: EP

Kind code of ref document: A1