WO2020110712A1 - Système de contrôle, procédé de contrôle, et programme - Google Patents

Système de contrôle, procédé de contrôle, et programme Download PDF

Info

Publication number
WO2020110712A1
WO2020110712A1 PCT/JP2019/044393 JP2019044393W WO2020110712A1 WO 2020110712 A1 WO2020110712 A1 WO 2020110712A1 JP 2019044393 W JP2019044393 W JP 2019044393W WO 2020110712 A1 WO2020110712 A1 WO 2020110712A1
Authority
WO
WIPO (PCT)
Prior art keywords
focus position
inspection
focus
unit
image
Prior art date
Application number
PCT/JP2019/044393
Other languages
English (en)
Japanese (ja)
Inventor
信隆 今西
加藤 豊
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2020110712A1 publication Critical patent/WO2020110712A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems

Definitions

  • This technology relates to inspection systems, inspection methods, and programs.
  • Patent Document 1 discloses a focus adjustment mechanism in which focus position data is switched according to the type of an inspection target and the focus position of the imaging unit is set to a position corresponding to the focus position data.
  • An image processing apparatus that controls operation is disclosed.
  • JP, 2013-108875 A International Publication No. 2017/056557 JP, 2010-78681, A JP, 10-170817, A
  • the image processing device described in Patent Document 1 can easily adjust the focus position of the imaging unit according to the type of the inspection object.
  • the focus position of the imaging unit is fixed at a position according to the type of inspection object. Therefore, when the distance between the inspection target and the imaging unit changes due to the individual difference of the inspection target, it is not possible to obtain an image focused on the inspection target.
  • the present invention has been made in view of the above problems, and an object thereof is to easily adjust the focus position according to at least one of the type of the object and the inspection target location, and to the object.
  • An object of the present invention is to provide an inspection system, an inspection method and a program capable of obtaining a focused image.
  • an inspection system includes an optical system having a variable focus position, an image sensor that generates a captured image by receiving light from an object via the optical system, and an autofocus processing unit. , An inspection unit and a setting unit.
  • the autofocus processing unit executes an autofocus process related to a search for a focus position that is a focus position where an object is focused, based on the captured image.
  • the inspection unit inspects the object based on the inspection image generated when the focus position is adjusted to the in-focus position.
  • the setting unit sets the condition data for the autofocus process according to at least one of the product type and the inspection target location.
  • the autofocus processing unit executes autofocus processing according to the condition data.
  • the autofocus processing is executed according to the condition data according to at least one of the type of the target object and the inspection target location. Therefore, the focus position can be easily adjusted according to at least one of the type of the target object and the inspection target location. Furthermore, since the focus position at which the object is focused is automatically searched for by the autofocus process, an image focused on the object can be obtained.
  • the autofocus processing unit searches for the in-focus position based on the in-focus degree in the partial area of the captured image.
  • the condition data includes data that specifies the size and position/orientation of the partial area.
  • the focusing degree is calculated from the partial area suitable for at least one of the type of the target object and the inspection target location. This makes it easier to obtain an image focused on the object.
  • condition data includes data that specifies at least one of the focus position search range and the focus position when starting the focus position search.
  • the focus position can be searched from the search range suitable for at least one of the type of the target object and the inspection target location.
  • the search can be started from a focus position suitable for at least one of the type of the target object and the inspection target location.
  • the autofocus process includes a process of determining the quality of the in-focus position by comparing the evaluation value indicating the reliability of the in-focus position with a threshold value.
  • the condition data includes data specifying at least one of an evaluation function for calculating an evaluation value and a threshold value.
  • At least one of the evaluation function and the threshold value suitable for at least one of the type of the target object and the inspection target location is set. Thereby, the reliability of the focus position can be appropriately evaluated according to at least one of the type of the object and the inspection target location.
  • the autofocus process includes a process of determining the quality of the in-focus position by comparing the evaluation value indicating the reliability of the in-focus position with a threshold value.
  • the evaluation value is calculated based on the degree of focus in the partial area of the inspection image.
  • the condition data includes data that specifies the size and position/orientation of the partial area.
  • the evaluation value is calculated from the focus degree in the partial area suitable for at least one of the type of the target object and the inspection target location. Therefore, the reliability of the focus position can be appropriately evaluated according to at least one of the type of the object and the inspection target location.
  • an inspection system includes an optical system having a variable focus position, an imaging element that generates a captured image by receiving light from an object via the optical system, and an inspection system based on the captured image. And an autofocus processing unit that executes an autofocus processing that searches for a focus position that is a focus position that focuses on the object.
  • the inspection method in the inspection system includes first to third steps.
  • the first step is a step of setting condition data for autofocus processing in accordance with the type of the target object or the inspection target location of the target object.
  • the second step is a step of causing the autofocus processing unit to execute the autofocus processing according to the condition data.
  • the third step is a step of inspecting the object based on the inspection image generated when the focus position is adjusted to the in-focus position.
  • a program causes a computer to execute the above inspection method.
  • the present invention it is possible to easily adjust the focus position according to at least one of the type of the target object and the inspection target location, and obtain an image focused on the target object.
  • FIG. 3 is a block diagram showing an example of a hardware configuration of an image processing apparatus according to an embodiment.
  • FIG. It is the figure which showed typically the imaging of the work W by an imaging device. It is a figure which shows the image containing the image of the work of a comparatively large type.
  • FIG. 1 is a schematic diagram showing one application example of the inspection system according to the embodiment.
  • FIG. 2 is a diagram illustrating an example of an internal configuration of an image pickup apparatus included in the inspection system.
  • the inspection system 1 is realized as, for example, an appearance inspection system.
  • the inspection system 1 images an inspection target portion on the work W placed on the stage 90 in, for example, a production line of an industrial product, and performs an appearance inspection of the work W using the obtained image.
  • the work W is inspected for scratches, dirt, presence of foreign matter, dimensions, and the like.
  • the next work (not shown) is transported onto the stage 90.
  • the work W may stand still at a predetermined position on the stage 90 in a predetermined posture.
  • the work W may be imaged while the work W moves on the stage 90.
  • the inspection system 1 includes an imaging device 10 and an image processing device 20 as basic components.
  • the inspection system 1 further includes a PLC (Programmable Logic Controller) 30, an input device 40, and a display device 50.
  • PLC Programmable Logic Controller
  • the imaging device 10 is connected to the image processing device 20.
  • the imaging device 10 images a subject (workpiece W) existing in the imaging field of view according to a command from the image processing device 20, and generates image data including an image of the workpiece W.
  • the imaging device 10 and the image processing device 20 may be integrated.
  • the imaging device 10 includes an illumination unit 11, a lens module 12, an imaging device 13, an imaging device control unit 14, a lens control unit 16, registers 15 and 17, a communication interface ( I/F) section 18 is included.
  • I/F communication interface
  • the illumination unit 11 irradiates the work W with light.
  • the light emitted from the illumination unit 11 is reflected on the surface of the work W and enters the lens module 12.
  • the illumination unit 11 may be omitted.
  • the lens module 12 is an optical system for forming an image of the light from the work W on the image pickup surface 13a of the image pickup device 13.
  • the focus position of the lens module 12 is variable within a predetermined movable range.
  • the focal position is the position of a point where an incident light ray parallel to the optical axis intersects the optical axis.
  • the lens module 12 includes a lens 12a, a lens group 12b, a lens 12c, a movable portion 12d, and a focus adjusting portion 12e.
  • the lens 12a is a lens for changing the focal position of the lens module 12.
  • the focus adjustment unit 12e controls the lens 12a to change the focal position of the lens module 12.
  • the lens group 12b is a lens group for changing the focal length.
  • the zoom magnification is controlled by changing the focal length.
  • the lens group 12b is installed in the movable portion 12d and is movable along the optical axis direction.
  • the lens 12c is a lens fixed at a predetermined position in the image pickup apparatus 10.
  • the image sensor 13 is a photoelectric conversion element such as a CMOS (Complementary Metal Oxide Semiconductor) image sensor, and generates an image signal by receiving light from the work W via the lens module 12.
  • CMOS Complementary Metal Oxide Semiconductor
  • the image sensor control unit 14 generates captured image data based on the image signal from the image sensor 13. At this time, the image sensor control unit 14 opens and closes the shutter so as to achieve a preset shutter speed (exposure time), and generates captured image data with a preset resolution. Information indicating the shutter speed and the resolution is stored in the register 15 in advance.
  • the lens control unit 16 adjusts the focus of the imaging device 10 according to the instruction stored in the register 17. Specifically, the lens control unit 16 controls the focus adjustment unit 12e so that the focus position changes in accordance with the imaged area of the work W. The focus adjustment unit 12e adjusts the focus position of the lens module 12 under the control of the lens control unit 16.
  • the lens control unit 16 may adjust the position of the lens group 12b by controlling the movable unit 12d so that the size of the region included in the imaging field of view of the work W is substantially constant. In other words, the lens control unit 16 can control the movable unit 12d so that the size of the region of the work W included in the imaging visual field falls within a predetermined range.
  • the lens control unit 16 may adjust the position of the lens group 12b according to the distance between the imaging position and the work W. In this embodiment, zoom adjustment is not essential.
  • the communication I/F unit 18 sends and receives data to and from the image processing device 20.
  • the communication I/F unit 18 receives an imaging instruction from the image processing device 20.
  • the communication I/F unit 18 transmits the image data generated by the image sensor control unit 14 to the image processing device 20.
  • the PLC 30 is connected to the image processing device 20 and controls the image processing device 20.
  • the PLC 30 controls the timing for the image processing apparatus 20 to output an image capturing command (image capturing trigger) to the image capturing apparatus 10.
  • the input device 40 and the display device 50 are connected to the image processing device 20.
  • the input device 40 receives user's inputs regarding various settings of the inspection system 1.
  • the display device 50 displays information regarding the setting of the inspection system 1, the result of the image processing of the work W by the image processing device 20, and the like.
  • the image processing device 20 acquires captured image data from the imaging device 10 and performs image processing on the acquired captured image data.
  • the image processing apparatus 20 includes a command generation unit 21, a calculation unit 22, an autofocus control unit (hereinafter, referred to as “AF control unit”) 23, an inspection unit 24, and an autofocus evaluation unit (hereinafter, “AF evaluation unit”). 25), a determination unit 26, an output unit 27, a storage unit 230, a condition creation unit 28, and a setting unit 29.
  • the command generation unit 21 receives a control command from the PLC 30 and outputs an imaging command (imaging trigger) to the imaging device 10. Further, the command generation unit 21 specifies the processing conditions of the lens control unit 16 of the image pickup apparatus 10 to the image pickup apparatus 10.
  • the calculation unit 22 calculates the focus degree from the captured image data.
  • the focus degree is a degree indicating how much the object is in focus, and is calculated using various known methods.
  • the calculation unit 22 extracts a high frequency component by applying a high pass filter to the captured image data, and calculates the integrated value of the extracted high frequency components as the focus degree.
  • Such a focus degree indicates a value that depends on the difference in brightness of the image.
  • the AF control unit 23 searches for a focus position which is a focus position where the work W is focused. Specifically, the AF control unit 23 acquires, from the calculation unit 22, the focus degree of each of the plurality of captured image data generated by changing the focal position of the lens module 12. The AF control unit 23 determines the focus position at which the acquired focus degree has a peak as the focus position. “In focus” means that an image of the work W is formed on the image pickup surface 13a (see FIG. 2) of the image pickup device 13. The AF control unit 23 specifies the captured image data when the focus position of the lens module 12 is the in-focus position as the inspection image data.
  • the inspection unit 24 inspects the work W based on the inspection image indicated by the inspection image data and outputs the inspection result. Specifically, the inspection unit 24 inspects the work W by performing pre-registered image processing on the inspection image. The inspection unit 24 may perform the inspection using a known technique. When the inspection item is the presence/absence of scratches, the inspection result indicates “with scratches” or “without scratches”. When the inspection item is a dimension, the inspection result indicates whether or not the measured value of the dimension is within a predetermined range.
  • the AF evaluation unit 25 evaluates the reliability of the in-focus position based on the inspection image and outputs the evaluation result. Specifically, the AF evaluation unit 25 calculates an evaluation value indicating the reliability of the in-focus position by performing image processing registered in advance on the inspection image, and compares the calculated evaluation value with a threshold value. By doing so, the reliability of the in-focus position is evaluated. The AF evaluation unit 25 calculates, for example, an evaluation value that increases as the reliability increases, and outputs an evaluation result that the in-focus position is correct when the evaluation value is equal to or greater than the threshold value, and the evaluation value is less than the threshold value. If it is, the evaluation result that the focus position may be incorrect is output.
  • the determination unit 26 makes a comprehensive determination of the work W based on the inspection result output from the inspection unit 24 and the evaluation result output from the AF evaluation unit 25. For example, the determination unit 26 determines that the work W is non-defective when receiving the inspection result indicating that there is no scratch and the evaluation result indicating that the focus position is correct. The determination unit 26 determines that the work W is a defective product when receiving the inspection result indicating that there is a scratch and the evaluation result indicating that the focus position is correct. Further, when the determination unit 26 receives the evaluation result indicating that the focus position may be incorrect, the determiner 26 determines that the inspection may not be performed accurately due to the error of the focus position.
  • the output unit 27 outputs the determination result of the determination unit 26.
  • the output unit 27 causes the display device 50 to display the determination result.
  • the output unit 27 may also display the inspection result and the evaluation result on the display device 50.
  • the storage unit 230 stores various data, programs and the like.
  • the storage unit 230 stores the inspection image data specified by the AF control unit 23 and the inspection image data that has been subjected to predetermined processing.
  • the storage unit 230 may store the inspection result by the inspection unit 24, the evaluation result by the AF evaluation unit 25, and the determination result by the determination unit 26.
  • the inspection system 1 includes a lens control unit 16, a calculation unit 22, an AF control unit 23, and an AF evaluation unit 25 as an autofocus processing unit that executes an autofocus process regarding a search for a focus position. ..
  • the conditions of the autofocus process are switched according to at least one of the type of work W and the inspection target location.
  • the storage unit 230 stores a condition table 232 in which the identification information for identifying the product type of the work W and the inspection target portion is associated with the condition data indicating the condition of the autofocus process.
  • the condition creating unit 28 creates the condition table 232 stored in the storage unit 230.
  • the condition creating unit 28 creates condition data for at least one of the type of the work W and the inspection target location, and associates the created condition data with the identification information for identifying the type of the work W and the inspection target location.
  • 232 is stored in the storage unit 230.
  • the setting unit 29 reads the condition data corresponding to the type of the work W and the inspection target location from the condition table 232, and sets the condition indicated by the read condition data as the execution condition of the autofocus process. At least one of the lens control unit 16, the calculation unit 22, the AF control unit 23, and the AF evaluation unit 25, which operates as the autofocus processing unit, executes processing according to the conditions set by the setting unit 29.
  • the autofocus process is executed according to the condition data according to the type of work W and the inspection target location. Therefore, the focus position can be easily adjusted according to the type of the work W and the inspection target location. Furthermore, since the focus position at which the work W is focused is automatically searched for by the autofocus processing, an image focused on the work W can be obtained.
  • FIG. 3 is a schematic diagram for explaining a method for searching a focus position. To simplify the description, FIG. 3 shows only one lens of the lens module 12.
  • the distance from the principal point O of the lens module 12 to the target surface is a
  • the distance from the principal point O of the lens module 12 to the imaging surface 13a is b.
  • f be the distance (focal length) from the principal point O of the lens module 12 to the focal position (rear focal position) F of the lens module 12.
  • the distance between the imaging surface 13a and the inspection target location may change depending on the individual difference in height of the inspection target location of the workpiece W.
  • the focus position F of the lens module 12 is adjusted in order to obtain an image focused on the inspection target portion even when the distance between the imaging surface 13a and the inspection target portion changes.
  • the method of adjusting the focal position F of the lens module 12 includes the following method (A) and method (B).
  • the method (A) is a method in which at least one lens (for example, the lens 12a) forming the lens module 12 is translated in the optical axis direction.
  • the focal point F changes while the principal point O of the lens module 12 moves in the optical axis direction.
  • the distance b changes.
  • the focus position F corresponding to the distance b that satisfies the expression (1) is searched for as the focus position.
  • the method (B) is a method of changing the refraction direction of at least one lens (for example, the lens 12a) forming the lens module 12.
  • the focal position F changes as the focal length f of the lens module 12 changes.
  • the focus position F corresponding to the focal length f that satisfies the expression (1) is searched for as the focus position.
  • the configuration of the lens 12a for changing the focal position F of the lens module 12 is not particularly limited. Below, the example of a structure of the lens 12a is demonstrated.
  • FIG. 4 is a diagram showing an example of the configuration of the lens module 12 whose focal position is variable.
  • the lens 12a forming the lens module 12 is moved in parallel.
  • at least one lens at least one of the lens 12a, the lens group 12b, and the lens 12c that configures the lens module 12 may be translated.
  • the focal position F of the lens module 12 changes according to the above method (A). That is, in the configuration shown in FIG. 4, the focus adjustment unit 12e moves the lens 12a along the optical axis direction. By moving the position of the lens 12a, the focus position F of the lens module 12 changes.
  • the movable range Ra that the focus position F can take corresponds to the movable range Rb of the lens 12a.
  • the lens control unit 16 changes the focal position F of the lens module 12 by controlling the movement amount of the lens 12a.
  • the calculation unit 22 calculates the degree of focus from the captured image data at each focus position F.
  • the AF control unit 23 determines the focus position F corresponding to the movement amount of the lens 12a at which the focus degree reaches a peak as the focus position.
  • the focus adjusting lens is often composed of a plurality of lens groups.
  • the focus position F of the lens module 12 can be changed by controlling the movement amount of at least one lens forming the combined lens.
  • FIG. 5 is a diagram showing another example of the configuration of the lens module 12 whose focal position is variable.
  • the focal position F of the lens module 12 changes according to the above method (B).
  • the lens 12a shown in FIG. 5 is a liquid lens.
  • the lens 12a includes a translucent container 70, electrodes 73a, 73b, 74a, 74b, insulators 75a, 75b, and insulating layers 76a, 76b.
  • the conductive liquid 71 and the insulating liquid 72 are not mixed and have different refractive indexes.
  • the electrodes 73a and 73b are fixed between the insulators 75a and 75b and the translucent container 70, respectively, and are located in the conductive liquid 71.
  • the electrodes 74a and 74b are arranged near the ends of the interface between the conductive liquid 71 and the insulating liquid 72.
  • An insulating layer 76a is interposed between the electrode 74a and the conductive liquid 71 and the insulating liquid 72.
  • An insulating layer 76b is interposed between the electrode 74b and the conductive liquid 71 and the insulating liquid 72.
  • the electrodes 74a and 74b are arranged at positions symmetrical with respect to the optical axis of the lens 12a.
  • the focus adjustment unit 12e includes a voltage source 12e1 and a voltage source 12e2.
  • the voltage source 12e1 applies the voltage Va between the electrode 74a and the electrode 73a.
  • the voltage source 12e2 applies the voltage Vb between the electrode 74b and the electrode 73b.
  • the conductive liquid 71 is pulled by the electrode 74a.
  • the conductive liquid 71 is pulled by the electrode 74b.
  • the curvature of the interface between the conductive liquid 71 and the insulating liquid 72 changes. Since the conductive liquid 71 and the insulating liquid 72 have different refractive indexes, the focus position F of the lens module 12 changes as the curvature of the interface between the conductive liquid 71 and the insulating liquid 72 changes.
  • the curvature of the interface between the conductive liquid 71 and the insulating liquid 72 depends on the magnitude of the voltages Va and Vb. Therefore, the lens control unit 16 changes the focus position F of the lens module 12 by controlling the magnitudes of the voltages Va and Vb.
  • the movable range Ra that the focus position F can take is determined by the voltage range that the voltages Va and Vb can take.
  • the calculation unit 22 calculates the degree of focus from the captured image data at each focus position F.
  • the AF control unit 23 determines the focus position F corresponding to the magnitudes of the voltages Va and Vb at which the focus degree reaches a peak as the focus position.
  • the voltage Va and the voltage Vb are controlled to the same value.
  • the interface between the conductive liquid 71 and the insulating liquid 72 changes symmetrically with respect to the optical axis.
  • the voltage Va and the voltage Vb may be controlled to different values.
  • the interface between the conductive liquid 71 and the insulating liquid 72 becomes asymmetric with respect to the optical axis, and the orientation of the imaging visual field of the imaging device 10 can be changed.
  • a liquid lens and a solid lens may be combined.
  • the focus position F of the lens module 12 is changed by using both the method (A) and the method (B), and the focus position F when the expression (1) is satisfied is determined as the focus position.
  • Focus position search method> As a method for searching the in-focus position by the AF control unit 23, there are a hill climbing method and a full scan method, and either method may be used.
  • the hill climbing method is a focus at which the focus is maximized while changing the focus position of the lens module 12 within the set search range and ending the search when the focus position at which the focus is maximized is found.
  • This is a method of determining the position as the in-focus position.
  • the hill-climbing method is based on the magnitude relationship between the focus degree at the focus position at the start of the search and the focus degree at the adjacent focus position, and the direction of the focus position at which the focus degree increases becomes the search direction. decide.
  • the hill climbing method sequentially calculates the difference between the focus degree at the previous focus position and the focus degree at the next focus position while changing the focus position in the search direction. The focus position at the time of negative change is determined as the focus position.
  • the all-scan method is to change the focal position of the lens module 12 over the entire set search range, obtain the in-focus degree at each in-focus position, and set the in-focus position to be the in-focus position with the maximum in-focus degree. It is a way to decide.
  • the full scan method also includes a method of performing a coarse second search process and then a fine second search process.
  • the first search process is a process of changing the focus position at a coarse pitch interval over the entire search range to search for the focus position having the maximum focus degree.
  • the second search process is a process of changing the focus position at fine pitch intervals in the entire local range including the focus position searched in the first search process, and searching the focus position with the maximum focus degree as the focus position. Is.
  • the hill climbing method has the advantage that the search time is shorter than the full scan method.
  • the in-focus position searched is the position at which the in-focus degree becomes maximum, it is not always the position at which the in-focus degree becomes maximum within the search range.
  • the all-scan method can reliably search for the in-focus position where the in-focus degree is maximum within the search range, but the search time becomes long.
  • the AF control unit 23 may specify, as the inspection image data, the imaged image data having the maximum focus degree.
  • the AF control unit 23 stores the picked-up image data of each focus position, and inspects the picked-up image data of the focus position where the degree of focus is maximum from the stored picked-up image data. Specify as image data.
  • the AF control unit 23 instructs the command generation unit 21 to output a command for adjusting the focus position to the in-focus position and outputting an image, and inspects the imaged image data received from the imaging device 10 according to the command. It may be specified as image data.
  • FIG. 6 is a block diagram showing an example of the hardware configuration of the image processing apparatus according to the embodiment.
  • the image processing apparatus 20 of the example illustrated in FIG. 6 includes a CPU (Central Processing Unit) 210 that is an arithmetic processing unit, a main memory 234 and a hard disk 236 that are storage units, a camera interface 216, an input interface 218, and a display controller. 220, PLC interface 222, communication interface 224, and data reader/writer 226. These units are connected to each other via a bus 228 so that they can communicate with each other.
  • a CPU Central Processing Unit
  • the CPU 210 executes various calculations by expanding a program (code) stored in the hard disk 236 in the main memory 234 and executing these in a predetermined order.
  • the CPU 210 executes the control program 238. It is realized by.
  • the main memory 234 is typically a volatile storage device such as a DRAM (Dynamic Random Access Memory), and in addition to the program read from the hard disk 236, the image data and work acquired by the imaging device 10 Holds data etc. Further, the hard disk 236 may store various setting values and the like.
  • the storage unit 230 shown in FIG. 1 includes a main memory 234 and a hard disk 236. In addition to the hard disk 236 or in place of the hard disk 236, a semiconductor storage device such as a flash memory may be adopted.
  • the camera interface 216 mediates data transmission between the CPU 210 and the imaging device 10. That is, the camera interface 216 is connected to the imaging device 10 for imaging the work W and generating image data. More specifically, the camera interface 216 includes an image buffer 216a for temporarily storing image data from the image pickup apparatus 10. Then, when the image data of a predetermined number of frames is stored in the image buffer 216a, the camera interface 216 transfers the stored data to the main memory 234. The camera interface 216 also sends an image pickup command to the image pickup apparatus 10 in accordance with an internal command generated by the CPU 210.
  • the input interface 218 mediates data transmission between the CPU 210 and the input device 40. That is, the input interface 218 receives an operation command given by the operator operating the input device 40.
  • the display controller 220 is connected to the display device 50 and notifies the user of the result of processing in the CPU 210. That is, the display controller 220 controls the screen of the display device 50.
  • the output unit 27 shown in FIG. 1 is configured by the display controller 220.
  • the PLC interface 222 mediates data transmission between the CPU 210 and the PLC 30. More specifically, the PLC interface 222 transmits the control command from the PLC 30 to the CPU 210.
  • the communication interface 224 mediates data transmission between the CPU 210 and the console (or personal computer or server device).
  • the communication interface 224 is typically composed of Ethernet (registered trademark) or USB (Universal Serial Bus).
  • Ethernet registered trademark
  • USB Universal Serial Bus
  • the data reader/writer 226 mediates data transmission between the CPU 210 and the memory card 206 which is a recording medium. That is, the memory card 206 circulates in a state in which a program executed by the image processing apparatus 20 is stored, and the data reader/writer 226 reads the program from the memory card 206. In addition, the data reader/writer 226 writes the image data acquired by the imaging device 10 and/or the processing result in the image processing device 20 to the memory card 206 in response to the internal command of the CPU 210.
  • the memory card 206 is a general-purpose semiconductor storage device such as SD (Secure Digital), a magnetic storage medium such as a flexible disk (Flexible Disk), or an optical storage medium such as a CD-ROM (Compact Disk Read Only Memory). Etc.
  • FIG. 7 is a diagram schematically showing the image pickup of the work W by the image pickup apparatus.
  • the work W in the example shown in FIG. 7 is a transparent body (such as glass). Since the work W is a transparent body, it can be focused on either the front surface or the back surface of the work W.
  • the front surface of the work W can be inspected by acquiring the inspection image data focused on the front surface of the work W.
  • the back surface of the work W can be inspected by acquiring the inspection image data focused on the back surface of the work W.
  • the imaging device 10 does not search the in-focus position from all of the movable range Ra (see FIGS. 4 and 5) of the focal position of the lens module 12, but moves the movable range Ra of the movable range Ra. It is preferable to search the in-focus position from a part of the search range.
  • the focus position is searched from the search range excluding the range of the focus position that focuses on the back surface of the work W in the movable range Ra. ..
  • the focus position is searched from the search range excluding the range of the focus position that focuses on the back surface of the work W in the movable range Ra. ..
  • FIG. 8 is a diagram showing an image including an image of a work W1 of a relatively large type.
  • FIG. 9 is a diagram showing an image including an image of the work W2 of a relatively small type.
  • the size of the work in the image 65 is different for each product.
  • the AF control unit 23 preferably searches for the in-focus position based on the in-focus degree in the partial area (hereinafter, referred to as “in-focus degree calculation area A1”) of the image 65 including the image of the work W2. .. This can reduce the possibility that an image focused on the background portion will be acquired.
  • the AF evaluation unit 25 calculates the focus degree from the detection image represented by the inspection image data, and calculates the evaluation value indicating the reliability of the focus position based on the calculated focus degree.
  • the focus degree is, for example, the integrated value of the high frequency components extracted from the image.
  • the reference focus degree d is a focus degree calculated from an image focused on the inspection target portion of the reference work, and is calculated in advance by an experiment.
  • the AF evaluation unit 25 determines whether the in-focus level g of the first peak and the in-focus level g of the second peak of the in-focus level waveform obtained when searching for the in-focus position.
  • the focus degree waveform is a waveform showing a change in the focus degree with respect to the focus position when the focus position of the lens module 12 is changed.
  • the first peak is the peak with the highest degree of focus.
  • the second peak is the peak having the second highest focus degree.
  • FIG. 10 is a diagram showing an example of a focus degree waveform.
  • the focus degree waveform of the example shown in FIG. 10 includes two peaks at the focus positions F1 and F2.
  • the focus position F1 is the focus position when focusing on the inspection target portion of the work W. Therefore, when the autofocus process is normally performed, the focus degree waveform includes only one peak at the focus position F1. However, for some reason, a peak may occur at a focus position different from the focus position F1. For example, when a sheet having a pattern with high contrast is reflected in the image, a peak appears at a focus position different from the focus position F1.
  • the focus position F2 different from the focus position F1 is erroneously determined as the focus position, and the image data when adjusted to the focus position F2 is the inspection image data. Can be output to the image processing device 20 as
  • the AF evaluation unit 25 may calculate an evaluation value that decreases as the reliability of the in-focus position increases.
  • the AF evaluation unit 25 may calculate the evaluation value using a known technique.
  • the AF evaluation unit 25 uses the techniques described in International Publication No. 2017/056557 (Patent Document 2), JP 2010-78681 A (Patent Document 3), and JP 10-170817 A (Patent Document 4). You may calculate an evaluation value using.
  • the AF evaluation unit 25 may calculate the evaluation value based on the focus degree calculated from the entire area of the inspection image, or may be calculated from the focus degree calculation area A1 (see FIGS. 8 and 9) of the inspection image.
  • the evaluation value may be calculated based on the degree of focus.
  • the condition creation unit 28 displays a setting screen for supporting the setting of the search range on the display device 50, and sets the search range for each inspection target portion of the work W according to the input to the input device 40.
  • FIG. 11 is a diagram showing an example of a setting screen for supporting the setting of the search range of the in-focus position.
  • the setting screen 51 in the example shown in FIG. 11 includes areas 52a and 52b, knobs 55 and 57, an OK button 60, and a cancel button 61.
  • the setting screen 51 of the example shown in FIG. 11 is displayed in the inspection system 1 in which the lens module 12 includes the lens 12a of the example shown in FIG.
  • the condition creation unit 28 causes the command generation unit 21 to output a scan command for changing the focus position in the entire movable range Ra to the imaging device 10 in a state where the reference work is placed on the stage 90.
  • the lens control unit 16 of the imaging device 10 changes the lens 12a from one end to the other end of the movable range Rb by a predetermined interval, thereby changing the focal position F of the lens module 12 to the entire movable range Ra. Change with.
  • the calculation unit 22 calculates the degree of focus for the imaged image data of each focus position F received from the image pickup apparatus 10.
  • the condition creating unit 28 displays a line graph 53, which is a graphic showing the relationship between the focus position of the lens module 12 and the focus degree, in the area 52a.
  • the line graph 53 shows the relationship between the focus position and the focus degree in the entire movable range Ra.
  • the horizontal axis represents the movement amount of the lens 12a
  • the vertical axis represents the focus degree.
  • the movement amount of the lens 12a is 0 when the focal position F of the lens module 12 is at one end of the movable range Ra, and the lens 12a when the focal position F of the lens module 12 is at the other end of the movable range Ra. Is 100.
  • a point 56a corresponding to the center of the search range of the in-focus position is displayed. Further, in the area 52b, a vertical line 56b which is drawn from the point 56a to the horizontal axis is displayed.
  • the default position of the point 56a is preset.
  • the default position of the point 56a is, for example, a position where the movement amount of the lens 12a is 0.
  • a dotted line 58 indicating the movement amount of the lens 12a corresponding to the lower limit of the search range and a dotted line 59 indicating the movement amount of the lens 12a corresponding to the upper limit of the search range are displayed in an overlapping manner. ..
  • the condition creating unit 28 displays the captured image 54 represented by the captured image data corresponding to the movement amount of the point 56a in the area 52b.
  • the condition creating unit 28 switches the captured image displayed in the area 52b every time the position of the point 56a is changed.
  • the knob 55 indicates the current position of the point 56a.
  • the setting unit 29 updates the positions of the point 56a, the perpendicular line 56b, and the dotted lines 58 and 59 according to the operation on the knob 55.
  • the operator can change the point 56a corresponding to the center of the search range to an arbitrary position on the line graph 53 by operating the knob 55 using the input device 40.
  • the knob 57 is for adjusting the width of the search range of the in-focus position.
  • the width of the search range is the difference between the amount of movement of the lens 12a corresponding to the lower limit of the search range and the amount of movement of the lens 12a corresponding to the upper limit of the search range.
  • the value “d” of the value “ ⁇ d” (d is 0 to 100) indicated by the knob 57 indicates the difference between the movement amount corresponding to the point 56a and the movement amount corresponding to the lower limit of the search range.
  • the difference between the movement amount corresponding to the upper limit of the search range and the movement amount corresponding to the point 56a is shown.
  • the width of the search range is twice the "d" of the value " ⁇ d” indicated by the knob 57.
  • the condition creating unit 28 updates the positions of the dotted lines 58 and 59 according to the operation of the knob 57.
  • the operator can change the width of the search range centered on the point 56a by operating the knob 57 using the input device 40.
  • the OK button 60 is a button for registering the currently set search range.
  • the cancel button 61 is a button for discarding the currently set search range.
  • condition creation unit 28 When the OK button 60 is operated, the condition creation unit 28 prompts the user to input identification information for identifying the product type of the work W and the inspection target portion, and acquires the identification information from the input device 40.
  • the condition creation unit 28 stores in the storage unit 230 a condition table 232 that associates the acquired identification information with the condition data that includes the data that specifies the currently set search range.
  • the setting unit 29 receives an input of identification information for identifying a product type to be inspected and an inspection target location, reads condition data corresponding to the received identification information, and executes the autofocus process on the condition indicated by the read condition data. Set as a condition.
  • the command generation unit 21 outputs to the imaging device 10 a command to change the focus position F within the search range included in the execution conditions set by the setting unit 29.
  • the lens control unit 16 changes the focus position F within the set search range. Then, the AF control unit 23 searches for the in-focus position from the instructed search range.
  • methods for searching the in-focus position by the AF control unit 23 include the hill climbing method and the all-scan method.
  • the condition creation unit 28 may set a focus position search method for each type of work W and each inspection target location.
  • the condition creating unit 28 displays a screen for prompting the user to input identification information for identifying the product type of the work W and the inspection target portion and a focusing position search method (either the hill climbing method or the all-scan method).
  • the search method may be set according to the input to the input device 40.
  • ⁇ In the hill climbing method it is preferable to start the search from the focus position in the center of the specified range.
  • the full scan method it is preferable to start the search from the focus position at one end of the designated range. With these, the in-focus position can be efficiently searched.
  • the condition creating unit 28 determines the focus position at the center of the search range as the focus position at the start of the search for the in-focus position (hereinafter, referred to as “initial position”). ..
  • the condition creating unit 28 determines the focus position at one end of the search range as the initial position. Then, the condition creating unit 28 creates the condition data including the data designating the initial position determined together with the search method.
  • the condition creation unit 28 stores a condition table 232 in the storage unit 230 in which the identification information for identifying the product type of the work W and the inspection target location is associated with the created condition data.
  • the setting unit 29 receives an input of identification information for identifying a product type to be inspected and an inspection target location, reads condition data corresponding to the received identification information, and executes the autofocus process on the condition indicated by the read condition data. Set as a condition.
  • the command generation unit 21 outputs a command to the imaging device 10 to set the focus position of the lens module 12 when the focus position is not searched as the initial position included in the execution condition.
  • the lens control unit 16 of the imaging device 10 moves the focal position of the lens module 12 to the set initial position, and receives the imaging command for the next inspection target work. stand by.
  • the lens control unit 16 can immediately change the focal position of the lens module 12 within the search range.
  • the condition creation unit 28 displays a setting screen for supporting the setting of the focus degree calculation area A1 on the display device 50, and sets the focus degree calculation area for each work type according to the input to the input device 40. ..
  • the worker puts the reference work for each product type in a predetermined posture at a predetermined position on the stage 90 (see FIG. 1).
  • the image processing device 20 outputs an imaging command to the imaging device 10 and acquires image data from the imaging device 10.
  • the condition creating unit 28 causes the display device 50 to display an image (for example, the image shown in FIG. 8 or 9) indicated by the image data acquired from the imaging device 10, and specifies the focus degree calculation area A1. Prompt to.
  • the condition creating unit 28 sets the focus degree calculation area A1 according to the input to the input device 40. For example, the operator inputs the four vertices of the focus degree calculation area A1 which is a rectangle.
  • the worker sets, as the focus degree calculation area A1, a region that has the same height as the inspection target portion of the work and includes a portion with high contrast.
  • the condition creating unit 28 creates condition data including data designating the size and position/orientation of the focus degree calculation area A1.
  • the high-contrast portion includes, in addition to the edge portion, a character printed on the surface, a pattern formed on the surface, a portion to which parts such as screws are attached, and the like.
  • a rectangular focus degree calculation area A1 is set, but the shape of each area is not limited to this.
  • the shape of the focus degree calculation area A1 may be a circular shape, a frame shape, or any free shape capable of forming an area.
  • the focus degree calculation area A1 does not need to be limited to a single area.
  • the focus degree calculation area A1 may be a plurality of areas that exist in a dispersed manner.
  • the condition creation unit 28 prompts the user to input identification information for identifying the type of work W and the inspection target portion, and acquires the identification information from the input device 40.
  • the condition creation unit 28 stores the acquired identification information and the condition data in the storage unit 230 in association with each other.
  • the setting unit 29 receives an input of identification information for identifying a product type to be inspected and an inspection target location, reads condition data corresponding to the received identification information, and executes the autofocus process on the condition indicated by the read condition data. Set as a condition.
  • the calculation unit 22 calculates the focus degree in the focus degree calculation area A1 of the captured image obtained by changing the focus position. Thereby, the focus degree can be calculated from the focus degree calculation area A1 corresponding to the product type and the inspection target portion, and an image focused on the inspection target portion of the work can be easily obtained.
  • the AF evaluation unit 25 calculates an evaluation value based on the focus degree calculated from the focus degree calculation area A1 in the inspection image represented by the inspection image data. Thereby, the reliability of the in-focus position can be appropriately evaluated according to the product type and the inspection target portion.
  • the worker sequentially puts a plurality of reference works for each product type at predetermined positions on the stage 90 (see FIG. 1) in a predetermined posture.
  • the image processing device 20 outputs an imaging command to the imaging device 10, and acquires image data focused on the inspection target portion of the reference work from the imaging device 10.
  • the image processing device 20 acquires a plurality of image data respectively corresponding to a plurality of reference works.
  • the condition creating unit 28 calculates an evaluation value for each of the plurality of image data acquired from the imaging device 10 using the same method as the AF evaluation unit 25.
  • the condition creating unit 28 determines the threshold value based on the calculated statistical data of the evaluation value.
  • the condition creating unit 28 creates condition data including data designating a threshold value.
  • condition creating unit 28 may determine the minimum value of the calculated evaluation values as a threshold value, or calculate the average value and the standard deviation ⁇ of the calculated evaluation values.
  • the obtained value (for example, average value ⁇ 3 ⁇ ) may be determined as the threshold value.
  • condition creation unit 28 may display the statistical data of the calculated evaluation value on the display device 50 and determine the value input to the input device 40 as the threshold value.
  • the operator may input a threshold value for each type of work W and each inspection target location.
  • the condition creation unit 28 prompts the user to input identification information for identifying the type of work W and the inspection target portion, and acquires the identification information from the input device 40.
  • the condition creation unit 28 stores the acquired identification information and the condition data in the storage unit 230 in association with each other.
  • the setting unit 29 receives an input of identification information for identifying a product type to be inspected and an inspection target location, reads condition data corresponding to the received identification information, and executes the autofocus process on the condition indicated by the read condition data. Set as a condition.
  • the setting unit 29 sets the threshold included in the execution condition in the AF evaluation unit 25.
  • the AF evaluation section 25 evaluates the reliability of the in-focus position by comparing the evaluation value calculated from the inspection image with the set threshold value. Thereby, the reliability of the in-focus position can be appropriately evaluated according to the product type and the inspection target portion.
  • condition creating unit 28 may set an evaluation function for calculating an evaluation value from the degree of focus instead of the threshold value or in addition to the threshold value, depending on the product type and the inspection target location. In this case, the condition creating unit 28 creates the condition data including the data designating the evaluation function.
  • FIG. 12 is a diagram showing an example of a table in which product types and condition data are associated with each other.
  • FIG. 13 is a diagram showing an example of a table in which inspection target locations are associated with condition data.
  • FIG. 14 is a diagram showing an example of a table in which product types, inspection target locations, and condition data are associated with each other.
  • condition creation unit 28 creates a condition table 232 as shown in FIG.
  • the condition creation unit 28 creates a condition table 232 as shown in FIG.
  • the condition creation unit 28 creates the condition table 232 as shown in FIG.
  • the condition table 232 includes condition data of different items depending on at least one of the product type and the inspection target part.
  • the number of items included in the condition table 232 may be one or more.
  • the condition table 232 illustrated in FIG. 14 includes condition data that specifies the search range and the focus degree calculation area.
  • Condition data of items other than items that differ depending on at least one of the product type and the inspection target location is stored in the storage unit 230 as common data.
  • the setting unit 29 may set the condition indicated by the condition data and common data according to the product type and the inspection target location as the execution condition.
  • FIG. 15 is a flowchart showing an example of the flow of the inspection process of the inspection system according to the embodiment.
  • the condition creating unit 28 creates in advance condition data corresponding to at least one of the product type and the inspection target area, and creates identification information and condition data for identifying at least one of the product type and the inspection target area.
  • the associated condition table 232 is stored in the storage unit 230.
  • the image processing device 20 determines whether or not an instruction to switch the product type and the inspection target portion has been input (step S1).
  • the operator inputs the switching instruction and the identification information for identifying the type and the inspection target part after the switching to the input device 40.
  • the setting unit 29 of the image processing apparatus 20 reads the condition data corresponding to the input identification information from the condition table 232 in step S2. Further, the setting unit 29 sets the condition indicated by the read condition data as the execution condition of the autofocus processing unit (lens control unit 16, calculation unit 22, AF control unit 23, and AF evaluation unit 25).
  • the setting unit 29 sets the same condition as the previous time in the autofocus processing unit (lens control unit 16, calculation unit 22, AF control unit 23, and AF evaluation unit 25). It is set as an execution condition (step S3).
  • step S4 the imaging device 10 and the image processing device 20 execute a focus position search process.
  • step S4 the lens control unit 16, the calculation unit 22, and the AF control unit 23 execute the search for the in-focus position according to the condition data set in step S2 or step S3.
  • the AF control unit 23 specifies the inspection image data when the focus position of the lens module 12 is adjusted to the in-focus position (step S5).
  • the inspection unit 24 of the image processing apparatus 20 inspects the work W based on the inspection image indicated by the inspection image data, and outputs the inspection result (step S6).
  • the AF evaluation unit 25 of the image processing device 20 evaluates the reliability of the in-focus position based on the inspection image indicated by the inspection image data, and outputs the evaluation result (step S7).
  • the AF evaluation unit 25 evaluates the reliability of the in-focus position according to the condition data set in step S2 or step S3.
  • step S6 and step S7 are not limited to this, and step S6 may be executed after step S7, or step S6 and step S7 may be executed in parallel.
  • the determination unit 26 of the image processing device 20 makes a comprehensive determination based on the inspection result and the evaluation result (step S8). After that, the output unit 27 displays the determination result on the display device 50 (step S9). After step S9, the inspection process ends.
  • the embodiment includes the following disclosures.
  • (Structure 1) An optical system (12) whose focal position is variable, An image sensor (13) that generates a captured image by receiving light from an object (W) via the optical system (12); An autofocus processing unit (16, 22, 23, 25) that performs autofocus processing relating to a search for a focus position that is the focus position that focuses on the object (W) based on the captured image; An inspection unit (22) for inspecting the object based on an inspection image generated when the focus position is adjusted to the in-focus position, A setting unit (28) for setting the condition data of the autofocus processing according to at least one of the type of the target object and the inspection target location, The inspection system (1), wherein the autofocus processing unit (16, 22, 23, 25) executes the autofocus processing according to the condition data.
  • the autofocus processing unit (16, 22, 23, 25) searches for the in-focus position based on the in-focus degree in a partial area of the captured image,
  • the inspection system (1) according to configuration 1, wherein the condition data includes data designating a size and a position/orientation of the partial area.
  • the autofocus process includes a process of determining the quality of the focus position by comparing an evaluation value indicating the reliability of the focus position with a threshold value,
  • the inspection system (1) according to configuration 1, wherein the condition data includes data that specifies at least one of an evaluation function for calculating the evaluation value and the threshold value.
  • the autofocus process includes a process of determining the quality of the focus position by comparing an evaluation value indicating the reliability of the focus position with a threshold value, The evaluation value is calculated based on the degree of focus in a partial area of the inspection image,
  • the inspection system (1) according to configuration 1, wherein the condition data includes data designating a size and a position/orientation of the partial area.
  • (Structure 6) An optical system (12) whose focal position is variable, An image sensor (13) that generates a captured image by receiving light from an object (W) via the optical system (12); An autofocus processing unit (16, 22, 23, 25) that executes an autofocus process for searching for a focus position that is the focus position that focuses on the object (W) based on the captured image.
  • the inspection method in the inspection system (1) A step of setting condition data of the autofocus processing according to at least one of a type of the target object (W) and an inspection target location; A step of causing the autofocus processing unit (16, 22, 23, 25) to execute the autofocus processing according to the condition data; A step of inspecting the object (W) based on an inspection image generated when the focus position is adjusted to the in-focus position.
  • (Structure 7) An optical system (12) whose focal position is variable, An image sensor (13) that generates a captured image by receiving light from an object (W) via the optical system (12); An inspection including an autofocus processing unit (16a, 16b, 23) that executes an autofocus processing for searching a focus position that is the focus position that focuses on the object (W) based on the captured image.
  • a program (215) for causing a computer to execute the inspection method in the system (1) comprising:
  • the inspection method is A step of setting condition data of the autofocus processing according to at least one of the type of the object and the inspection target location;

Abstract

Ce système de contrôle est pourvu d'une unité de traitement de mise au point automatique, d'une unité de contrôle pour contrôler un objet sur la base d'une image de contrôle générée lorsqu'une position focale est réglée à une position de mise au point, et une unité de réglage pour régler des données d'état de traitement de mise au point automatique conformément à au moins l'un de la diversité de l'objet et de la partie de correspondante à contrôler. L'unité de traitement de mise au point automatique effectue un traitement de mise au point automatique conformément aux données d'état. La position focale peut ainsi être facilement ajustée en fonction de la diversité de l'objet et de la partie correspondante à contrôler, et une image focalisée de l'objet peut être obtenue.
PCT/JP2019/044393 2018-11-27 2019-11-12 Système de contrôle, procédé de contrôle, et programme WO2020110712A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018221071A JP2020086152A (ja) 2018-11-27 2018-11-27 検査システム、検査方法およびプログラム
JP2018-221071 2018-11-27

Publications (1)

Publication Number Publication Date
WO2020110712A1 true WO2020110712A1 (fr) 2020-06-04

Family

ID=70854275

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/044393 WO2020110712A1 (fr) 2018-11-27 2019-11-12 Système de contrôle, procédé de contrôle, et programme

Country Status (2)

Country Link
JP (1) JP2020086152A (fr)
WO (1) WO2020110712A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023103426A1 (fr) * 2022-08-04 2023-06-15 中电科机器人有限公司 Procédé et dispositif de mise au point automatique pour inspection visuelle de partie

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2023120845A (ja) * 2022-02-18 2023-08-30 あっと株式会社 毛細血管撮像システム、毛細血管撮像システム用サーバ装置、及び毛細血管撮像プログラム
CN114979491B (zh) * 2022-05-31 2023-09-19 广东利元亨智能装备股份有限公司 一种图像获取方法与装置
JP7415216B1 (ja) 2023-09-11 2024-01-17 ダイトロン株式会社 外観検査装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010134915A (ja) * 2008-11-04 2010-06-17 Omron Corp 画像処理装置
JP2011034360A (ja) * 2009-07-31 2011-02-17 Optoelectronics Co Ltd 光学的情報読取装置及び光学的情報読取方法
JP2012003197A (ja) * 2010-06-21 2012-01-05 Olympus Corp 顕微鏡装置および画像取得方法
JP2014130221A (ja) * 2012-12-28 2014-07-10 Canon Inc 画像処理装置、その制御方法、画像処理システム、及びプログラム
JP2014215582A (ja) * 2013-04-30 2014-11-17 オリンパス株式会社 共焦点顕微鏡装置
JP2017116459A (ja) * 2015-12-25 2017-06-29 大塚電子株式会社 光学特性測定装置および光学系

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7030351B2 (en) * 2003-11-24 2006-04-18 Mitutoyo Corporation Systems and methods for rapidly automatically focusing a machine vision inspection system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010134915A (ja) * 2008-11-04 2010-06-17 Omron Corp 画像処理装置
JP2011034360A (ja) * 2009-07-31 2011-02-17 Optoelectronics Co Ltd 光学的情報読取装置及び光学的情報読取方法
JP2012003197A (ja) * 2010-06-21 2012-01-05 Olympus Corp 顕微鏡装置および画像取得方法
JP2014130221A (ja) * 2012-12-28 2014-07-10 Canon Inc 画像処理装置、その制御方法、画像処理システム、及びプログラム
JP2014215582A (ja) * 2013-04-30 2014-11-17 オリンパス株式会社 共焦点顕微鏡装置
JP2017116459A (ja) * 2015-12-25 2017-06-29 大塚電子株式会社 光学特性測定装置および光学系

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023103426A1 (fr) * 2022-08-04 2023-06-15 中电科机器人有限公司 Procédé et dispositif de mise au point automatique pour inspection visuelle de partie

Also Published As

Publication number Publication date
JP2020086152A (ja) 2020-06-04

Similar Documents

Publication Publication Date Title
WO2020110712A1 (fr) Système de contrôle, procédé de contrôle, et programme
US10698308B2 (en) Ranging method, automatic focusing method and device
JP5895270B2 (ja) 撮像装置
US10120163B2 (en) Auto-focus method for a coordinate-measuring apparatus
US9667853B2 (en) Image-capturing apparatus
JP3996617B2 (ja) 画像歪み補正機能を備えたプロジェクタ装置
JP2012149928A (ja) Afレンズユニットの特性検査装置およびその特性検査方法、制御プログラム、可読記憶媒体
US10827114B2 (en) Imaging system and setting device
US9979858B2 (en) Image processing apparatus, image processing method and program
US10317665B2 (en) Method for correcting illumination-dependent aberrations in a modular digital microscope, digital microscope and data-processing program
JP7287533B2 (ja) 検査システム、検査方法およびプログラム
JP2000028336A (ja) 形状測定装置及び形状測定方法
WO2020110711A1 (fr) Système d'inspection, procédé d'inspection et programme
JP2008281887A (ja) 合焦検出装置、合焦検出方法および合焦検出プログラム
JP3382346B2 (ja) 撮像装置
JP7087984B2 (ja) 撮像システムおよび設定装置
JP6312410B2 (ja) アライメント装置、顕微鏡システム、アライメント方法、及びアライメントプログラム
JP7135586B2 (ja) 画像処理システム、画像処理方法およびプログラム
JP6089232B2 (ja) 撮像装置
JPH109819A (ja) 距離計測装置
JP2014029429A5 (ja) 画像処理装置、撮像装置、制御方法、及びプログラム
JPS599613A (ja) 自動焦点調節方法
JP3226678B2 (ja) 微小寸法測定装置
JP2015210396A (ja) アライメント装置、顕微鏡システム、アライメント方法、及びアライメントプログラム
JP6615438B2 (ja) モデル登録方法及びモデル登録装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19889144

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19889144

Country of ref document: EP

Kind code of ref document: A1