WO2020039922A1 - Image processing system, image processing method and program - Google Patents

Image processing system, image processing method and program Download PDF

Info

Publication number
WO2020039922A1
WO2020039922A1 PCT/JP2019/031042 JP2019031042W WO2020039922A1 WO 2020039922 A1 WO2020039922 A1 WO 2020039922A1 JP 2019031042 W JP2019031042 W JP 2019031042W WO 2020039922 A1 WO2020039922 A1 WO 2020039922A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
image processing
focus position
search
orientation
Prior art date
Application number
PCT/JP2019/031042
Other languages
French (fr)
Japanese (ja)
Inventor
加藤 豊
Original Assignee
オムロン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オムロン株式会社 filed Critical オムロン株式会社
Publication of WO2020039922A1 publication Critical patent/WO2020039922A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the present invention relates to an image processing system, an image processing method, and a program.
  • Patent Document 1 discloses a focus adjustment device that determines focus within a limited range using a contrast detection method or a phase difference detection method.
  • One method is to search for an image pattern from a plurality of images having different focus positions.
  • the search for the image pattern is executed by, for example, an image processing device.
  • an image pattern must be searched for each of a plurality of images. Therefore, there is a problem that the processing time becomes long. For example, in a visual inspection of a product flowing on a manufacturing line, a process in a short time is required. A long processing time causes a reduction in inspection throughput.
  • Another method is to search for an image after determining the focus position.
  • the imaging device images the target object while changing the focus position.
  • the image processing device evaluates the degree of focusing by image processing and determines an optimal focus position.
  • the imaging device searches for an image pattern using an image obtained by imaging the target at the optimal focus position.
  • the image processing apparatus needs to know in advance where the image of the target object is located in the image. However, since the position of the target object in the image cannot be determined in advance, it is not easy to find the optimum focus position.
  • An object of the present invention is to enable a focus position and a position and orientation in an image to be specified in a short time.
  • an image processing system captures an image of an object, generates an image including the object, acquires an image from the image capturing unit, and performs image pattern matching on the image. And an image processing unit to execute.
  • the image pattern is a pattern serving as a reference for the position and orientation in the image of the target object.
  • the imaging unit can change the focus position.
  • the image processing unit searches for the focus position and the position and orientation by executing image pattern matching while narrowing the range of the focus position, and specifies the position and orientation and the focus position.
  • the focus position and the position and orientation in the image can be specified in a short time.
  • the image processing system searches for a focus position and a position and orientation in an image from a search space having two degrees of freedom, a focus position and a position and orientation in an image.
  • the degree of matching in the matching process is improved.
  • the focus position and the position and orientation in the image can be specified in a short time. Further, the focus position and the position and orientation in the image can be specified more accurately.
  • the image processing unit shifts the search for the focus position and the position and orientation from the coarse search to the fine search.
  • the focus position and the position and orientation can be specified in a short time.
  • the transition from the coarse (course) search to the fine (fine) search is characterized in that the information amount is gradually increased and the search step width is gradually reduced.
  • the image processing unit sets a search range for matching in the image, and resets the search range based on the matching degree obtained from the matching result.
  • the focus position and the posture position in the image can be specified in a short time by resetting the search range based on the matching degree.
  • the image processing unit obtains at least one candidate point from the degree of coincidence, and resets the search range so that at least one candidate point is included in the search range and the search range is narrowed.
  • the focus position and the posture position in the image can be specified in a short time by gradually narrowing the search range for matching.
  • the image processing unit performs matching on a plurality of images having different focus positions, and based on a result of the matching, an image having the highest focus degree from the plurality of images.
  • Select The image processing unit reduces a change in the focus position between a plurality of images as the number of corrections of the focus position increases.
  • the focus position and the posture position in the image can be specified in a short time by gradually narrowing the search range of the focus position.
  • the image processing unit selects an image used for correcting the focus position by thinning out a plurality of images captured by the imaging unit while changing the focus position.
  • an image processing method is an image processing method by an image processing system including an imaging unit that captures an object to generate an image including the object, and an image processing unit.
  • the imaging unit can change the focus position when capturing the target object.
  • the image processing method performs matching of an image pattern serving as a reference of the position and orientation in the image of the target object with respect to the image including the target object generated by the imaging unit while narrowing the range of the focus position,
  • the method includes a step of searching for the focus position and the position and orientation, and a step of specifying the position and orientation and the focus position based on the search result of the searching step.
  • the focus position and the position and orientation in the image can be specified in a short time.
  • a program is a program for processing an image obtained by capturing an image of an object by an imaging device.
  • the imaging device can change the focus position when capturing an object.
  • the program causes the computer to execute matching of an image pattern serving as a reference of the position and orientation in the image of the target object with respect to the image including the target object generated by the imaging device while narrowing the range of the focus position.
  • the focus position and the position and orientation in the image can be specified in a short time.
  • FIG. 1 is a schematic diagram illustrating an outline of a visual inspection system that is one application example of an image processing system according to the present embodiment.
  • FIG. 3 is a diagram illustrating an example of an internal configuration of the imaging device.
  • FIG. 3 is a schematic diagram for explaining adjustment of a focus position.
  • FIG. 3 is a diagram illustrating an example of a configuration of a lens. It is a figure which shows another example of a lens for focus position adjustment.
  • FIG. 3 is a diagram schematically illustrating an image obtained by imaging by an imaging device.
  • FIG. 4 is a schematic diagram illustrating a first method for specifying a focus position and a position and orientation of a work.
  • FIG. 9 is a schematic diagram illustrating a second method for specifying a focus position and a position and orientation of a work.
  • FIG. 5 is a schematic diagram showing an area in an image designated to execute the search method according to the present embodiment. It is a schematic diagram for explaining the search method according to the present embodiment.
  • FIG. 4 is a diagram showing a search route by a search method according to the present embodiment.
  • 5 is a flowchart showing a flow of a pre-setting for a search method according to the present embodiment.
  • 5 is a flowchart illustrating a search flow.
  • FIG. 10 is a diagram showing a first applied example of search according to the present embodiment.
  • FIG. 3 is a schematic diagram illustrating an inspection area on a work surface.
  • FIG. 5 is a diagram illustrating an example of setting an inspection area and a reference area.
  • FIG. 14 is a diagram illustrating a second applied example of the search according to the present embodiment.
  • FIG. 14 is a diagram illustrating a third applied example of the search according to the present embodiment.
  • FIG. 1 is a schematic diagram showing one application example of the image processing system according to the present embodiment.
  • the image processing system 1 is realized, for example, as a visual inspection system.
  • the appearance inspection system images a plurality of inspection target positions on the work W placed on the stage 90 in, for example, a production line of an industrial product, and performs an appearance inspection of the work W using the obtained images. .
  • the work W is inspected for scratches, dirt, presence or absence of foreign matter, dimensions, and the like.
  • the next work (not shown) is transported onto the stage 90.
  • the work W may be stationary at a predetermined position on the stage 90 in a predetermined posture.
  • the work W may be imaged while the work W moves on the stage 90.
  • the image processing system 1 includes an imaging device 10 and an image processing device 20 as basic components.
  • the image processing system 1 further includes a PLC (Programmable Logic Controller) 50 and an input / display device 60.
  • PLC Programmable Logic Controller
  • the imaging device 10 is connected to the image processing device 20.
  • the imaging device 10 captures an image of a subject (work W) present in an imaging field of view in accordance with a command from the image processing device 20, and generates image data including an image of the work W.
  • the imaging device 10 is an imaging system capable of changing a focus position.
  • the imaging device 10 and the image processing device 20 may be integrated.
  • the imaging device 10 includes a lens module with a variable focus position.
  • the focus position means a position where an image of the work W is formed.
  • the focus position of the lens module is changed according to the distance between the imaging device 10 and the work W. Accordingly, an image in which the work W is focused can be captured.
  • the imaging device 10 has an autofocus function, and can automatically focus on the work W.
  • the image processing device 20 acquires an image of the work W from the imaging device 10.
  • the image processing device 20 performs a predetermined process on the image.
  • the image processing device 20 includes a determination unit 21, an output unit 22, a storage unit 23, and a command generation unit 24.
  • the determination unit 21 performs a predetermined process on the image data generated by the imaging device 10 to determine whether the appearance of the work W is good.
  • the output unit 22 outputs the result of the determination by the determination unit 21. For example, the output unit 22 causes the input / display device 60 to display the determination result.
  • the storage unit 23 stores various data, programs, and the like. For example, the storage unit 23 stores the image data acquired from the imaging device 10 and the image data subjected to a predetermined process. The storage unit 23 may store the determination result by the determination unit 21. Further, the storage unit 23 stores a program for causing the image processing device 20 to execute various processes.
  • the command generation unit 24 receives a control command from the PLC 50 and outputs an imaging command (imaging trigger) to the imaging device 10.
  • the image processing device 20 is connected to the PLC 50.
  • the PLC 50 controls the image processing device 20.
  • the PLC 50 controls the timing at which the image processing device 20 outputs an imaging command (imaging trigger) to the imaging device 10.
  • the input / display device 60 is connected to the image processing device 20.
  • the input / display device 60 receives a user input regarding various settings of the image processing system 1. Further, the input / display device 60 displays information on the settings of the image processing system 1 and a result of the image processing of the work W by the image processing device 20 (for example, a result of determining whether the appearance of the product is good or bad).
  • the image processing device 20 performs image pattern matching on an image acquired from the imaging device 10.
  • This image pattern is a pattern serving as a reference for the position and orientation of the work W in the image.
  • the image processing device 20 searches the focus position and the position and orientation by executing image pattern matching while narrowing the range of the focus position, and specifies the position and orientation and the focus position.
  • the image processing system searches for a focus position and a position and orientation in an image from a search space having two degrees of freedom, a focus position and a position and orientation in an image.
  • the degree of matching in the matching process is improved.
  • the focus position and the position and orientation in the image can be specified in a short time. Further, the focus position and the position and orientation in the image can be specified more accurately.
  • FIG. 2 is a diagram illustrating an example of the internal configuration of the imaging device 10.
  • the imaging device 10 includes an illumination unit 11, a lens module 12, an imaging element 13, an imaging element control unit 14, a lens control unit 16, registers 15, 17, a communication I / O F section 18.
  • the illumination unit 11 irradiates the work W with light. Light emitted from the illumination unit 11 is reflected on the surface of the work W and enters the lens module 12. The lighting unit 11 may be omitted.
  • the lens module 12 forms the reflected light from the work W on the imaging surface 13 a of the imaging device 13.
  • the lens module 12 has a lens 12a, a lens group 12b, a lens 12c, a movable unit 12d, and a focus adjustment unit 12e.
  • the lens 12a is a lens for mainly changing a focus position.
  • the focus adjustment unit 12e controls the lens 12a to change the focus position.
  • the lens group 12b is a lens group for changing the focal length.
  • the zoom magnification is controlled by changing the focal length.
  • the lens group 12b is provided on the movable part 12d and is movable along the optical axis direction.
  • the lens 12c is a lens fixed at a predetermined position in the imaging device 10.
  • the imaging device 13 is a photoelectric conversion device such as a CMOS (Complementary Metal Oxide Semiconductor) image sensor, for example, and converts light from an imaging visual field into an image signal.
  • CMOS Complementary Metal Oxide Semiconductor
  • the imaging device control unit 14 Upon receiving an imaging command from the image processing device 20 via the communication I / F unit 18, the imaging device control unit 14 opens a shutter (not shown) to perform exposure, and generates image data based on an image signal from the imaging device 13. I do. At this time, the imaging element control unit 14 opens and closes the shutter so as to have a shutter speed (exposure time) corresponding to the imaging position, and generates image data of a preset resolution. Information indicating the shutter speed and the resolution corresponding to the imaging position is stored in the register 15 in advance. The imaging device control unit 14 outputs the generated image data to the image processing device 20 via the communication I / F unit 18.
  • the lens controller 16 adjusts the focus of the imaging device 10 according to the command stored in the register 17. Specifically, the lens control unit 16 controls the focus adjustment unit 12e so that the focus position changes according to the region of the work W where the work W is imaged.
  • the focus adjustment unit 12e adjusts the position of the lens 12a under the control of the lens control unit 16. That is, the lens control unit 16 controls the lens 12a so that the focus is on the imaging target area of the work W. “Focused” means that an image of the imaging target area of the work W is formed on the imaging surface 13 a of the imaging element 13. The lens 12a will be described later in detail.
  • the lens control unit 16 may control the movable unit 12d to adjust the position of the lens group 12b so that the size of the region included in the imaging field of view of the work W is substantially constant. In other words, the lens control unit 16 can control the movable unit 12d such that the size of a region included in the imaging field of view of the workpiece W is within a predetermined range.
  • the lens controller 16 may adjust the position of the lens group 12b according to the distance between the imaging position and the work W. Note that in this embodiment, zoom adjustment is not essential.
  • FIG. 3 is a schematic diagram for explaining the adjustment of the focus position. For simplicity, FIG. 3 shows only one lens (lens 12a).
  • the distance from the principal point O of the lens 12a to the target surface is a
  • the distance from the principal point O of the lens 12a to the imaging surface 13a is b
  • the distance from the principal point O of the lens 12a is The distance (focal length) from the point O to the focal point F of the lens 12a is defined as f.
  • the working distance (WD) can change according to the height of the surface of the work W.
  • the working distance is defined as a distance from the surface of the lens 12a on the work W side to the work W.
  • Expression (1) it is possible to capture an image in a state where the surface of the work W is in focus. For example, the focus can be adjusted by changing the distance b.
  • the change amount of the distance b can be represented as an offset from a reference distance.
  • this offset is referred to as “lens extension amount”.
  • the amount of lens movement for obtaining an image focused on the surface of the work W can be determined.
  • the reference value of the distance b can be arbitrarily determined.
  • the reference value of the distance b may be set as the value of the focal length f.
  • the imaging device 10 has an autofocus function. Therefore, the imaging device 10 determines the degree of focusing from the image of the work W and adjusts the focus position.
  • the configuration of the lens 12a for adjusting the focus position is not particularly limited. Hereinafter, an example of the configuration of the lens 12a will be described.
  • FIG. 4 is a diagram showing an example of the configuration of the lens 12a.
  • the focus adjustment unit 12e moves the lens 12a along the optical axis direction.
  • the extension amount of the lens 12a changes. Therefore, the lens 12a moves so that an image of the work W is formed on the imaging surface 13a according to a change in the working distance WD.
  • FIGS. 3 and 4 show an example of one lens.
  • an FA lens is often composed of a plurality of grouped lenses.
  • the combined focal length f and the position of the lens principal point can be obtained using the focal length f of each lens and the positional relationship between the lenses.
  • WD can be calculated by using the characteristic values.
  • FIG. 4 shows an example in which the focus position is adjusted by a mechanical method.
  • the method of adjusting the focus position is not limited to a mechanical method.
  • FIG. 5 is a diagram illustrating another example of the focus position adjusting lens.
  • the lens 12a includes a translucent container 70, electrodes 73a, 73b, 74a, 74b, insulators 75a, 75b, and insulating layers 76a, 76b.
  • the sealed space in the translucent container 70 is filled with a conductive liquid 71 such as water and an insulating liquid 72 such as oil.
  • the conductive liquid 71 and the insulating liquid 72 are not mixed and have different refractive indexes.
  • the electrodes 73a and 73b are fixed between the insulators 75a and 75b and the translucent container 70, respectively, and are located in the conductive liquid 71.
  • the electrodes 74a and 74b are arranged near the end of the interface between the conductive liquid 71 and the insulating liquid 72.
  • An insulating layer 76a is interposed between the electrode 74a and the conductive liquid 71 and the insulating liquid 72.
  • An insulating layer 76b is interposed between the electrode 74b and the conductive liquid 71 and the insulating liquid 72.
  • the electrode 74a and the electrode 74b are arranged at positions symmetric with respect to the optical axis of the lens 12a.
  • the focus adjustment unit 12e includes a voltage source 12e1 and a voltage source 12e2.
  • the voltage source 12e1 applies a voltage Va between the electrode 74a and the electrode 73a.
  • Voltage source 12e2 applies voltage Vb between electrode 74b and electrode 73b.
  • the curvature of the interface depends on the magnitude of the voltages Va and Vb. Therefore, by changing the magnitudes of the voltages Va and Vb, the focus position of the lens 12a can be adjusted so that an image is formed on the imaging surface 13a even when the working distance WD changes.
  • voltage Va and voltage Vb are controlled to the same value. Thereby, the interface between the conductive liquid 71 and the insulating liquid 72 changes symmetrically with respect to the optical axis. However, the voltage Va and the voltage Vb may be controlled to different values. Thereby, the interface between the conductive liquid 71 and the insulating liquid 72 becomes asymmetric with respect to the optical axis, and the direction of the imaging visual field of the imaging device 10 can be changed.
  • a liquid lens and a solid lens may be combined.
  • the position of the principal point of the lens changes in addition to the focal length f. Accordingly, since the distance b changes, the focus adjustment may be performed in consideration of the change.
  • the image processing device 20 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), an auxiliary storage device, a communication I / F, and performs information processing.
  • the auxiliary storage device includes, for example, a hard disk drive, a solid state drive, and the like, and stores a program executed by the CPU.
  • the determination unit 21 of the image processing device 20 binarizes the difference image between the image of the non-defective work and the image of the non-defective work stored in the storage unit 23 in advance, and determines the number of pixels exceeding the threshold value and the reference value.
  • the quality of the inspection target position may be determined by collation.
  • the output unit 22 of the image processing device 20 may display the determination result on the input / display device 60.
  • FIG. 6 is a diagram schematically illustrating an image obtained by imaging by the imaging device.
  • the image 100 includes an image of the work W. Since the distance (work distance) from the lens of the imaging device 10 to the work may differ for each work, in order to obtain an image of the work W in focus, it is necessary to specify an optimum focus position for each work W. is necessary.
  • the work W is originally placed at a predetermined position on the stage 90 in a predetermined posture.
  • the work W moves below the imaging device 10 by moving the stage 90. Due to the movement of the work W, there is a possibility that the position of the work W is deviated from the original shooting position, or there is a possibility that the work W is rotating on the stage 90.
  • the degree of displacement of the position and orientation may also differ for each work.
  • FIG. 7 is a schematic diagram illustrating a first method for specifying the focus position and the position and orientation of the work W.
  • the imaging device 10 captures an image of the work W while changing the focus position, and acquires a plurality (M) of images.
  • the image processing apparatus 20 searches for an image pattern of the work W on each of the M sheets. Then, the image processing device 20 specifies one image that is most focused on the workpiece W from the M images.
  • FIG. 8 is a schematic diagram illustrating a second method for specifying the focus position and the position and orientation of the work W.
  • an image is searched after a focus position is determined.
  • the imaging device 10 images the work W while changing the focus position, and acquires M images.
  • the image processing device 20 evaluates the degree of focusing of the M images by image processing, and determines an optimal focus position.
  • the imaging device 10 selects an image obtained by imaging the work W at the optimum focus position from among the M images, and searches for an image pattern in the image.
  • the image processing device 20 needs to know the position of the image of the work W in the image and the posture of the work W in advance.
  • the search route becomes longer. In other words, it takes time to specify the optimum focus position and the position and orientation of the work. Therefore, in the present embodiment, a search method for shortening the search route is used.
  • FIG. 9 is a schematic diagram showing an area in an image designated to execute the search method according to the present embodiment.
  • a predetermined area in the image 100 is set as a reference area B1.
  • the image pattern of the reference area B1 is registered in the image processing device 20 as a model.
  • a certain range in the image 100 including the reference area B1 is set as a search range A1.
  • the search range A1 and the search range of the focus position are registered in the image processing device 20.
  • FIG. 10 is a schematic diagram for explaining the search method according to the present embodiment.
  • a group of images having different focus positions is collectively photographed within the focus position search range.
  • N images are obtained.
  • the search range of the focus position is, for example, a range in which the focus of the imaging device 10 can be controlled.
  • the focus position search range may be narrower than the focus controllable range.
  • the image processing device 20 extracts M images by thinning out the N image groups.
  • the M images are a so-called subset of a group of N images.
  • Pattern matching is performed on the extracted M images using the registered image model. As a result, one image having the highest focusing degree and the highest degree of pattern matching is selected from the M images.
  • the search area is limited based on the result of the pattern matching, and a plurality of images having different focus positions are extracted.
  • the plurality of images selected at this time include one image selected last time.
  • the degree of change of the focus position among the plurality of images is smaller than the degree of change of the focus position between the plurality of images selected in the previous search. That is, the focus position changes finely among a plurality of images.
  • the image processing device 20 determines the degree of focus of a plurality of images and selects the image with the highest degree of focus. In the image, the image processing device 20 executes pattern matching by narrowing the search range. As described above, the pattern matching is repeatedly performed while the search range is narrowed while the degree of change in the focus position is reduced.
  • FIG. 11 is a diagram showing a search route by the search method according to the present embodiment.
  • the pattern matching is repeatedly performed while limiting the search range while reducing the degree of change in the focus position.
  • the search for the focus position and the position and orientation is shifted from the coarse search to the fine search.
  • the search route can be shortened for each search. Therefore, according to the present embodiment, it is possible to specify the focus position and the position and orientation within the image in a shorter time.
  • FIG. 12 is a flowchart showing a flow of the pre-setting for the search method according to the present embodiment.
  • a sample of the inspection object work
  • the sample is placed at a specified position (in front of the lens of the imaging device 10).
  • the lens of the imaging device 10 is controlled to focus on an object (sample).
  • step S14 the user specifies a reference area in the image. For example, a sample image 100 is displayed on the display of the input / display device 60.
  • the designation of a reference region in the image is accepted, and the region is set as the reference region B1.
  • the image processing device 20 stores the image pattern of the reference area B1 in the storage unit 23. As a result, the image pattern is registered in the image processing device 20 as a model.
  • step S16 the user specifies a search range in the image.
  • the sample image 100 is displayed on the display of the input / display device 60, for example.
  • the designation of a search range in the image is accepted, and the area is set as the search range A1.
  • the image processing device 20 stores the information of the search range A1 in the storage unit 23. As a result, the specified search range is registered in the image processing apparatus 20.
  • step S17 the user specifies a search range of the focus position.
  • the input / display device 60 receives designation of a search range of a focus position.
  • the image processing device 20 stores information on the search range of the focus position in the storage unit 23. Thereby, the search range of the designated focus position is registered in the image processing device 20.
  • FIG. 13 is a flowchart showing a search flow.
  • the imaging device 10 in step S21, collectively captures a group of images having different focus positions within the focus position search range. As a result, N (N is an integer of 2 or more) images are obtained. It is desirable that the focus position of each image is arranged so that the depth of field does not leak.
  • step S22 the image processing apparatus 20 extracts M images (subsets) by thinning out the N image groups.
  • M is an integer satisfying 1 ⁇ M ⁇ N.
  • the extraction method is not particularly limited. For example, images may be extracted at equal intervals (for each predetermined number of images).
  • step S23 the image processing device 20 performs pattern matching on the M images using the model registered in step S15 (see FIG. 12).
  • the degree of freedom of pattern matching is the degree of freedom regarding X, Y, and ⁇ .
  • X and Y respectively represent two orthogonal axes in the image.
  • represents the rotation angle in the image.
  • step S24 the image processing device 20 obtains a plurality of candidate points that give a high correlation value (degree of coincidence).
  • the attributes of the candidate points are correlation values, X, Y, and ⁇ .
  • a normalized cross-correlation method can be used.
  • the normalized cross-correlation method is widely used as a method of searching for an image pattern, and is a method of statistically processing the grayscale information of an image showing an object (for example, the “normalized cross-correlation method” (image Analysis Handbook, edited by Mikio Takagi and Hirohisa Shimoda, University of Tokyo Press).
  • a correlation method as disclosed in JP-A-7-78257 may be used. In this method, a grayscale image is captured by a camera, and an evaluation value at each pixel position of the grayscale image is calculated. Then, the maximum evaluation value in the point sequence in which the evaluation value continuously exceeds the threshold value in the scanning line direction is detected.
  • an edge-based correlation method may be used.
  • the edge-based correlation method is a method of calculating a degree of coincidence using characteristics of an edge (boundary) portion of an image showing an object.
  • the degree of coincidence reflects edge strength or edge gradient.
  • a method as disclosed in JP-A-10-171989 may be used. In this method, a density gradient direction at each pixel of the input image is calculated, and a value for evaluating a difference in the density gradient direction between the density gradient direction of the input image and the density gradient direction of each pixel of a predetermined pattern is calculated. Desired.
  • step S25 the image processing device 20 resets the search range.
  • the image processing apparatus 20 determines the degree of focus of a plurality of images having different focus positions by limiting the search range to an area near the candidate point position. That is, the search range is narrowed to a range including the candidate points.
  • the image processing device 20 selects an image with the highest focus degree from among the plurality of images. For example, an edge is extracted from a region near a candidate point position of each image. The degree of focus is determined based on the edge.
  • step S26 the image processing device 20 performs pattern matching on the selected image.
  • the search range may be limited by resetting the search range.
  • Representative parameters of pattern matching include “search increments” and “model thinning”. “Search step” means a search step (resolution), and “model thinning” does not express an image pattern to be a model with all pixels, but approximately expresses it with thinned pixels. Things. In the coarse search, which is an initial stage of the search, the values of these parameters are increased, and the values of these parameters are gradually reduced as the search proceeds to the detailed search (dense search).
  • Such a method is a conventional method in pattern matching. However, conventionally, it has been used closed for searching for one image. In the present embodiment, the above method is used without closing within an image. This can be expected to improve the search efficiency (speed up).
  • step S27 the image processing device 20 obtains a candidate point that gives a high correlation value.
  • the attributes of the candidate point are the correlation value, X, Y, and ⁇ .
  • the image processing device 20 determines whether the termination condition is satisfied.
  • the termination condition is not particularly limited.
  • the resolution of the final search may be defined in advance, and the entire processing may be terminated when the resolution is reached.
  • the final resolution may be one pixel.
  • the final resolution may be 1 deg.
  • the final resolution of the lens extension amount may be 1 mm.
  • estimation with a precision smaller than the step size may be performed by various methods (for example, parabola fitting or the like) using correlation values.
  • the process returns to step S25, and the processes of steps S25 to S28 are repeated. Thereby, the accuracy of the focus position and the position and orientation in the image can be improved.
  • the termination condition is satisfied (YES in S28)
  • the search processing ends. Therefore, the focus position and the position and orientation of the work in the image are optimized.
  • a coarse search is performed at the beginning of the iteration, and a finer search is performed as the number of iterations increases. This makes it possible to speed up the search process without lowering the accuracy.
  • a plurality of candidate points only one point may be finally left using the correlation value, or a plurality of points may be output.
  • 20 images were captured by changing the focus position.
  • the search according to the present embodiment is faster than the full search.
  • FIG. 14 is a diagram showing a first applied example of the search according to the present embodiment.
  • the search according to the present embodiment can be used for the appearance inspection of the work W.
  • the work W has an area W1 and an area W2.
  • the region W1 is, for example, the surface of a transparent body (such as glass).
  • the region W2 is a region surrounding the region W1, for example, a surface of a housing of the electronic device.
  • Examples of such a work W include an electronic device having a display (for example, a smartphone or a tablet). That is, the area W1 can be a display screen.
  • FIG. 15 is a schematic diagram showing an inspection area on the work surface.
  • inspection area A2 is set in area W1 of work W.
  • an area including the inspection area A2 is imaged by the imaging device 10, and the inspection area is inspected for the presence or absence of a flaw, dirt, or foreign matter.
  • an area (reference area B1) serving as a focus reference may be set separately from the inspection area.
  • FIG. 16 is a diagram showing a setting example of the inspection area A2 and the reference area B1.
  • reference region B1 is a region including a clear pattern.
  • the reference area B1 includes a part of the area W2.
  • the reference area is an area including an end of the area W2 and includes an outline of the work W. Therefore, the reference area B1 includes a boundary between the work W and the periphery of the work W. Thereby, the reference area B1 can include a clear pattern.
  • the image of the reference area B1 has such a contrast that the degree of focus can be determined. Therefore, the focus position and the position and orientation of the work W in the image can be obtained from the reference area B1.
  • the sample work W is imaged by the imaging device 10 to obtain an image of the work W.
  • the reference region B1 and the inspection region A2 are respectively specified from the image of the work W, and the images of the reference region B1 and the inspection region A2 are registered in the image processing device 20. Further, a focus position when the reference area B1 is in focus and a focus position when the inspection area A2 is in focus are registered in the image processing apparatus 20, respectively.
  • the search according to the present embodiment can be used for searching for a focus position. Further, the search according to the present embodiment may be used to correct the deviation of the XY position of the workpiece W in the image.
  • FIG. 17 is a diagram showing a second applied example of the search according to the present embodiment.
  • a second application is picking of a workpiece.
  • a work W for example, a part carried on the belt conveyor 91 is gripped by the robot 30 and transferred.
  • the robot is, for example, a parallel link robot, but is not limited to this.
  • FIG. 18 is a diagram showing a third applied example of the search according to the present embodiment.
  • a third application is reading of bar codes or characters on the surface of an article (not limited to a work).
  • imaging apparatus 10 captures an image of barcode 5 printed on work W (for example, a component or a packing box) carried on belt conveyor 91.
  • the image processing device 20 acquires an image from the imaging device 10 and reads the barcode 5.
  • the present embodiment can be used not only for barcodes but also for recognition of characters or two-dimensional codes.
  • the focus position and the image pattern can be specified in a short time and accurately without the need for an external sensor or large illumination. Bar codes or characters can be read accurately in a short time.
  • An imaging unit (10) configured to capture an image of the object (W) and generate an image including the object (W);
  • An image processing unit (20) that acquires the image from the imaging unit (10) and executes image pattern matching on the image.
  • the image pattern is a pattern serving as a reference for a position and orientation of the object (W) in the image,
  • the imaging unit (10) is capable of changing a focus position,
  • the image processing unit (20) searches for the focus position and the position / posture by executing the matching of the image pattern while narrowing the range of the focus position, and searches the position / posture and the focus position.
  • Configuration 2 The image processing system (1) according to Configuration 1, wherein the image processing unit (20) shifts the search for the focus position and the position and orientation from a coarse search to a fine search.
  • the image processing unit (20) obtains at least one candidate point from the matching degree, and re-determines the search range so that the at least one candidate point is included in the search range and the search range is narrowed.
  • the image processing system (1) according to Configuration 3, which is set.
  • the image processing unit (20) performs the matching on a plurality of images having different focus positions, and determines a degree of focus from the plurality of images based on a result of the matching. Select the image with the highest.
  • System (1) In any one of Configurations 1 to 4, wherein the image processing unit (20) reduces a change in the focus position between the plurality of images as the number of times of correction of the focus position increases.
  • the image processing method includes: An image serving as a reference for the position and orientation of the object (W) in the image with respect to the image including the object (W) generated by the imaging unit (10) while narrowing the range of the focus position. Searching for the focus position and the position and orientation by executing pattern matching (S25-S28); Specifying the position and orientation and the focus position from a result of the search performed by the searching step.
  • the imaging device (10) can change a focus position when imaging the object (W),
  • the program is stored in a computer, While narrowing the range of the focus position, matching of an image including the object (W) generated by the imaging device with an image pattern serving as a reference of the position and orientation of the object (W) in the image (S25-S28) of searching for the focus position and the position and orientation by executing
  • 1 image processing system 5 barcode, 10 image pickup device, 11 illumination unit, 12 lens module, 12a, 12c lens, 12b lens group, 12d movable unit, 12e focus adjustment unit, 12e1, 12e2 voltage source, 13 image sensor, 13a Imaging surface, 14 image sensor control unit, 15, 17 register, 16 lens control unit, 18 communication I / F unit, 20 image processing device, 21 judgment unit, 22 output unit, 23 storage unit, 24 command generation unit, 30 robot , 60 display device, 70 translucent container, 71 conductive liquid, 72 insulating liquid, 73a, 73b, 74a, 74b electrode, 75a, 75b insulator, 76a, 76b insulating layer, 90 stage, 91 belt conveyor, 100 Image, A1 search range, A2 inspection area, B1 group Regions, F focus, O principal point, S11 ⁇ S17, S21 ⁇ S28 step, W workpiece, W1, W2 region.

Abstract

This image processing system is provided with: an imaging device (10) which images an object and generates an image including the object; and an image processing device (20) which acquires the image from the imaging device and executes image pattern matching on the image. The image pattern is a pattern that is a reference for the position orientation of the object in the image. The imaging device (10) can change a focal position. The image processing device (20) searches the focal position and the position orientation and specifies the position orientation and the focal position by executing image pattern matching while narrowing the range of the focal position.

Description

画像処理システム、画像処理方法およびプログラムImage processing system, image processing method and program
 本発明は、画像処理システム、画像処理方法およびプログラムに関する。 The present invention relates to an image processing system, an image processing method, and a program.
 一般にカメラの分野では、合焦状態を判定し、被写体に合焦するようにレンズユニットのフォーカスレンズを移動させるオートフォーカス機能が知られている。たとえば特開2018-84701号公報(特許文献1)は、コントラスト検出方式または位相差検出方式を用いて、制限された範囲内で合焦を判断する焦点調節装置を開示する。 Generally, in the field of cameras, an autofocus function for determining a focus state and moving a focus lens of a lens unit so as to focus on a subject is known. For example, Japanese Patent Laying-Open No. 2018-84701 (Patent Document 1) discloses a focus adjustment device that determines focus within a limited range using a contrast detection method or a phase difference detection method.
特開2018-84701号公報JP 2018-84701 A
 撮像装置からの距離が不明のところに置かれた対象物を撮像する際に、その対象物のフォーカス位置(ピントの合う位置)に加え、画像内の対象物の位置姿勢を特定したい場合がある。このための方法として、たとえば以下の2通りの方法が考えられる。 When capturing an image of an object placed at an unknown distance from the imaging device, it may be necessary to specify the position and orientation of the object in the image in addition to the focus position (focusing position) of the object. . As a method for this, for example, the following two methods can be considered.
 1つの方法は、フォーカス位置の異なる複数の画像から、画像パターンを検索する方法である。画像パターンの検索は、たとえば画像処理装置により実行される。この方法では、複数の画像の各々に対して画像パターンを検索しなければならない。したがって処理時間が長くなるという問題がある。たとえば製造ラインを流れる製品の外観検査等においては、短時間での処理が要求される。長い処理時間は、検査のスループットを低下させる要因となる。 One method is to search for an image pattern from a plurality of images having different focus positions. The search for the image pattern is executed by, for example, an image processing device. In this method, an image pattern must be searched for each of a plurality of images. Therefore, there is a problem that the processing time becomes long. For example, in a visual inspection of a product flowing on a manufacturing line, a process in a short time is required. A long processing time causes a reduction in inspection throughput.
 別の方法は、フォーカス位置を決めてから画像探索をする方法である。たとえば撮像装置がフォーカス位置を変化させながら対象物を撮像する。画像処理装置は、画像処理により合焦度合いを評価して、最適なフォーカス位置を決定する。撮像装置は、その最適なフォーカス位置で対象物を撮像した画像を用いて、画像パターンを検索する。この方法では、最適なフォーカス位置を求めるために、画像処理装置は、対象物の像が画像内のどの位置にあるかを予め把握しておく必要がある。しかしながら、対象物の画像内の位置は予め決定できないため、最適なフォーカス位置を求めることが容易ではない。 Another method is to search for an image after determining the focus position. For example, the imaging device images the target object while changing the focus position. The image processing device evaluates the degree of focusing by image processing and determines an optimal focus position. The imaging device searches for an image pattern using an image obtained by imaging the target at the optimal focus position. In this method, in order to obtain an optimal focus position, the image processing apparatus needs to know in advance where the image of the target object is located in the image. However, since the position of the target object in the image cannot be determined in advance, it is not easy to find the optimum focus position.
 本発明の目的は、フォーカス位置および画像内の位置姿勢を短時間で特定することを可能にすることである。 An object of the present invention is to enable a focus position and a position and orientation in an image to be specified in a short time.
 本開示の一例によれば、画像処理システムは、対象物を撮像して、対象物を含む画像を生成する撮像部と、撮像部から画像を取得して、画像に対して画像パターンのマッチングを実行する画像処理部とを備える。画像パターンは、対象物の画像内の位置姿勢の基準となるパターンである。撮像部は、フォーカス位置を変化させることが可能である。画像処理部は、フォーカス位置の範囲を狭めながら画像パターンのマッチングを実行することにより、フォーカス位置と位置姿勢とを探索して、位置姿勢とフォーカス位置とを特定する。 According to an example of the present disclosure, an image processing system captures an image of an object, generates an image including the object, acquires an image from the image capturing unit, and performs image pattern matching on the image. And an image processing unit to execute. The image pattern is a pattern serving as a reference for the position and orientation in the image of the target object. The imaging unit can change the focus position. The image processing unit searches for the focus position and the position and orientation by executing image pattern matching while narrowing the range of the focus position, and specifies the position and orientation and the focus position.
 この開示によれば、フォーカス位置および画像内の位置姿勢を短時間で特定することができる。画像処理システムは、フォーカス位置および画像内の位置姿勢という2つの方向の自由度を有する探索空間から、フォーカス位置および画像内の位置姿勢を探索する。フォーカス位置の範囲を狭めながら、画像パターンのマッチングを実行することにより、マッチングの処理における一致度が向上する。マッチングにおける一致度が高い状態でフォーカス位置を求めることにより、フォーカス位置の精度がより向上する。したがって、フォーカス位置および画像内の位置姿勢を短時間で特定できる。さらに、フォーカス位置および画像内の位置姿勢をより正確に特定することができる。 According to this disclosure, the focus position and the position and orientation in the image can be specified in a short time. The image processing system searches for a focus position and a position and orientation in an image from a search space having two degrees of freedom, a focus position and a position and orientation in an image. By executing image pattern matching while narrowing the range of the focus position, the degree of matching in the matching process is improved. By obtaining the focus position in a state where the matching degree in the matching is high, the accuracy of the focus position is further improved. Therefore, the focus position and the position and orientation in the image can be specified in a short time. Further, the focus position and the position and orientation in the image can be specified more accurately.
 上述の開示において、画像処理部は、フォーカス位置および位置姿勢の探索を、粗探索から密探索へと移行させる。 In the above disclosure, the image processing unit shifts the search for the focus position and the position and orientation from the coarse search to the fine search.
 この開示によれば、フォーカス位置および位置姿勢を短時間で特定することができる。なお、粗(コース)探索から密(ファイン)探索への移行は、情報量を次第に増やす、探索の刻み幅を次第に細かくするといった特徴とする。 According to this disclosure, the focus position and the position and orientation can be specified in a short time. The transition from the coarse (course) search to the fine (fine) search is characterized in that the information amount is gradually increased and the search step width is gradually reduced.
 上述の開示において、画像処理部は、画像内にマッチングのための検索範囲を設定して、マッチングの結果から求められた一致度に基づいて検索範囲を再設定する。 In the above disclosure, the image processing unit sets a search range for matching in the image, and resets the search range based on the matching degree obtained from the matching result.
 この開示によれば、一致度に基づいて検索範囲を再設定することにより、フォーカス位置、および画像内の姿勢位置を短時間で特定することができる。 According to this disclosure, the focus position and the posture position in the image can be specified in a short time by resetting the search range based on the matching degree.
 上述の開示において、画像処理部は、一致度から少なくとも1つの候補点を求めて、少なくとも1つの候補点が検索範囲に含まれ、かつ検索範囲が狭まるように、検索範囲を再設定する。 In the above disclosure, the image processing unit obtains at least one candidate point from the degree of coincidence, and resets the search range so that at least one candidate point is included in the search range and the search range is narrowed.
 この開示によれば、マッチングのための検索範囲を次第に狭くすることにより、フォーカス位置および画像内の姿勢位置を短時間で特定することができる。 According to this disclosure, the focus position and the posture position in the image can be specified in a short time by gradually narrowing the search range for matching.
 上述の開示において、フォーカス位置の修正において、画像処理部は、フォーカス位置が異なる複数の画像に対してマッチングを実行して、マッチングの結果に基づいて、複数の画像から合焦度の最も高い画像を選択する。画像処理部は、フォーカス位置の修正の回数が増加するにつれて、複数の画像の間のフォーカス位置の変化を小さくする。 In the above disclosure, in correcting the focus position, the image processing unit performs matching on a plurality of images having different focus positions, and based on a result of the matching, an image having the highest focus degree from the plurality of images. Select The image processing unit reduces a change in the focus position between a plurality of images as the number of corrections of the focus position increases.
 この開示によれば、フォーカス位置の検索範囲を次第に狭くすることにより、フォーカス位置および画像内の姿勢位置を短時間で特定することができる。 According to this disclosure, the focus position and the posture position in the image can be specified in a short time by gradually narrowing the search range of the focus position.
 上述の開示において、画像処理部は、撮像部がフォーカス位置を変化させながら撮像された複数の画像を間引くことにより、フォーカス位置の修正に用いる画像を選択する。 In the above disclosure, the image processing unit selects an image used for correcting the focus position by thinning out a plurality of images captured by the imaging unit while changing the focus position.
 この開示によれば、予め取得された複数の画像に基づいて、粗探索および密探索を実行することができる。 According to this disclosure, it is possible to execute a coarse search and a fine search based on a plurality of images acquired in advance.
 本開示の一例によれば、画像処理方法は、対象物を撮像して対象物を含む画像を生成する撮像部と、画像処理部とを備えた画像処理システムによる画像処理方法である。撮像部は、対象物の撮像時にフォーカス位置を変化させることが可能である。画像処理方法は、フォーカス位置の範囲を狭めながら、撮像部により生成された対象物を含む画像に対して、対象物の画像内の位置姿勢の基準となる画像パターンのマッチングを実行することにより、フォーカス位置と位置姿勢とを探索するステップと、探索するステップによる探索の結果から、位置姿勢とフォーカス位置とを特定するステップとを備える。 According to an example of the present disclosure, an image processing method is an image processing method by an image processing system including an imaging unit that captures an object to generate an image including the object, and an image processing unit. The imaging unit can change the focus position when capturing the target object. The image processing method performs matching of an image pattern serving as a reference of the position and orientation in the image of the target object with respect to the image including the target object generated by the imaging unit while narrowing the range of the focus position, The method includes a step of searching for the focus position and the position and orientation, and a step of specifying the position and orientation and the focus position based on the search result of the searching step.
 この開示によれば、フォーカス位置および画像内の位置姿勢を短時間で特定することができる。 According to this disclosure, the focus position and the position and orientation in the image can be specified in a short time.
 本開示の一例によれば、プログラムは、撮像装置によって対象物を撮像することにより得られた画像を処理するためのプログラムである。撮像装置は、対象物の撮像時にフォーカス位置を変化させることが可能である。プログラムは、コンピュータに、フォーカス位置の範囲を狭めながら、撮像装置により生成された対象物を含む画像に対して、対象物の画像内の位置姿勢の基準となる画像パターンのマッチングを実行することにより、フォーカス位置と位置姿勢とを探索するステップと、探索するステップによる探索の結果から、位置姿勢とフォーカス位置とを特定するステップとを実行させる。 According to an example of the present disclosure, a program is a program for processing an image obtained by capturing an image of an object by an imaging device. The imaging device can change the focus position when capturing an object. The program causes the computer to execute matching of an image pattern serving as a reference of the position and orientation in the image of the target object with respect to the image including the target object generated by the imaging device while narrowing the range of the focus position. A step of searching for the focus position and the position and orientation, and a step of specifying the position and orientation and the focus position based on the result of the search by the searching step.
 本発明によれば、フォーカス位置および画像内の位置姿勢を短時間で特定することができる。 According to the present invention, the focus position and the position and orientation in the image can be specified in a short time.
本実施の形態に係る画像処理システムの1つの適用例である外観検査システムの概要を示す模式図である。1 is a schematic diagram illustrating an outline of a visual inspection system that is one application example of an image processing system according to the present embodiment. 撮像装置の内部構成の一例を示す図である。FIG. 3 is a diagram illustrating an example of an internal configuration of the imaging device. フォーカス位置の調整を説明するための模式図である。FIG. 3 is a schematic diagram for explaining adjustment of a focus position. レンズの構成の一例を示す図である。FIG. 3 is a diagram illustrating an example of a configuration of a lens. フォーカス位置調整用レンズの他の一例を示す図である。It is a figure which shows another example of a lens for focus position adjustment. 撮像装置による撮像により得られた画像を模式的に示した図である。FIG. 3 is a diagram schematically illustrating an image obtained by imaging by an imaging device. ワークのフォーカス位置および位置姿勢を特定するための第1の方法を説明した模式図である。FIG. 4 is a schematic diagram illustrating a first method for specifying a focus position and a position and orientation of a work. ワークのフォーカス位置および位置姿勢を特定するための第2の方法を説明した模式図である。FIG. 9 is a schematic diagram illustrating a second method for specifying a focus position and a position and orientation of a work. 本実施の形態に係る探索方法を実行するために指定される画像内の領域を示した模式図である。FIG. 5 is a schematic diagram showing an area in an image designated to execute the search method according to the present embodiment. 本実施の形態に係る探索方法を説明するための模式図である。It is a schematic diagram for explaining the search method according to the present embodiment. 本実施の形態による探索方法による探索経路を示した図である。FIG. 4 is a diagram showing a search route by a search method according to the present embodiment. 本実施の形態に係る探索方法のための事前設定のフローを示したフローチャートである。5 is a flowchart showing a flow of a pre-setting for a search method according to the present embodiment. 探索のフローを示したフローチャートである。5 is a flowchart illustrating a search flow. 本実施の形態による探索の第1の応用例を示した図である。FIG. 10 is a diagram showing a first applied example of search according to the present embodiment. ワーク表面の検査領域を示した模式図である。FIG. 3 is a schematic diagram illustrating an inspection area on a work surface. 検査領域および基準領域の設定例を示した図である。FIG. 5 is a diagram illustrating an example of setting an inspection area and a reference area. 本実施の形態による探索の第2の応用例を示した図である。FIG. 14 is a diagram illustrating a second applied example of the search according to the present embodiment. 本実施の形態による探索の第3の応用例を示した図である。FIG. 14 is a diagram illustrating a third applied example of the search according to the present embodiment.
 §1 適用例
 まず、図1を参照して、本発明が適用される場面の一例について説明する。図1は、本実施の形態に係る画像処理システムの1つの適用例を示す模式図である。
§1 Application Example First, an example of a scene to which the present invention is applied will be described with reference to FIG. FIG. 1 is a schematic diagram showing one application example of the image processing system according to the present embodiment.
 図1に示すように、本実施の形態に係る画像処理システム1は、たとえば外観検査システムとして実現される。外観検査システムは、たとえば工業製品の生産ラインなどにおいて、ステージ90上に載置されたワークW上の複数の検査対象位置を撮像し、得られた画像を用いて、ワークWの外観検査を行う。外観検査では、ワークWの傷、汚れ、異物の有無、寸法などが検査される。 As shown in FIG. 1, the image processing system 1 according to the present embodiment is realized, for example, as a visual inspection system. The appearance inspection system images a plurality of inspection target positions on the work W placed on the stage 90 in, for example, a production line of an industrial product, and performs an appearance inspection of the work W using the obtained images. . In the appearance inspection, the work W is inspected for scratches, dirt, presence or absence of foreign matter, dimensions, and the like.
 ステージ90上に載置されたワークWの外観検査が完了すると、次のワーク(図示せず)がステージ90上に搬送される。ワークWの撮像の際、ワークWは、ステージ90上の予め定められた位置に予め定められた姿勢で静止してもよい。あるいは、ワークWがステージ90上を移動しながら、ワークWが撮像されてもよい。 When the appearance inspection of the work W placed on the stage 90 is completed, the next work (not shown) is transported onto the stage 90. At the time of imaging the work W, the work W may be stationary at a predetermined position on the stage 90 in a predetermined posture. Alternatively, the work W may be imaged while the work W moves on the stage 90.
 図1に示すように、画像処理システム1は、基本的な構成要素として、撮像装置10と、画像処理装置20とを備える。この実施の形態では、画像処理システム1は、さらに、PLC(Programmable Logic Controller)50と、入力/表示装置60とを備える。 As shown in FIG. 1, the image processing system 1 includes an imaging device 10 and an image processing device 20 as basic components. In this embodiment, the image processing system 1 further includes a PLC (Programmable Logic Controller) 50 and an input / display device 60.
 撮像装置10は、画像処理装置20に接続される。撮像装置10は、画像処理装置20からの指令に従って、撮像視野に存在する被写体(ワークW)を撮像して、ワークWの像を含む画像データを生成する。本実施の形態では、撮像装置10は、フォーカス位置を変化させることが可能な撮像系である。撮像装置10と画像処理装置20とは一体化されていてもよい。 The imaging device 10 is connected to the image processing device 20. The imaging device 10 captures an image of a subject (work W) present in an imaging field of view in accordance with a command from the image processing device 20, and generates image data including an image of the work W. In the present embodiment, the imaging device 10 is an imaging system capable of changing a focus position. The imaging device 10 and the image processing device 20 may be integrated.
 撮像装置10は、フォーカス位置が可変のレンズモジュールを含む。フォーカス位置とは、ワークWの像が形成される位置を意味する。撮像装置10とワークWとの間の距離に応じてレンズモジュールのフォーカス位置が変更される。これにより、ワークWにフォーカスの合った画像を撮像することができる。なお、撮像装置10はオートフォーカス機能を有し、ワークWに自動的にフォーカスを合わせることができる。 The imaging device 10 includes a lens module with a variable focus position. The focus position means a position where an image of the work W is formed. The focus position of the lens module is changed according to the distance between the imaging device 10 and the work W. Accordingly, an image in which the work W is focused can be captured. Note that the imaging device 10 has an autofocus function, and can automatically focus on the work W.
 画像処理装置20は、撮像装置10からワークWの画像を取得する。画像処理装置20は、その画像に対して所定の処理を実行する。画像処理装置20は、判定部21と、出力部22と、記憶部23と、指令生成部24とを含む。 The image processing device 20 acquires an image of the work W from the imaging device 10. The image processing device 20 performs a predetermined process on the image. The image processing device 20 includes a determination unit 21, an output unit 22, a storage unit 23, and a command generation unit 24.
 判定部21は、撮像装置10によって生成された画像データに対して予め定められた処理を実行することにより、ワークWの外観の良否を判定する。出力部22は、判定部21による判定結果を出力する。たとえば、出力部22は、入力/表示装置60に判定結果を表示させる。 The determination unit 21 performs a predetermined process on the image data generated by the imaging device 10 to determine whether the appearance of the work W is good. The output unit 22 outputs the result of the determination by the determination unit 21. For example, the output unit 22 causes the input / display device 60 to display the determination result.
 記憶部23は、各種のデータ、プログラム等を記憶する。たとえば記憶部23は、撮像装置10から取得された画像データ、および所定の処理が施された画像データを保存する。記憶部23は、判定部21による判定結果を保存してもよい。さらに、記憶部23は、各種の処理を画像処理装置20に実行させるためのプログラムを記憶する。 The storage unit 23 stores various data, programs, and the like. For example, the storage unit 23 stores the image data acquired from the imaging device 10 and the image data subjected to a predetermined process. The storage unit 23 may store the determination result by the determination unit 21. Further, the storage unit 23 stores a program for causing the image processing device 20 to execute various processes.
 指令生成部24は、PLC50からの制御指令を受けて、撮像装置10に撮像指令(撮像トリガ)を出力する。 The command generation unit 24 receives a control command from the PLC 50 and outputs an imaging command (imaging trigger) to the imaging device 10.
 画像処理装置20はPLC50に接続される。PLC50は画像処理装置20を制御する。たとえばPLC50は、画像処理装置20が撮像指令(撮像トリガ)を撮像装置10に出力するためのタイミングを制御する。 The image processing device 20 is connected to the PLC 50. The PLC 50 controls the image processing device 20. For example, the PLC 50 controls the timing at which the image processing device 20 outputs an imaging command (imaging trigger) to the imaging device 10.
 入力/表示装置60は、画像処理装置20に接続される。入力/表示装置60は、画像処理システム1の各種の設定に関するユーザの入力を受け付ける。さらに、入力/表示装置60は、画像処理システム1の設定に関する情報、画像処理装置20によるワークWの画像処理の結果(たとえば製品の外観の良否を判定した結果)を表示する。 The input / display device 60 is connected to the image processing device 20. The input / display device 60 receives a user input regarding various settings of the image processing system 1. Further, the input / display device 60 displays information on the settings of the image processing system 1 and a result of the image processing of the work W by the image processing device 20 (for example, a result of determining whether the appearance of the product is good or bad).
 本実施の形態では、画像処理装置20は、撮像装置10から取得した画像に対して画像パターンのマッチングを実行する。この画像パターンは、ワークWの画像内の位置姿勢の基準となるパターンである。画像処理装置20は、フォーカス位置の範囲を狭めながら画像パターンのマッチングを実行することにより、フォーカス位置と位置姿勢とを探索して、位置姿勢とフォーカス位置とを特定する。画像処理システムは、フォーカス位置および画像内の位置姿勢という2つの方向の自由度を有する探索空間から、フォーカス位置および画像内の位置姿勢を探索する。フォーカス位置の範囲を狭めながら、画像パターンのマッチングを実行することにより、マッチングの処理における一致度が向上する。マッチングにおける一致度が高い状態でフォーカス位置を求めることにより、フォーカス位置の精度がより向上する。したがって、フォーカス位置および画像内の位置姿勢を短時間で特定できる。さらに、フォーカス位置および画像内の位置姿勢をより正確に特定することができる。 In the present embodiment, the image processing device 20 performs image pattern matching on an image acquired from the imaging device 10. This image pattern is a pattern serving as a reference for the position and orientation of the work W in the image. The image processing device 20 searches the focus position and the position and orientation by executing image pattern matching while narrowing the range of the focus position, and specifies the position and orientation and the focus position. The image processing system searches for a focus position and a position and orientation in an image from a search space having two degrees of freedom, a focus position and a position and orientation in an image. By executing image pattern matching while narrowing the range of the focus position, the degree of matching in the matching process is improved. By obtaining the focus position in a state where the matching degree in the matching is high, the accuracy of the focus position is further improved. Therefore, the focus position and the position and orientation in the image can be specified in a short time. Further, the focus position and the position and orientation in the image can be specified more accurately.
 図2は、撮像装置10の内部構成の一例を示す図である。図2に示されるように、撮像装置10は、照明部11と、レンズモジュール12と、撮像素子13と、撮像素子制御部14と、レンズ制御部16と、レジスタ15,17と、通信I/F部18とを含む。 FIG. 2 is a diagram illustrating an example of the internal configuration of the imaging device 10. As shown in FIG. 2, the imaging device 10 includes an illumination unit 11, a lens module 12, an imaging element 13, an imaging element control unit 14, a lens control unit 16, registers 15, 17, a communication I / O F section 18.
 照明部11は、ワークWに対して光を照射する。照明部11から照射された光は、ワークWの表面で反射し、レンズモジュール12に入射する。照明部11は省略されてもよい。 The illumination unit 11 irradiates the work W with light. Light emitted from the illumination unit 11 is reflected on the surface of the work W and enters the lens module 12. The lighting unit 11 may be omitted.
 レンズモジュール12は、ワークWからの反射光を撮像素子13の撮像面13a上に結像させる。レンズモジュール12は、レンズ12aと、レンズ群12bと、レンズ12cと、可動部12dと、フォーカス調整部12eとを有する。レンズ12aは、主としてフォーカス位置を変更するためのレンズである。フォーカス調整部12eは、レンズ12aを制御して、フォーカス位置を変更する。 The lens module 12 forms the reflected light from the work W on the imaging surface 13 a of the imaging device 13. The lens module 12 has a lens 12a, a lens group 12b, a lens 12c, a movable unit 12d, and a focus adjustment unit 12e. The lens 12a is a lens for mainly changing a focus position. The focus adjustment unit 12e controls the lens 12a to change the focus position.
 レンズ群12bは、焦点距離を変更するためのレンズ群である。焦点距離が変更されることにより、ズーム倍率が制御される。レンズ群12bは、可動部12dに設置され、光軸方向に沿って可動する。レンズ12cは、撮像装置10内の予め定められた位置に固定されるレンズである。 The lens group 12b is a lens group for changing the focal length. The zoom magnification is controlled by changing the focal length. The lens group 12b is provided on the movable part 12d and is movable along the optical axis direction. The lens 12c is a lens fixed at a predetermined position in the imaging device 10.
 撮像素子13は、たとえばCMOS(Complementary Metal Oxide Semiconductor)イメージセンサなどの光電変換素子であり、撮像視野からの光を画像信号に変換する。 The imaging device 13 is a photoelectric conversion device such as a CMOS (Complementary Metal Oxide Semiconductor) image sensor, for example, and converts light from an imaging visual field into an image signal.
 撮像素子制御部14は、通信I/F部18を介して画像処理装置20から撮像指令を受けると、図示しないシャッターを開いて露光し、撮像素子13からの画像信号に基づいて画像データを生成する。このとき、撮像素子制御部14は、撮像位置に対応するシャッター速度(露光時間)となるようにシャッターを開閉し、予め設定された解像度の画像データを生成する。撮像位置に対応するシャッター速度および解像度を示す情報は、予めレジスタ15に記憶されている。撮像素子制御部14は、生成した画像データを通信I/F部18を介して画像処理装置20に出力する。 Upon receiving an imaging command from the image processing device 20 via the communication I / F unit 18, the imaging device control unit 14 opens a shutter (not shown) to perform exposure, and generates image data based on an image signal from the imaging device 13. I do. At this time, the imaging element control unit 14 opens and closes the shutter so as to have a shutter speed (exposure time) corresponding to the imaging position, and generates image data of a preset resolution. Information indicating the shutter speed and the resolution corresponding to the imaging position is stored in the register 15 in advance. The imaging device control unit 14 outputs the generated image data to the image processing device 20 via the communication I / F unit 18.
 レンズ制御部16は、レジスタ17が記憶する命令に従って、撮像装置10のフォーカスを調整する。具体的には、レンズ制御部16は、ワークWの撮像される領域に応じてフォーカス位置が変化するように、フォーカス調整部12eを制御する。フォーカス調整部12eは、レンズ制御部16の制御により、レンズ12aの位置を調整する。つまり、レンズ制御部16は、ワークWの撮像対象領域にフォーカスが合うように、レンズ12aを制御する。「フォーカスが合う」とは、ワークWの撮像対象領域の像が撮像素子13の撮像面13aに形成されることを意味する。レンズ12aについては、後で詳細に説明する。 The lens controller 16 adjusts the focus of the imaging device 10 according to the command stored in the register 17. Specifically, the lens control unit 16 controls the focus adjustment unit 12e so that the focus position changes according to the region of the work W where the work W is imaged. The focus adjustment unit 12e adjusts the position of the lens 12a under the control of the lens control unit 16. That is, the lens control unit 16 controls the lens 12a so that the focus is on the imaging target area of the work W. “Focused” means that an image of the imaging target area of the work W is formed on the imaging surface 13 a of the imaging element 13. The lens 12a will be described later in detail.
 レンズ制御部16は、ワークWのうち撮像視野内に含まれる領域の大きさが略一定になるように、可動部12dを制御して、レンズ群12bの位置を調整してもよい。言い換えると、レンズ制御部16は、ワークWのうち撮像視野内に含まれる領域の大きさが予め定められた範囲内になるように、可動部12dを制御することができる。レンズ制御部16は、撮像位置とワークWとの距離に応じてレンズ群12bの位置を調整すればよい。なお、この実施の形態では、ズームの調整は必須ではない。 The lens control unit 16 may control the movable unit 12d to adjust the position of the lens group 12b so that the size of the region included in the imaging field of view of the work W is substantially constant. In other words, the lens control unit 16 can control the movable unit 12d such that the size of a region included in the imaging field of view of the workpiece W is within a predetermined range. The lens controller 16 may adjust the position of the lens group 12b according to the distance between the imaging position and the work W. Note that in this embodiment, zoom adjustment is not essential.
 §2 具体例
 <A.フォーカス調整のための構成例>
 図3は、フォーカス位置の調整を説明するための模式図である。説明を簡単にするため、図3には、1枚のレンズ(レンズ12a)のみを示している。
§2 Specific example <A. Configuration example for focus adjustment>
FIG. 3 is a schematic diagram for explaining the adjustment of the focus position. For simplicity, FIG. 3 shows only one lens (lens 12a).
 図3に示すように、レンズ12aの主点Oから対象面(ワークWの表面)までの距離をaとし、レンズ12aの主点Oから撮像面13aまでの距離をbとし、レンズ12aの主点Oからレンズ12aの焦点Fまでの距離(焦点距離)をfとする。ワークWの像の位置が撮像面13aの位置に等しい場合、以下の式(1)が成立する。 As shown in FIG. 3, the distance from the principal point O of the lens 12a to the target surface (the surface of the workpiece W) is a, the distance from the principal point O of the lens 12a to the imaging surface 13a is b, and the distance from the principal point O of the lens 12a is The distance (focal length) from the point O to the focal point F of the lens 12a is defined as f. When the position of the image of the work W is equal to the position of the imaging surface 13a, the following expression (1) is established.
 1/a+1/b=1/f・・・(1)
 ワークWの表面の高さに応じて、ワーキングディスタンス(WD)が変化し得る。ワーキングディスタンスは、レンズ12aのワークW側の面から、ワークWまでの距離であると定義される。式(1)が成り立つときに、ワークWの表面にフォーカスが合った状態の画像を撮像することができる。たとえば距離bを変化させることによって、フォーカスの調整が可能である。
1 / a + 1 / b = 1 / f (1)
The working distance (WD) can change according to the height of the surface of the work W. The working distance is defined as a distance from the surface of the lens 12a on the work W side to the work W. When Expression (1) holds, it is possible to capture an image in a state where the surface of the work W is in focus. For example, the focus can be adjusted by changing the distance b.
 距離bの変化量は、基準となる距離からのオフセットとして表すことができる。本実施の形態では、このオフセットを「レンズの繰り出し量」と呼ぶ。レンズ12aのワークW側の面からワークWの表面までの距離をレンズ操出し量に関連づけることによって、ワークWの表面にフォーカスの合った画像を得るためのレンズ操出し量を決定することができる。なお、距離bの基準値は任意に定めることができる。たとえば距離bの基準値を焦点距離fの値としてもよい。 変 化 The change amount of the distance b can be represented as an offset from a reference distance. In the present embodiment, this offset is referred to as “lens extension amount”. By relating the distance from the surface of the lens 12a on the work W side to the surface of the work W to the amount of lens movement, the amount of lens movement for obtaining an image focused on the surface of the work W can be determined. . Note that the reference value of the distance b can be arbitrarily determined. For example, the reference value of the distance b may be set as the value of the focal length f.
 上述のように、撮像装置10は、オートフォーカス機能を有する。したがって、撮像装置10が、ワークWの画像から合焦の度合いを判断して、フォーカス位置を調整する。フォーカス位置を調整するためのレンズ12aの構成は特に限定されない。以下に、レンズ12aの構成の例を説明する。 撮 像 As described above, the imaging device 10 has an autofocus function. Therefore, the imaging device 10 determines the degree of focusing from the image of the work W and adjusts the focus position. The configuration of the lens 12a for adjusting the focus position is not particularly limited. Hereinafter, an example of the configuration of the lens 12a will be described.
 図4は、レンズ12aの構成の一例を示す図である。図4に示した構成では、フォーカス調整部12eは、レンズ12aを、光軸方向に沿って移動させる。レンズ12aの位置を変更することによって、レンズ12aの繰り出し量が変化する。したがって、ワーキングディスタンスWDの変化に応じて、ワークWの像が撮像面13aに形成されるように、レンズ12aが移動する。 FIG. 4 is a diagram showing an example of the configuration of the lens 12a. In the configuration shown in FIG. 4, the focus adjustment unit 12e moves the lens 12a along the optical axis direction. By changing the position of the lens 12a, the extension amount of the lens 12a changes. Therefore, the lens 12a moves so that an image of the work W is formed on the imaging surface 13a according to a change in the working distance WD.
 なお、図3および図4では、1枚のレンズの例が示されている。通常では、FA用のレンズは複数枚の組レンズで構成されることが多い。しかしながら、組レンズにおいても、各レンズの焦点距離fと、レンズ間の位置関係を用いて、合成された焦点距離fおよびレンズ主点の位置を求めることができる。その特性値を用いることでWDを計算することができる。 FIGS. 3 and 4 show an example of one lens. Usually, an FA lens is often composed of a plurality of grouped lenses. However, also in a group lens, the combined focal length f and the position of the lens principal point can be obtained using the focal length f of each lens and the positional relationship between the lenses. WD can be calculated by using the characteristic values.
 図4では、機械的な方式によりフォーカス位置が調整される例が示される。しかし、フォーカス位置の調整の方式は機械的な方式に限定されない。図5は、フォーカス位置調整用レンズの他の一例を示す図である。 FIG. 4 shows an example in which the focus position is adjusted by a mechanical method. However, the method of adjusting the focus position is not limited to a mechanical method. FIG. 5 is a diagram illustrating another example of the focus position adjusting lens.
 図5に例示されているのは、液体レンズである。レンズ12aは、透光性容器70と、電極73a,73b,74a,74bと、絶縁体75a,75bと、絶縁層76a,76bとを含む。 例 示 A liquid lens is illustrated in FIG. The lens 12a includes a translucent container 70, electrodes 73a, 73b, 74a, 74b, insulators 75a, 75b, and insulating layers 76a, 76b.
 透光性容器70内の密閉空間には、水などの導電性液体71と、油などの絶縁性液体72とが充填される。導電性液体71と絶縁性液体72とは混合せず、互いに屈折率が異なる。 密閉 The sealed space in the translucent container 70 is filled with a conductive liquid 71 such as water and an insulating liquid 72 such as oil. The conductive liquid 71 and the insulating liquid 72 are not mixed and have different refractive indexes.
 電極73a,73bは、絶縁体75a,75bと透光性容器70との間にそれぞれ固定され、導電性液体71中に位置する。 The electrodes 73a and 73b are fixed between the insulators 75a and 75b and the translucent container 70, respectively, and are located in the conductive liquid 71.
 電極74a,74bは、導電性液体71と絶縁性液体72との界面の端部付近に配置される。電極74aと導電性液体71および絶縁性液体72との間には絶縁層76aが介在する。電極74bと導電性液体71および絶縁性液体72との間には絶縁層76bが介在する。電極74aと電極74bとは、レンズ12aの光軸に対して対称な位置に配置される。 The electrodes 74a and 74b are arranged near the end of the interface between the conductive liquid 71 and the insulating liquid 72. An insulating layer 76a is interposed between the electrode 74a and the conductive liquid 71 and the insulating liquid 72. An insulating layer 76b is interposed between the electrode 74b and the conductive liquid 71 and the insulating liquid 72. The electrode 74a and the electrode 74b are arranged at positions symmetric with respect to the optical axis of the lens 12a.
 図5に示す構成において、フォーカス調整部12eは、電圧源12e1と、電圧源12e2とを含む。電圧源12e1は、電極74aと電極73aとの間に電圧Vaを印加する。電圧源12e2は、電極74bと電極73bとの間に電圧Vbを印加する。 に お い て In the configuration shown in FIG. 5, the focus adjustment unit 12e includes a voltage source 12e1 and a voltage source 12e2. The voltage source 12e1 applies a voltage Va between the electrode 74a and the electrode 73a. Voltage source 12e2 applies voltage Vb between electrode 74b and electrode 73b.
 電極74aと電極73aとの間に電圧Vaを印加すると、導電性液体71は、電極74aに引っ張られる。同様に、電極74bと電極73bとの間に電圧Vbを印加すると、導電性液体71は、電極74bに引っ張られる。これにより、導電性液体71と絶縁性液体72との界面の曲率が変化する。導電性液体71と絶縁性液体72との屈折率が異なるため、導電性液体71と絶縁性液体72との界面の曲率が変化することにより、レンズ12aの焦点距離(図3に示す焦点距離fに相当)が変化する。当該界面の曲率は、電圧Va,Vbの大きさに依存する。そのため、電圧Va,Vbの大きさを変えることにより、ワーキングディスタンスWDが変化しても撮像面13aに像が形成されるように、レンズ12aのフォーカス位置を調整することができる。 When a voltage Va is applied between the electrode 74a and the electrode 73a, the conductive liquid 71 is pulled by the electrode 74a. Similarly, when a voltage Vb is applied between the electrode 74b and the electrode 73b, the conductive liquid 71 is pulled by the electrode 74b. Thereby, the curvature of the interface between the conductive liquid 71 and the insulating liquid 72 changes. Since the refractive index of the conductive liquid 71 and the refractive index of the insulating liquid 72 are different, the curvature of the interface between the conductive liquid 71 and the insulating liquid 72 changes, thereby changing the focal length of the lens 12a (the focal length f shown in FIG. 3). Changes). The curvature of the interface depends on the magnitude of the voltages Va and Vb. Therefore, by changing the magnitudes of the voltages Va and Vb, the focus position of the lens 12a can be adjusted so that an image is formed on the imaging surface 13a even when the working distance WD changes.
 通常は、電圧Vaと電圧Vbとは同値に制御される。これにより、導電性液体71と絶縁性液体72との界面は、光軸に対して対称に変化する。ただし、電圧Vaと電圧Vbとが異なる値に制御されてもよい。これにより、導電性液体71と絶縁性液体72との界面が光軸に対して非対称となり、撮像装置10の撮像視野の向きを変更することができる。 Normally, voltage Va and voltage Vb are controlled to the same value. Thereby, the interface between the conductive liquid 71 and the insulating liquid 72 changes symmetrically with respect to the optical axis. However, the voltage Va and the voltage Vb may be controlled to different values. Thereby, the interface between the conductive liquid 71 and the insulating liquid 72 becomes asymmetric with respect to the optical axis, and the direction of the imaging visual field of the imaging device 10 can be changed.
 さらに液体レンズと固体レンズとを組み合わせてもよい。この場合、焦点距離fに加えてレンズの主点の位置が変化する。したがって距離bが変化するため、その変化分を考慮したフォーカス調整を行えばよい。 Furthermore, a liquid lens and a solid lens may be combined. In this case, the position of the principal point of the lens changes in addition to the focal length f. Accordingly, since the distance b changes, the focus adjustment may be performed in consideration of the change.
 <B.画像処理装置>
 画像処理装置20は、CPU(Central Processing Unit)、RAM(Random Access Memory)、ROM(Read Only Memory)、補助記憶装置、通信I/F等を含み、情報処理を行なう。補助記憶装置は、たとえば、ハードディスクドライブ、ソリッドステートドライブ等で構成され、CPUが実行するプログラム等を記憶する。
<B. Image processing device>
The image processing device 20 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), an auxiliary storage device, a communication I / F, and performs information processing. The auxiliary storage device includes, for example, a hard disk drive, a solid state drive, and the like, and stores a program executed by the CPU.
 画像処理装置20の判定部21は、たとえば、記憶部23に予め記憶しておいた良品ワークの画像との間の差分画像を2値化し、しきい値を超えた画素数と基準値とを照合することにより、検査対象位置の良否を判定してもよい。画像処理装置20の出力部22は、判定結果を入力/表示装置60に表示してもよい。 The determination unit 21 of the image processing device 20 binarizes the difference image between the image of the non-defective work and the image of the non-defective work stored in the storage unit 23 in advance, and determines the number of pixels exceeding the threshold value and the reference value. The quality of the inspection target position may be determined by collation. The output unit 22 of the image processing device 20 may display the determination result on the input / display device 60.
 <C.ワークのフォーカス位置および位置姿勢の探索の課題>
 図6は、撮像装置による撮像により得られた画像を模式的に示した図である。図6に示すように、画像100は、ワークWの像を含む。撮像装置10のレンズからワークまでの距離(ワークディスタンス)は、ワークごとに異なり得るので、フォーカスの合ったワークWの像を得るためには、ワークWごとに最適なフォーカス位置を特定することが必要である。
<C. Challenges of Searching Work Focus Position and Position>
FIG. 6 is a diagram schematically illustrating an image obtained by imaging by the imaging device. As shown in FIG. 6, the image 100 includes an image of the work W. Since the distance (work distance) from the lens of the imaging device 10 to the work may differ for each work, in order to obtain an image of the work W in focus, it is necessary to specify an optimum focus position for each work W. is necessary.
 さらに、ワークWの撮像のため、ワークWは、本来は、ステージ90の予め定められた位置に、予め定められた姿勢で載置される。しかしながら、ワークWはステージ90を動かすことによって、撮像装置10の下に移動する。ワークWの移動により、ワークWの位置が本来の撮影位置からずれている可能性、あるいは、ワークWがステージ90上で回転している可能性がある。位置姿勢のずれの程度もワークごとに異なり得る。 Furthermore, in order to image the work W, the work W is originally placed at a predetermined position on the stage 90 in a predetermined posture. However, the work W moves below the imaging device 10 by moving the stage 90. Due to the movement of the work W, there is a possibility that the position of the work W is deviated from the original shooting position, or there is a possibility that the work W is rotating on the stage 90. The degree of displacement of the position and orientation may also differ for each work.
 したがってワークWを撮像する際には、ワークWのフォーカス位置に加え、画像100内のワークWの位置姿勢を特定することが求められる。このための方法として、たとえば以下の2通りの方法が考えられる。 Therefore, when imaging the work W, it is required to specify the position and orientation of the work W in the image 100 in addition to the focus position of the work W. As a method for this, for example, the following two methods can be considered.
 図7は、ワークWのフォーカス位置および位置姿勢を特定するための第1の方法を説明した模式図である。図7に示すように、撮像装置10がフォーカス位置を変えながらワークWを撮像して、複数(M枚)の画像を取得する。画像処理装置20は、そのM枚の各々において、ワークWの画像パターンを検索する。そして、画像処理装置20は、M枚の画像から、ワークWに最もフォーカスの合った1枚の画像を特定する。 FIG. 7 is a schematic diagram illustrating a first method for specifying the focus position and the position and orientation of the work W. As illustrated in FIG. 7, the imaging device 10 captures an image of the work W while changing the focus position, and acquires a plurality (M) of images. The image processing apparatus 20 searches for an image pattern of the work W on each of the M sheets. Then, the image processing device 20 specifies one image that is most focused on the workpiece W from the M images.
 上述の方法を実行する場合には、処理時間が長くなるという課題がある。たとえば製造ラインを流れる製品の外観検査等においては、短時間での処理が要求される。処理時間が長いことは、検査のスループットを低下させる要因となる。 When the above method is executed, there is a problem that the processing time becomes long. For example, in a visual inspection of a product flowing on a manufacturing line, a process in a short time is required. A long processing time causes a reduction in inspection throughput.
 図8は、ワークWのフォーカス位置および位置姿勢を特定するための第2の方法を説明した模式図である。第2の方法では、フォーカス位置を決めてから画像を探索をする。第1の方法と同じように、撮像装置10が、フォーカス位置を変化させながらワークWを撮像して、M枚の画像を取得する。画像処理装置20は、画像処理によりM枚の画像の合焦度合いを評価して、最適なフォーカス位置を決定する。撮像装置10は、M枚の画像の中から、その最適なフォーカス位置でワークWを撮像した画像を選択して、その画像において画像パターンを検索する。 FIG. 8 is a schematic diagram illustrating a second method for specifying the focus position and the position and orientation of the work W. In the second method, an image is searched after a focus position is determined. As in the first method, the imaging device 10 images the work W while changing the focus position, and acquires M images. The image processing device 20 evaluates the degree of focusing of the M images by image processing, and determines an optimal focus position. The imaging device 10 selects an image obtained by imaging the work W at the optimum focus position from among the M images, and searches for an image pattern in the image.
 この方法によれば、最適なフォーカス位置を求めるために、画像処理装置20は、ワークWの像の画像内の位置およびワークWの姿勢を予め把握しておく必要がある。しかし、上述の通り、ワークWの位置姿勢はワークWごとに異なり得るため、ワークWの位置姿勢を予め特性することができない。このため探索経路が長くなる。言い換えると、最適なフォーカス位置およびワークの位置姿勢の特定に時間がかかる。したがって本実施の形態では、探索経路を短くする探索方法が用いられる。 According to this method, in order to obtain the optimum focus position, the image processing device 20 needs to know the position of the image of the work W in the image and the posture of the work W in advance. However, as described above, since the position and orientation of the work W can be different for each work W, the position and orientation of the work W cannot be characterized in advance. Therefore, the search route becomes longer. In other words, it takes time to specify the optimum focus position and the position and orientation of the work. Therefore, in the present embodiment, a search method for shortening the search route is used.
 <D.本実施の形態に係る探索方法>
 図9は、本実施の形態に係る探索方法を実行するために指定される画像内の領域を示した模式図である。図9に示すように、画像100内の所定の領域が基準領域B1として設定される。基準領域B1の画像パターンが、モデルとして画像処理装置20に登録される。次に、基準領域B1を含む画像100内のある範囲が、検索範囲A1として設定される。さらに、検索範囲A1およびフォーカス位置の探索範囲が画像処理装置20に登録される。
<D. Search method according to present embodiment>
FIG. 9 is a schematic diagram showing an area in an image designated to execute the search method according to the present embodiment. As shown in FIG. 9, a predetermined area in the image 100 is set as a reference area B1. The image pattern of the reference area B1 is registered in the image processing device 20 as a model. Next, a certain range in the image 100 including the reference area B1 is set as a search range A1. Further, the search range A1 and the search range of the focus position are registered in the image processing device 20.
 図10は、本実施の形態に係る探索方法を説明するための模式図である。図10に示すように、本実施の形態では、まず、フォーカス位置の探索範囲内で、フォーカス位置の異なる画像群をまとめて撮影する。これにより、N枚の画像が得られる。フォーカス位置の探索範囲は、たとえば撮像装置10の焦点制御可能な範囲である。しかしながらフォーカス位置の探索範囲は、焦点制御可能な範囲よりも狭くてもよい。 FIG. 10 is a schematic diagram for explaining the search method according to the present embodiment. As shown in FIG. 10, in the present embodiment, first, a group of images having different focus positions is collectively photographed within the focus position search range. As a result, N images are obtained. The search range of the focus position is, for example, a range in which the focus of the imaging device 10 can be controlled. However, the focus position search range may be narrower than the focus controllable range.
 次に、画像処理装置20は、N枚の画像群を間引くことにより、M枚の画像を抽出する。M枚の画像は、いわばN枚の画像群の部分集合である。抽出されたM枚の画像に対して、登録した画像モデルを用いてパターンマッチングを実行する。これにより、M枚の画像から、合焦度およびパターンのマッチングの度合いともに最も高い1枚の画像が選択される。 Next, the image processing device 20 extracts M images by thinning out the N image groups. The M images are a so-called subset of a group of N images. Pattern matching is performed on the extracted M images using the registered image model. As a result, one image having the highest focusing degree and the highest degree of pattern matching is selected from the M images.
 続いて、パターンマッチングの結果に基づき検索領域を限定するとともに、フォーカス位置の異なる複数の画像を抽出する。このときに選ばれた複数の画像は、前回に選択された1枚の画像を含む。さらに複数の画像の間でのフォーカス位置の変化の度合いが、前回の探索において選ばれた複数の画像の間でのフォーカス位置の変化の度合いよりも小さくなっている。すなわち、複数の画像の間でフォーカス位置が細かく変化する。画像処理装置20は、複数の画像の合焦度を判定して、最も合焦度の高い画像を選択する。その画像において画像処理装置20は、検索範囲を狭くしてパターンマッチングを実行する。このように、フォーカス位置の変化の度合いを小さくしながら、検索範囲を狭くしてパターンマッチングを実行することを繰り返す。 Subsequently, the search area is limited based on the result of the pattern matching, and a plurality of images having different focus positions are extracted. The plurality of images selected at this time include one image selected last time. Further, the degree of change of the focus position among the plurality of images is smaller than the degree of change of the focus position between the plurality of images selected in the previous search. That is, the focus position changes finely among a plurality of images. The image processing device 20 determines the degree of focus of a plurality of images and selects the image with the highest degree of focus. In the image, the image processing device 20 executes pattern matching by narrowing the search range. As described above, the pattern matching is repeatedly performed while the search range is narrowed while the degree of change in the focus position is reduced.
 図11は、本実施の形態による探索方法による探索経路を示した図である。図11に示されるように、フォーカス位置の変化の度合いを小さくしながら、検索範囲を限定してパターンマッチングを実行することを繰り返す。フォーカス位置および位置姿勢の探索を、粗探索から密探索へと移行させる。探索のたびに探索経路を短くすることができる。したがって本実施の形態によれば、より短時間で、対象物のフォーカス位置と画像内位置姿勢とを特定することができる。 FIG. 11 is a diagram showing a search route by the search method according to the present embodiment. As shown in FIG. 11, the pattern matching is repeatedly performed while limiting the search range while reducing the degree of change in the focus position. The search for the focus position and the position and orientation is shifted from the coarse search to the fine search. The search route can be shortened for each search. Therefore, according to the present embodiment, it is possible to specify the focus position and the position and orientation within the image in a shorter time.
 <E.本実施の形態に係る探索方法のフロー>
 以下に説明する処理は、画像処理装置20の判定部21が、記憶部23に記憶されたプログラムを読み出すことにより実行される。
<E. Flow of search method according to present embodiment>
The processing described below is executed by the determination unit 21 of the image processing device 20 reading out the program stored in the storage unit 23.
 図12は、本実施の形態に係る探索方法のための事前設定のフローを示したフローチャートである。図12を参照して、まずステップS11において、検査対象物(ワーク)のサンプルが準備される。ステップS12において、サンプルが規定の位置(撮像装置10のレンズの前)に配置される。ステップS13において、撮像装置10のレンズが制御されて、対象物(サンプル)に焦点を合わせる。 FIG. 12 is a flowchart showing a flow of the pre-setting for the search method according to the present embodiment. Referring to FIG. 12, first, in step S11, a sample of the inspection object (work) is prepared. In step S12, the sample is placed at a specified position (in front of the lens of the imaging device 10). In step S13, the lens of the imaging device 10 is controlled to focus on an object (sample).
 ステップS14において、ユーザが画像内の基準領域を指定する。たとえば入力/表示装置60のディスプレイにサンプルの画像100が表示される。ユーザが、ポインティングデバイス等の入力デバイスを操作することにより、画像内の基準領域の指定が受け付けられて、その領域が基準領域B1として設定される。ステップS15において、画像処理装置20は、基準領域B1の画像パターンを記憶部23に格納する。これにより画像パターンがモデルとして画像処理装置20に登録される。 In step S14, the user specifies a reference area in the image. For example, a sample image 100 is displayed on the display of the input / display device 60. When the user operates an input device such as a pointing device, the designation of a reference region in the image is accepted, and the region is set as the reference region B1. In step S15, the image processing device 20 stores the image pattern of the reference area B1 in the storage unit 23. As a result, the image pattern is registered in the image processing device 20 as a model.
 ステップS16において、ユーザが画像内の検索範囲を指定する。ステップS14の処理と同様に、たとえば入力/表示装置60のディスプレイにサンプルの画像100が表示される。ユーザが、ポインティングデバイス等の入力デバイスを操作することにより、画像内の検索範囲の指定が受け付けられて、その領域が検索範囲A1として設定される。画像処理装置20は、検索範囲A1の情報を記憶部23に格納する。これにより指定された検索範囲が画像処理装置20に登録される。 In step S16, the user specifies a search range in the image. As in the process of step S14, the sample image 100 is displayed on the display of the input / display device 60, for example. When the user operates an input device such as a pointing device, the designation of a search range in the image is accepted, and the area is set as the search range A1. The image processing device 20 stores the information of the search range A1 in the storage unit 23. As a result, the specified search range is registered in the image processing apparatus 20.
 ステップS17において、ユーザがフォーカス位置の探索範囲を指定する。入力/表示装置60は、フォーカス位置の探索範囲の指定を受け付ける。画像処理装置20は、フォーカス位置の探索範囲の情報を記憶部23に格納する。これにより指定されたフォーカス位置の探索範囲が画像処理装置20に登録される。 In step S17, the user specifies a search range of the focus position. The input / display device 60 receives designation of a search range of a focus position. The image processing device 20 stores information on the search range of the focus position in the storage unit 23. Thereby, the search range of the designated focus position is registered in the image processing device 20.
 図13は、探索のフローを示したフローチャートである。図13を参照して、ステップS21において、撮像装置10は、フォーカス位置の探索範囲内でフォーカス位置の異なる画像群をまとめて撮影する。これにより、N枚(Nは2以上の整数)の画像が得られる。各画像のフォーカス位置は、被写界深度の漏れがないように配置されていることが望ましい。 FIG. 13 is a flowchart showing a search flow. Referring to FIG. 13, in step S21, the imaging device 10 collectively captures a group of images having different focus positions within the focus position search range. As a result, N (N is an integer of 2 or more) images are obtained. It is desirable that the focus position of each image is arranged so that the depth of field does not leak.
 ステップS22において、画像処理装置20は、N枚の画像群から間引いてM枚の画像(部分集合)を抽出する。ここでMは、1<M<Nを満たす整数である。抽出の方法は特に限定されない。例えば、等間隔(所定の画像の枚数ごと)に画像を抽出するのでもよい。 In step S22, the image processing apparatus 20 extracts M images (subsets) by thinning out the N image groups. Here, M is an integer satisfying 1 <M <N. The extraction method is not particularly limited. For example, images may be extracted at equal intervals (for each predetermined number of images).
 ステップS23において、画像処理装置20は、M枚の画像に対して、ステップS15(図12を参照)において登録したモデルを用いて、パターンマッチングを実行する。ここでパターンマッチングの自由度は、X,Y,θに関する自由度である。XおよびYは画像内の直交する2つの軸をそれぞれ表す。θは、画像内での回転角度を表す。 In step S23, the image processing device 20 performs pattern matching on the M images using the model registered in step S15 (see FIG. 12). Here, the degree of freedom of pattern matching is the degree of freedom regarding X, Y, and θ. X and Y respectively represent two orthogonal axes in the image. θ represents the rotation angle in the image.
 ステップS24において、画像処理装置20は、高い相関値(一致度)を与える複数の候補点を求める。候補点の属性は、相関値、X、Y、θである。 In step S24, the image processing device 20 obtains a plurality of candidate points that give a high correlation value (degree of coincidence). The attributes of the candidate points are correlation values, X, Y, and θ.
 相関値の算出方法は、公知の種々の方法を適用することができる。一例としては、正規化相互相関法を用いることができる。正規化相互相関法は、画像パターンを検索する方法として広く用いられるものであり、対象物を映した画像の濃淡情報を統計的に処理する方法である(たとえば「正規化相互相関法」(画像解析ハンドブック、高木幹雄・下田陽久監修、東京大学出版会)を参照)。あるいは、特開平7-78257号公報に開示されるような相関法を用いてもよい。この方法では、カメラで濃淡画像を撮像し、その濃淡画像の各画素位置における評価値を算出する。そして評価値が走査線方向に連続してしきい値以上となる点列の中の最大評価値を検出する。 As a calculation method of the correlation value, various known methods can be applied. As an example, a normalized cross-correlation method can be used. The normalized cross-correlation method is widely used as a method of searching for an image pattern, and is a method of statistically processing the grayscale information of an image showing an object (for example, the “normalized cross-correlation method” (image Analysis Handbook, edited by Mikio Takagi and Hirohisa Shimoda, University of Tokyo Press). Alternatively, a correlation method as disclosed in JP-A-7-78257 may be used. In this method, a grayscale image is captured by a camera, and an evaluation value at each pixel position of the grayscale image is calculated. Then, the maximum evaluation value in the point sequence in which the evaluation value continuously exceeds the threshold value in the scanning line direction is detected.
 相関法としては、エッジベース相関法を用いてもよい。エッジベース相関法とは、対象物を映した画像のエッジ(境界)部分の特徴を用いて、一致度を算出する方法である。一致度には、エッジ強度あるいはエッジ勾配が反映される。たとえば特開平10-171989号公報に開示されるような方法を用いてもよい。この方法では、入力画像の各画素における濃度勾配方向が算出されて、入力画像の濃度勾配方向と、所定のパターンの各画素の濃度勾配方向との間で濃度勾配方向の差異を評価する値が求められる。 エ ッ ジ As a correlation method, an edge-based correlation method may be used. The edge-based correlation method is a method of calculating a degree of coincidence using characteristics of an edge (boundary) portion of an image showing an object. The degree of coincidence reflects edge strength or edge gradient. For example, a method as disclosed in JP-A-10-171989 may be used. In this method, a density gradient direction at each pixel of the input image is calculated, and a value for evaluating a difference in the density gradient direction between the density gradient direction of the input image and the density gradient direction of each pixel of a predetermined pattern is calculated. Desired.
 ステップS25において、画像処理装置20は、検索範囲を再設定する。一実施形態として、画像処理装置20は、検索範囲を候補点位置付近の領域に限定して、フォーカス位置の異なる複数の画像の合焦度を判定する。すなわち、検索範囲は、候補点を含む範囲に狭められる。画像処理装置20は、複数の画像の中から最も合焦度の高い画像を選択する。たとえば、各画像の候補点位置付近の領域からエッジが抽出される。合焦度は、そのエッジに基づいて判定される。 In step S25, the image processing device 20 resets the search range. As one embodiment, the image processing apparatus 20 determines the degree of focus of a plurality of images having different focus positions by limiting the search range to an area near the candidate point position. That is, the search range is narrowed to a range including the candidate points. The image processing device 20 selects an image with the highest focus degree from among the plurality of images. For example, an edge is extracted from a region near a candidate point position of each image. The degree of focus is determined based on the edge.
 ステップS26において、画像処理装置20は、選択された画像に対して、パターンマッチングを実行する。このときに検索範囲の再設定によって検索範囲が限定されてもよい。パターンマッチングの代表的なパラメータとして、「サーチ刻み」と「モデル間引き」とがある。「サーチ刻み」とは探索の刻み(分解能)を意味するものであり、「モデル間引き」とは、モデルとする画像パターンを、全画素で表現せずに、間引いた画素によって近似的に表現するものである。探索の初期段階である粗探索では、これらのパラメータの値を大きくしておき、詳細探索(密探索)に向かうにつれて、これらのパラメータの値を次第に小さくする。このような手法は、パターンマッチングでは常套的な手法である。ただし従来では、1枚の画像の探索に閉じて利用されてきた。本実施の形態では、画像内に閉じずに上記の方法を利用する。これによって、探索の効率化(高速化)を期待することができる。 In step S26, the image processing device 20 performs pattern matching on the selected image. At this time, the search range may be limited by resetting the search range. Representative parameters of pattern matching include “search increments” and “model thinning”. “Search step” means a search step (resolution), and “model thinning” does not express an image pattern to be a model with all pixels, but approximately expresses it with thinned pixels. Things. In the coarse search, which is an initial stage of the search, the values of these parameters are increased, and the values of these parameters are gradually reduced as the search proceeds to the detailed search (dense search). Such a method is a conventional method in pattern matching. However, conventionally, it has been used closed for searching for one image. In the present embodiment, the above method is used without closing within an image. This can be expected to improve the search efficiency (speed up).
 ステップS27において、画像処理装置20は、高い相関値を与える候補点を求める。上記の通り、候補点の属性は、相関値、X、Y、θである。 In step S27, the image processing device 20 obtains a candidate point that gives a high correlation value. As described above, the attributes of the candidate point are the correlation value, X, Y, and θ.
 ステップS28において、画像処理装置20は、終了条件が満たされているかどうかを判定する。なお、終了の条件は特に限定されるものではない。たとえば最終の探索の分解能を予め規定しておいて、その分解能に達した段階で、全体の処理を終了してもよい。たとえばxy方向の位置については最終の分解能を1ピクセルとしてもよい。角度については、最終の分解能を1degとしてもよい。フォーカス位置については、レンズ繰り出し量の最終の分解能を1mmとしてもよい。 In step S28, the image processing device 20 determines whether the termination condition is satisfied. Note that the termination condition is not particularly limited. For example, the resolution of the final search may be defined in advance, and the entire processing may be terminated when the resolution is reached. For example, for the position in the xy direction, the final resolution may be one pixel. As for the angle, the final resolution may be 1 deg. Regarding the focus position, the final resolution of the lens extension amount may be 1 mm.
 また、探索の終了後に、相関値を利用した各種の手法(たとえばパラボラフィッティングなど)により、刻み幅よりも細かい精度の推定(サブピクセル推定)を行ってもよい。 (5) After the search is completed, estimation with a precision smaller than the step size (sub-pixel estimation) may be performed by various methods (for example, parabola fitting or the like) using correlation values.
 終了条件が満たされない場合、処理はステップS25に戻り、ステップS25~S28の処理が反復される。これによりフォーカス位置と画像内位置姿勢の精度を高めることができる。終了条件が満たされると(S28においてYES)、探索処理は終了する。したがって、フォーカス位置および画像内のワークの位置姿勢が最適化された状態となる。 If the end condition is not satisfied, the process returns to step S25, and the processes of steps S25 to S28 are repeated. Thereby, the accuracy of the focus position and the position and orientation in the image can be improved. When the termination condition is satisfied (YES in S28), the search processing ends. Therefore, the focus position and the position and orientation of the work in the image are optimized.
 本実施の形態では、反復の初期は粗い探索を実行し、反復の回数が多いほど細かい探索を実行する。これにより、精度を低下させることなく探索処理を高速化することができる。候補点が複数ある場合は、相関値を用いて最終的に1点のみ残してもよいし、複数出力してもよい。 In the present embodiment, a coarse search is performed at the beginning of the iteration, and a finer search is performed as the number of iterations increases. This makes it possible to speed up the search process without lowering the accuracy. When there are a plurality of candidate points, only one point may be finally left using the correlation value, or a plurality of points may be output.
 <F.本実施の形態による効果>
 本実施の形態によれば、短時間の探索が可能である。このような効果について具体例を挙げて説明する。前提となる条件を以下とする。
<F. Effect of the present embodiment>
According to the present embodiment, a short-time search can be performed. Such an effect will be described with a specific example. The preconditions are as follows.
 ・画像サイズ:1600×1200ピクセル
 ・基準パターンのサイズ:300×300ピクセル
 ・探索範囲:1000×1000ピクセル
 ・WDの探索範囲:100mm~500mm
 ・1枚の画像からパターンを検索するのに要する時間:20ms・・・(1)(一般的には下記(2)の時間と(3)の時間との合計である)
 ・1枚の画像からパターンを粗く検索するのにかかる時間:10ms・・・(2)(候補点が求められる)
 ・画像の候補点周辺でパターンを細かく検索するのにかかる時間:10ms・・・(3)
 ・フォーカス位置を現在位置から変化させて1回の画像撮影にかかる時間:10ms
 ・WDの探索範囲におけるフォーカス位置のバリエーション数:20枚
 ・合焦点の評価値計算にかかる時間:5ms
 (全探索の場合)
 ・フォーカス位置を変化させた画像撮影に要する時間:10ms×20回=200ms
 ・画像パターンの検索時間:20ms×20枚=400ms
 ・合計時間:600ms (本実施の形態による探索の場合)
 ・フォーカス位置を(間引いて)変化させた画像撮影に要する時間:10ms×5回=50ms
 ・間引いた画像セットから画像パターンを粗く検索する時間:10ms×5枚=50ms
 ・見つかった候補の周辺で画像撮影に要する時間:10ms×4回=40ms
 ・フォーカス位置の探索に要する時間:5ms×4回=20ms
 ・候補1点の周辺で画像パターンを細かく検索する時間:10ms×1回=10ms
 ・合計時間:170ms
 以上のように、従来のような全探索に比べて、本実施の形態に係る探索は3~4倍程度高速である。なお、上記の例によれば、撮像した画像の枚数は9枚(=5枚+4枚)であるが、仮に、全探索の場合と同じく、フォーカス位置を変化させて20枚の画像を撮像した場合にも、本実施の形態による探索では、全探索に比べて高速である。画像撮影に要する時間は、10ms×20回=200msであるが、他の処理に要する時間は同じであるので、合計時間は320msとなる。したがって本実施の形態による探索は、全探索の場合に比べて約2倍高速である。
・ Image size: 1600 × 1200 pixels ・ Reference pattern size: 300 × 300 pixels ・ Search range: 1000 × 1000 pixels ・ WD search range: 100 mm to 500 mm
Time required to search for a pattern from one image: 20 ms (1) (generally, the sum of the times (2) and (3) below)
Time required to roughly search for a pattern from one image: 10 ms (2) (candidate points are obtained)
Time required to search for a pattern in detail around a candidate point of an image: 10 ms (3)
-Time required for one image shooting by changing the focus position from the current position: 10 ms
-Number of variations of the focus position in the WD search range: 20-Time required to calculate the evaluation value of the focal point: 5 ms
(In case of full search)
Time required for image capturing with a changed focus position: 10 ms × 20 times = 200 ms
-Image pattern search time: 20 ms x 20 sheets = 400 ms
・ Total time: 600 ms (in the case of search according to the present embodiment)
Time required for image shooting with the focus position changed (decimated): 10 ms x 5 times = 50 ms
Time to roughly search for an image pattern from a thinned image set: 10 ms x 5 images = 50 ms
・ Time required for image capturing around the found candidate: 10 ms × 4 times = 40 ms
Time required for searching for a focus position: 5 ms × 4 times = 20 ms
・ Time to search for an image pattern finely around one candidate point: 10 ms × 1 time = 10 ms
・ Total time: 170ms
As described above, the search according to the present embodiment is about three to four times faster than the conventional full search. According to the above example, the number of captured images is nine (= 5 + 4). However, as in the case of full search, 20 images were captured by changing the focus position. Also in this case, the search according to the present embodiment is faster than the full search. The time required for image capturing is 10 ms × 20 times = 200 ms, but the time required for other processing is the same, so the total time is 320 ms. Therefore, the search according to the present embodiment is about twice as fast as the full search.
 <G.本実施の形態による探索の応用例>
 図14は、本実施の形態による探索の第1の応用例を示した図である。図14に示すように、本実施の形態による探索は、ワークWの外観検査に用いることができる。図14に示した例において、ワークWは、領域W1と領域W2とを有する。領域W1は、たとえば透明体(ガラスなど)の表面である。領域W2は領域W1を囲む領域であり、たとえば電子機器の筐体の表面である。このようなワークWの例として、ディスプレイを有する電子機器(一例では、スマートフォンあるいはタブレットなど)を挙げることができる。すなわち領域W1は、表示画面でありえる。
<G. Application example of search according to present embodiment>
FIG. 14 is a diagram showing a first applied example of the search according to the present embodiment. As shown in FIG. 14, the search according to the present embodiment can be used for the appearance inspection of the work W. In the example shown in FIG. 14, the work W has an area W1 and an area W2. The region W1 is, for example, the surface of a transparent body (such as glass). The region W2 is a region surrounding the region W1, for example, a surface of a housing of the electronic device. Examples of such a work W include an electronic device having a display (for example, a smartphone or a tablet). That is, the area W1 can be a display screen.
 図15は、ワーク表面の検査領域を示した模式図である。図15を参照して、検査領域A2がワークWの領域W1内に設定される。外観検査では、検査領域A2を含む領域が撮像装置10により撮像されて、検査領域内に傷、汚れ、あるいは異物の有無などが検査される。 FIG. 15 is a schematic diagram showing an inspection area on the work surface. Referring to FIG. 15, inspection area A2 is set in area W1 of work W. In the appearance inspection, an area including the inspection area A2 is imaged by the imaging device 10, and the inspection area is inspected for the presence or absence of a flaw, dirt, or foreign matter.
 この例では、検査領域とは別に、フォーカスの基準となる領域(基準領域B1)を設定してもよい。図16は、検査領域A2および基準領域B1の設定例を示した図である。図16を参照して、基準領域B1は、明確なパターンを含む領域である。図16に示す例によれば、基準領域B1は、領域W2の一部を含む。たとえば基準領域は、領域W2の端部を含む領域であり、ワークWの外郭線を含む。したがって基準領域B1は、ワークWと、ワークWの周囲との境界を含む。これにより、基準領域B1は、明確なパターンを含むことができる。このように、基準領域B1の画像は、合焦度を判定可能な程度のコントラストを有している。したがって、基準領域B1からフォーカス位置および画像内のワークWの位置姿勢を求めることができる。 In this example, an area (reference area B1) serving as a focus reference may be set separately from the inspection area. FIG. 16 is a diagram showing a setting example of the inspection area A2 and the reference area B1. Referring to FIG. 16, reference region B1 is a region including a clear pattern. According to the example shown in FIG. 16, the reference area B1 includes a part of the area W2. For example, the reference area is an area including an end of the area W2 and includes an outline of the work W. Therefore, the reference area B1 includes a boundary between the work W and the periphery of the work W. Thereby, the reference area B1 can include a clear pattern. Thus, the image of the reference area B1 has such a contrast that the degree of focus can be determined. Therefore, the focus position and the position and orientation of the work W in the image can be obtained from the reference area B1.
 実際の検査に先立って、フォーカス位置を設定する。サンプルのワークWを撮像装置10により撮像して、ワークWの画像を取得する。ワークWの画像から基準領域B1および検査領域A2がそれぞれ指定されて、基準領域B1および検査領域A2の画像が画像処理装置20に登録される。さらに、基準領域B1にフォーカスが合うときのフォーカス位置、および検査領域A2にフォーカスが合うときのフォーカス位置がそれぞれ画像処理装置20に登録される。本実施の形態に係る探索を、フォーカス位置の探索に用いることができる。さらに、本実施の形態に係る探索を、画像内のワークWのXY位置のずれを修正するために用いてもよい。 フ ォ ー カ ス Set the focus position before the actual inspection. The sample work W is imaged by the imaging device 10 to obtain an image of the work W. The reference region B1 and the inspection region A2 are respectively specified from the image of the work W, and the images of the reference region B1 and the inspection region A2 are registered in the image processing device 20. Further, a focus position when the reference area B1 is in focus and a focus position when the inspection area A2 is in focus are registered in the image processing apparatus 20, respectively. The search according to the present embodiment can be used for searching for a focus position. Further, the search according to the present embodiment may be used to correct the deviation of the XY position of the workpiece W in the image.
 図17は、本実施の形態による探索の第2の応用例を示した図である。第2の応用例は、ワークのピッキングである。図17に示すように、ベルトコンベア91上を運ばれるワークW(たとえば部品)がロボット30で把持されて移設される。ロボットは、たとえばパラレルリンクロボットであるがこれに限定されない。 FIG. 17 is a diagram showing a second applied example of the search according to the present embodiment. A second application is picking of a workpiece. As shown in FIG. 17, a work W (for example, a part) carried on the belt conveyor 91 is gripped by the robot 30 and transferred. The robot is, for example, a parallel link robot, but is not limited to this.
 従来は、二次元カメラにより部品を認識していた。しかし、部品が横倒しになったり、互いに重なりあったりする場合に、把持に失敗することがあった。このようなケースでは、本来は、把持不可能な部品を避けて、把持可能な部品だけを確実に把持することが望ましい。三次元センサを用いれば、部品の高さや姿勢を認識することができるので、このような状況にも対応できる。しかし三次元センサは、二次元カメラに比べてコストが高いという点で問題がある。本実施の形態によれば、特殊なセンサを用いることなく、部品の位置と高さを認識できる。したがって、ローコストなシステムで、把持可能な部品だけを確実に把持することが可能である。 Conventionally, parts were recognized using a two-dimensional camera. However, when the parts lie down or overlap each other, the gripping may fail. In such a case, it is originally desirable to securely hold only the grippable parts while avoiding the non-grabable parts. If a three-dimensional sensor is used, it is possible to recognize the height and posture of the component, so that it is possible to cope with such a situation. However, three-dimensional sensors have a problem in that they are more expensive than two-dimensional cameras. According to the present embodiment, the position and height of the component can be recognized without using a special sensor. Therefore, it is possible to reliably grip only the grippable components with a low-cost system.
 図18は、本実施の形態による探索の第3の応用例を示した図である。第3の応用例は、物品(ワークに限定されない)の表面のバーコードあるいは文字の読み取りである。図18を参照して、撮像装置10は、ベルトコンベア91上を運ばれるワークW(たとえば部品あるいは梱包箱)に印字されたバーコード5を撮像する。画像処理装置20は、撮像装置10から画像を取得して、バーコード5を読み取る。バーコードに限らず、本実施の形態では、文字あるいは二次元コードの認識にも利用することができる。 FIG. 18 is a diagram showing a third applied example of the search according to the present embodiment. A third application is reading of bar codes or characters on the surface of an article (not limited to a work). Referring to FIG. 18, imaging apparatus 10 captures an image of barcode 5 printed on work W (for example, a component or a packing box) carried on belt conveyor 91. The image processing device 20 acquires an image from the imaging device 10 and reads the barcode 5. The present embodiment can be used not only for barcodes but also for recognition of characters or two-dimensional codes.
 品種の異なる製造物がベルトコンベア91を流れる場合、様々な高さのワークに対して、バーコード5等の読み取りを行う必要がある。従来の撮像方法では、フォーカスの合う画像を得るために、レンズ絞りを絞った状態(すなわちFナンバーを大きくした状態)にして、被写界深度を深くして読み取りを行う方法が用いられていた。しかし、レンズ絞りを絞ることで撮像装置に入る光の量が少なくなる。したがって、光量を補うために、大型の照明を準備する必要があった。別の方法として、高さセンサと可変焦点レンズとを組み合わせる方法もあるが、その場合には、高さセンサを別途準備する必要がある。 場合 When products of different varieties flow on the belt conveyor 91, it is necessary to read the bar code 5 and the like for works of various heights. In a conventional imaging method, in order to obtain an in-focus image, a method is used in which the lens aperture is reduced (that is, the F-number is increased) and the depth of field is increased to perform reading. . However, reducing the lens aperture reduces the amount of light that enters the imaging device. Therefore, it was necessary to prepare a large-sized illumination in order to supplement the light quantity. As another method, there is a method of combining a height sensor and a variable focus lens. In that case, however, it is necessary to separately prepare a height sensor.
 本実施の形態によれば、外部センサや大型照明を必要とせずとも、短時間かつ正確にフォーカス位置と画像パターンとを特定することができる。バーコードあるいは文字等を短時間で正確に読み取ることができる。 According to the present embodiment, the focus position and the image pattern can be specified in a short time and accurately without the need for an external sensor or large illumination. Bar codes or characters can be read accurately in a short time.
 <H.付記>
 以上のように、本実施の形態は以下のような開示を含む。
<H. Appendix>
As described above, the present embodiment includes the following disclosure.
 (構成1)
 対象物(W)を撮像して、前記対象物(W)を含む画像を生成する撮像部(10)と、
 前記撮像部(10)から前記画像を取得して、前記画像に対して画像パターンのマッチングを実行する画像処理部(20)とを備え、
 前記画像パターンは、前記対象物(W)の前記画像内の位置姿勢の基準となるパターンであり、
 前記撮像部(10)は、フォーカス位置を変化させることが可能であり、
 前記画像処理部(20)は、前記フォーカス位置の範囲を狭めながら前記画像パターンの前記マッチングを実行することにより、前記フォーカス位置と前記位置姿勢とを探索して、前記位置姿勢と前記フォーカス位置とを特定する、画像処理システム(1)。
(Configuration 1)
An imaging unit (10) configured to capture an image of the object (W) and generate an image including the object (W);
An image processing unit (20) that acquires the image from the imaging unit (10) and executes image pattern matching on the image.
The image pattern is a pattern serving as a reference for a position and orientation of the object (W) in the image,
The imaging unit (10) is capable of changing a focus position,
The image processing unit (20) searches for the focus position and the position / posture by executing the matching of the image pattern while narrowing the range of the focus position, and searches the position / posture and the focus position. An image processing system (1) for specifying
 (構成2)
 前記画像処理部(20)は、前記フォーカス位置および前記位置姿勢の探索を、粗探索から密探索へと移行させる、構成1に記載の画像処理システム(1)。
(Configuration 2)
The image processing system (1) according to Configuration 1, wherein the image processing unit (20) shifts the search for the focus position and the position and orientation from a coarse search to a fine search.
 (構成3)
 前記画像処理部(20)は、前記画像内に前記マッチングのための検索範囲を設定して、前記マッチングの結果から求められた一致度に基づいて前記検索範囲を再設定する、構成1または構成2に記載の画像処理システム(1)。
(Configuration 3)
The image processing unit (20), wherein a search range for the matching is set in the image, and the search range is reset based on the matching degree obtained from the matching result. 3. The image processing system (1) according to item 2.
 (構成4)
 前記画像処理部(20)は、前記一致度から少なくとも1つの候補点を求めて、前記少なくとも1つの候補点が前記検索範囲に含まれ、かつ前記検索範囲が狭まるように、前記検索範囲を再設定する、構成3に記載の画像処理システム(1)。
(Configuration 4)
The image processing unit (20) obtains at least one candidate point from the matching degree, and re-determines the search range so that the at least one candidate point is included in the search range and the search range is narrowed. The image processing system (1) according to Configuration 3, which is set.
 (構成5)
 前記フォーカス位置の修正において、前記画像処理部(20)は、前記フォーカス位置が異なる複数の画像に対して前記マッチングを実行して、前記マッチングの結果に基づいて、前記複数の画像から合焦度の最も高い画像を選択し、
 前記画像処理部(20)は、前記フォーカス位置の修正の回数が増加するにつれて、前記複数の画像の間の前記フォーカス位置の変化を小さくする、構成1から構成4のいずれかに記載の画像処理システム(1)。
(Configuration 5)
In the correction of the focus position, the image processing unit (20) performs the matching on a plurality of images having different focus positions, and determines a degree of focus from the plurality of images based on a result of the matching. Select the image with the highest
The image processing according to any one of Configurations 1 to 4, wherein the image processing unit (20) reduces a change in the focus position between the plurality of images as the number of times of correction of the focus position increases. System (1).
 (構成6)
 前記画像処理部(20)は、前記撮像部(10)が前記フォーカス位置を変化させながら撮像された複数の画像を間引くことにより、前記フォーカス位置の修正に用いる画像を選択する、構成5に記載の画像処理システム(1)。
(Configuration 6)
The image processing unit (20) according to configuration 5, wherein the image capturing unit (10) selects an image used for correcting the focus position by thinning out a plurality of images captured while changing the focus position. Image processing system (1).
 (構成7)
 対象物(W)を撮像して前記対象物(W)を含む画像を生成する撮像部(10)と、画像処理部(20)とを備えた画像処理システム(1)による画像処理方法であって、
 前記撮像部(10)は、前記対象物(W)の撮像時にフォーカス位置を変化させることが可能であり、
 前記画像処理方法は、
 前記フォーカス位置の範囲を狭めながら、前記撮像部(10)により生成された前記対象物(W)を含む画像に対して、前記対象物(W)の前記画像内の位置姿勢の基準となる画像パターンのマッチングを実行することにより、前記フォーカス位置と前記位置姿勢とを探索するステップ(S25-S28)と、
 前記探索するステップによる探索の結果から、前記位置姿勢と前記フォーカス位置とを特定するステップとを備える、画像処理方法。
(Configuration 7)
An image processing method by an image processing system (1) including an imaging unit (10) for imaging an object (W) to generate an image including the object (W), and an image processing unit (20). hand,
The imaging unit (10) can change a focus position when imaging the object (W),
The image processing method includes:
An image serving as a reference for the position and orientation of the object (W) in the image with respect to the image including the object (W) generated by the imaging unit (10) while narrowing the range of the focus position. Searching for the focus position and the position and orientation by executing pattern matching (S25-S28);
Specifying the position and orientation and the focus position from a result of the search performed by the searching step.
 (構成8)
 撮像装置(10)によって対象物(W)を撮像することにより得られた画像を処理するためのプログラムであって、
 前記撮像装置(10)は、前記対象物(W)の撮像時にフォーカス位置を変化させることが可能であり、
 前記プログラムは、コンピュータに、
 前記フォーカス位置の範囲を狭めながら、前記撮像装置により生成された前記対象物(W)を含む画像に対して、前記対象物(W)の前記画像内の位置姿勢の基準となる画像パターンのマッチングを実行することにより、前記フォーカス位置と前記位置姿勢とを探索するステップ(S25-S28)と、
 前記探索するステップによる探索の結果から、前記位置姿勢と前記フォーカス位置とを特定するステップとを実行させる、プログラム。
(Configuration 8)
A program for processing an image obtained by imaging an object (W) by an imaging device (10),
The imaging device (10) can change a focus position when imaging the object (W),
The program is stored in a computer,
While narrowing the range of the focus position, matching of an image including the object (W) generated by the imaging device with an image pattern serving as a reference of the position and orientation of the object (W) in the image (S25-S28) of searching for the focus position and the position and orientation by executing
A program for executing the step of specifying the position and orientation and the focus position from a result of the search in the searching step.
 本発明の実施の形態について説明したが、今回開示された実施の形態はすべての点で例示であって制限的なものではないと考えられるべきである。本発明の範囲は請求の範囲によって示され、請求の範囲と均等の意味および範囲内でのすべての変更が含まれることが意図される。 Although the embodiments of the present invention have been described, the embodiments disclosed this time are to be considered in all respects as illustrative and not restrictive. The scope of the present invention is defined by the terms of the claims, and is intended to include any modifications within the scope and meaning equivalent to the terms of the claims.
 1 画像処理システム、5 バーコード、10 撮像装置、11 照明部、12 レンズモジュール、12a,12c レンズ、12b レンズ群、12d 可動部、12e フォーカス調整部、12e1,12e2 電圧源、13 撮像素子、13a 撮像面、14 撮像素子制御部、15,17 レジスタ、16 レンズ制御部、18 通信I/F部、20 画像処理装置、21 判定部、22 出力部、23 記憶部、24 指令生成部、30 ロボット、60 表示装置、70 透光性容器、71 導電性液体、72 絶縁性液体、73a,73b,74a,74b 電極、75a,75b 絶縁体、76a,76b 絶縁層、90 ステージ、91 ベルトコンベア、100 画像、A1 検索範囲、A2 検査領域、B1 基準領域、F 焦点、O 主点、S11~S17,S21~S28 ステップ、W ワーク、W1,W2 領域。 1 image processing system, 5 barcode, 10 image pickup device, 11 illumination unit, 12 lens module, 12a, 12c lens, 12b lens group, 12d movable unit, 12e focus adjustment unit, 12e1, 12e2 voltage source, 13 image sensor, 13a Imaging surface, 14 image sensor control unit, 15, 17 register, 16 lens control unit, 18 communication I / F unit, 20 image processing device, 21 judgment unit, 22 output unit, 23 storage unit, 24 command generation unit, 30 robot , 60 display device, 70 translucent container, 71 conductive liquid, 72 insulating liquid, 73a, 73b, 74a, 74b electrode, 75a, 75b insulator, 76a, 76b insulating layer, 90 stage, 91 belt conveyor, 100 Image, A1 search range, A2 inspection area, B1 group Regions, F focus, O principal point, S11 ~ S17, S21 ~ S28 step, W workpiece, W1, W2 region.

Claims (8)

  1.  対象物を撮像して、前記対象物を含む画像を生成する撮像部と、
     前記撮像部から前記画像を取得して、前記画像に対して画像パターンのマッチングを実行する画像処理部とを備え、
     前記画像パターンは、前記対象物の前記画像内の位置姿勢の基準となるパターンであり、
     前記撮像部は、フォーカス位置を変化させることが可能であり、
     前記画像処理部は、前記フォーカス位置の範囲を狭めながら前記画像パターンの前記マッチングを実行することにより、前記フォーカス位置と前記位置姿勢とを探索して、前記位置姿勢と前記フォーカス位置とを特定する、画像処理システム。
    An imaging unit configured to capture an image of an object and generate an image including the object;
    An image processing unit that acquires the image from the imaging unit and performs image pattern matching on the image,
    The image pattern is a pattern serving as a reference for the position and orientation of the object in the image,
    The imaging unit is capable of changing a focus position,
    The image processing unit searches the focus position and the position and orientation by executing the matching of the image pattern while narrowing the range of the focus position, and specifies the position and orientation and the focus position. , Image processing system.
  2.  前記画像処理部は、前記フォーカス位置および前記位置姿勢の探索を、粗探索から密探索へと移行させる、請求項1に記載の画像処理システム。 The image processing system according to claim 1, wherein the image processing unit shifts the search for the focus position and the position and orientation from a coarse search to a fine search.
  3.  前記画像処理部は、前記画像内に前記マッチングのための検索範囲を設定して、前記マッチングの結果から求められた一致度に基づいて前記検索範囲を再設定する、請求項1または請求項2に記載の画像処理システム。 The said image processing part sets the search range for the said matching in the said image, and resets the said search range based on the matching degree calculated | required from the said matching result. An image processing system according to claim 1.
  4.  前記画像処理部は、前記一致度から少なくとも1つの候補点を求めて、前記少なくとも1つの候補点が前記検索範囲に含まれ、かつ前記検索範囲が狭まるように、前記検索範囲を再設定する、請求項3に記載の画像処理システム。 The image processing unit obtains at least one candidate point from the matching degree, and resets the search range so that the at least one candidate point is included in the search range and the search range is narrowed; The image processing system according to claim 3.
  5.  前記フォーカス位置の修正において、前記画像処理部は、前記フォーカス位置が異なる複数の画像に対して前記マッチングを実行して、前記マッチングの結果に基づいて、前記複数の画像から合焦度の最も高い画像を選択し、
     前記画像処理部は、前記フォーカス位置の修正の回数が増加するにつれて、前記複数の画像の間の前記フォーカス位置の変化を小さくする、請求項1から請求項4のいずれか1項に記載の画像処理システム。
    In the correction of the focus position, the image processing unit performs the matching on a plurality of images having different focus positions, and, based on a result of the matching, a highest degree of focus from the plurality of images. Select an image,
    The image according to any one of claims 1 to 4, wherein the image processing unit reduces a change in the focus position between the plurality of images as the number of times of correction of the focus position increases. Processing system.
  6.  前記画像処理部は、前記撮像部が前記フォーカス位置を変化させながら撮像された複数の画像を間引くことにより、前記フォーカス位置の修正に用いる画像を選択する、請求項5に記載の画像処理システム。 6. The image processing system according to claim 5, wherein the image processing unit selects an image used for correcting the focus position by thinning out a plurality of images captured by the imaging unit while changing the focus position. 7.
  7.  対象物を撮像して前記対象物を含む画像を生成する撮像部と、画像処理部とを備えた画像処理システムによる画像処理方法であって、
     前記撮像部は、前記対象物の撮像時にフォーカス位置を変化させることが可能であり、
     前記画像処理方法は、
     前記フォーカス位置の範囲を狭めながら、前記撮像部により生成された前記対象物を含む画像に対して、前記対象物の前記画像内の位置姿勢の基準となる画像パターンのマッチングを実行することにより、前記フォーカス位置と前記位置姿勢とを探索するステップと、
     前記探索するステップによる探索の結果から、前記位置姿勢と前記フォーカス位置とを特定するステップとを備える、画像処理方法。
    An imaging unit that captures an object to generate an image including the object, and an image processing method by an image processing system including an image processing unit,
    The imaging unit is capable of changing a focus position when imaging the object,
    The image processing method includes:
    While narrowing the range of the focus position, by performing matching of an image pattern serving as a reference of the position and orientation of the target object in the image, for an image including the target object generated by the imaging unit, Searching for the focus position and the position and orientation;
    Specifying the position and orientation and the focus position from a result of the search performed by the searching step.
  8.  撮像装置によって対象物を撮像することにより得られた画像を処理するためのプログラムであって、
     前記撮像装置は、前記対象物の撮像時にフォーカス位置を変化させることが可能であり、
     前記プログラムは、コンピュータに、
     前記フォーカス位置の範囲を狭めながら、前記撮像装置により生成された前記対象物を含む画像に対して、前記対象物の前記画像内の位置姿勢の基準となる画像パターンのマッチングを実行することにより、前記フォーカス位置と前記位置姿勢とを探索するステップと、
     前記探索するステップによる探索の結果から、前記位置姿勢と前記フォーカス位置とを特定するステップとを実行させる、プログラム。
    A program for processing an image obtained by imaging an object by an imaging device,
    The imaging device is capable of changing a focus position when imaging the object,
    The program is stored in a computer,
    While narrowing the range of the focus position, for an image including the target generated by the imaging device, by performing matching of an image pattern serving as a reference of the position and orientation of the target in the image, Searching for the focus position and the position and orientation;
    A program for executing the step of specifying the position and orientation and the focus position from a result of the search in the searching step.
PCT/JP2019/031042 2018-08-23 2019-08-07 Image processing system, image processing method and program WO2020039922A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-156216 2018-08-23
JP2018156216A JP7087823B2 (en) 2018-08-23 2018-08-23 Image processing system, image processing method and program

Publications (1)

Publication Number Publication Date
WO2020039922A1 true WO2020039922A1 (en) 2020-02-27

Family

ID=69593023

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/031042 WO2020039922A1 (en) 2018-08-23 2019-08-07 Image processing system, image processing method and program

Country Status (2)

Country Link
JP (1) JP7087823B2 (en)
WO (1) WO2020039922A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04318509A (en) * 1991-04-17 1992-11-10 Sankyo Seiki Mfg Co Ltd Automatic focusing device
JPH10100415A (en) * 1996-10-02 1998-04-21 Fuji Xerox Co Ltd Method for inspection of nozzle head
JP2007303994A (en) * 2006-05-12 2007-11-22 Denso Corp Visual inspecting device and method
JP2014106388A (en) * 2012-11-28 2014-06-09 Hitachi High-Technologies Corp Automatic focusing detection device and charged particle beam microscope having the same provided

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04318509A (en) * 1991-04-17 1992-11-10 Sankyo Seiki Mfg Co Ltd Automatic focusing device
JPH10100415A (en) * 1996-10-02 1998-04-21 Fuji Xerox Co Ltd Method for inspection of nozzle head
JP2007303994A (en) * 2006-05-12 2007-11-22 Denso Corp Visual inspecting device and method
JP2014106388A (en) * 2012-11-28 2014-06-09 Hitachi High-Technologies Corp Automatic focusing detection device and charged particle beam microscope having the same provided

Also Published As

Publication number Publication date
JP7087823B2 (en) 2022-06-21
JP2020030339A (en) 2020-02-27

Similar Documents

Publication Publication Date Title
TWI538508B (en) Image capturing system obtaining scene depth information and focusing method thereof
JPH0736613B2 (en) Focus adjustment method for imaging device
WO2022126870A1 (en) Three-dimensional imaging method and method based on light field camera and three-dimensional imaging measuring production line
WO2020110712A1 (en) Inspection system, inspection method, and program
CN113513981A (en) Multi-target parallel measurement method, system, equipment and storage medium based on binocular stereo vision
CN106973199B (en) Multi-aperture camera system for improving depth accuracy by using focusing distance scanning
US10827114B2 (en) Imaging system and setting device
KR102601288B1 (en) Camera module and image operating method performed therein
WO2020039922A1 (en) Image processing system, image processing method and program
JP2019168479A (en) Controller, imaging device, method for control, program, and, and storage medium
JP2016053491A (en) Three-dimensional shape measuring apparatus and three-dimensional shape measuring method
JP5050282B2 (en) Focus detection device, focus detection method, and focus detection program
JP7047725B2 (en) Inspection system, inspection method and program
CN112866546B (en) Focusing method and device, electronic equipment and computer readable storage medium
JP7135586B2 (en) Image processing system, image processing method and program
JP5544894B2 (en) Wafer inspection apparatus and wafer inspection method
JP6312410B2 (en) Alignment apparatus, microscope system, alignment method, and alignment program
JP4651550B2 (en) Three-dimensional coordinate measuring apparatus and method
US11631194B2 (en) Image processing apparatus that performs recognition processing, control method thereof, and storage medium
CN110838107A (en) Method and device for intelligently detecting defects of 3C transparent component by variable-angle optical video
CN113096084B (en) Visual detection method, device and system based on array camera imaging
WO2016047220A1 (en) Imaging device and imaging method
JP2018200268A (en) Image recognition device
JP2018206063A (en) Image recognition device and image recognition method
JP7198731B2 (en) Imaging device and focus adjustment method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19852385

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19852385

Country of ref document: EP

Kind code of ref document: A1