CN111323369B - Optical detection device and correction method - Google Patents

Optical detection device and correction method Download PDF

Info

Publication number
CN111323369B
CN111323369B CN201811524912.3A CN201811524912A CN111323369B CN 111323369 B CN111323369 B CN 111323369B CN 201811524912 A CN201811524912 A CN 201811524912A CN 111323369 B CN111323369 B CN 111323369B
Authority
CN
China
Prior art keywords
image
gray scale
light source
target
scale value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811524912.3A
Other languages
Chinese (zh)
Other versions
CN111323369A (en
Inventor
刘育鑫
詹凱劭
薛名凱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chroma ATE Suzhou Co Ltd
Original Assignee
Chroma ATE Suzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chroma ATE Suzhou Co Ltd filed Critical Chroma ATE Suzhou Co Ltd
Priority to CN201811524912.3A priority Critical patent/CN111323369B/en
Publication of CN111323369A publication Critical patent/CN111323369A/en
Application granted granted Critical
Publication of CN111323369B publication Critical patent/CN111323369B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/01Arrangements or apparatus for facilitating the optical investigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined

Landscapes

  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Image Input (AREA)

Abstract

The disclosure provides an optical detection device and a correction method. The optical detection device comprises an image acquisition device and a processor. The processor is coupled to the light source and the image capturing device. The processor is used for adjusting the light intensity of the light source for irradiating the correction object, so that the gray scale value of at least one image block captured by the image capturing device for the correction object accords with the target correction value, and the target light intensity when the gray scale value accords with the target correction value is recorded; the processor is also used for controlling the light source to irradiate the object to be detected with the target light intensity and controlling the image capturing device to capture the image of the object to be detected; the processor is also used for calculating the ratio of the target gray level value to the gray level values of a plurality of pixels of the image of the object to be detected to obtain a mapping table, so that the effect of rapidly detecting whether the image has defects or not can be achieved.

Description

Optical detection device and correction method
Technical Field
The present disclosure relates to a detection device and a method, and more particularly, to an optical detection device and a calibration method.
Background
With technological progress, the image recognition technology is used to identify whether an object has a defect or not, so that the method gradually replaces manual judgment. The image sensor only needs to capture the image of the object, and the image processing technology and the algorithm are accompanied to rapidly give out whether the object has the defect result.
However, the problem of the automatic optical detection technique affects the output result mostly according to whether the design of the algorithm or the image processing operation is considered. Therefore, the parameters controlling the image processing or carelessness may cause erroneous judgment of whether the object has a defect.
Disclosure of Invention
This summary is intended to provide a simplified summary of the disclosure so that the reader is a basic understanding of the disclosure. This summary is not an extensive overview of the disclosure and is intended to neither identify key/critical elements of the embodiments of the invention nor delineate the scope of the invention.
According to one embodiment of the present disclosure, an optical inspection device is disclosed. The optical detection device comprises an image acquisition device and a processor. The processor is coupled to the light source and the image capturing device. The processor is used for adjusting the light intensity of the light source for irradiating the correction object, so that the gray scale value of at least one block captured by the image capturing device for the correction object accords with the target correction value, and the target light intensity when the gray scale value accords with the target correction value is recorded; the processor is also used for controlling the light source to irradiate the object to be detected with the target light intensity and controlling the image capturing device to capture the image of the object to be detected; the processor is also used for calculating the ratio of the target gray level value to the gray level value of a plurality of pixels of the image of the object to be detected so as to obtain a mapping table.
In an embodiment, the image capturing device further includes capturing a smooth region of the object to be tested to obtain a region image, and the processor copies the region image and splices the copied region images into the object to be tested, wherein the region image is smaller than the object to be tested.
In an embodiment, the light source includes a red light emitting unit, a green light emitting unit and a blue light emitting unit, wherein the processor controls the red light emitting unit, the green light emitting unit and the blue light emitting unit to irradiate the object to be detected with the corresponding target light intensities respectively, so that the image capturing device captures the corresponding image of the object to be detected respectively, wherein the image of the object to be detected includes a first light source image corresponding to the red light emitting unit, a second light source image corresponding to the red light emitting unit and a third light source image corresponding to the red light emitting unit.
In one embodiment, the target gray scale value comprises a red light target gray scale value, a green light target gray scale value, and a blue light target gray scale value, wherein the processor is further configured to: respectively calculating the ratio of the red light target gray scale value to the gray scale value of the first light source image at each pixel coordinate according to a plurality of pixel coordinates of the first light source image to obtain the corresponding mapping table; respectively calculating the ratio of the green light target gray scale value to the gray scale value of the second light source image at each pixel coordinate according to a plurality of pixel coordinates of the second light source image to obtain the corresponding mapping table; and respectively calculating the ratio of the gray scale value of the blue light target to the gray scale value of the third light source image at each pixel coordinate according to a plurality of pixel coordinates of the third light source image so as to obtain the corresponding mapping table.
In one embodiment, the processor is further configured to: controlling the light source to irradiate a detection object with the target light intensity and the image capturing device to capture the detection object so as to generate a detection object image; and adjusting the gray scale values of a plurality of pixels of the detected object image according to the mapping table, so that the gray scale value of at least one image block of the detected object image accords with the target gray scale value.
According to another embodiment, a correction method is disclosed, comprising the steps of: adjusting the light intensity of the light source irradiated on the correction object to enable the gray scale value of at least one block captured by the image capturing device to accord with the target correction value, and recording the target light intensity when the gray scale value accords with the target correction value; controlling a light source to irradiate the object to be detected with the target light intensity, and controlling an image capturing device to capture an image of the object to be detected; and calculating the ratio of the target gray scale value to the gray scale values of a plurality of pixels of the image of the object to be detected to obtain a mapping table.
In an embodiment, the image capturing device captures a smooth region of the object to be tested to obtain a region image, and copies the region image and splices the copied region images to obtain the object to be tested, wherein the region image is smaller than the object to be tested.
In an embodiment, the light source further includes a red light emitting unit, a green light emitting unit, and a blue light emitting unit, wherein the calibration method further includes: the red light emitting unit, the green light emitting unit and the blue light emitting unit are controlled to irradiate the object to be detected with the corresponding target light intensities respectively, so that the image capturing device captures the corresponding object to be detected images respectively, wherein the object to be detected images comprise a first light source image corresponding to the red light emitting unit, a second light source image corresponding to the red light emitting unit and a third light source image corresponding to the red light emitting unit.
In one embodiment, the target gray scale value includes a red light target gray scale value, a green light target gray scale value, and a blue light target gray scale value, wherein the correction method further includes: respectively calculating the ratio of the red light target gray scale value to the gray scale value of the first light source image at each pixel coordinate according to a plurality of pixel coordinates of the first light source image to obtain the corresponding mapping table; respectively calculating the ratio of the green light target gray scale value to the gray scale value of the second light source image at each pixel coordinate according to a plurality of pixel coordinates of the second light source image to obtain the corresponding mapping table; and respectively calculating the ratio of the gray scale value of the blue light target to the gray scale value of the third light source image at each pixel coordinate according to a plurality of pixel coordinates of the third light source image so as to obtain the corresponding mapping table.
In one embodiment, the light source is controlled to irradiate a detection object with the target light intensity and the image capturing device captures the detection object to generate a detection object image; and adjusting the gray scale values of a plurality of pixels of the detected object image according to the mapping table, so that the gray scale value of at least one image block of the detected object image accords with the target gray scale value.
Drawings
FIG. 1 is a schematic block diagram of an optical inspection device according to some embodiments of the disclosure.
FIG. 2 is a flow chart illustrating steps of a calibration method according to some embodiments of the present disclosure.
Detailed Description
The following disclosure provides many different embodiments, or examples, for implementing different features of the invention. Specific examples of elements and arrangements are described below to simplify the present disclosure. Of course, these examples are merely illustrative and are not intended to be limiting. For example, forming a first feature over or on a second feature in the description below may include embodiments in which the first and second features are formed in direct contact, and may also include embodiments in which additional features may be formed between the first and second features such that the first and features may not be in direct contact. In addition, the present invention may repeat reference numerals and/or letters in the various examples. This repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed.
Further, spatially relative terms, such as "under," "below," "lower," "above," "higher," and the like, may be used herein for ease of description to describe one element or feature's relationship to another element (or elements) or feature (or features) illustrated in the figures. Spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. The apparatus may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
Generally, optical detection techniques are used to detect whether an object has a defect, for example, after an image is captured, an image processing technique is further used to determine pixels or related parameters on the image, so as to determine whether the image has an abnormality. Therefore, whether the image is a defective product can be known through whether the image is abnormal or not.
Referring to fig. 1, a functional block diagram of an optical inspection apparatus 100 according to some embodiments of the disclosure is shown. The optical inspection device 100 includes an image capturing device 110 and a processor 120. The image capturing device 110 is coupled to the processor 120. The image capturing device 110 is used for capturing an object (not shown) to be tested and generating an image of the object. The processor 120 may control the light source 500. The light source 500 includes a red light emitting unit 510, a green light emitting unit 520, and a blue light emitting unit 530. The light source 500 is used for irradiating the object to be measured, so that the object to be measured reflects light for the image capturing device 110 to obtain an image of the object to be measured.
Referring to fig. 2, a flowchart illustrating steps of a calibration method according to some embodiments of the present disclosure is shown. The steps of the calibration method of fig. 2 will be described below together with the elements of the optical detection device 100 of fig. 1. As shown in fig. 2, in step S210, the processor 120 activates the light source 500 to irradiate the calibration object, and controls the image capturing device 110 to capture an image of the calibration object. The calibration material may be a gray card, or a gray card with 18% gray surface for light measurement, for obtaining accurate exposure values, may be implemented in the present disclosure.
In one embodiment, the processor 120 activates the red light emitting unit 510 of the light source 500 to generate a corrector image corresponding to red light, the processor 120 activates the green light emitting unit 520 of the light source 500 to generate a corrector image corresponding to green light, and the processor 120 activates the blue light emitting unit 530 of the light source 500 to generate a corrector image corresponding to blue light. For simplicity of explanation, the following description will be given with respect to the corrector image corresponding to red light, and the corrector image corresponding to green light and the corrector image corresponding to blue light will be omitted. The calibration object image may be a gray-scale image having a plurality of pixels. For example, if the image size of the corrected object image is 100 pixels×100 pixels, there are 10000 pixels in total. Of these pixels, each pixel has a gray-scale value recorded therein. Therefore, 10000 gradation values are recorded in an image having an image size of 100 pixels×100 pixels of the corrected object image. It should be noted that, to simplify the complexity of processing the corrected image by the processor 120, the corrected image is processed in the form of image blocks in an image with an image size of 100 pixels×100 pixels. For example, a 10-pixel×10-pixel image is taken as an image block, and there are 100 image blocks in a 100-pixel×100-pixel image, and each image block is taken as a unit of image processing by the processor 120.
In one embodiment, the gray scale value may be varied with the light intensity of the light source 500. In step S220, the processor 120 adjusts the light intensity of the light source 500 when the calibration object is irradiated, so that the gray-scale value of at least one image block of the calibration object image corresponds to the target calibration value. For example, the processor 120 controls the current value, which is related to the light intensity of the illumination light of the light source 500. The calibration material generates reflected light due to illumination. The image capturing device 110 senses the reflected light and captures a calibration object image.
In one embodiment, the image capturing device 110 determines whether the gray-scale value of all or at least one image block (e.g. 10 pixels×10 pixels) of the calibration object image meets the target gray-scale value. If the detected gray level value does not match, the current value passing through the light source 500 is continuously adjusted to control the light intensity of the light source 500 until the gray level value of at least one image block of the image sensed by the image capturing device 110 matches the target gray level value. In a preferred embodiment, the processor 120 adjusts the light intensity until the image capturing device 110 determines that the gray scale value in the center of the image block corresponds to the target correction value. The target correction value may be a gray scale value of 140.
In an embodiment, the processor 120 respectively controls the light intensities of the red light emitting unit 510, the green light emitting unit 520 and the blue light emitting unit 530 of the light source 500 to respectively determine the gray scale value of the image or at least one image block thereof until the image capturing device 110 determines that the gray scale value meets the target gray scale value. For simplicity of illustration, the following description will be made with reference to the red light emitting unit 510 as the illumination light source, the green light emitting unit 520 and the blue light emitting unit 530, and so on, and the description will not be repeated.
In step S230, when the gray-scale values of all or at least one image block of the corrected object image conform to the target correction value, the processor 120 records the light intensity (e.g. current value or Pulse Width Modulation (PWM) signal) used by the light source 500, and records the light intensity at this time as the target light intensity. For example, the processor 120 records the light intensity used by the red light emitting unit 510 as a subsequent use. In other words, the gray scale value of all or at least one image block of the calibration object image can be matched with the target calibration value by using the light intensity. In this way, each time the image capturing device 110 captures an image of the gray card, it can ensure that the image gray level value of at least one image block is maintained at the target correction value.
In step S240, the processor 120 controls the light source 500 to irradiate an object (not shown) with the target light intensity. In an embodiment, the processor 120 controls the red light emitting unit 510, the green light emitting unit 520 and the blue light emitting unit 530 to irradiate the object to be measured with the corresponding target light intensities, so that the image capturing device 110 captures the image of the object to be measured corresponding to the red light, the image of the object to be measured corresponding to the green light and the image of the object to be measured corresponding to the blue light, respectively. For simplicity of explanation, the image of the object to be measured corresponding to the red light is hereinafter described, and the image of the object to be measured corresponding to the green light and the image of the object to be measured corresponding to the blue light are so deduced and will not be described again. In one embodiment, the object to be measured may be a solar panel.
Then, the image capturing device 110 captures an image of the object to be detected. The object image to be measured is provided with a plurality of pixels, and each pixel is provided with a corresponding gray scale value. Since the pixels of the image of the object to be measured and the corresponding gray scale values are similar to those of the image of the correction object, the description thereof will not be repeated.
Next, in step S250, the processor 120 calculates a mapping table by calculating a ratio of the target gray-scale value to the gray-scale values of the plurality of pixels of the object image. Taking the image of the object to be measured as 3×3 pixels, the following table 1-1 shows an example. The gray scale values of a plurality of pixels of the image of the object to be measured are a1, b1, c1, d1, e1, f1, g1, h1 and i1 respectively from left to right and from top to bottom according to pixel coordinates. If the target gray level is T (e.g., the gray level 140), the contents of the mapping table are calculated from left to right and from top to bottom as T/a1, T/b1, T/c1, T/d1, T/e1, T/f1, T/g1, T/h1, T/i1, respectively, as shown in the following tables 1-2. It should be noted that, the present disclosure is to represent the gray level values of the image in a form of a table, the contents of the table are not identical to the contents actually implemented on the image, the table is only one form for expressing the image, and the contents of the table are actually the contents of the image.
Table 1-1: image gray scale value (3X 3 pixel image)
a1 b1 c1
d1 e1 f1
g1 h1 i1
Table 1-2: mapping table (image suitable for 3X 3 pixel)
T/a1 T/b1 T/c1
T/d1 T/e1 T/f1
T/g1 T/h1 T/i1
In another embodiment, the image of the object to be measured has 90 pixels×90 pixels, and if one image block is 30 pixels×30 pixels, the image blocks are numbered from left to right and from top to bottom as B1 to B9 in sequence, as shown in the following table 2-1. The object image has 9 image blocks (i.e., 3×3), each of which includes 30 pixels×30 pixels.
Table 2-1: numbering of image blocks
B1 B2 B3
B4 B5 B6
B7 B8 B9
Table 2-2 is a mapping table applicable to image blocks. In this embodiment, when the gray scale values of the pixels in the image block are the same, the operation of the processor 120 can be simplified by using the image block. The mapping table may be a Flat-field correction table (FFC table), or FFC table.
Table 2-2: mapping table (image suitable for 90X 90 pixels)
T/B1 T/B2 T/B3
T/B4 T/B5 T/B6
T/B7 T/B8 T/B9
In some embodiments, the light source 500 irradiates the object with the red light emitting unit 510, the green light emitting unit 520, and the blue light emitting unit 530, respectively. The image capturing device 110 captures a first light source image corresponding to red light, a second light source image corresponding to green light, and a third light source image corresponding to blue light, respectively. The first light source image, the second light source image and the third light source image are respectively gray scale images. The processor 120 generates a first mapping table according to the gray-scale value of the first light source image and the target gray-scale value corresponding to the first light source, generates a second mapping table according to the gray-scale value of the second light source image and the target gray-scale value corresponding to the second light source, and generates a third mapping table according to the gray-scale value of the third light source image and the target gray-scale value corresponding to the third light source. The first mapping table, the second mapping table, and the third mapping table are calculated as described above, and tables 3 to 5 below are calculated results.
Table 3: the gray scale value, the target gray scale value and the first mapping table of the first light source image are as follows:
from table 3 above, the first light source image is, for example, a 3×3 pixel image, and each pixel has a corresponding gray scale value. Each table content of the first mapping table is the calculated ratio of the numerical value of one table in the red light target gray scale value divided by the gray scale value of a pixel corresponding to the table in the first light source image. For example, the pixel coordinates of the first light source image are (1, 1), (1, 2), (1, 3), (2, 1), (2, 2), … (3, 3) from left to right and from top to bottom. The gray scale value of the pixel coordinates (1, 1) in the first light source image is 10. The red-light target gray-scale value corresponding to the pixel coordinate (1, 1) is 20, and thus a value of 2 is obtained by dividing 20 by 10, which is stored in the first map at a position corresponding to the pixel coordinate (1, 1). The first light source image has a gray level value of 20 at the pixel coordinates (2, 2), and the red light target gray level value corresponding to the pixel coordinates (2, 2) is 20, so that the value obtained by dividing 20 by 20 is 1, and the value is stored in the position corresponding to the pixel coordinates (2, 2) in the first mapping table. Similarly, a first mapping table is calculated.
Table 4: the gray scale value, the target gray scale value and the second mapping table of the second light source image are as follows:
similar to the description of table 3, in table 4 above, the gray-scale value at the pixel coordinate (1, 1) in the second light source image is 20, the target gray-scale value corresponding to the pixel coordinate (1, 1) is 25, so the value obtained by dividing 25 by 20 is 1.25, and this value is stored in the second mapping table at the position corresponding to the pixel coordinate (1, 1). The gray scale value at the pixel coordinates (2, 2) in the second light source image is 25, and the target gray scale value corresponding to the pixel coordinates (2, 2) is 25, so that the value obtained by dividing 25 by 25 is 1, and the value is stored in the position corresponding to the pixel coordinates (2, 2) in the second mapping table. And similarly, a second mapping table is calculated.
Table 5: the gray scale value, the target gray scale value and the third mapping table of the third light source image are as follows:
similar to the description of table 3, in table 5 above, the gray-scale value at the pixel coordinate (1, 1) in the third light source image is 25, the target gray-scale value corresponding to the pixel coordinate (1, 1) is 40, and thus the value obtained by dividing 40 by 25 is 1.6, which is stored in the third mapping table at the position corresponding to the pixel coordinate (1, 1). The gray scale value of the pixel coordinate (2, 2) in the third light source image is 40, and the target gray scale value corresponding to the pixel coordinate (2, 2) is 40, so that the value obtained by dividing 40 by 40 is 1, and the value is stored in the position corresponding to the pixel coordinate (2, 2) of the third mapping table.
The pixel coordinates in tables 3 to 5 may be different from each other. In fact, there are different absorption or reflection levels for different light waves due to the characteristics of different objects to be measured. For example, the solar panel is made of a dark blue rigid material, so that the blue light has a higher reflectivity when being irradiated on the solar panel than the red light and the green light, and therefore the gray level (e.g. 40) of the third light source image is higher than the gray level (e.g. 20) of the first light source image and the gray level (e.g. 25) of the second light source image.
In some embodiments, the image capturing device 110 is configured to capture a smooth region of the object to be detected to obtain a region image. For example, the solar wafer may have a slightly rugged surface, so that when an image of the object to be measured is generated, a smooth region may be extracted from the whole image, and the region image of the smooth region may be copied to one or more. The processor 120 splices one or more area images to generate a stitched image as an object image. Therefore, the gray scale value of the generated image of the object to be detected can be ensured to be uniform and close to an ideal image without flaws. The size of the region image may be smaller than the size of the object image.
Next, the light source 500 is started, so that the red light emitting units 510 irradiate the correction objects (such as gray cards) respectively, and the light intensities are controlled so that the gray scale values of the images of the correction objects are the first target correction values (such as gray scale values 140), and the target light intensities corresponding to the red light are recorded; causing the green light emitting unit 520 to irradiate light to the correction object and control the light intensity so that the gray scale value of the image of the correction object is the second target correction value and recording the target light intensity corresponding to green light; the blue light emitting unit 530 is made to irradiate the correction object with light, and controls the light intensity so that the gray scale value of the image of the correction object is the third target correction value, and records the target light intensity corresponding to the blue light, wherein the first target correction value, the second target correction value, and the third target correction value are the same value. In one embodiment, the image capturing device 110 senses an image with a gray level of 140 when the red light emitting unit 510, the green light emitting unit 520, and the blue light emitting unit 530 are respectively illuminated, and performs a white balance correction control procedure.
Next, in step S260, the processor 120 makes the light source 500 irradiate a detection object (not shown) with the target light intensity. The image capturing device 110 captures an image of the object to be detected, and obtains a gray scale value of the image of the object to be detected. Then, the processor 120 adjusts the gray-scale value of the detected object image according to the mapping table. The light source 500 irradiates the object to be detected, and the image capturing device 110 captures an image of the object to be detected. The processor 120 multiplies the gray-scale values of the detected object images by the mapping tables, respectively. For example, the content of table 3 is multiplied by the image of the detection object generated by the irradiation of the detection object by the red light emitting unit 510, the content of table 4 is multiplied by the image of the detection object generated by the irradiation of the detection object by the green light emitting unit 520, and the content of table 5 is multiplied by the image of the detection object generated by the irradiation of the detection object by the blue light emitting unit 530. Taking an image of a detection object image with a size of 3×3 pixels as an example, if the gray-scale value of the detection object image at the pixel coordinate (1, 1) is a2, the gray-scale value a2 is multiplied by the value corresponding to the pixel coordinate (1, 1) in table 3, that is, 2, to obtain the product 2×a2. The adjusted image of the object can be obtained by performing the above calculation for all pixels of the entire image of the object. The gray level value of at least one image block of the adjusted detected object image (for example, the image block of the center of the detected object image) accords with the target gray level value. In one embodiment, the gray level of the image block in the center of the adjusted object image is adjusted to the desired gray level 140.
Therefore, if it is desired to detect whether the same or similar type of articles have defects (such as solar panels or other specifications of circuit boards), the light intensity of the calibration material can be set first. And then setting a mapping table of the object to be detected according to the set light intensity of the object of the type. And again, the white balance is set, and then the images of the same type of objects are adjusted by the light intensity and the set mapping table, so that the images can be taught to the ideal gray level 140, and the difference degree of the gray level values of all pixels of the whole image can be reduced.
In summary, compared to the conventional method of using only gray cards to correct exposure values, the optical inspection apparatus 100 and the correction method disclosed in the present disclosure can build mapping tables for different characteristics of the objects to be inspected. In addition, through the adjustment of the mapping table and the target light intensity of the light source 500, the gray scale values of the detected object image on the whole surface or the cross section of the image center can be adjusted to be in linear distribution. In this way, the adjusted detected object image can avoid the probability of erroneous judgment of optical detection from being reduced due to the overlarge gray scale value difference, and the accuracy of detecting flaws by the optical detection device 100 is improved.
The foregoing outlines features of several embodiments so that those skilled in the art may better understand the aspects of the present disclosure. It should be appreciated by those skilled in the art that the present invention may be readily utilized as a basis for designing or modifying other processes and structures for carrying out the same purposes and/or achieving the same advantages of the embodiments presented herein. It should also be realized by those skilled in the art that such equivalent constructions do not depart from the spirit and scope of the invention, and that they may make various changes, substitutions and alterations herein without departing from the spirit and scope of the invention.

Claims (8)

1. An optical inspection device, comprising:
an image capturing device; and
a processor coupled to a light source and the image capturing device, wherein the processor is configured to:
adjusting a light intensity of the light source irradiating a correction object, enabling a gray scale value of at least one image block captured by the correction object to be in accordance with a target correction value, and recording a target light intensity when the gray scale value is in accordance with the target correction value;
controlling the light source to irradiate an object to be detected with the target light intensity, and controlling the image capturing device to capture an image of the object to be detected;
calculating a target gray scale value and a ratio of gray scale values of a plurality of pixels of the object image to be detected to obtain a mapping table;
controlling the light source to irradiate a detection object with the target light intensity and the image capturing device to capture the detection object so as to generate a detection object image; and
and adjusting the gray scale values of a plurality of pixels of the detection object image according to the mapping table, so that the gray scale value of at least one image block of the detection object image accords with the target gray scale value.
2. The optical inspection device according to claim 1, wherein the image capturing device further comprises capturing a smooth region of the object to be inspected to obtain a region image, and the processor replicates the region image and splices the replicated region images into the object to be inspected image, wherein the region image is smaller than the object to be inspected image.
3. The optical inspection device according to claim 1, wherein the light source comprises a red light emitting unit, a green light emitting unit and a blue light emitting unit, wherein the processor controls the red light emitting unit, the green light emitting unit and the blue light emitting unit to irradiate the object with the corresponding target light intensities respectively, so that the image capturing device captures the corresponding object image of the object respectively, wherein the object image comprises a first light source image corresponding to the red light emitting unit, a second light source image corresponding to the red light emitting unit and a third light source image corresponding to the red light emitting unit.
4. The optical inspection device of claim 3, wherein the target gray scale values comprise a red target gray scale value, a green target gray scale value, and a blue target gray scale value, wherein the processor is further configured to:
respectively calculating the ratio of the red light target gray scale value to the gray scale value of the first light source image at each pixel coordinate according to a plurality of pixel coordinates of the first light source image to obtain the corresponding mapping table;
respectively calculating the ratio of the green light target gray scale value to the gray scale value of the second light source image at each pixel coordinate according to a plurality of pixel coordinates of the second light source image to obtain the corresponding mapping table; and
and respectively calculating the ratio of the gray scale value of the blue light target to the gray scale value of the third light source image at each pixel coordinate according to a plurality of pixel coordinates of the third light source image so as to obtain the corresponding mapping table.
5. A correction method, comprising:
adjusting a light intensity of a light source irradiated on a correction object, enabling a gray scale value of at least one image block captured by the correction object by an image capturing device to accord with a target correction value, and recording a target light intensity when the gray scale value accords with the target correction value;
controlling the light source to irradiate an object to be detected with the target light intensity, and controlling the image capturing device to capture an image of the object to be detected;
calculating a target gray scale value and a ratio of gray scale values of a plurality of pixels of the object image to be detected to obtain a mapping table;
controlling the light source to irradiate a detection object with the target light intensity and the image capturing device to capture the detection object so as to generate a detection object image; and
and adjusting the gray scale values of a plurality of pixels of the detection object image according to the mapping table, so that the gray scale value of at least one image block of the detection object image accords with the target gray scale value.
6. The correction method according to claim 5, characterized by further comprising:
the image capturing device captures a smooth area of the object to be detected to obtain an area image, and copies the area image and splices the copied area images to obtain the object to be detected, wherein the area image is smaller than the object to be detected.
7. The method of claim 5, wherein the light source further comprises a red light emitting unit, a green light emitting unit, and a blue light emitting unit, and wherein the method further comprises:
the red light emitting unit, the green light emitting unit and the blue light emitting unit are controlled to irradiate the object to be detected with the corresponding target light intensities respectively, so that the image capturing device captures the corresponding object to be detected images respectively, wherein the object to be detected images comprise a first light source image corresponding to the red light emitting unit, a second light source image corresponding to the red light emitting unit and a third light source image corresponding to the red light emitting unit.
8. The method of claim 7, wherein the target gray scale values comprise a red target gray scale value, a green target gray scale value, and a blue target gray scale value, and wherein the method further comprises:
respectively calculating the ratio of the red light target gray scale value to the gray scale value of the first light source image at each pixel coordinate according to a plurality of pixel coordinates of the first light source image to obtain the corresponding mapping table;
respectively calculating the ratio of the green light target gray scale value to the gray scale value of the second light source image at each pixel coordinate according to a plurality of pixel coordinates of the second light source image to obtain the corresponding mapping table; and
and respectively calculating the ratio of the gray scale value of the blue light target to the gray scale value of the third light source image at each pixel coordinate according to a plurality of pixel coordinates of the third light source image so as to obtain the corresponding mapping table.
CN201811524912.3A 2018-12-13 2018-12-13 Optical detection device and correction method Active CN111323369B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811524912.3A CN111323369B (en) 2018-12-13 2018-12-13 Optical detection device and correction method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811524912.3A CN111323369B (en) 2018-12-13 2018-12-13 Optical detection device and correction method

Publications (2)

Publication Number Publication Date
CN111323369A CN111323369A (en) 2020-06-23
CN111323369B true CN111323369B (en) 2023-08-29

Family

ID=71170100

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811524912.3A Active CN111323369B (en) 2018-12-13 2018-12-13 Optical detection device and correction method

Country Status (1)

Country Link
CN (1) CN111323369B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1805546A (en) * 2005-01-14 2006-07-19 Lg电子株式会社 Apparatus and method for compensating images in display device
CN101242476A (en) * 2008-03-13 2008-08-13 北京中星微电子有限公司 Automatic correction method of image color and digital camera system
CN101556381A (en) * 2008-04-10 2009-10-14 东捷科技股份有限公司 Detection device and image illumination level compensation method
CN101604509A (en) * 2008-06-13 2009-12-16 胜华科技股份有限公司 Image-displaying method
TW201621297A (en) * 2014-12-04 2016-06-16 致茂電子股份有限公司 Light source calibration detecting system and light source calibration method using the same
WO2018159317A1 (en) * 2017-02-28 2018-09-07 日本精機株式会社 Display device, head-up display

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10469807B2 (en) * 2013-09-11 2019-11-05 Color Match, LLC Color measurement and calibration

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1805546A (en) * 2005-01-14 2006-07-19 Lg电子株式会社 Apparatus and method for compensating images in display device
CN101242476A (en) * 2008-03-13 2008-08-13 北京中星微电子有限公司 Automatic correction method of image color and digital camera system
CN101556381A (en) * 2008-04-10 2009-10-14 东捷科技股份有限公司 Detection device and image illumination level compensation method
CN101604509A (en) * 2008-06-13 2009-12-16 胜华科技股份有限公司 Image-displaying method
TW201621297A (en) * 2014-12-04 2016-06-16 致茂電子股份有限公司 Light source calibration detecting system and light source calibration method using the same
WO2018159317A1 (en) * 2017-02-28 2018-09-07 日本精機株式会社 Display device, head-up display

Also Published As

Publication number Publication date
CN111323369A (en) 2020-06-23

Similar Documents

Publication Publication Date Title
US5598345A (en) Method and apparatus for inspecting solder portions
US10458927B2 (en) Image processing device, image processing method, and program
KR20060044521A (en) Inspection system and method for providing feedback
US20080175466A1 (en) Inspection apparatus and inspection method
US7561751B2 (en) Image processing method
KR20120068128A (en) Method of detecting defect in pattern and apparatus for performing the method
JP6303352B2 (en) Appearance inspection system
KR20120014886A (en) Inspection recipe generation and inspection based on an inspection recipe
JP2015068668A (en) Appearance inspection device
JP2006303491A (en) Method of inspecting wafer
KR20000022946A (en) Automatic Inspection of Print Quality Using an Elastic Model
TWI703509B (en) Optical detecting device and calibrating method
TW201024716A (en) Wafer pattern inspection apparatus
CN109945794B (en) Image processing system, computer-readable recording medium, and image processing method
CN111323369B (en) Optical detection device and correction method
JPWO2011027848A1 (en) Surface inspection illumination / imaging system and data structure
CN116074495B (en) Method and device for detecting and correcting dead pixel of image sensor
JP2009002669A (en) Visual inspection device
JP2002310939A (en) Air-bubble inspection system
JP2009192358A (en) Defect inspection device
CN109931881B (en) Image processing system, computer-readable recording medium, and image processing method
JP6649439B2 (en) Adjusting the light source
KR20150024122A (en) Method for inspecting vehicle component
WO2002071046A1 (en) Method for optically enhancing contrast in high-throughput optical inspection
WO2022163002A1 (en) Imaging condition setting system, imaging condition setting method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant