CN111504608B - Brightness uniformity detection system and brightness uniformity detection method - Google Patents

Brightness uniformity detection system and brightness uniformity detection method Download PDF

Info

Publication number
CN111504608B
CN111504608B CN201910098660.0A CN201910098660A CN111504608B CN 111504608 B CN111504608 B CN 111504608B CN 201910098660 A CN201910098660 A CN 201910098660A CN 111504608 B CN111504608 B CN 111504608B
Authority
CN
China
Prior art keywords
image
verification
sample
detection
texture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910098660.0A
Other languages
Chinese (zh)
Other versions
CN111504608A (en
Inventor
赖郁仁
姜皇成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Coretronic Corp
Original Assignee
Coretronic Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Coretronic Corp filed Critical Coretronic Corp
Priority to CN201910098660.0A priority Critical patent/CN111504608B/en
Priority to TW108109555A priority patent/TWI757590B/en
Publication of CN111504608A publication Critical patent/CN111504608A/en
Application granted granted Critical
Publication of CN111504608B publication Critical patent/CN111504608B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M11/00Testing of optical apparatus; Testing structures by optical methods not otherwise provided for

Landscapes

  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Analysis (AREA)

Abstract

The brightness uniformity detection system is used for detecting an object to be detected and is provided with an image sensor, a storage unit and a processing device. The image sensor is used for capturing an image including an object to be detected so as to acquire a plurality of gray scale information in the image. The storage unit is used for storing a plurality of luminance curves corresponding to the detection position information. The processing device is used for acquiring a plurality of pieces of to-be-detected gray scale information corresponding to a plurality of detection positions on the to-be-detected object from the gray scale information of the image according to the image and the detection position information. The processing device also respectively obtains a plurality of pieces of estimated luminance information at the detection positions according to the luminance curve and the gray scale information to be detected, and judges whether the luminance of the object to be detected is uniform or not according to the difference of the estimated luminance information among the detection positions of the object to be detected. The brightness uniformity detection system and the brightness uniformity detection method do not need to measure each detection position one by one, and can effectively save the time consumed by detection and reduce the manpower required by detection.

Description

Brightness uniformity detection system and brightness uniformity detection method
Technical Field
The present invention relates to a detection technology, and more particularly, to a luminance uniformity detection system and a luminance uniformity detection method.
Background
In order to ensure the yield, the panels are required to be subjected to luminance detection before being shipped from a factory so as to ensure uniform luminance. In the conventional detection method, an automatic optical detection machine equipped with a luminance measurement device is used to measure a plurality of detection positions on a panel, and the luminance measured at the detection positions is used to determine whether the measured luminance of the panel is uniform.
However, the luminance measuring apparatus only measures the luminance value of a single position on the panel at a time, and thus, it takes a long time to measure all the detected positions on the single panel. In particular, the luminance measuring apparatus must repeat the measurement every time interval for each detection position to take the average of several times as the luminance value at the detection position, resulting in a long overall measurement time. In addition, in the process of measuring the brightness, the tester is required to confirm the relationship between the automatic optical inspection machine and the inspection position of the panel to be inspected, so as to ensure that the measurement position of the automatic optical inspection machine is actually the inspection position. Therefore, how to reduce the time and labor consumption in detection is the subject of the skilled person.
The background section is provided to aid in understanding the present invention, and thus the disclosure in the background section may include additional art not already known to those skilled in the art. The disclosure in the "background" section, which is not intended to represent a prior art disclosure or problem to be solved by one or more embodiments of the present invention, will be appreciated and understood by those skilled in the art before filing this specification.
Disclosure of Invention
The invention provides a brightness uniformity detection system and a brightness uniformity detection method, which are used for reducing time and labor cost consumed in detection.
To achieve one or a part of or all of the above or other objects, an embodiment of the invention provides a luminance uniformity detection system. The brightness uniformity detection system is used for detecting an object to be detected and is provided with an image sensor, a storage unit and a processing device. The image sensor is used for capturing an image including an object to be detected so as to acquire a plurality of gray scale information in the image. The storage unit is used for storing a plurality of luminance curves corresponding to the detection position information. The processing device is connected to the image sensor and the storage unit. The processing device is used for acquiring a plurality of pieces of to-be-detected gray scale information corresponding to a plurality of detection positions on the to-be-detected object from the gray scale information of the image according to the image and the detection position information. And the processing device also respectively acquires a plurality of estimated luminance information at the detection positions according to the luminance curve and the gray scale information to be detected, and judges whether the luminance of the object to be detected is uniform or not according to the difference of the estimated luminance information among the detection positions of the object to be detected.
In order to achieve one or a part of or all of the above or other objects, an embodiment of the invention provides a method for detecting luminance uniformity, which is used for detecting an object to be detected. The luminance uniformity detection method comprises the following steps: capturing an image including an object to be detected to acquire a plurality of gray scale information in the image; acquiring a plurality of pieces of to-be-detected gray scale information of a plurality of detection positions on a corresponding to-be-detected object from the gray scale information of the image according to the image and the detection position information; respectively acquiring a plurality of estimated luminance information at the detection position according to a plurality of luminance curves corresponding to the detection position information and the gray scale information to be detected; and judging whether the brightness of the object to be detected is uniform or not according to the difference of the estimated brightness information among the detection positions of the object to be detected.
Based on the above, the luminance uniformity detection system and the luminance uniformity detection method of the present invention do not need to measure each detection position one by one, and can effectively save the time consumed by detection. Moreover, the luminance uniformity detection system and the luminance uniformity detection method do not need to independently position each detection position, so that the labor required by detection is reduced.
In order to make the aforementioned and other features and advantages of the invention more comprehensible, embodiments accompanied with figures are described in detail below.
Drawings
Fig. 1 is a system diagram of a luminance uniformity detection system according to an embodiment of the present invention.
Fig. 2 is a circuit connection diagram of a luminance uniformity detection system according to an embodiment of the invention.
FIG. 3 is a flowchart illustrating a luminance uniformity detection method according to an embodiment of the present invention.
FIG. 4 is a schematic diagram illustrating a detection position on an object to be detected according to an embodiment of the invention.
FIG. 5 is a detailed flowchart of a luminance uniformity detection method according to an embodiment of the present invention.
Fig. 6 is an image schematic diagram illustrating an object positioning procedure according to an embodiment of the invention.
FIG. 7 is a schematic diagram of an image of a texture analysis process according to an embodiment of the present invention.
FIG. 8 is a schematic diagram of associating a texture image and an image including an object under test according to an embodiment of the invention.
FIG. 9 is a flowchart illustrating a luminance uniformity detection method according to an embodiment of the present invention.
Fig. 10 is a detailed flowchart illustrating a luminance uniformity detection method according to an embodiment of the present invention.
List of reference numerals
10: test object
100: brightness uniformity detection system
110: image sensor
120: storage unit
130: processing apparatus
S310 to S340, S510 to S560, S910 to S960, S1010 to S1060: and (5) carrying out the following steps.
Detailed Description
Fig. 1 is a system diagram of a luminance uniformity detection system according to an embodiment of the present invention. Referring to fig. 1, in an embodiment of the invention, a luminance uniformity detection system 100 is used for detecting an object 10 to be detected, for example, the object 10 to be detected may be a panel or a backlight module suitable for the panel, but the invention is not limited thereto. In an embodiment of the invention, the luminance uniformity detecting system 100 detects a plurality of detection positions of the object 10 to determine whether the luminance at the detection positions of the object 10 is uniform.
Fig. 2 is a circuit connection diagram of a luminance uniformity detection system according to an embodiment of the invention. The circuit connection of the luminance uniformity detection system of fig. 2 is at least suitable for the luminance uniformity detection system of fig. 1. The components of the luminance uniformity detection system according to an embodiment of the present invention will be described with reference to fig. 1 and fig. 2. Specifically, the luminance uniformity detection system 100 has an image sensor 110, a storage unit 120, and a processing device 130.
The image sensor 110 is used to capture an image including the object 10. In an embodiment of the invention, the image sensor 110 is, for example, a Charge-coupled device (CCD) or a Complementary Metal-Oxide-Semiconductor (CMOS) sensor, but the invention is not limited thereto. The image sensor 110 converts the sensed light into a current signal and then into a digital signal, wherein the converted digital signal corresponds to gray scale information of an image. Specifically, when the object 10 emits currents with different gray scales, the image sensor 110 respectively obtains luminance values of a plurality of detection positions on the object 10 under different gray scales to obtain gray scale information of the plurality of detection positions on the object 10. In one embodiment, the image is imaged on the display of the processing device 130 according to the digital signal.
The storage unit 120 stores a plurality of luminance curves corresponding to the detected position information. Specifically, the detection position information is a plurality of detection positions on the object to be measured 10, the detection positions are determined by the detection personnel and established in the luminance uniformity detection system 100, and the number of the detection positions and the positions actually corresponding to the upper surface of the object to be measured 10 are not limited by the present invention. Moreover, the storage unit 120 stores the luminance curve corresponding to each detection position. Specifically, the luminance curve is used to describe the corresponding relationship between the gray scale and the luminance. If the two-dimensional coordinate is used for illustration, for example, the horizontal axis of the two-dimensional coordinate is recorded with gray scales, and the vertical axis is recorded with luminances, so that the situation of the corresponding luminance at each gray scale can be known. In an embodiment of the present invention, the luminance curve can be expressed by a mathematical function or a table to record the luminance corresponding to each gray level. The invention is not limited to the way in which the luminance curve is described. In an embodiment of the invention, the storage unit 120 may be various types of non-volatile memories, such as Hard Disk Drives (HDD) and solid-state drives (SSD), but the invention is not limited thereto.
The processing device 130 is connected to the image sensor 110 and the storage unit 120. The processing device 130 is used for performing various operations of the luminance uniformity detection system 100, and the details will be described later. In an embodiment of the invention, the Processing Device 130 is, for example, a Central Processing Unit (CPU), a Microprocessor (Microprocessor), a Digital Signal Processor (DSP), a Programmable controller, a Programmable Logic Device (PLD), or other similar devices or combinations thereof, but the invention is not limited thereto. In the embodiment of the present invention, the storage unit 120 may be integrated into the processing device 130, or may be independently built outside the processing device 130, and electrically connected or communicatively connected (e.g., via wi-fi, local area network, etc.) to the processing device 130 for the processing device 130 to access, which is not limited by the invention.
FIG. 3 is a flowchart illustrating a luminance uniformity detection method according to an embodiment of the present invention. The luminance uniformity detection method of fig. 3 is at least suitable for the luminance uniformity detection system of fig. 1 and fig. 2. The following describes details of the operation of the luminance uniformity detection system 100 and the luminance uniformity detection method according to an embodiment of the present invention with reference to fig. 1 to 3.
In step S310, an image including the object 10 is captured by the image sensor 110 to obtain a plurality of gray scale information in the image. Specifically, in an embodiment of the present invention, the image sensor 110 generates a digital signal through the sensed light, and generates a corresponding image based on the digital signal. The digital signal represents gray scale information in the image. The image captured by the image sensor 110 has the object 10, but is not limited to the object 10. For example, the image captured by the image sensor 110 may include a crawler belt for transporting the object 10, and the content of the image captured by the image sensor 100 is not limited in the present invention.
In step S320, the processing device 130 obtains a plurality of pieces of to-be-detected gray scale information corresponding to a plurality of detection positions on the to-be-detected object from the plurality of pieces of gray scale information of the image according to the image and the detection position information. Referring to fig. 4, fig. 4 is a schematic view illustrating a detection position on an object to be detected according to an embodiment of the invention. In one embodiment of the present invention, the number of the detection positions is 13, and the detection positions are evenly distributed on the object 10 to be detected, however, the present invention is not limited thereto, and in other embodiments, the number of the detection positions may be 9 or 25. In addition, the inspection position at least corresponds to a necessary inspection position for industrial quality control inspection, and is previously established by an inspection person and stored in the storage unit 120. Accordingly, the processing device 130 can further obtain the gray scale information corresponding to each detection position on the object to be tested as the gray scale information to be tested according to the detection position of the object to be tested and the plurality of gray scale information obtained in step S310.
In step S330, the processing device 130 obtains a plurality of estimated luminance information at a plurality of detection positions according to a plurality of luminance curves and a plurality of pieces of gray scale information to be detected, respectively. Due to the error of the image sensor 110, the characteristics of the object 10, and other factors, the brightness of different positions on the object 10 under the same brightness may be different. Therefore, the storage unit 120 stores the luminance curve corresponding to each detection position, and the processing device 130 obtains the estimated luminance information corresponding to the to-be-detected gray scale information at each detection position according to the luminance curve at each detection position and the gray scale information corresponding to each detection position.
In summary, the luminance curve can be expressed by a mathematical function or a table to describe the luminance corresponding to each gray scale. If the luminance curve is recorded as a mathematical function, the processing device 130 can calculate the estimated luminance information in real time by inputting the gray scale information to be measured into the mathematical function. If the luminance curve is recorded in a table, the processing device 130 can obtain the estimated luminance information by looking up the table, but the invention is not limited thereto.
In step S340, the processing device 130 determines whether the luminance of the object 10 is uniform according to the difference of the estimated luminance information among the detection positions of the object 10. In an embodiment of the invention, the processing device 130 determines whether the object 10 is uniform according to a difference between the maximum estimated luminance and the minimum estimated luminance in the estimated luminance information corresponding to all the detection positions. If the difference between the maximum estimated luminance and the minimum estimated luminance does not exceed a certain threshold (e.g., the difference is less than 5%), it is determined that the luminance of the object 10 is uniform, otherwise, it is determined that the luminance of the object 10 is not uniform. However, the invention is not limited thereto.
It should be noted that, in the embodiment of the invention, the processing device 130 can obtain the gray scale information to be detected corresponding to a plurality of or all of the detection positions of the object 10 according to one image obtained by the image sensor 110, and accordingly obtain the corresponding estimated luminance information, there is no need to take an image for each detection position, and the time required for detection is effectively saved.
FIG. 5 is a detailed flowchart of a luminance uniformity detection method according to an embodiment of the present invention. Referring to fig. 1 to fig. 5, it will be more clearly described how, in the luminance uniformity detection system and the luminance uniformity detection method according to the present invention, the processing device 130 obtains the gray scale information of the detection positions on the object to be detected from the gray scale information of the image according to the image and the detection position information, with the aid of fig. 5.
In step S510, the processing device 130 executes a positioning procedure on the image to obtain a position of the object 10 in the image, and determines a position of the detection position in the image according to the position of the object 10 in the image and the detection position information. In detail, as mentioned above, besides the object 10, there may be other objects in the image sensed by the image sensor 110. Therefore, in this embodiment, the processing device 130 further performs an object positioning procedure on the image to find the position of the object 10 in the image.
Referring to fig. 5 and fig. 6, fig. 6 is a schematic image diagram illustrating an object positioning procedure according to an embodiment of the invention. The following describes a process of the processing device 130 performing an object positioning procedure on the image with reference to fig. 6.
First, as shown in (1) of fig. 6, the processing device 130 performs binarization processing on the image so that the image is represented by black and white.
As shown in fig. 6 (2), the processing device 130 performs noise removal on the binarized image to remove the portion not belonging to the object 10. In detail, in the image of fig. 6 (1), it can be seen that the panel corresponds to a quadrilateral block with a larger area and a larger area, and noise not belonging to the object 10 exists around the image. The noise is mostly formed by dots, and a complete quadrilateral space with a large area is not formed. Therefore, the processing device 130 can perform operations on the image to find a closed space in the image (i.e., a complete quadrilateral area with a large area), and filter out other portions belonging to noise (i.e., four corners in fig. 6 (1)) to obtain an image belonging to the object 10 to be measured.
As shown in fig. 6 (3), the processing device 130 performs edge detection on a portion of the object 10 to obtain an edge of the corresponding object 10.
As shown in (4) of fig. 6, the processing device 130 further performs a straight line detection on the edge of the object 10 to find a straight line equation corresponding to the edge of the object 10. Since the object 10 is a panel, the processing device 130 obtains four straight line equations.
As shown in (5) of fig. 6, after obtaining the four line equations, the processing device 130 can obtain intersection positions by using the line equations corresponding to two adjacent edges, respectively, where the intersection positions are the vertex positions of the object 10 to be measured. Accordingly, the processing device 130 obtains the position of the object 10 in the image, and completes the object positioning procedure. At this time, since the position of the object 10 in the image is known, the processing device 130 can further obtain the positions of the detection positions on the object in the image according to the vertex position of the object 10. It should be noted that fig. 6 (1) to fig. 6 (5) are only schematic diagrams corresponding to the operation result of the processing device 130, and the image of fig. 6 is not necessarily shown in reality during the operation of the luminance uniformity detection system 100.
In step S520, a texture analysis procedure is performed on the image by the processing device 130 to obtain a texture image including the object 10. In detail, if the object 10 is contaminated during the process, for example, fingerprints, scurf or damage, the object will be removed by the cleaning process during the subsequent process, but the detection result will be affected during the detection process. Therefore, the processing device 130 executes the texture analysis program on the image to find the dirty position on the panel, so as to avoid the processing device 130 misjudging the gray scale information of the detected position on the object 10 during the detection process. In the texture analysis process, the processing device 130 sequentially performs differential edge detection, sharpening and binarization on the image.
In detail, the difference between the dirty edge and the surrounding environment may be displayed on the object 10 in a gradual layer manner, and the gray scale difference between the dirty edge and the surrounding environment is not necessarily obvious. Accordingly, in the embodiment of the present invention, a Gaussian differential edger (Gaussian filter) is used to obtain the edge of the graph existing on the image. Since the principle of the gaussian differential edger is known to those skilled in the art, it is not described herein. The processing device 130 further image sharpens the edges of the pattern to highlight the edges of the pattern detected by the gaussian differential edger. Finally, the processing device 130 binarizes the image to make the image appear in black and white. Accordingly, the processing device 130 can acquire a texture image including the object 10.
Referring to fig. 5 and 7, fig. 7 is an image schematic diagram illustrating a texture analysis process according to an embodiment of the invention. In the left diagram of fig. 7, the gray scale information of the corresponding object 10 in the original image is shown, and in the right diagram of fig. 7, the texture image is generated after the edge detection, sharpening and binarization processes. The white blocks are portions where no defect exists, and the black portions indicate the texture corresponding to the defect. It should be noted that, in the embodiment, the processing device 130 performs differential edge detection on the entire image, so that other textures may exist. Moreover, for easy understanding, the image shown in fig. 7 only retains a portion of the corresponding object 10, and no other texture exists. However, in other embodiments of the present invention, the processing device 130 only performs the texture analysis on the block of the object 10 in the image according to the position of the object 10 in the image obtained in step S510, which is not limited in the present invention.
In step S530, the processing device 130 correlates the texture image and the image, and further obtains a texture pattern corresponding to the position of the object 10 in the image according to the position of the object in the image. FIG. 8 is a schematic diagram of associating a texture image and an image including an object under test according to an embodiment of the invention. Referring to fig. 5 and 8, after correlating the texture image and the image including the object to be tested, the processing device 130 can obtain the position of the dirt on the object to be tested 10.
In step S540, the processing device 130 determines whether there is a texture at the detected position according to the texture pattern corresponding to the position of the object 10 in the image. In the embodiment of the present invention, the processing device 130 determines whether there is a texture in the detected position. However, in other embodiments of the present invention, the processing device 130 determines whether there is a texture in a radius range (e.g., within 10 pixel values) centered around the detection position, which is not limited to this.
If there is texture, the indication processing device 130 may cause an error due to a defect if it detects the luminance at the detection position. Accordingly, in step S550, the processing device 130 moves the detection position where the texture exists from the first position to the second position. For example, in this embodiment, the processing device 130 displaces the defect detection positions by one unit in the direction of the center of the object 10. For example, taking the detection position at the top left corner of fig. 8 as an example, if there is a defect, the processing device 130 shifts the detection position one unit to the right and downward, and performs step S540 again until there is no texture at the detection position. It should be noted that, the above, the below, the left and the right are only the directions corresponding to the drawings, and the way of adjusting the detection position by the processing device 130 may be adjusted according to different embodiments and practical requirements, and the invention is not limited thereto.
However, if there is no texture, in step S560, the processing device 130 obtains the average gray scale information corresponding to the predetermined radius according to the detection position where there is no texture, so as to set the average gray scale information as the to-be-tested gray scale information corresponding to the detection position where there is no texture. Accordingly, the processing device 130 can further obtain the corresponding estimated luminance information according to the gray scale information to be measured.
FIG. 9 is a flowchart illustrating a luminance uniformity detection method according to an embodiment of the present invention. Referring to fig. 1 to 3 and fig. 9, details of the luminance uniformity detection method and the luminance uniformity detection system for obtaining the luminance curve according to an embodiment of the present invention will be described below with reference to fig. 9.
In step S910, a sample image including a sample object is captured by the image sensor 110 to obtain a plurality of gray scale information in the sample image including the sample object. Step S910 is the same as step S310, except that step S910 performs image sensing on the sample object, and step S310 performs sensing on the object to be tested 10, so details are not repeated herein.
In step S920, the processing device 130 obtains actual luminance information of a plurality of detection positions on the corresponding sample object, and obtains a plurality of sample gray scale information of the detection positions on the corresponding sample object in the sample image according to the sample image and the detection position information. In detail, in order to establish a real situation that corresponds to the object to be measured and the image sensor 110 obtains the image of the object to be measured 10, in an embodiment of the present invention, the processing device 130 obtains the corresponding relationship between the gray scale and the luminance in the real situation in advance. Therefore, the processing device 130 must first obtain the actual luminance information of a plurality of detection positions on the sample object. In this embodiment, the sample object is first processed by an automatic optical inspection machine with a luminance measurement device to obtain the actual luminance information of each inspection position on the object 10, and the actual luminance information of each inspection position is transmitted and established in the storage unit 120 in advance. Accordingly, the processing device 130 can access the storage unit 120 or directly perform the calculation by using the received actual luminance information of each detected position.
In addition, the processing device 130 captures a sample image including the sample object through the image sensor 110 to obtain sample gray scale information of the detection position on the corresponding sample object in the sample image. The process of acquiring the sample gray scale information of the corresponding detection position on the sample object by the processing device 130 is the same as the above steps S310 and S320, except that in steps S310 and S320, the to-be-detected gray scale information of the corresponding detection position is acquired for the object 10, and in step S920, the sample gray scale information of the corresponding detection position is acquired for the sample object. Accordingly, details are not set forth herein. In addition, in the present embodiment, the detection position of the sample object is consistent with the detection position of the object 10.
It should be noted that, in order to cope with various situations that may occur in the object 10, in the embodiment of the present invention, the actual luminance information of the sample object under various different luminance situations and the corresponding sample grayscale information are collected. For example, in this embodiment, the object 10 will emit current corresponding to 64 gray scales (i.e. from 0 to 255, four brightness levels are used as a gray scale and corresponding current is emitted), so as to obtain the real luminance information corresponding to 64 gray scales through the automatic optical inspection machine. Meanwhile, through cooperation of the image sensor 110 and the processing device 130, sample gray scale information of the corresponding sample object in each detection position corresponding to different real luminance information is obtained.
In step S930, the processing device 130 executes a curve fitting procedure to determine the sample gray scale information and the luminance estimation curve of the real luminance information at each detection position on the corresponding sample object, and stores the luminance estimation curve corresponding to each detection position in the storage unit 120. Specifically, in the curve fitting process, the processing device 130 uses the data input curve equations of the sample gray scale information and the corresponding real luminance information to find the luminance estimation curve equations of the corresponding sample gray scale information and the real luminance information. Therefore, the correlation condition of the sample gray scale information and the real luminance information under various different luminance estimation curves can be found. The curve mathematical expression used in the embodiment of the present invention may be, for example, a Polynomial function (Polynomial), an interpolation function (Spline), an Exponential function (Exponential), etc., but is not limited thereto. Examples of the polynomial function include Linear function (Linear), quadratic function (Quadratic), cubic function (Cubic), quartic function (Quintic) \8230, and multiple function (nth order). Examples of the interpolation function include a Linear interpolation function (Linear), a Hermite interpolation function (Hermite), a kattimum interpolation function (Catmull-rom), a Cubic interpolation function (Cubic), an Akima interpolation function, and a monotonic interpolation function (Monotone). The Exponential function includes, for example, a symmetric S-function (symmetric signature), an asymmetric S-function (asymmetric signature), a michaelis dynamic function (microkinetic menten), a basic Exponential function (Exponential basic), an Exponential semi-decay function (Exponential half-life), an Exponential rate-of-increase function (Exponential probability), a Power function (Power curve), a normal distribution function (Gaussian curve), and the like. Moreover, in this embodiment, the processing device 130 can also use a deep learning method, such as but not limited to, learning using a neural network, so as to estimate the luminance estimation curve corresponding to the real luminance information and the sample gray-scale information. In the embodiment, the processing device 130 uses all the curve functions and deep learning methods described above to obtain the relation curve mathematical expression matching the sample gray scale information and the real luminance information at each detection position, that is, the processing device 130 obtains 29 kinds of luminance estimation curves, but the invention is not limited to the above mathematical functions. Any combination of the above curve functions, or other equations not described in detail herein, can be used as the mathematical function for estimating the relationship between the actual luminance information and the sample gray scale information without departing from the present invention.
The processing device 130 obtains a plurality of luminance estimation curves at each detection position on the sample object. That is, in this embodiment, there are 29 kinds of luminance estimation curves for each detection position. The processing device 130 further stores the luminance estimation curve in the storage unit 120.
In step S940, the processing device 130 captures a verification image including the verification object to obtain a plurality of gray-scale information in the verification image including the verification object. Step S940 is the same as steps S310 and S910, except that in step S310 and step S910, image sensing is performed on the object to be tested 10 and the sample object, and herein, image sensing is performed on the verification object, and details are not described herein. It should be noted that, in the embodiment of the present invention, the object to be tested, the sample object and the verification object all belong to the same type of panel, backlight module or other types of objects to be tested.
In step S950, the processing device 130 obtains the actual luminance information of the detection locations on the corresponding verification object, and the processing device 130 obtains a plurality of verification gray-scale information of a plurality of detection locations on the verification object according to the verification image, so as to determine a plurality of verification estimated luminance information of a plurality of detection locations on the corresponding verification object according to a plurality of luminance estimation curves and a plurality of verification gray-scale information of each detection location. Step S950 is the same as step S920, except that step S920 obtains the actual luminance information and the sample gray scale information at each detection position for the sample object, and here, obtains the actual luminance information and the verification gray scale information at each detection position for the verification object, which is not described in detail herein. In the present embodiment, the detection position of the verification object is consistent with the detection position of the object 10 and the detection position of the sample object.
In step S960, the processing device 130 determines an error value of each luminance estimation curve corresponding to each detection position on the verification object according to the estimated luminance information and the real luminance information of the detection positions on the verification object, so as to set one of the luminance estimation curves having the minimum error value as a luminance curve corresponding to each detection position of the object to be tested in each detection position of the verification object.
In an embodiment of the invention, the process of obtaining the error value by the processing device 130 and the definition of the error value can be expressed as the following equation:
Figure BDA0001965112150000121
wherein N is a luminance estimation curve, e is an error value, and L Est Inputting the estimated luminance information L at the detection position obtained after verifying the gray scale of the object at the detection position to the estimated luminance curve of the corresponding detection position Gnd The real brightness information measured by the automatic detection machine at the detection position is used for verifying the gray scale of the object.
After obtaining the error value of each of the estimated luminance curves corresponding to the detected position, the processing device 130 determines the estimated luminance curve with the smallest error value as the estimated luminance curve closest to the detected position, and accordingly, the processing device 130 uses the estimated luminance curve with the smallest error value as the estimated luminance curve at the detected position. It should be noted that, in different detection positions, the luminance estimation curves obtained for the same mathematical function are not necessarily the same, and the corresponding error values are not necessarily the same. Therefore, the luminance prediction curves corresponding to the minimum error values at different detection positions are different. Therefore, the processing unit 130 stores the corresponding luminance curves for each of the different detection positions, and then adopts the corresponding luminance curves to estimate the estimated luminance information for the different detection positions during the subsequent luminance measurement.
Referring to fig. 10, fig. 10 is a detailed flowchart illustrating a luminance uniformity detection method according to an embodiment of the invention. With reference to fig. 10, the following description describes that the processing device 130 obtains actual luminance information of a plurality of detection positions on a corresponding sample object, and obtains sample gray scale information of the detection positions on the corresponding sample object according to the sample image and the detection position information, and the processing device 130 obtains actual luminance information of the detection positions on a corresponding verification object, and the processing device 130 obtains verification gray scale information of the detection positions on the verification object according to the image including the verification object, so as to determine details of a plurality of estimated luminance information of the detection positions on the corresponding verification object according to a luminance estimation curve.
In step S1010, the processing device 130 executes an object positioning procedure to obtain the positions of the sample object and the verification object in the sample image and the verification image, respectively, and obtain the positions of the detection position on the sample object and the detection position on the verification object in the sample image and the verification image, respectively, according to the detection position information. The processing device 130 further performs binarization processing on the sample image and the verification image respectively to remove portions not belonging to the sample object and the verification object, and performs edge detection on portions corresponding to the sample object and the verification object respectively to obtain a plurality of vertex positions of edges of the sample object and the verification object.
In step S1020, the processing device 130 performs a texture analysis procedure on the sample image and the verification image to obtain texture images including the sample object and the verification object, respectively. The processing device 130 further performs a differential edge detection procedure on the sample image and the verification image to obtain the image edges existing on the sample image and the verification image, respectively, sharpens the image edges of the sample image and the verification image, and performs a binarization process on the image and the sharpened image edges to obtain a texture image.
In step S1030, the processing device 130 correlates the sample image with the texture image including the sample object and the verification image with the texture image including the sample object, and obtains a texture pattern of the corresponding sample object in the sample image and a texture pattern of the corresponding verification object in the verification image according to the positions of the sample object in the sample image and the verification object in the verification image, respectively.
In step S1040, the processing device 130 determines whether there is a texture at the detected position of the sample object and at the detected position of the verification object according to the texture patterns of the corresponding sample object at the positions of the sample image and the verification object at the verification image, respectively.
In step S1050, if there is a texture at one of the detection positions of the sample object or at one of the detection positions of the verification object, the detection position of the texture in the sample object or the verification object is moved from the first position to the second position by the processing device 130. In one embodiment, when a texture exists at one of the detection positions of the sample object, the one of the detection positions of the sample object where the texture exists is moved from the first position to the second position. Meanwhile, one of the detection positions corresponding to the detection position where the texture exists in the sample object among the plurality of detection positions of the verification object is also moved from the first position to the second position.
In step S1060, if there is no texture at one of the detection positions of the sample object or one of the detection positions of the verification object, the processing device 130 further obtains average grayscale information corresponding to a predetermined radius according to one of the detection positions of the sample object where there is no texture and one of the detection positions of the verification object where there is no texture, so as to set the average grayscale information as the sample grayscale information of the detection positions of the corresponding sample object where there is no texture or the verification grayscale information of the detection positions of the verification object where there is no texture. In one embodiment, when no texture exists at one of the detection positions of the sample object and one of the detection positions of the verification object corresponding to the detection position of the sample object where no texture exists has a texture, only one of the detection positions of the verification object where the texture exists is moved from the first position to the second position.
Since the details of steps S1010 to S1060 are the same as those of steps S510 to S560, the difference is that steps S510 to S560 process the object 10 to obtain the gray scale information to be tested at the plurality of testing positions on the object 10, and steps S1010 to S1060 process the sample object and the verification object to obtain the sample gray scale information and the verification gray scale information at the plurality of testing positions on the sample object and the verification object. Therefore, the details of step S1010 to step S1060 are not described herein again.
In summary, the luminance uniformity detection system and the luminance uniformity detection method of the present invention can obtain the gray scale information to be detected corresponding to the plurality of detection positions of the object to be detected according to the image obtained by the image sensor, and further obtain the corresponding estimated luminance information. Because the brightness of each detection position does not need to be measured one by one, the urgent brightness uniformity detection method of the brightness uniformity detection system can effectively save the time consumed by detection. In addition, the brightness uniformity detection system and the brightness uniformity detection method can automatically position the object, do not need to independently position each detection position, and do not need to assist a tester to confirm the relation between the detection positions of the automatic optical detection machine and the panel to be detected, so that the manpower required by detection is reduced. Moreover, the system and the method for detecting the uniform brightness of the invention adopt various estimated brightness curves, and find the brightness curves closer to different detection positions through the auxiliary verification of the verification object, thereby providing more accurate estimated brightness results.
Although the present invention has been described with reference to the above embodiments, it should be understood that various changes and modifications can be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (30)

1. A system for detecting brightness uniformity comprises an image sensor, a storage unit and a processing device,
the image sensor captures an image including the object to be detected so as to acquire a plurality of gray scale information in the image;
the storage unit stores a plurality of luminance curves corresponding to the detection position information;
the processing device is connected to the image sensor and the storage unit, acquires a plurality of pieces of gray scale information corresponding to a plurality of detection positions on the object to be detected from the plurality of pieces of gray scale information of the image according to the image and the detection position information, respectively acquires a plurality of pieces of estimated luminance information at the plurality of detection positions according to the plurality of luminance curves and the plurality of pieces of gray scale information to be detected, and judges whether the luminance of the object to be detected is uniform according to the difference of the plurality of pieces of estimated luminance information among the plurality of detection positions of the object to be detected, wherein the image sensor further captures a sample image including a sample object to acquire the plurality of pieces of gray scale information in the sample image including the sample object,
the processing device further obtains a plurality of real luminance information corresponding to a plurality of detection positions on the sample object, and the processing device obtains a plurality of sample gray scale information corresponding to the plurality of detection positions on the sample object in the sample image according to the sample image and the plurality of detection positions,
the processing device further executes a curve fitting procedure to determine a plurality of luminance estimation curves corresponding to the plurality of sample gray scale information and the plurality of real luminance information at each of the plurality of detection positions on the sample object, and stores the plurality of luminance estimation curves in the storage unit,
the image sensor further captures a verification image including a verification object to obtain the plurality of gray scale information in the verification image including the verification object,
the processing device further obtains the real luminance information corresponding to the detection positions on the verification object in the verification image, and the processing device obtains verification gray-scale information corresponding to the detection positions on the verification object in the verification image according to the verification image to determine verification estimated luminance information corresponding to the detection positions on the verification object according to the luminance estimation curves and the verification gray-scale information,
the processing device further determines an error value of each of the plurality of luminance estimation curves corresponding to each of the plurality of detection positions on the verification object according to the plurality of verification estimated luminance information and the plurality of real luminance information corresponding to the plurality of detection positions on the verification object, respectively, so as to set one of the plurality of luminance estimation curves having the smallest error value in each of the plurality of detection positions of the verification object as each of the plurality of luminance curves corresponding to each of the plurality of detection positions of the object to be tested.
2. The system according to claim 1, wherein the testing position information comprises the testing positions, the storage unit stores each of the plurality of luminance curves corresponding to each of the testing positions, and the processing device obtains each of the estimated luminance information corresponding to each of the testing positions according to each of the plurality of luminance curves corresponding to each of the testing positions and each of the gray scale information to be tested.
3. The luminance uniformity detection system according to claim 1,
the processing device further executes an object positioning program on the image to acquire the position of the object to be detected in the image, and judges the positions of the detection positions in the image according to the position of the object to be detected in the image and the detection position information.
4. The luminance uniformity detection system according to claim 3, wherein, in the object localization program,
the processing device also carries out binarization processing on the image so as to remove parts which do not belong to the object to be detected, and carries out edge detection on the parts corresponding to the object to be detected so as to obtain a plurality of vertex positions of the edge of the object to be detected.
5. The luminance uniformity detection system according to claim 3,
the processing device further executes a texture analysis program on the image to acquire a texture image including the object to be measured,
the processing device is also associated with the texture image and the image, and acquires a texture pattern corresponding to the position of the object to be detected in the image according to the position of the object to be detected in the image.
6. The luminance uniformity detection system according to claim 5, wherein, in the texture analysis program,
the processing device further performs a differential edge detection procedure on the image to obtain a graphic edge existing on the image,
and the processing device also sharpens the image edge and performs binarization processing on the image and the sharpened image edge to obtain the texture image.
7. The system according to claim 5, wherein the processing device further determines whether a texture exists at the plurality of detection positions according to the texture pattern corresponding to the position of the object in the image, and moves one of the plurality of detection positions where the texture exists from a first position to a second position when the texture exists at the one of the plurality of detection positions.
8. The system according to claim 7, wherein when the texture does not exist at one of the plurality of detection positions, average gray scale information with a corresponding predetermined radius is obtained according to each of the plurality of detection positions where the texture does not exist, so as to set the average gray scale information as each of the plurality of to-be-measured gray scale information corresponding to each of the plurality of detection positions where the texture does not exist.
9. The luminance uniformity detection system according to claim 1,
the processing device further executes an object positioning program on the sample image and the verification image to respectively acquire positions of the sample object and the verification object in the sample image and the verification image, and respectively acquires positions of the plurality of detection positions on the sample object and the plurality of detection positions on the verification object in the sample image and the verification image according to the detection position information.
10. The luminance uniformity detection system according to claim 9, wherein, in the object localization program,
the processing device further performs binarization processing on the sample image and the verification image respectively to remove parts which do not belong to the sample object and the verification object, and performs edge detection on the parts corresponding to the sample object and the verification object respectively to obtain a plurality of vertex positions of the edges of the sample object and the verification object.
11. The luminance uniformity detection system according to claim 9,
the processing device further executes a texture analysis program on the sample image and the verification image to respectively acquire texture images including the sample object and the verification object,
the processing device is further used for associating the sample image with the texture image comprising the sample object, the verification image with the texture image comprising the verification object, and respectively obtaining the corresponding sample object and the texture pattern of the corresponding verification object in the sample image and the verification image according to the position of the sample object in the sample image and the position of the verification object in the verification image.
12. The luminance uniformity detection system according to claim 11, wherein, in the texture analysis program,
the processing device further performs a differential edge detection procedure on the sample image and the verification image respectively to obtain the graphic edges existing on the sample image and the verification image respectively,
the processing device also sharpens the image edges of the sample image and the verification image, and performs binarization processing on the sample image, the verification image and the sharpened image edges to respectively obtain the texture images corresponding to the sample image and the verification image.
13. The system according to claim 11, wherein the processing device further determines whether there is a texture at the plurality of detection positions of the sample object and at the plurality of detection positions of the verification object according to the texture patterns corresponding to the positions of the sample object and the verification object in the sample image, respectively, and moves one of the plurality of detection positions where the texture exists in the sample object or the verification object from a first position to a second position when the texture exists at one of the plurality of detection positions of the sample object or at one of the plurality of detection positions of the verification object.
14. The system according to claim 13, wherein the processing device further obtains average gray-scale information with a predetermined radius from the plurality of detection positions of the sample object where the texture is not present or from the plurality of detection positions of the verification object where the texture is not present, when the texture is not present at one of the plurality of detection positions of the sample object or at one of the plurality of detection positions of the verification object where the texture is not present, to set the average gray-scale information as the sample gray-scale information corresponding to the plurality of detection positions of the sample object where the texture is not present or the verification gray-scale information corresponding to the plurality of detection positions of the verification object where the texture is not present.
15. The system according to claim 1, wherein the plurality of luminance estimation curves corresponding to each of the plurality of detection positions on the sample object are plural.
16. A method for detecting uniformity of luminance is used for detecting an object to be detected, and comprises the following steps:
capturing an image comprising the object to be detected to acquire a plurality of gray scale information in the image;
acquiring a plurality of pieces of to-be-detected gray scale information corresponding to a plurality of detection positions on the to-be-detected object from the plurality of pieces of gray scale information of the image according to the image and the detection position information;
respectively acquiring a plurality of estimated luminance information at a plurality of detection positions according to a plurality of luminance curves corresponding to the detection position information and the plurality of to-be-detected gray scale information;
judging whether the brightness of the object to be detected is uniform or not according to the difference of the estimated brightness information among the detection positions of the object to be detected; and
capturing a sample image including a sample object to obtain the plurality of gray scale information in the sample image including the sample object;
acquiring a plurality of real luminance information corresponding to a plurality of detection positions on the sample object, and acquiring a plurality of sample gray scale information corresponding to the plurality of detection positions on the sample object in the sample image according to the sample image and the detection positions;
executing a curve fitting program to determine a plurality of luminance estimation curves corresponding to the plurality of sample gray scale information and the plurality of real luminance information at each of the plurality of detection positions on the sample object, and storing the plurality of luminance estimation curves;
capturing a verification image comprising a verification object to acquire the plurality of gray scale information in the verification image comprising the verification object;
obtaining the real luminance information of the detection positions on the verification object in the verification image, and obtaining verification gray scale information of the detection positions on the verification object in the verification image according to the verification image, so as to determine verification estimated luminance information of the detection positions on the verification object according to the luminance estimation curves and the verification gray scale information; and
determining an error value of each of the plurality of luminance estimation curves corresponding to each of the plurality of detection positions on the verification object according to the plurality of verification estimated luminance information and the plurality of real luminance information corresponding to the plurality of detection positions on the verification object, respectively, so as to set one of the plurality of luminance estimation curves having the smallest error value in each of the plurality of detection positions of the verification object as each of the plurality of luminance curves corresponding to each of the plurality of detection positions of the object to be tested.
17. The method as claimed in claim 16, wherein the detection position information comprises the plurality of detection positions, and at least one of the plurality of luminance curves is associated with each of the plurality of detection positions, and the step of obtaining the estimated luminance information at the detection positions according to the luminance curves and the gray scale information comprises:
and obtaining each estimated brightness information of each detection position according to each brightness curve corresponding to each detection position and each gray scale information to be detected.
18. The luminance uniformity detection method according to claim 16, further comprising:
executing an object positioning program on the image to acquire the position of the object to be detected in the image; and
and judging the positions of the detection positions in the image according to the position of the object to be detected in the image and the detection position information.
19. The method according to claim 18, wherein the step of performing the object-locating procedure on the image comprises:
carrying out binarization processing on the image to remove parts which do not belong to the object to be detected; and
and carrying out edge detection on the part corresponding to the object to be detected so as to obtain a plurality of vertex positions of the edge of the object to be detected.
20. The luminance uniformity detection method according to claim 18, further comprising:
executing a texture analysis program on the image to obtain a texture image comprising the object to be detected; and
and associating the texture image with the image, and acquiring a texture pattern corresponding to the position of the object to be detected in the image according to the position of the object to be detected in the image.
21. The luminance uniformity detection method according to claim 20, wherein the step of performing the texture analysis procedure on the image further comprises:
executing a differential edge measurement program on the image to obtain a graph edge existing on the image; and
and sharpening the image edge, and performing binarization processing on the image and the sharpened image edge to obtain the texture image.
22. The method as claimed in claim 20, wherein the step of associating the texture image with the image and obtaining the texture pattern corresponding to the position of the object in the image according to the position of the object in the image further comprises:
judging whether textures exist at the detection positions or not according to the texture patterns corresponding to the positions of the object to be detected in the images; and
when the texture exists in one of the plurality of detection positions, moving the one of the plurality of detection positions where the texture exists from a first position to a second position.
23. The luminance uniformity detection method according to claim 22, further comprising:
when the texture does not exist in one of the detection positions, respectively acquiring average gray scale information of a corresponding preset radius according to each detection position without the texture; and
and respectively setting the average gray scale information as each piece of the plurality of pieces of to-be-detected gray scale information corresponding to each detection position without the texture.
24. The luminance uniformity detection method according to claim 16, further comprising:
executing an object positioning program on the sample image and the verification image to respectively obtain the positions of the sample object and the verification object in the sample image and the verification image, and respectively obtaining the positions of the plurality of detection positions on the sample object and the plurality of detection positions on the verification object in the sample image and the verification image according to the detection position information.
25. The method according to claim 24, wherein the step of performing the object-locating procedure on the sample image and the verification image further comprises:
respectively carrying out binarization processing on the sample image and the verification image so as to remove parts which do not belong to the sample object and the verification object; and
and respectively carrying out edge detection on parts corresponding to the sample object and the verification object so as to obtain a plurality of vertex positions of the edges of the sample object and the verification object.
26. The luminance uniformity detection method according to claim 24, further comprising:
executing a texture analysis program on the sample image and the verification image to respectively obtain texture images comprising the sample object and the verification object; and
and associating the sample image with the texture image comprising the sample object and the verification image with the texture image comprising the verification object, and respectively acquiring the corresponding sample object and the texture pattern of the verification object in the sample image and the texture pattern of the verification object in the verification image according to the position of the sample object in the sample image and the position of the verification object in the verification image.
27. The luminance uniformity detection method according to claim 26, wherein in the step of performing the texture analysis procedure on the sample image and the verification image, further comprising:
respectively executing a differential edge measurement program on the sample image and the verification image so as to respectively obtain the graph edges on the sample image and the verification image; and
sharpening the image edges of the sample image and the verification image, and performing binarization processing on the sample image, the verification image and the sharpened image edges to respectively obtain the texture images corresponding to the sample image and the verification image.
28. The luminance uniformity detection method according to claim 26, further comprising:
judging whether textures exist at the plurality of detection positions of the sample object and at the plurality of detection positions of the verification object according to the texture patterns corresponding to the positions of the sample object in the sample image and the positions of the verification object in the verification image respectively; and
moving the one of the plurality of detection locations where the texture is present in the sample object or the validation object from a first location to a second location when the texture is present in the one of the plurality of detection locations in the sample object or the validation object.
29. The luminance uniformity detection method according to claim 28, further comprising:
when the texture does not exist at one of the plurality of detection positions of the sample object or one of the plurality of detection positions of the verification object, respectively obtaining average gray scale information of corresponding preset radii according to one of the plurality of detection positions of the sample object where the texture does not exist or the plurality of detection positions of the verification object where the texture does not exist; and
setting the average grayscale information to correspond to the sample grayscale information for the plurality of detection locations of the sample object where the texture is not present or the verification grayscale information for the plurality of detection locations of the verification object where the texture is not present.
30. The method as claimed in claim 29, wherein the plurality of luminance estimation curves corresponding to each of the plurality of detection positions on the sample object are plural.
CN201910098660.0A 2019-01-31 2019-01-31 Brightness uniformity detection system and brightness uniformity detection method Active CN111504608B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910098660.0A CN111504608B (en) 2019-01-31 2019-01-31 Brightness uniformity detection system and brightness uniformity detection method
TW108109555A TWI757590B (en) 2019-01-31 2019-03-20 Luminance uniform detection system and luminance uniform detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910098660.0A CN111504608B (en) 2019-01-31 2019-01-31 Brightness uniformity detection system and brightness uniformity detection method

Publications (2)

Publication Number Publication Date
CN111504608A CN111504608A (en) 2020-08-07
CN111504608B true CN111504608B (en) 2022-10-04

Family

ID=71877458

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910098660.0A Active CN111504608B (en) 2019-01-31 2019-01-31 Brightness uniformity detection system and brightness uniformity detection method

Country Status (2)

Country Link
CN (1) CN111504608B (en)
TW (1) TWI757590B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113634509A (en) * 2021-07-01 2021-11-12 苏州博测泰自动化设备有限公司 Automatic testing method of LED luminance machine

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1705380A (en) * 2004-05-31 2005-12-07 光道视觉科技股份有限公司 Video detection device
TW200931000A (en) * 2008-01-02 2009-07-16 Univ Nat Kaohsiung Applied Sci Neural network-based two-dimensional luminance measurement method
CN101833398A (en) * 2009-03-09 2010-09-15 李锦峰 Position detecting device and method thereof

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1141665C (en) * 2002-06-07 2004-03-10 清华大学 Micro image characteristic extracting and recognizing method
KR100505365B1 (en) * 2003-07-03 2005-08-03 주식회사 한택 Apparatus and method for inspecting display panel using pixel interpolation
JPWO2010119588A1 (en) * 2009-04-16 2012-10-22 シャープ株式会社 Luminance characteristic measuring method, luminance characteristic measuring apparatus, signal processing system thereof, and display device including the signal processing system
CN103136518A (en) * 2013-03-06 2013-06-05 安徽云盛信息科技有限公司 Preprocessing algorithm of fingerprint image automatic identification system
JP6377011B2 (en) * 2014-06-19 2018-08-22 株式会社イクス Luminance measuring method, luminance measuring apparatus, and image quality adjustment technology using them
CN105241638A (en) * 2015-09-09 2016-01-13 重庆平伟光电科技有限公司 Vision-based quick LED module brightness uniformity detection method
CN105301810A (en) * 2015-11-24 2016-02-03 上海斐讯数据通信技术有限公司 Screen defect detecting method and screen defect detecting device
CN106846313B (en) * 2017-01-23 2020-03-10 广东工业大学 Workpiece surface defect detection method and device
CN107845087B (en) * 2017-10-09 2020-07-03 深圳市华星光电半导体显示技术有限公司 Method and system for detecting uneven brightness defect of liquid crystal panel
CN108510965B (en) * 2018-05-03 2019-10-11 武汉天马微电子有限公司 Display brightness compensation method, device and system
CN109101854A (en) * 2018-06-25 2018-12-28 华南理工大学 A kind of multiple barcode localization method
CN108877652A (en) * 2018-06-27 2018-11-23 武汉华星光电半导体显示技术有限公司 Optical compensation method and OLED display

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1705380A (en) * 2004-05-31 2005-12-07 光道视觉科技股份有限公司 Video detection device
TW200931000A (en) * 2008-01-02 2009-07-16 Univ Nat Kaohsiung Applied Sci Neural network-based two-dimensional luminance measurement method
CN101833398A (en) * 2009-03-09 2010-09-15 李锦峰 Position detecting device and method thereof

Also Published As

Publication number Publication date
CN111504608A (en) 2020-08-07
TWI757590B (en) 2022-03-11
TW202030465A (en) 2020-08-16

Similar Documents

Publication Publication Date Title
CN111970506B (en) Lens dirt detection method, device and equipment
US7012701B2 (en) Measuring for device for contactless measurement of tires
WO2017020829A1 (en) Resolution testing method and resolution testing device
US11158039B2 (en) Using 3D vision for automated industrial inspection
CN113850749B (en) Method for training defect detector
EP2985566B1 (en) Data generation method and data generation apparatus
CN103760165A (en) Defect detecting method and device of display panel
CN112070751A (en) Wood floor defect detection method and device
CN106504231A (en) Component defect detection method and system
CN103852034A (en) Elevator guide rail perpendicularity detection method
KR102242996B1 (en) Method for atypical defects detect in automobile injection products
CN115375610A (en) Detection method and device, detection equipment and storage medium
TWI618926B (en) Method and system for improving wafer surface inspection sensitivity
TWI429900B (en) A method of detecting a bright spot defect and a threshold value generating method and the device thereof
CN117589109B (en) Quality detection and control method for quartz center tray manufacturing process
CN111504608B (en) Brightness uniformity detection system and brightness uniformity detection method
CN117830251A (en) Defect analysis method, defect analysis device and electronic equipment
CN112802022A (en) Method for intelligently detecting defective glass image, electronic device and storage medium
CN105572133A (en) Flaw detection method and device
CN108629813A (en) A kind of acquisition methods, the device of projection device elevation information
KR20160097651A (en) Apparatus and Method for Testing Pattern of Sample using Validity Image Processing Technique, and computer-readable recording medium with program therefor
JP6681082B2 (en) Information processing apparatus, information system, information processing method, and program
CN115494659A (en) Liquid crystal panel detection method and system
TWI493177B (en) Method of detecting defect on optical film with periodic structure and device thereof
JP2007081513A (en) Blot defect detecting method for solid-state imaging element

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant