WO2020253827A1 - 评估图像采集精度的方法及装置、电子设备和存储介质 - Google Patents

评估图像采集精度的方法及装置、电子设备和存储介质 Download PDF

Info

Publication number
WO2020253827A1
WO2020253827A1 PCT/CN2020/097122 CN2020097122W WO2020253827A1 WO 2020253827 A1 WO2020253827 A1 WO 2020253827A1 CN 2020097122 W CN2020097122 W CN 2020097122W WO 2020253827 A1 WO2020253827 A1 WO 2020253827A1
Authority
WO
WIPO (PCT)
Prior art keywords
test point
test
image
demura
center
Prior art date
Application number
PCT/CN2020/097122
Other languages
English (en)
French (fr)
Inventor
王斌
Original Assignee
京东方科技集团股份有限公司
成都京东方光电科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司, 成都京东方光电科技有限公司 filed Critical 京东方科技集团股份有限公司
Priority to EP20824412.9A priority Critical patent/EP3989542A4/en
Priority to US17/256,072 priority patent/US11314979B2/en
Publication of WO2020253827A1 publication Critical patent/WO2020253827A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/211Selection of the most significant subset of features
    • G06F18/2113Selection of the most significant subset of features by ranking or filtering the set of features, e.g. using a measure of variance or of feature cross-correlation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/217Validation; Performance evaluation; Active pattern learning techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/28Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/758Involving statistics of pixels or of feature values, e.g. histogram matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/776Validation; Performance evaluation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/98Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
    • G06V10/993Evaluation of the quality of the acquired pattern
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/004Diagnosis, testing or measuring for television systems or their details for digital television systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30121CRT, LCD or plasma display
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/62Extraction of image or video features relating to a temporal dimension, e.g. time-based feature extraction; Pattern tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/02Recognising information on displays, dials, clocks

Definitions

  • the present disclosure relates to the field of display technology, and in particular to a method and device for evaluating the image acquisition accuracy of a Demura device, an electronic device, and a non-transitory computer-readable storage medium.
  • the driving circuit of an Organic Light-Emitting Diode (OLED) display device may include a plurality of thin film transistors. Due to the limitations of the crystallization process, the thin film transistors at different positions often have non-uniformities in electrical parameters such as threshold voltage and mobility, resulting in the display panel of the display device prone to local mura phenomenon. In order to improve the display effect, it is necessary to perform Demura compensation on the display panel.
  • OLED Organic Light-Emitting Diode
  • the embodiments of the present disclosure provide a method and apparatus for evaluating the image acquisition accuracy of a Demura device, an electronic device, and a non-transitory computer-readable storage medium.
  • the first aspect of the present disclosure provides a method for evaluating the image acquisition accuracy of a Demura device, including:
  • the Demura device uses the Demura device to perform image collection on the detection screen to obtain a preprocessed image corresponding to the detection screen; the preprocessed image and the corresponding detection screen have the same size and shape; and
  • the image acquisition accuracy of the Demura device is determined.
  • the determining the image acquisition accuracy of the Demura device according to the difference between the position of each test point pattern in the detection frame and the corresponding position in the preprocessed image includes :
  • the image acquisition accuracy of the Demura device is determined.
  • the image acquisition of the Demura device is determined according to the difference between the position of each test point pattern in the detection frame and the position of the corresponding test spot in the binary image Accuracy, including:
  • the image acquisition accuracy of the Demura device is determined according to the average value and standard deviation of the offset distance set.
  • the offset distance D between each test point pattern and the corresponding test spot is calculated according to the following formula:
  • x is the difference between the abscissa of the center of the test point pattern in the preset coordinate system and the abscissa of the center of the corresponding test spot in the preset coordinate system;
  • y is the difference between the ordinate of the center of the test point pattern in the preset coordinate system and the ordinate of the center of the corresponding test spot in the preset coordinate system.
  • the gray scale of each of the plurality of test point patterns is between 95 and 255; the gray scale of other positions in the detection screen except for the plurality of test point patterns is between 0 and 255. Between 50.
  • each of the test point patterns is a single pixel point.
  • the multiple test point patterns are uniformly arranged in an array.
  • the interval between each adjacent two test point patterns of the plurality of test point patterns included in the detection frame is equal.
  • the gray level of each of the plurality of test point patterns is 225; the gray level of other positions in the inspection frame except for the plurality of test point patterns is 31.
  • a second aspect of the present disclosure provides an apparatus for evaluating the image acquisition accuracy of a Demura device, including:
  • An acquiring component for acquiring a preprocessed image corresponding to the detection screen, the size and shape of the preprocessed image and the corresponding detection screen are the same;
  • the determining component is used to determine the image acquisition accuracy of the Demura device according to the difference between the position of each test point pattern in the detection frame and the corresponding position in the preprocessed image.
  • the determining component includes:
  • a filtering unit configured to perform low-pass filtering on the preprocessed image
  • the binarization unit is configured to perform binarization processing on the low-pass filtered image to obtain a binarized image, and the binarized image includes a plurality of tests corresponding to the plurality of test point patterns one-to-one Spot;
  • the determining unit is configured to determine the image acquisition accuracy of the Demura device according to the difference between the position of each test point pattern in the detection frame and the position of the corresponding test spot in the binarized image.
  • the determining unit includes:
  • the coordinate acquisition subunit is used to acquire the coordinates of the center of each test point pattern in the preset coordinate system and the coordinates of the center of the test spot corresponding to each test point pattern in the preset coordinate system; wherein, The coverage area of the detection screen and the binarized image in the preset coordinate system is the same;
  • the first calculation subunit is used to calculate the offset distance between each of the multiple test point patterns and the respective corresponding test spots according to the coordinates of the center of each test point pattern and the coordinates of the corresponding test spot center ;
  • the second calculation subunit is configured to form an offset distance set according to the offset distance between each of the multiple test point patterns and the respective test spot, and calculate the average sum of the offset distance sets Standard deviation;
  • the determining subunit is configured to determine the image acquisition accuracy of the Demura device according to the average value and standard deviation of the offset distance set.
  • the first calculation subunit is used to calculate the offset distance D between each test point pattern and the corresponding test spot according to the following formula:
  • x is the difference between the abscissa of the center of the test point pattern in the preset coordinate system and the abscissa of the center of the corresponding test spot in the preset coordinate system;
  • y is the difference between the ordinate of the center of the test point pattern in the preset coordinate system and the ordinate of the center of the corresponding test spot in the preset coordinate system.
  • the gray scale of each of the plurality of test point patterns is between 95 and 255; the gray scale of other positions in the detection screen except for the plurality of test point patterns is between 0 and 255. Between 50.
  • each of the test point patterns is a single pixel point.
  • the multiple test point patterns are uniformly arranged in an array.
  • the interval between each adjacent two test point patterns of the plurality of test point patterns included in the detection frame is equal.
  • the gray level of each of the plurality of test point patterns is 225; the gray level of other positions in the inspection frame except for the plurality of test point patterns is 31.
  • a third aspect of the present disclosure provides an electronic device, including:
  • One or more processors are One or more processors;
  • a storage device having one or more programs stored thereon, and when the one or more programs are executed by the one or more processors, the one or more processors implement the first aspect according to the present disclosure
  • the one or more processors implement the first aspect according to the present disclosure
  • the fourth aspect of the present disclosure provides a non-transitory computer-readable storage medium on which a computer program is stored, wherein, when the computer program is executed by a processor, the implementation in each embodiment of the first aspect of the present disclosure Any of the methods described.
  • FIG. 1 is a method for evaluating the image acquisition accuracy of a Demura device according to an embodiment of the disclosure
  • 1A is a schematic diagram of a plurality of test point patterns in a detection screen provided by an embodiment of the disclosure
  • FIG. 1B is a schematic diagram of a preprocessed image obtained after image collection of a detection screen by a Demura device according to an embodiment of the disclosure
  • FIG. 2 is a flowchart of an alternative implementation manner of step S3 shown in FIG. 1;
  • 2A is a schematic diagram of a test spot in a binarized image provided by an embodiment of the disclosure
  • FIG. 3 is a flowchart of an optional implementation manner of step S33 in an embodiment of the disclosure.
  • Fig. 3A is a schematic diagram of the position offset between each test point pattern and the respective test spot provided by an embodiment of the present disclosure
  • 3B is a schematic diagram of the offset distance between a test point pattern and a corresponding test spot provided in an embodiment of the present disclosure
  • Figure 4 is the distance histogram corresponding to two Demura devices
  • FIG. 5 is a schematic structural diagram of an apparatus for evaluating the image acquisition accuracy of a Demura device according to an embodiment of the disclosure.
  • FIG. 6 is a schematic diagram of an optional structure of a determining component in an embodiment of the disclosure.
  • a Demura device which may be a camera used for Demura
  • the compensation data performs Demura compensation on the display panel. Therefore, the image acquisition accuracy of Demura equipment directly affects the effect of Demura compensation.
  • the Demura compensation process includes pre-processing and compensation processing.
  • the Demura device is used to collect images on the display screen of the display panel.
  • the compensation process the Demura compensation data is calculated according to the collected images, and the display panel is Demura compensation based on the Demura compensation data.
  • the image acquisition accuracy that is, the effect of the pre-processing
  • this method cannot accurately determine the pros and cons of different pre-processing effects, and thus cannot accurately determine the Demura device with a better compensation effect, thereby affecting the Demura compensation effect.
  • FIG. 1 is a method for evaluating the image acquisition accuracy of a Demura device provided by an embodiment of the disclosure. As shown in Fig. 1, the method may include the following steps S1 to S3.
  • Step S1 Control the display panel to display a detection screen, and the detection screen includes multiple test point patterns with intervals. Alternatively, the interval between two adjacent test point patterns among the plurality of test point patterns included in the detection screen may be equal.
  • the detection screen includes multiple pixel points, and the size of the test point pattern may be a single pixel point.
  • the test point patterns can be evenly arranged in an array.
  • the gray scale of the test point pattern and the gray scale of other areas can be set to have a large difference.
  • the gray level of the test point pattern is above 200, and the gray level of other areas is below 40; or the gray level of the test point pattern is below 40, and the gray level of other areas is above 200.
  • Step S2 Use the Demura device to perform image collection on the detection screen to obtain a preprocessed image corresponding to the detection screen.
  • the size and shape of the preprocessed image and the corresponding detection screen are the same.
  • FIG. 1B shows a schematic diagram of a preprocessed image obtained by a Demura device of an embodiment of the present disclosure after image collection of a detection screen.
  • Commonly used Demura equipment may be Demura cameras and the like.
  • Step S3 Determine the image acquisition accuracy of the Demura device according to the difference between the position of each test point pattern in the detection frame and the corresponding position in the preprocessed image.
  • the position of the test point pattern in the detection screen can be specifically the position of the center of the test point pattern in the detection screen.
  • the corresponding position of the test point pattern in the preprocessed image is the center of the image of the test point pattern. The position in the preprocessed image.
  • the image acquisition accuracy of the Demura device is inversely related to the difference between the position of the test point pattern in the detection screen and the corresponding position in the preprocessed image.
  • the difference (for example, distance) between the position of each test point pattern in the detection screen and the corresponding position in the preprocessed image is smaller, the image acquisition accuracy of the Demura device is higher.
  • the image acquisition accuracy of the Demura device is determined based on the difference between the position of each test point pattern in the detection screen and the corresponding position in the preprocessed image, and does not rely on the judgment of the human eye, thereby eliminating The environment and people’s subjective consciousness and other uncertain factors. Therefore, the accuracy of the image collected by the Demura device can be accurately evaluated, which in turn helps to select a Demura device with higher accuracy, thereby improving the compensation effect of the Demura compensation.
  • the display panel can be controlled to display a detection screen, and based on the detection screen, the image acquisition accuracy of the Demura device is determined through steps S2 and S3. It is also possible to control the display panel to display multiple inspection images. For each inspection image, an image acquisition accuracy is determined through steps S2 and S3, and the lower accuracy is used as the final image acquisition accuracy of the Demura device. For example, among multiple inspection images, the gray scale of the test point in one inspection image is between 95 and 255, and the gray level of other locations is 31; the gray level of the test point in the other inspection image is between 0 and 50. In the meantime, the gray scale of other positions is 225.
  • the display panel can be controlled to display a test screen, and the gray level of the test screen is lower than the test point.
  • the gray scale of the test point is between 95 and 255; the gray scale of other positions in the detection screen is between 0 and 50.
  • the gray scale of the test point is 225, and the gray scale of other positions in the detection screen is 31. In this way, the accuracy of image acquisition is higher, so that a more accurate evaluation can be performed.
  • FIG. 2 is a flowchart of an optional implementation manner of step S3 provided by an embodiment of the disclosure. As shown in Fig. 2, step S3 may include steps S31 to S33.
  • Step S31 Perform low-pass filtering on the preprocessed image.
  • the image formed by the test point pattern often has blurred boundaries. After low-pass filtering, the boundary of the test point pattern can be made clearer, which is beneficial to accurately detect the test.
  • the position of the image formed by the dot pattern in the preprocessed image is beneficial to accurately detect the test.
  • the image boundary is a part of the local image where the gray scale changes sharply and the change is discontinuous, and the number of pixels in the transition area is small.
  • Image acquisition is easily affected by factors such as imaging system aberration, depth of field, defocus, or weak illumination.
  • the boundary image will degenerate into a fuzzy boundary, and the fuzzy boundary will cause inaccurate boundary positioning due to the above-mentioned image noise.
  • Image noise is often the high-frequency component of the image, and low-pass filters can be used to remove image noise.
  • the low-pass filter can be a low-pass filter commonly used in the prior art, such as: arithmetic mean filter, geometric mean filter, harmonic mean filter, inverse harmonic mean filter, inverse harmonic mean filter, Alpha mean filter and Gaussian low-pass filter, etc.
  • Step S32 Binarize the low-pass filtered image to obtain a binarized image.
  • the binarized image includes test spots corresponding to the test point patterns one-to-one.
  • the test spot is the image formed by the test point pattern after low-pass filtering and binarization.
  • FIG. 2A shows a schematic diagram of a test spot in a binarized image provided by an embodiment of the present disclosure.
  • the process of binarization is to compare the gray level of each pixel in the image with a preset threshold. If the gray level of the pixel is greater than the preset threshold, adjust the gray level of the pixel to 255. If the gray level is not greater than the preset threshold, the gray level of the pixel is adjusted to 0; for example, the preset threshold can be set according to actual needs. In the case where the gray level of the test point pattern is 220 and the gray level of other positions is 31, the preset threshold may be set to 150.
  • binarization processing can facilitate the display and separation of test spots. At the same time, binarization can also increase the speed of image processing and simplify the calculation of image processing.
  • Step S33 Determine the image acquisition accuracy of the Demura device according to the difference between the position of each test point pattern in the detection frame and the position of the corresponding test spot in the binary image.
  • FIG. 3 is a flowchart of an optional implementation manner of step S33 in an embodiment of the disclosure. As shown in FIG. 3, step S33 may include the following steps S331 to S334.
  • Step S331 Obtain the coordinates of the center of each test point pattern in the preset coordinate system and the coordinates of the center of the test spot corresponding to each test point pattern in the preset coordinate system.
  • the coverage area of the detection screen and the binarized image in the preset coordinate system is the same.
  • Step S331 can be regarded as taking the vertex corresponding to the position in the detection screen and the binarized image as the origin of the preset coordinate system, taking the row direction as the horizontal axis direction and the column direction as the vertical axis direction. For example, take the vertex of the lower left corner of the detection screen and the vertex of the lower left corner of the binarized image as the origin of the preset coordinate system, the horizontal and right direction as the positive horizontal axis, and the vertical upward direction as the positive vertical axis.
  • a preset coordinate system is established, and the coverage area of the detection screen and the binarized image in the preset coordinate system is the same.
  • the origin of the preset coordinate system may be the lower left corner of the left part of FIG. 3A
  • the horizontal axis may be along the bottom edge of the left part of FIG. 3A
  • the horizontal axis may be along the left edge of the left part of FIG. 3A.
  • center coordinates of the test point pattern can be obtained according to the driving signal when the driving detection screen is displayed.
  • test point pattern is a pixel point
  • coordinates of the pixel point can be used as the center coordinates of the test point pattern.
  • Step S332 Calculate the offset distance between each test point pattern and its corresponding test spot according to the coordinates of each test point pattern and the coordinates of the corresponding test spot.
  • FIG. 3A shows a schematic diagram of the position offset between each test point pattern and its corresponding test spot provided by an embodiment of the present disclosure.
  • point A in FIG. 3A is the test point pattern.
  • Point B is the test spot.
  • the position coordinates of each test point pattern A and the position coordinates of the corresponding test spot B should be the same in theory, but the actual coordinate position and range after the pre-processing algorithm will always be small. Offset, the smaller the offset, the higher the alignment accuracy. In other words, the smaller the offset, the higher the image acquisition accuracy of the Demura device.
  • the offset distance D between each test point pattern A and its corresponding test spot B is calculated according to the following formula:
  • x is the difference between the abscissa of the center of the test point pattern in the preset coordinate system and the abscissa of the center of the corresponding test spot in the preset coordinate system;
  • y is the center of the test point pattern in the preset coordinate system The difference between the ordinate in the coordinate system and the ordinate of the center of the corresponding test spot in the preset coordinate system.
  • FIG. 3B shows a top view of the offset distance between a test point pattern and a corresponding test spot provided in an embodiment of the present disclosure and a side view of the offset distance corresponding to the top view, where point A is the test point pattern, and B The dots are test spots.
  • Step S333 According to the offset distance between each test point pattern and the corresponding test spot, form an offset distance set, and calculate the average value and standard deviation of the offset distance set.
  • the present disclosure not only calculates the average value of the distance between each test point pattern and its corresponding test spot, but also measures the distance between each test point pattern and its corresponding test spot.
  • the degree of dispersion of the average value of the distance between spots is measured. A larger standard deviation means that the difference between most values and its average is larger; a smaller standard deviation means that these values are closer to the mean.
  • Step S334 Determine the image acquisition accuracy of the Demura device according to the average value and standard deviation of the offset distance.
  • the image acquisition accuracy of the Demura device is inversely related to the average value and standard deviation, that is, the larger the average value and standard deviation, the lower the image acquisition accuracy of the Demura device, and the smaller the average value and standard deviation.
  • the sum of the product of the average value and the first weight and the product of the standard deviation and the second weight can be calculated, and the image acquisition accuracy can be determined according to the sum.
  • the first weight may be a larger value
  • the second weight may be a smaller value, for example: the first weight is 0.9 and the second weight is 0.1.
  • the image acquisition accuracy can be expressed in terms of specific numerical values or grades. For example, a value that is inversely related to the above sum is calculated according to a preset formula as the image acquisition accuracy. For example, the sum of the first weight and the second weight is equal to 1.
  • Figure 4 is a histogram of the offset distances corresponding to two Demura devices.
  • the histogram on the left in Figure 4 is: a histogram of the offset distance between each test point pattern and its corresponding test spot calculated according to the image collected by the Demura device of manufacturer A
  • the histogram on the right is: A histogram of the offset distance between each test point pattern and its corresponding test spot calculated according to the images collected by the Demura equipment of manufacturer B.
  • the total number of test points N is 2616.
  • the horizontal axis represents the offset distance
  • the vertical axis represents the number of test point patterns.
  • the average offset distance corresponding to the image collected by the Demura device of manufacturer A is 1.461, and the standard deviation is 0.7289; the average offset distance corresponding to the image collected by the Demura device of manufacturer B is 0.7328, and the standard deviation is 0.5343. Therefore, it can be determined that manufacturer B has higher image acquisition accuracy and better technical capabilities than the Demura equipment of manufacturer A.
  • FIG. 5 is a schematic structural diagram of a device for evaluating the image acquisition accuracy of a Demura device provided by an embodiment of the disclosure, and the device can be used to implement the above-mentioned method for evaluating the image acquisition accuracy of a Demura device.
  • the device includes: a controller 10, an acquiring component 20, and a determining component 30.
  • the controller 10 is used to control the display panel to display a detection screen, and the detection screen includes a plurality of test point patterns with intervals.
  • the interval between two adjacent test point patterns among the plurality of test point patterns included in the detection screen may be equal.
  • the obtaining component 20 is used to obtain a preprocessed image corresponding to the detection screen, and the preprocessed image and the corresponding detection screen have the same size and shape.
  • the acquisition component 20 may be a Demura device.
  • the pre-processed image can be obtained by the Demura device performing image acquisition on the detection screen.
  • the determining component 30 is used to determine the image acquisition accuracy of the Demura device according to the difference between the position of each test point pattern in the detection screen and the corresponding position in the preprocessed image.
  • the gray scale of the test point is between 95 and 255; the gray scale of other positions in the detection screen is between 0 and 50.
  • the gray scale of the test point is 225, and the gray scale of other positions is 31.
  • FIG. 6 is a schematic diagram of an optional structure of the determining component 30 in an embodiment of the disclosure.
  • the determining component 30 may include: a filtering unit 31, a binarization unit 32 and a determining unit 33.
  • the filtering unit 31 is used to perform low-pass filtering on the preprocessed image.
  • the binarization unit 32 is configured to perform binarization processing on the low-pass filtered image to obtain a binarized image.
  • the binarized image includes test spots corresponding to the test point patterns one-to-one.
  • the determining unit 33 is configured to determine the image acquisition accuracy of the Demura device according to the difference between the position of each test point pattern in the detection frame and the position of the corresponding test spot in the binary image.
  • the determination unit 33 may include, for example, a coordinate acquisition subunit 331, a first calculation subunit 332, a second calculation subunit 333, and a determination subunit 334.
  • the coordinate acquisition subunit 331 is configured to acquire the coordinates of the center of each test point pattern in the preset coordinate system and the coordinates of the center of the test spot corresponding to each test point pattern in the preset coordinate system.
  • the detection screen and the binarized image have the same coverage in the preset coordinate system.
  • the first calculation subunit 332 is configured to calculate the offset distance between each test point pattern and the corresponding test spot according to the coordinates of each test point pattern and the coordinates of the corresponding test spot.
  • the first calculation subunit 332 is used to calculate the offset distance D between each test point pattern and the corresponding test spot, for example, according to the following formula:
  • x is the difference between the abscissa of the center of the test point pattern in the preset coordinate system and the abscissa of the center of the corresponding test spot in the preset coordinate system
  • y is the difference between the abscissa of the test point pattern The difference between the ordinate of the center in the preset coordinate system and the ordinate of the center of the corresponding test spot in the preset coordinate system.
  • the second calculation subunit 333 is configured to form an offset distance set according to the offset distance between each test point pattern and the corresponding test spot, and calculate the average value and standard deviation of the offset distance set.
  • the determining sub-unit 333 is used to determine the image acquisition accuracy of the Demura device according to the average value and standard deviation of the offset distance set.
  • the device shown in FIG. 5 may further include a memory, which may be connected to the controller 10 for storing the coordinates of the test point pattern, the coordinates of the test spot, the offset distance, and the average Value, said standard deviation, and other related data and computer programs.
  • a memory which may be connected to the controller 10 for storing the coordinates of the test point pattern, the coordinates of the test spot, the offset distance, and the average Value, said standard deviation, and other related data and computer programs.
  • the various components of the apparatus shown in FIG. 5 and FIG. 6 may be implemented in a hardware manner, or may be implemented in a combination of hardware and software.
  • the various components of the device shown in FIG. 5 and FIG. 6 may be a central processing unit (CPU), an application processor (AP), a digital signal processor (DSP), Field programmable logic circuit (FPGA), microprocessor (MCU), filter, integrated circuit (IC) or application specific integrated circuit (ASIC).
  • the various components of the apparatus shown in FIG. 5 and FIG. 6 may be implemented by a combination of a processor, a memory, and a computer program.
  • the computer program is stored in the memory, and the processor obtains data from the memory.
  • the computer program is read and executed so as to be used as various components of the apparatus shown in FIGS. 5 and 6.
  • the embodiments of the present disclosure also provide an electronic device, including: one or more processors and a storage device; for example, one or more programs are stored on the storage device, and when the one or more programs are When multiple processors are executed, the one or more processors implement the method for evaluating the image acquisition accuracy of the Demura device described in the foregoing embodiment.
  • the embodiments of the present disclosure also provide a non-transitory computer-readable storage medium on which a computer program is stored. For example, when the computer program is executed, it is used to evaluate the image acquisition accuracy of the Demura device as described in the previous embodiment. Methods.
  • the method for evaluating the image acquisition accuracy of Demura equipment, the device for evaluating the image acquisition accuracy of Demura equipment, electronic equipment, and non-transitory computer-readable storage medium provided by the embodiments of the present disclosure can objectively and accurately evaluate The image acquisition accuracy of Demura equipment is helpful to improve the Demura compensation effect.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Quality & Reliability (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Testing Of Optical Devices Or Fibers (AREA)

Abstract

本公开提供了一种用于评估Demura设备的图像采集精度的方法,包括:控制显示面板显示检测画面,所述检测画面中包括间隔的多个测试点图案;利用所述Demura设备对所述检测画面进行图像采集,得到与所述检测画面对应的预处理图像;所述预处理图像与相应检测画面的大小、形状均相同;根据每个测试点图案在所述检测画面中的位置与在所述预处理图像中所对应的位置之间的差异,确定所述Demura设备的图像采集精度。本公开还提供一种用于评估Demura设备的图像采集精度的装置、电子设备和非暂时性计算机可读存储介质。

Description

评估图像采集精度的方法及装置、电子设备和存储介质
相关申请的交叉引用
本申请要求于2019年06月21日提交的中国专利申请No.201910543098.8的优先权,该专利申请的全部内容以引用方式整体并入本文中。
技术领域
本公开涉及显示技术领域,具体涉及一种用于评估Demura设备的图像采集精度的方法及装置、电子设备和非暂时性计算机可读存储介质。
背景技术
有机发光二极管(OLED,Organic Light-Emitting Diode)显示装置的驱动电路中可以包括多个薄膜晶体管。由于晶化工艺的局限性,不同位置的薄膜晶体管常常在阈值电压、迁移率等电学参数上具有非均匀性,从而导致该显示装置的显示面板容易产生局部斑痕(Mura)现象。为了改善显示效果,需要对显示面板进行去斑痕(Demura)补偿。
发明内容
本公开的实施例提供了一种用于评估Demura设备的图像采集精度的方法及装置、电子设备和非暂时性计算机可读存储介质。
本公开的第一方面提供了一种用于评估Demura设备的图像采集精度的方法,包括:
控制显示面板显示检测画面,所述检测画面中包括间隔的多个测试点图案;
利用所述Demura设备对所述检测画面进行图像采集,得到与所 述检测画面对应的预处理图像;所述预处理图像与相应检测画面的大小、形状均相同;以及
根据每个测试点图案在所述检测画面中的位置与在所述预处理图像中所对应的位置之间的差异,确定所述Demura设备的图像采集精度。
在一个实施例中,所述根据每个测试点图案在所述检测画面中的位置与在所述预处理图像中所对应的位置之间的差异,确定所述Demura设备的图像采集精度,包括:
对所述预处理图像进行低通滤波;
对经过低通滤波后的图像进行二值化处理,得到二值化图像,所述二值化图像中包括与所述多个测试点图案一一对应的多个测试斑;以及
根据每个测试点图案在所述检测画面中的位置与相应的测试斑在所述二值化图像中的位置之间的差异,确定所述Demura设备的图像采集精度。
在一个实施例中,所述根据每个测试点图案在所述检测画面中的位置与相应的测试斑在所述二值化图像中的位置之间的差异,确定所述Demura设备的图像采集精度,包括:
获取每个测试点图案的中心在预设坐标系中的坐标、以及每个测试点图案所对应的测试斑的中心在所述预设坐标系中的坐标;其中,所述检测画面与所述二值化图像在所述预设坐标系中的覆盖范围相同;
根据每个测试点图案中心的坐标与相应的测试斑中心的坐标,计算所述多个测试点图案中的每一个与各自对应的测试斑之间的偏移距离;
根据所述多个测试点图案中的每一个与各自对应的测试斑之间的偏移距离,得到偏移距离集合,计算所述偏移距离集合的平均值和标准差;以及
根据所述偏移距离集合的平均值和标准差确定所述Demura设备的图像采集精度。
在一个实施例中,每个测试点图案与对应的测试斑之间的偏移距离D根据以下公式计算:
Figure PCTCN2020097122-appb-000001
其中,x为测试点图案的中心在所述预设坐标系中的横坐标与对应的测试斑的中心在所述预设坐标系中的横坐标之间的差值;以及
y为测试点图案的中心在所述预设坐标系中的纵坐标与对应的测试斑的中心在所述预设坐标系中的纵坐标之间的差值。
在一个实施例中,所述多个测试点图案中的每一个的灰阶在95~255之间;所述检测画面中除了所述多个测试点图案以外的其他位置的灰阶在0~50之间。
在一个实施例中,每一个所述测试点图案为单个像素点。
在一个实施例中,所述多个测试点图案均匀地排布为阵列。
在一个实施例中,所述检测画面包括的所述多个测试点图案中的各相邻两个测试点图案之间的间隔相等。
在一个实施例中,所述多个测试点图案中的每一个的灰阶为225;所述检测画面中除了所述多个测试点图案以外的其他位置的灰阶为31。
本公开的第二方面提供了一种用于评估Demura设备的图像采集精度的装置,包括:
控制器,用于控制显示面板显示检测画面,所述检测画面中包括间隔的多个测试点图案;
获取组件,用于获取与所述检测画面对应的预处理图像,所述预处理图像与相应检测画面的大小、形状均相同;以及
确定组件,用于根据每个测试点图案在所述检测画面中的位置与在所述预处理图像中所对应的位置之间的差异,确定所述Demura设备的图像采集精度。
在一个实施例中,所述确定组件包括:
滤波单元,用于对所述预处理图像进行低通滤波;
二值化单元,用于对经过低通滤波后的图像进行二值化处理, 得到二值化图像,所述二值化图像中包括与所述多个测试点图案一一对应的多个测试斑;以及
确定单元,用于根据每个测试点图案在所述检测画面中的位置与相应的测试斑在所述二值化图像中的位置之间的差异,确定所述Demura设备的图像采集精度。
在一个实施例中,所述确定单元包括:
坐标获取子单元,用于获取每个测试点图案的中心在预设坐标系中的坐标、以及每个测试点图案所对应的测试斑的中心在所述预设坐标系中的坐标;其中,所述检测画面与所述二值化图像在所述预设坐标系中的覆盖范围相同;
第一计算子单元,用于根据每个测试点图案中心的坐标与相应的测试斑中心的坐标,计算所述多个测试点图案中的每一个与各自对应的测试斑之间的偏移距离;
第二计算子单元,用于根据所述多个测试点图案中的每一个与各自对应的测试斑之间的偏移距离,构成偏移距离集合,计算所述偏移距离集合的平均值和标准差;以及
确定子单元,用于根据所述偏移距离集合的平均值和标准差确定所述Demura设备的图像采集精度。
在一个实施例中,所述第一计算子单元用于根据以下公式计算每个测试点图案与对应的测试斑之间的偏移距离D:
Figure PCTCN2020097122-appb-000002
其中,x为测试点图案的中心在所述预设坐标系中的横坐标与对应的测试斑的中心在所述预设坐标系中的横坐标之间的差值;以及
y为测试点图案的中心在所述预设坐标系中的纵坐标与对应的测试斑的中心在所述预设坐标系中的纵坐标之间的差值。
在一个实施例中,所述多个测试点图案中的每一个的灰阶在95~255之间;所述检测画面中除了所述多个测试点图案以外的其他位置的灰阶在0~50之间。
在一个实施例中,每一个所述测试点图案为单个像素点。
在一个实施例中,所述多个测试点图案均匀地排布为阵列。
在一个实施例中,所述检测画面包括的所述多个测试点图案中的各相邻两个测试点图案之间的间隔相等。
在一个实施例中,所述多个测试点图案中的每一个的灰阶为225;所述检测画面中除了所述多个测试点图案以外的其他位置的灰阶为31。
本公开的第三方面提供了一种电子设备,包括:
一个或多个处理器;
存储装置,其上存储有一个或多个程序,当所述一个或多个程序被所述一个或多个处理器执行时,使得所述一个或多个处理器实现根据本公开的第一方面的各个实施例中的任一个所述的方法。
本公开的第四方面提供了一种非暂时性计算机可读存储介质,其上存储有计算机程序,其中,所述计算机程序被处理器执行时实现根据本公开的第一方面的各个实施例中的任一个所述的方法。
附图说明
附图是用来提供对本公开的进一步理解,并且构成说明书的一部分,与下面的示例性实施例一起用于解释本公开,但并不构成对本公开的限制。在附图中:
图1为本公开实施例提供的一种用于评估Demura设备的图像采集精度的方法;
图1A为本公开实施例提供的一种检测画面中的多个测试点图案的示意图;
图1B为本公开实施例提供的一种Demura设备对检测画面进行图像采集后得到的预处理图像的示意图;
图2为图1所示的步骤S3的一种可选实现方式流程图;
图2A为本公开的实施例提供的一种二值化图像中的测试斑的示意图;
图3为本公开实施例中步骤S33的一种可选实现方式的流程图;
图3A为本公开实施例提供的一种每个测试点图案与各自对应 的测试斑之间的位置偏移的示意图;
图3B为本公开的实施例中提供的一种测试点图案与对应的测试斑之间的偏移距离的示意图;
图4为两个Demura设备所对应的距离直方图;
图5为本公开实施例提供的一种用于评估Demura设备的图像采集精度的装置的结构示意图;以及
图6为本公开实施例中确定组件的一种可选结构示意图。
具体实施方式
以下结合附图对本公开的示例性实施例进行详细说明。应当理解的是,此处所描述的示例性实施例仅用于说明和解释本公开,并不用于限制本公开。
本发明构思的发明人发现,在进行Demura补偿时,可以利用Demura设备(其可以是用于Demura的相机)对显示面板的显示画面进行图像采集,根据采集的图像计算Demura补偿数据,之后利用Demura补偿数据对显示面板进行Demura补偿。因此,Demura设备的图像采集精度直接影响了Demura补偿的效果。
目前,Demura补偿过程包括前处理和补偿处理。例如,在前处理过程中,利用Demura设备对显示面板的显示画面进行图像采集。在补偿处理过程中,根据采集的图像计算Demura补偿数据,并根据Demura补偿数据对显示面板进行Demura补偿。对于前处理过程,不同厂商的图像采集精度不同,目前通常是靠人眼观测的方式来对图像采集精度(即,前处理的效果)进行评价。但是,这种方式并不能准确地判断出不同的前处理的效果的优劣,从而无法准确地确定出补偿效果较好的Demura设备,进而影响Demura补偿效果。
图1为本公开实施例提供的一种用于评估Demura设备的图像采集精度的方法。如图1所示,该方法可以包括如下步骤S1至S3。
步骤S1、控制显示面板显示检测画面,检测画面中包括间隔的多个测试点图案。可替换地,检测画面中包括的多个测试点图案中个相邻两个测试点图案之间的间隔可以相等。
例如,如图1A所示,检测画面包括多个像素点,测试点图案的大小可以为单个像素点。测试点图案可以均匀地排布为阵列。
为了便于后续检测,检测画面中除测试点图案之外的其他区域设置为纯色,例如,黑色或深灰色。并且,测试点图案的灰阶与其他区域的灰阶可以设置为差异较大的灰阶。例如,测试点图案的灰阶在200以上,其他区域的灰阶在40以下;或者,测试点图案的灰阶在40以下,其他区域的灰阶在200以上。
步骤S2、利用Demura设备对检测画面进行图像采集,得到与检测画面对应的预处理图像。预处理图像与相应检测画面的大小、形状均相同。
例如,如图1B示出了本公开实施例的一种Demura设备对检测画面进行图像采集后得到的预处理图像的示意图。常用的Demura设备可为Demura相机等。
步骤S3、根据每个测试点图案在检测画面中的位置与在预处理图像中所对应的位置之间的差异,确定Demura设备的图像采集精度。
例如,测试点图案在检测画面中的位置具体可以为测试点图案的中心在检测画面中的位置,相应地,测试点图案在预处理图像中所对应的位置则为测试点图案所成像的中心在预处理图像中的位置。
Demura设备的图像采集精度与测试点图案在检测画面中的位置与在预处理图像中所对应的位置之间的差异反相关。当每个测试点图案在检测画面中的位置与在预处理图像中所对应的位置之间的差异(例如,距离)越大时,则Demura设备的精度越低。当每个测试点图案在检测画面中的位置与在预处理图像中所对应的位置之间的差异(例如,距离)越小时,Demura设备的图像采集精度越高。
在本公开中,Demura设备的图像采集精度是根据每个测试点图案在检测画面中的位置与在预处理图像中所对应的位置之间的差异确定的,不依靠人眼的判断,从而排除了环境和人的主观意识等不确定因素。因此,能够准确地评估出Demura设备采集图像的精度,进而有助于选择出精度更高的Demura设备,从而提高Demura补偿的补偿效果。
在评估Demura设备的图像采集精度时,可以控制显示面板显示一幅检测画面,基于该幅检测画面,通过步骤S2和步骤S3确定Demura设备的图像采集精度。也可以控制显示面板显示多幅检测画面,对于每个检测画面,均通过步骤S2和步骤S3确定出一个图像采集精度,并将其中较低的精度作为Demura设备最终的图像采集精度。例如,多幅检测画面中,其中一幅检测画面中测试点的灰阶在95~255之间,其他位置的灰阶为31;另一幅检测画面中测试点的灰阶在0~50之间,其他位置的灰阶为225。
通常,当检测画面中除测试点之外的其他位置较暗时,图像采集的精度较低。因此,可以控制显示面板显示一幅检测画面,且检测画面中除测试点之外的其他位置的灰阶较低。在本公开的一些实施例中,测试点的灰阶在95~255之间;检测画面中其他位置的灰阶在0~50之间。例如,测试点的灰阶为225,检测画面中其他位置的灰阶为31。这样,图像采集的精度较高,从而能够执行更准确的评估。
图2为本公开实施例提供的步骤S3的一种可选实现方式流程图。如图2所示,步骤S3可以包括步骤S31至S33。
步骤S31、对预处理图像进行低通滤波。
例如,Demura设备进行图像采集时,测试点图案所成的像往往会出现边界模糊的现象,经过低通滤波后,能够使测试点图案所成像的边界更加清晰,从而有利于准确地检测出测试点图案所成的像在预处理图像中的位置。
具体地,图像边界是局部图像中灰阶的急剧变化且变化不连续的部分,过渡区域像素数较少。图像采集时容易受到成像系统像差、景深、离焦或者弱光照等因素的影响,此时边界图像就会退化为模糊边界,模糊边界由于上述图像噪声的存在导致边界定位不准确。而图像噪声常常属于图像的高频成分,采用低通滤波器可以达到去除图像噪声目的。低通滤波器可采用现有技术中常用的低通滤波器,例如为:算数均值滤波器、几何均值滤波器、谐波均值滤波器、逆谐波均值滤波器、逆谐波均值滤波器、阿尔法均值滤波器和高斯型低通滤波器等。
步骤S32、对经过低通滤波后的图像进行二值化处理,得到二值 化图像,二值化图像中包括与测试点图案一一对应的测试斑。该测试斑即为测试点图案所成的像经过低通滤波和二值化后的图像。
例如,如图2A示出了本公开的实施例提供的一种二值化图像中的测试斑的示意图。二值化处理的过程即为:将图像中各像素点的灰阶与预设阈值对比,若像素点的灰阶大于预设阈值,则将像素点的灰阶调节为255,若像素点的灰阶不大于预设阈值,则将像素点的灰阶调节为0;例如,预设阈值可以根据实际需要设置。在测试点图案的灰阶为220并且其他位置的灰阶为31的情况下,预设阈值可以设置为150。
采用二值化处理可便于测试斑的显示和分离。同时,二值化处理也可以提高图像处理的速度并简化图像处理的计算量。
步骤S33、根据每个测试点图案在检测画面中的位置与相应的测试斑在二值化图像中的位置之间的差异,确定Demura设备的图像采集精度。
图3为本公开实施例中步骤S33的一种可选实现方式的流程图。如图3所示,步骤S33可以包括如下步骤S331至S334。
步骤S331、获取每个测试点图案的中心在预设坐标系中的坐标、以及每个测试点图案所对应的测试斑的中心在所述预设坐标系中的坐标。其中,检测画面与二值化图像在预设坐标系中的覆盖范围相同。
步骤S331可以看作,以检测画面和二值化图像中位置对应的顶点作为预设坐标系的原点,以行方向作为横轴方向、列方向作为纵轴方向。例如,以检测画面的左下角顶点和二值化图像的左下角顶点作为预设坐标系原点,以水平向右的方向作为横轴正方向、以竖直向上的方向作为纵轴正方向,从而建立预设坐标系,且检测画面和二值化图像在预设坐标系中的覆盖范围相同。例如,预设坐标系原点可以为图3A的左侧部分的左下角,横轴可以沿着图3A的左侧部分的底边,横轴可以沿着图3A的左侧部分的左侧边。
可以理解的是,测试点图案的中心坐标可以根据驱动检测画面显示时的驱动信号获取。
另外,当测试点图案为一个像素点时,则可以将该像素点的坐 标作为测试点图案的中心坐标。
步骤S332、根据每个测试点图案的坐标与相应的测试斑的坐标,计算每个测试点图案与各自对应的测试斑之间的偏移距离。
例如,图3A示出了本公开实施例提供的一种每个测试点图案与各自对应的测试斑之间的位置偏移的示意图,为了方便区分,图3A中的A点为测试点图案,B点为测试斑,每个测试点图案A的位置坐标与各自对应的测试斑B的位置坐标,理论上二者应该一致,但实际经过前处理算法处理后的坐标位置和范围总会发生微小偏移,此偏移量越小,表明对位精度越高。换言之,此偏移量越小Demura设备的图像采集精度越高。
在一些实施例中,每个测试点图案A与其对应的测试斑B之间的偏移距离D根据以下公式计算:
Figure PCTCN2020097122-appb-000003
例如,x为测试点图案的中心在预设坐标系中的横坐标与对应的测试斑的中心在预设坐标系中的横坐标之间的差值;y为测试点图案的中心在预设坐标系中的纵坐标与对应的测试斑的中心在预设坐标系中的纵坐标之间的差值。图3B示出了本公开的实施例中提供的一种测试点图案与对应的测试斑之间的偏移距离俯视图和对应于俯视图的偏移距离侧视图,其中A点为测试点图案,B点为测试斑。步骤S333、根据每个测试点图案与各自对应的测试斑之间的偏移距离,构成偏移距离集合,计算偏移距离集合的平均值和标准差。
需要说明的是,本公开不仅计算每个测试点图案与各自对应的测试斑之间的距离的平均值,同时还结合标准差进行度量,标准差可以对每个测试点图案与各自对应的测试斑之间的距离的平均值的分散程度进行度量。一个较大的标准差,代表大部分数值和其平均值之间差异较大;一个较小的标准差,代表这些数值较接近平均值。通过计算距离的平均值结合标准差可以更精确地反映测试斑的离散程度,从而能够更准确地评估Demura设备的图像采集的精度。
例如,通过采用matlab或C++等常规图像处理工具分别计算每 个测试点图案与各自对应的测试斑之间的偏移距离、各个偏移距离的平均值和标准差。
步骤S334、根据偏移距离的平均值和标准差确定Demura设备的图像采集精度。
例如,Demura设备的图像采集精度与平均值和标准差成反相关,也就是说,平均值和标准差越大,Demura设备的图像采集精度越低,平均值和标准差越小,Demura设备的图像采集精度越高。在一些实施例中,可以计算平均值与第一权值的乘积和标准差与第二权值的乘积的总和,根据该总和来确定图像采集精度。例如,第一权值可以取较大值,第二权值取较小值,例如:第一权值为0.9,第二权值为0.1。图像采集精度可以以具体数值来表示,也可以以等级来表示。例如,根据预设公式计算出与上述总和反相关的值,作为图像采集精度。例如,第一权值与第二权值之和等于1。
在实际应用过程中,当需要对比两个Demura设备的图像采集精度时,可以控制显示面板显示检测画面,并利用两个Demura设备分别采集的预处理图像,根据上述步骤S331至步骤S333计算每个Demura设备中每个测试斑与各自对应的测试点图案之间的偏移距离平均值和标准差,通过直接将两个平均值和标准差进行对比,即可比较出两个Demura设备的图像采集精度的高低。例如,可以认为偏移距离平均值小的Demura设备的图像采集精度更高。在两个Demura设备的偏移距离平均值相同的情况下,可以认为偏移距离的标准差小的Demura设备的图像采集精度更高。
图4为两个Demura设备所对应的偏移距离直方图。例如,图4中左边的直方图为:根据A厂商的Demura设备采集的图像所计算得到的每个测试点图案与各自对应的测试斑之间的偏移距离直方图,右边的直方图为:根据B厂商的Demura设备采集的图像所计算得到的每个测试点图案与各自对应的测试斑之间的偏移距离直方图。测试点总数N为2616。两个直方图中,横轴表示偏移距离,纵轴表示测试点图案的个数。通过直方图可以计算出A厂商的Demura设备采集的图像对应的偏移距离平均值为1.461,标准差为0.7289;B厂商的 Demura设备采集的图像对应的偏移距离平均值为0.7328,标准差为0.5343。因此可以确定B厂商比A厂商的Demura设备的图像采集精度更高,技术能力也更好。
图5为本公开实施例提供的一种用于评估Demura设备的图像采集精度的装置的结构示意图,该装置可用于实现上述评估Demura设备的图像采集精度的方法。如图5所示,该装置包括:控制器10、获取组件20和确定组件30。
例如,控制器10用于控制显示面板显示检测画面,检测画面中包括间隔的多个测试点图案。可替换地,检测画面中包括的多个测试点图案中个相邻两个测试点图案之间的间隔可以相等。
获取组件20用于获取与检测画面对应的预处理图像,预处理图像与相应检测画面的大小、形状均相同。例如,获取组件20可以是Demura设备。换言之,预处理图像可以由Demura设备对检测画面进行图像采集得到。
确定组件30用于根据每个测试点图案在检测画面中的位置与在预处理图像中所对应的位置之间的差异,确定Demura设备的图像采集精度。
在一些实施例中,测试点的灰阶在95~255之间;所述检测画面中其他位置的灰阶在0~50之间。例如,测试点灰阶为225,其他位置的灰阶为31。
图6为本公开实施例中确定组件30的一种可选结构示意图。如图6所示,确定组件30可以包括:滤波单元31、二值化单元32和确定单元33。
例如,滤波单元31用于对预处理图像进行低通滤波。
二值化单元32用于对经过低通滤波后的图像进行二值化处理,得到二值化图像,二值化图像中包括与测试点图案一一对应的测试斑。
确定单元33用于根据每个测试点图案在检测画面中的位置与相应的测试斑在二值化图像中的位置之间的差异,确定Demura设备的图像采集精度。
在一些实施例中,确定单元33例如可以包括:坐标获取子单元 331、第一计算子单元332、第二计算子单元333和确定子单元334。
例如,坐标获取子单元331用于获取每个测试点图案的中心在预设坐标系中的坐标、以及每个测试点图案所对应的测试斑的中心在预设坐标系中的坐标。例如,检测画面与二值化图像在预设坐标系中的覆盖范围相同。
第一计算子单元332用于根据每个测试点图案的坐标与相应的测试斑的坐标,计算每个测试点图案与各自对应的测试斑之间的偏移距离。
在一些实施例中,第一计算子单元332用于例如根据以下公式计算每个测试点图案与对应的测试斑之间的偏移距离D:
Figure PCTCN2020097122-appb-000004
其中,x为测试点图案的中心在所述预设坐标系中的横坐标与对应的测试斑的中心在所述预设坐标系中的横坐标之间的差值;y为测试点图案的中心在所述预设坐标系中的纵坐标与对应的测试斑的中心在所述预设坐标系中的纵坐标之间的差值。
第二计算子单元333用于根据每个测试点图案与各自对应的测试斑之间的偏移距离,构成偏移距离集合,计算偏移距离集合的平均值和标准差。
确定子单元333用于根据偏移距离集合的平均值和标准差确定Demura设备的图像采集精度。
另外,对于上述各组件、单元和子单元的实现细节和技术效果的描述,可以参见前述方法实施例的说明,此处不再赘述。
此外,图5所示的装置还可以包括存储器,该存储器例如可以连接至控制器10,用于存储所述测试点图案的坐标、所述测试斑的坐标、所述偏移距离、所述平均值、所述标准差、以及其他相关的数据和计算机程序。
应当理解的是,图5和图6所示的装置的各个部件可以通过硬件的方式来实现,也可以通过硬件和软件相结合的方式来实现。例如,图5和图6所示的装置的各个部件可以是具有本公开的实施例所述的 相应功能的中央处理器(CPU)、应用处理器(AP)、数字信号处理器(DSP)、现场可编程逻辑电路(FPGA)、微处理器(MCU)、滤波器、集成电路(IC)或专用集成电路(ASIC)。例如,图5和图6所示的装置的各个部件可以通过处理器、存储器和计算机程序相结合的方式来实现,所述计算机程序存储在所述存储器中,所述处理器从所述存储器中读取并执行所述计算机程序,从而用作图5和图6所示的装置的各个部件。
本公开实施例还提供了一种电子设备,包括:一个或多个处理器以及存储装置;例如,存储装置上存储有一个或多个程序,当所述一个或多个程序被所述一个或多个处理器执行时,使得所述一个或多个处理器实现前述实施例所述的用于评估Demura设备的图像采集精度的方法。
本公开实施例还提供了一种非暂时性计算机可读存储介质,其上存储有计算机程序,例如,该计算机程序被执行时实现如前述实施例所述的用于评估Demura设备的图像采集精度的方法。
本公开的实施例所提供的用于评估Demura设备的图像采集精度的方法、用于评估Demura设备的图像采集精度的装置、电子设备和非暂时性计算机可读存储介质,能够客观准确地评估出Demura设备的图像采集精度,从而有利于提高Demura补偿效果。
应该说明的是,在没有明确冲突的情况下,本公开的上述实施例可以互相结合。
应当理解的是,以上实施方式仅仅是为了说明本公开的原理而采用的示例性实施方式,然而本公开并不局限于此。对于本领域内的普通技术人员而言,在不脱离由所附的权利要求限定的本公开的保护范围的情况下,可以做出各种变型和改进,这些变型和改进也属于本公开的保护范围。

Claims (20)

  1. 一种用于评估Demura设备的图像采集精度的方法,包括:
    控制显示面板显示检测画面,所述检测画面中包括间隔的多个测试点图案;
    利用所述Demura设备对所述检测画面进行图像采集,得到与所述检测画面对应的预处理图像;所述预处理图像与相应检测画面的大小、形状均相同;以及
    根据每个测试点图案在所述检测画面中的位置与在所述预处理图像中所对应的位置之间的差异,确定所述Demura设备的图像采集精度。
  2. 根据权利要求1所述的方法,其中,所述根据每个测试点图案在所述检测画面中的位置与在所述预处理图像中所对应的位置之间的差异,确定所述Demura设备的图像采集精度,包括:
    对所述预处理图像进行低通滤波;
    对经过低通滤波后的图像进行二值化处理,得到二值化图像,所述二值化图像中包括与所述多个测试点图案一一对应的多个测试斑;以及
    根据每个测试点图案在所述检测画面中的位置与相应的测试斑在所述二值化图像中的位置之间的差异,确定所述Demura设备的图像采集精度。
  3. 根据权利要求2所述的方法,其中,所述根据每个测试点图案在所述检测画面中的位置与相应的测试斑在所述二值化图像中的位置之间的差异,确定所述Demura设备的图像采集精度,包括:
    获取每个测试点图案的中心在预设坐标系中的坐标、以及每个测试点图案所对应的测试斑的中心在所述预设坐标系中的坐标;其中,所述检测画面与所述二值化图像在所述预设坐标系中的覆盖范围相同;
    根据每个测试点图案中心的坐标与相应的测试斑中心的坐标,计算所述多个测试点图案中的每一个与各自对应的测试斑之间的偏移距离;
    根据所述多个测试点图案中的每一个与各自对应的测试斑之间的偏移距离,得到偏移距离集合,计算所述偏移距离集合的平均值和标准差;以及
    根据所述偏移距离集合的平均值和标准差确定所述Demura设备的图像采集精度。
  4. 根据权利要求3所述的方法,其中,每个测试点图案与对应的测试斑之间的偏移距离D根据以下公式计算:
    Figure PCTCN2020097122-appb-100001
    其中,x为测试点图案的中心在所述预设坐标系中的横坐标与对应的测试斑的中心在所述预设坐标系中的横坐标之间的差值;以及
    y为测试点图案的中心在所述预设坐标系中的纵坐标与对应的测试斑的中心在所述预设坐标系中的纵坐标之间的差值。
  5. 根据权利要求1至4中任一所述的方法,其中,所述多个测试点图案中的每一个的灰阶在95~255之间;所述检测画面中除了所述多个测试点图案以外的其他位置的灰阶在0~50之间。
  6. 根据权利要求1至5中任一所述的方法,其中,每一个所述测试点图案为单个像素点。
  7. 根据权利要求1至6中任一所述的方法,其中,所述多个测试点图案均匀地排布为阵列。
  8. 根据权利要求1至7中任一所述的方法,其中,所述检测画面包括的所述多个测试点图案中的各相邻两个测试点图案之间的间 隔相等。
  9. 根据权利要求5所述的方法,其中,所述多个测试点图案中的每一个的灰阶为225;所述检测画面中除了所述多个测试点图案以外的其他位置的灰阶为31。
  10. 一种用于评估Demura设备的图像采集精度的装置,包括:
    控制器,用于控制显示面板显示检测画面,所述检测画面中包括间隔的多个测试点图案;
    获取组件,用于获取与所述检测画面对应的预处理图像,所述预处理图像与相应检测画面的大小、形状均相同;以及
    确定组件,用于根据每个测试点图案在所述检测画面中的位置与在所述预处理图像中所对应的位置之间的差异,确定所述Demura设备的图像采集精度。
  11. 根据权利要求10所述的装置,其中,所述确定组件包括:
    滤波单元,用于对所述预处理图像进行低通滤波;
    二值化单元,用于对经过低通滤波后的图像进行二值化处理,得到二值化图像,所述二值化图像中包括与所述多个测试点图案一一对应的多个测试斑;以及
    确定单元,用于根据每个测试点图案在所述检测画面中的位置与相应的测试斑在所述二值化图像中的位置之间的差异,确定所述Demura设备的图像采集精度。
  12. 根据权利要求11所述的装置,其中,所述确定单元包括:
    坐标获取子单元,用于获取每个测试点图案的中心在预设坐标系中的坐标、以及每个测试点图案所对应的测试斑的中心在所述预设坐标系中的坐标;其中,所述检测画面与所述二值化图像在所述预设坐标系中的覆盖范围相同;
    第一计算子单元,用于根据每个测试点图案中心的坐标与相应 的测试斑中心的坐标,计算所述多个测试点图案中的每一个与各自对应的测试斑之间的偏移距离;
    第二计算子单元,用于根据所述多个测试点图案中的每一个与各自对应的测试斑之间的偏移距离,构成偏移距离集合,计算所述偏移距离集合的平均值和标准差;以及
    确定子单元,用于根据所述偏移距离集合的平均值和标准差确定所述Demura设备的图像采集精度。
  13. 根据权利要求12所述的装置,其中,所述第一计算子单元用于根据以下公式计算每个测试点图案与对应的测试斑之间的偏移距离D:
    Figure PCTCN2020097122-appb-100002
    其中,x为测试点图案的中心在所述预设坐标系中的横坐标与对应的测试斑的中心在所述预设坐标系中的横坐标之间的差值;以及
    y为测试点图案的中心在所述预设坐标系中的纵坐标与对应的测试斑的中心在所述预设坐标系中的纵坐标之间的差值。
  14. 根据权利要求10至13中任一所述的装置,其中,所述多个测试点图案中的每一个的灰阶在95~255之间;所述检测画面中除了所述多个测试点图案以外的其他位置的灰阶在0~50之间。
  15. 根据权利要求10至14中任一所述的装置,其中,每一个所述测试点图案为单个像素点。
  16. 根据权利要求10至17中任一所述的装置,其中,所述多个测试点图案均匀地排布为阵列。
  17. 根据权利要求10至16中任一所述的装置,其中,所述检测画面包括的所述多个测试点图案中的各相邻两个测试点图案之间 的间隔相等。
  18. 根据权利要求14所述的装置,其中,所述多个测试点图案中的每一个的灰阶为225;所述检测画面中除了所述多个测试点图案以外的其他位置的灰阶为31。
  19. 一种电子设备,包括:
    一个或多个处理器;
    存储装置,其上存储有一个或多个程序,当所述一个或多个程序被所述一个或多个处理器执行时,使得所述一个或多个处理器实现根据权利要求1至9中任一所述的方法。
  20. 一种非暂时性计算机可读存储介质,其上存储有计算机程序,其中,所述计算机程序被处理器执行时实现根据权利要求1至9中任一所述的方法。
PCT/CN2020/097122 2019-06-21 2020-06-19 评估图像采集精度的方法及装置、电子设备和存储介质 WO2020253827A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP20824412.9A EP3989542A4 (en) 2019-06-21 2020-06-19 METHOD AND DEVICE FOR EVALUATION OF IMAGE ACCURACY, AND ELECTRONIC DEVICE AND STORAGE MEDIA
US17/256,072 US11314979B2 (en) 2019-06-21 2020-06-19 Method and apparatus for evaluating image acquisition accuracy, electronic device and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910543098.8A CN110225336B (zh) 2019-06-21 2019-06-21 评估图像采集精度的方法及装置、电子设备、可读介质
CN201910543098.8 2019-06-21

Publications (1)

Publication Number Publication Date
WO2020253827A1 true WO2020253827A1 (zh) 2020-12-24

Family

ID=67814327

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/097122 WO2020253827A1 (zh) 2019-06-21 2020-06-19 评估图像采集精度的方法及装置、电子设备和存储介质

Country Status (4)

Country Link
US (1) US11314979B2 (zh)
EP (1) EP3989542A4 (zh)
CN (1) CN110225336B (zh)
WO (1) WO2020253827A1 (zh)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11663493B2 (en) 2019-01-30 2023-05-30 Intuit Inc. Method and system of dynamic model selection for time series forecasting
CN110225336B (zh) 2019-06-21 2022-08-26 京东方科技集团股份有限公司 评估图像采集精度的方法及装置、电子设备、可读介质
US11657302B2 (en) 2019-11-19 2023-05-23 Intuit Inc. Model selection in a forecasting pipeline to optimize tradeoff between forecast accuracy and computational cost
US11423250B2 (en) 2019-11-19 2022-08-23 Intuit Inc. Hierarchical deep neural network forecasting of cashflows with linear algebraic constraints
CN113393811B (zh) * 2020-03-12 2022-06-28 咸阳彩虹光电科技有限公司 亮度不均匀补偿方法、装置和显示面板
CN113256700B (zh) * 2021-05-26 2023-05-23 长江存储科技有限责任公司 图层厚度检测方法及装置、电子设备、可读存储介质
CN113538393A (zh) * 2021-07-26 2021-10-22 中冶京诚工程技术有限公司 棒线材原料坯弯曲检测方法、装置、设备及可读存储介质
CN113645464A (zh) * 2021-08-27 2021-11-12 优奈柯恩(北京)科技有限公司 用于检测摄像头的测试系统和用于检测摄像头的方法
CN114323585B (zh) * 2021-12-28 2024-04-12 梅卡曼德(北京)机器人科技有限公司 批量计算调制传递函数的方法、电子设备及存储介质
CN114445402B (zh) * 2022-04-02 2022-06-24 深圳市龙图光电有限公司 半导体芯片用掩模版贴膜精度检测方法及检测装置
CN116309447B (zh) * 2023-03-17 2024-01-05 水利部交通运输部国家能源局南京水利科学研究院 一种基于深度学习的水坝斜坡裂缝检测方法

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120062609A1 (en) * 2010-09-14 2012-03-15 Dae-Sick Jeon Luminance correction system for organic light emitting display
CN105120258A (zh) * 2015-07-20 2015-12-02 深圳市航盛电子股份有限公司 一种摄像头畸变率测试方法及系统
CN105389809A (zh) * 2015-10-26 2016-03-09 广州视源电子科技股份有限公司 显示性能测试方法、系统和装置
CN107240384A (zh) * 2017-08-11 2017-10-10 芯颖科技有限公司 显示器亮度补偿方法及装置
CN107358935A (zh) * 2017-08-25 2017-11-17 惠科股份有限公司 亮度补偿数据量的优化方式及设备
CN108831358A (zh) * 2018-06-13 2018-11-16 武汉精测电子集团股份有限公司 一种用于评估DeMura设备亮度测量精度的方法
CN110225336A (zh) * 2019-06-21 2019-09-10 京东方科技集团股份有限公司 评估图像采集精度的方法及装置、电子设备、可读介质

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5917935A (en) * 1995-06-13 1999-06-29 Photon Dynamics, Inc. Mura detection apparatus and method
US7193196B2 (en) * 2003-10-29 2007-03-20 Lockheed Martin Corporation Methods and systems for evaluating optical systems
CN103024426A (zh) * 2011-09-23 2013-04-03 亚旭电子科技(江苏)有限公司 影像拍摄设备的测试方法
CN106101695B (zh) * 2016-06-15 2017-12-15 上海木爷机器人技术有限公司 摄像头识别精度检测系统
CN109859155A (zh) * 2017-11-30 2019-06-07 京东方科技集团股份有限公司 影像畸变检测方法和系统
CN108234998A (zh) * 2018-01-09 2018-06-29 南京华捷艾米软件科技有限公司 体感摄像头的精度测量方法和体感摄像头的精度测量装置
CN108470334B (zh) * 2018-03-20 2020-10-30 上海顺久电子科技有限公司 一种采集屏幕亮度和色度的方法及装置
CN108548655B (zh) * 2018-03-28 2020-05-15 华勤通讯技术有限公司 成像精度的测量系统及方法、测试主机
CN108924544A (zh) * 2018-06-29 2018-11-30 上海与德通讯技术有限公司 摄像头畸变测量方法与测试装置
CN109883654B (zh) * 2019-01-25 2021-11-09 武汉精立电子技术有限公司 一种用于oled亚像素定位的棋盘格图、生成方法及定位方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120062609A1 (en) * 2010-09-14 2012-03-15 Dae-Sick Jeon Luminance correction system for organic light emitting display
CN105120258A (zh) * 2015-07-20 2015-12-02 深圳市航盛电子股份有限公司 一种摄像头畸变率测试方法及系统
CN105389809A (zh) * 2015-10-26 2016-03-09 广州视源电子科技股份有限公司 显示性能测试方法、系统和装置
CN107240384A (zh) * 2017-08-11 2017-10-10 芯颖科技有限公司 显示器亮度补偿方法及装置
CN107358935A (zh) * 2017-08-25 2017-11-17 惠科股份有限公司 亮度补偿数据量的优化方式及设备
CN108831358A (zh) * 2018-06-13 2018-11-16 武汉精测电子集团股份有限公司 一种用于评估DeMura设备亮度测量精度的方法
CN110225336A (zh) * 2019-06-21 2019-09-10 京东方科技集团股份有限公司 评估图像采集精度的方法及装置、电子设备、可读介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3989542A4 *

Also Published As

Publication number Publication date
EP3989542A1 (en) 2022-04-27
US20210150257A1 (en) 2021-05-20
CN110225336B (zh) 2022-08-26
CN110225336A (zh) 2019-09-10
US11314979B2 (en) 2022-04-26
EP3989542A4 (en) 2022-08-10

Similar Documents

Publication Publication Date Title
WO2020253827A1 (zh) 评估图像采集精度的方法及装置、电子设备和存储介质
KR101590831B1 (ko) 기판의 이물질 검사방법
CN105812790B (zh) 图像传感器感光面与光轴垂直度的评测方法及光学测试卡
WO2017067342A1 (zh) 板卡位置检测方法及装置
US20120207379A1 (en) Image Inspection Apparatus, Image Inspection Method, And Computer Program
CN105718931B (zh) 用于确定采集图像中的杂斑的系统和方法
Flesia et al. Sub-pixel straight lines detection for measuring through machine vision
JP2005172559A (ja) パネルの線欠陥検出方法及び装置
CN111242888A (zh) 一种基于机器视觉的图像处理方法及系统
CN109406539B (zh) 一种透明药瓶底部积料缺陷检测系统与方法
KR101966075B1 (ko) 표시장치의 얼룩 검출 장치 및 방법
JP2005345290A (ja) 筋状欠陥検出方法及び装置
TW201326735A (zh) 寬度量測方法及系統
JP2009036582A (ja) 平面表示パネルの検査方法、検査装置及び検査プログラム
CN110225335A (zh) 相机稳定性评估方法及装置
CN113375555A (zh) 一种基于手机影像的电力线夹测量方法及系统
JP6647903B2 (ja) 画像検査装置、画像検査プログラム及びコンピュータで読み取り可能な記録媒体並びに記録した機器
CN117252915A (zh) 一种基于改进梯度加权的零件图像高精度聚焦方法及装置
CN114964032B (zh) 基于机器视觉的盲孔深度测量方法及装置
JP2005326323A (ja) 画質検査装置
CN108428250B (zh) 一种应用于视觉定位和标定的x角点检测方法
JP2019120644A (ja) 表面検査装置、及び表面検査方法
JP2004219291A (ja) 画面の線欠陥検出方法及び装置
CN204649642U (zh) 一种检测平面缺陷的装置
JP5603964B2 (ja) 平面表示パネルの検査方法、検査装置及び検査プログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20824412

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2020824412

Country of ref document: EP