CN117871047A - Light mixing effect evaluation method and device and terminal equipment - Google Patents

Light mixing effect evaluation method and device and terminal equipment Download PDF

Info

Publication number
CN117871047A
CN117871047A CN202311852848.2A CN202311852848A CN117871047A CN 117871047 A CN117871047 A CN 117871047A CN 202311852848 A CN202311852848 A CN 202311852848A CN 117871047 A CN117871047 A CN 117871047A
Authority
CN
China
Prior art keywords
image
tested
pixel point
display screen
light mixing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311852848.2A
Other languages
Chinese (zh)
Inventor
张旗
徐梦梦
石昌金
丁崇彬
丁彦辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Absen Optoelectronic Co Ltd
Huizhou Absen Optoelectronic Co Ltd
Original Assignee
Shenzhen Absen Optoelectronic Co Ltd
Huizhou Absen Optoelectronic Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Absen Optoelectronic Co Ltd, Huizhou Absen Optoelectronic Co Ltd filed Critical Shenzhen Absen Optoelectronic Co Ltd
Priority to CN202311852848.2A priority Critical patent/CN117871047A/en
Publication of CN117871047A publication Critical patent/CN117871047A/en
Pending legal-status Critical Current

Links

Abstract

The application is applicable to the technical field of display screen detection, and provides a light mixing effect evaluation method, a light mixing effect evaluation device and terminal equipment, wherein the light mixing effect evaluation method comprises the following steps: shooting a display screen to be tested to obtain an image to be tested; determining a target pixel point from the image to be detected, wherein the target pixel point comprises pixel points with at least two overlapped single-base color groups or comprises pixel points with the difference between UV color coordinates and corresponding standard UV color coordinates smaller than a threshold value; and determining the target light mixing rate of the display screen to be tested according to the duty ratio of the target pixel point in the image to be tested. The method and the device can improve the evaluation accuracy of the light mixing effect of the display screen.

Description

Light mixing effect evaluation method and device and terminal equipment
Technical Field
The application belongs to the technical field of display screen detection, and particularly relates to a method and a device for evaluating a light mixing effect, terminal equipment and a computer readable storage medium.
Background
An LED display screen is an electronic display screen composed of an LED lattice, and each pixel point of the LED display screen is provided with an LED lamp bead capable of emitting monochromatic light, for example, each pixel point of the most common full-color LED display screen is usually provided with three LED lamps, which respectively project light of three primary colors of red light, green light and blue light. The LED display screen has the advantages of high brightness, bright color, high luminous efficiency, high contrast ratio and the like, so that the LED display screen is widely applied to various display devices such as stage display devices, advertisement display devices, data visual display devices, commercial display devices and the like.
The display effect of the LED display screen is generally affected by factors such as the luminous brightness and the color purity of the LED lamp, wherein the light mixing effect of the LED lamp is an important factor affecting the display effect of the LED display screen, but the light mixing effect of the LED display screen is usually evaluated only by a manual detection mode at present, so that the LED display screen is easily affected by personal experience and subjective consciousness, and the evaluation accuracy is lower.
Disclosure of Invention
The embodiment of the application provides a method, a device and terminal equipment for evaluating a light mixing effect, which can improve the accuracy of evaluating the light mixing effect of a display screen.
In a first aspect, an embodiment of the present application provides a method for evaluating a light mixing effect, including:
shooting a display screen to be tested to obtain an image to be tested;
determining a target pixel point from the image to be detected, wherein the target pixel point comprises pixel points with at least two overlapped single-base color groups or comprises pixel points with the difference between UV color coordinates and corresponding standard UV color coordinates smaller than a threshold value;
and determining the target light mixing rate of the display screen to be tested according to the duty ratio of the target pixel point in the image to be tested.
In a second aspect, an embodiment of the present application provides a light mixing effect evaluation device, including:
The image acquisition module to be measured is used for shooting the display screen to be measured to obtain an image to be measured;
the target pixel point determining module is used for determining a target pixel point from the image to be detected, wherein the target pixel point comprises pixel points with overlapped basic color groups or comprises pixel points with the difference between UV color coordinates and corresponding standard UV color coordinates smaller than a threshold value;
and the evaluation module is used for determining the target light mixing rate of the display screen to be tested according to the duty ratio of the target pixel point in the image to be tested.
In a third aspect, an embodiment of the present application provides a terminal device, including a memory, a processor, and a computer program stored in the memory and capable of running on the processor, where the steps of the method for evaluating a light mixing effect according to the first aspect are implemented when the processor executes the computer program.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium storing a computer program, which when executed by a processor, implements the steps of the light mixing effect evaluation method described in the first aspect.
In a fifth aspect, embodiments of the present application provide a computer program product, which when run on a terminal device, causes the terminal device to perform the light mixing effect evaluation method according to the first aspect.
Compared with the prior art, the embodiment of the application has the beneficial effects that:
in the embodiment of the present application, after an image to be measured corresponding to a display screen to be measured is obtained, a target pixel point is determined from the image to be measured, where the target pixel point is a pixel point in the image to be measured where at least two overlapping single-primary color groups exist, and at least two overlapping single-primary color groups, that is, at least two single-primary color lights are mixed, that is, the target pixel point is a pixel point of mixed light, or the target pixel point is a difference between a UV color coordinate and a corresponding UV color coordinate, that is, a pixel point with a color difference smaller than a threshold value during actual display, so that a target light mixing rate of the display screen to be measured is determined according to a determined ratio of the target pixel point in the image to be measured, and the light mixing rate of the display screen to be measured can be determined well.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings used in the description of the embodiments or the prior art will be briefly described below.
Fig. 1 is a flow chart of a method for evaluating a light mixing effect according to an embodiment of the present application;
fig. 2 is a schematic diagram of an image of a target area in images to be processed before and after binarization processing according to an embodiment of the present application;
fig. 3 is a schematic diagram of an image to be processed and an image to be detected corresponding to each single primary color provided in the embodiment of the present application;
FIG. 4 is a schematic diagram of a target area image and a sub-image to be measured provided in an embodiment of the present application;
FIG. 5 is a schematic diagram of another target area image and a sub-image to be measured provided in an embodiment of the present application;
fig. 6 is a schematic structural diagram of a light mixing effect evaluation device according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of a terminal device provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system configurations, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It should be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
Furthermore, the terms first, second and the like in the description and in the claims, are used for distinguishing between the descriptions and not necessarily for indicating or implying relative importance.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise.
Embodiment one:
fig. 1 shows a flow chart of a light mixing effect evaluation method according to an embodiment of the present invention, which is described in detail below:
step S101, shooting a display screen to be tested to obtain an image to be tested.
Optionally, when the display screen to be tested is photographed to obtain the image to be tested, the optical imaging device (used for photographing to obtain the image to be tested) may be placed right in front of the display screen to be tested and right opposite to the center point of the display screen to be tested, so as to better photograph the display screen to be tested.
In some embodiments, the distance between the lens of the optical imaging device and the display screen to be measured may be determined according to the light intensity distribution of the display screen to be measured. For example, assuming that the difference between the intensity of the LED received at the optical axis of the lens of the optical imaging device and the intensity of the LED at the 0 ° angle is required to be less than 5%, and that the included angle between the optical axis of the lens of the optical imaging device and the front face of the LED lamp at the edge of the display surface (i.e., the display area) of the display screen to be measured is determined by combining the intensity angle distribution of the green LED to be less than 2 ° (i.e., the angle threshold corresponding to the intensity difference threshold of 5% is 2 °), the distance between the lens of the optical imaging device and the display screen to be measured can be calculated as follows:
And h is the height of the display surface of the display screen to be tested, phi is an angle threshold corresponding to the difference threshold, and is 2 degrees.
In some embodiments, the optical imaging device may be a single-lens reflex camera, the device lens of the optical imaging device may be a non-wide-angle zoom lens, when the image to be measured is collected by the optical imaging device, the imaging area of the display screen to be measured may need to be larger than an area threshold (for example, 4/5 of the imaging area of the photosensitive element of the optical imaging device), and the gray scale of the image to be measured obtained by shooting may be within the linear working area of the photosensitive element by adjusting the parameters of the exposure time, aperture size, ISO value, focusing ring, etc., and when the image to be measured is formed by single-base color of the LED display screen to be measured, an obvious limit may exist between the pixel points, so that the situation of overexposure or insufficient sensitization may not occur, so that the subsequent evaluation of the light mixing effect of the display screen to be measured may be better based on the image to be measured.
Step S102, determining a target pixel point from the image to be detected, wherein the target pixel point comprises a pixel point with at least two overlapped single-base color groups, or comprises a pixel point with a difference between a UV color coordinate and a corresponding standard UV color coordinate smaller than a threshold value.
It can be understood that three LED lamps are generally disposed in each pixel of the display screen, and each LED lamp emits light of three single primary colors (such as red, green and blue in RGB color space), where the single primary color groups are light groups formed when the LED lamps of the single primary colors in the pixel of the display screen to be tested are turned on, that is, the light groups are equivalent to the display range of the LED lamps, and when two or more single primary color groups overlap, it indicates that the light of the corresponding primary colors emitted by the LED lamps of two or more different single primary colors is mixed.
The UV color coordinates are a color space representation method, which is based on a chromaticity diagram spectrum of the CIE1976 standard, and represents chromaticity information of colors by UV coordinates while ignoring luminance information. In the UV color coordinate system, each color may be represented by a point in a two-dimensional coordinate system, where the U-coordinate represents the ratio of red to blue and the V-coordinate represents the ratio of green to luminance, thereby representing chromaticity information of the color, but cannot additionally represent luminance information. It can be understood that when a display screen to be tested displays a certain color, if the UV color coordinates of pixel points in an image to be tested are different from the standard UV color coordinates corresponding to the color, the display screen to be tested shows that color difference exists between the color actually displayed after different primary colors are mixed due to factors such as screen body packaging, LED lamp brightness and the like.
Specifically, when light emitted by two or more LED lamps with different single primary colors is mixed, display ranges corresponding to at least two LED lamps with single primary colors overlap, so in order to better evaluate the light mixing effect of the display screen to be tested, a pixel point with at least two overlapping single primary color groups can be found out from the image to be tested as a target pixel point, so that the evaluation of the light mixing effect is performed based on the target pixel point.
Or, the UV color coordinates of each pixel point in the image to be measured can be obtained, the difference (i.e., the difference) between the UV color coordinates of each pixel point and the corresponding standard UV color coordinates is determined, whether the difference corresponding to each pixel point is greater than a threshold (e.g., 0, 01) is determined, if the difference of the UV color coordinates corresponding to the pixel point is less than or equal to the threshold, the color difference between the color of the pixel point and the color to be displayed is considered to be smaller, i.e., the light mixing effect is better, so that the pixel point with the difference between the UV color coordinates and the standard UV color coordinates less than the threshold is used as the target pixel point to evaluate the light mixing effect.
Optionally, when a pixel point with a difference between the UV color coordinates and the standard UV color coordinates being smaller than a threshold value is obtained as a target pixel point, an image of the display screen to be measured under the condition that a certain mixed color (i.e., a color formed by mixing at least two primary colors, such as white, purple, etc.) is displayed can be obtained as an image to be measured, so that the standard UV color coordinates corresponding to each pixel point can be well determined and the difference of the UV color coordinates can be calculated, and the evaluation effect can be improved. In order to further improve the accuracy of evaluation, an image of the display screen to be evaluated under the condition of displaying the color formed by mixing the three primary colors (such as R, G, B) can be obtained to serve as an image to be evaluated, and the LED lamps corresponding to the three primary colors are ensured to be lightened so as to evaluate the light mixing effect of the display screen to be evaluated better.
In the embodiment of the application, the pixel point with at least two overlapped single-primary color groups in the image to be detected is determined to be the target pixel point, or the pixel point with the difference between the UV color coordinates and the corresponding standard UV color coordinates smaller than or equal to the threshold value is determined to be the target pixel point, and the determined target pixel point is the pixel point for mixing the light with different single primary colors, so that the light mixing effect of the display screen to be detected can be well evaluated based on the target pixel point.
And step S103, determining the target light mixing rate of the display screen to be tested according to the duty ratio of the target pixel point in the image to be tested.
Specifically, since the determined target pixel point includes a pixel point where at least two overlapping single-primary color light groups exist, or includes a pixel point where a difference between a UV color coordinate and a standard UV color coordinate is less than or equal to a threshold value, that is, a pixel point where single-primary color light is mixed, the light mixing effect of the display screen to be tested can be evaluated according to the ratio of the target pixel point in the image to be tested, that is, the ratio of the number of the target pixel points to the number of the pixel points in the image to be tested is taken as the target light mixing rate of the display screen to be tested, and quantization of the light mixing effect is achieved while the evaluation of the light mixing effect is achieved.
For example, assuming that the image to be measured of the display screen a to be measured includes 1000 pixels, where the number of target pixels is 780, the target light mixing rate of the display screen a to be measured is 78% (780/1000×100%).
It can be understood that after the target light mixing rate of the display screen to be tested is obtained, the light mixing effect of the display screen to be tested can be evaluated according to the target light mixing rate, for example, when the target light mixing rate is smaller than a light mixing threshold (such as 70%), the light mixing effect of the display screen to be tested is determined to be unqualified, at this time, the factor affecting the light mixing effect of the display screen to be tested can be detected, so as to adjust the display screen to be tested, when the target light mixing rate is greater than or equal to the light mixing threshold (such as 70%), the light mixing effect of the display screen to be tested can be determined to be qualified, and when the target light mixing rate is greater than or equal to a light mixing good threshold (such as 90%), the light mixing effect of the display screen to be tested can be determined to be better.
In the embodiment of the present application, after an image to be measured corresponding to a display screen to be measured is obtained, a target pixel point is determined from the image to be measured, where the target pixel point is a pixel point in the image to be measured where at least two overlapping single-primary color groups exist, and at least two overlapping single-primary color groups, that is, at least two single-primary color lights are mixed, that is, the target pixel point is a pixel point of mixed light, or the target pixel point is a difference between a UV color coordinate and a corresponding UV color coordinate, that is, a pixel point with a color difference smaller than a threshold value during actual display, so that a target light mixing rate of the display screen to be measured is determined according to a determined ratio of the target pixel point in the image to be measured, and the light mixing rate of the display screen to be measured can be determined well.
In some embodiments, the step S101 includes:
a1, when the display screens to be tested are respectively displayed in single basic colors, shooting the display screens to be tested, and obtaining the images to be processed corresponding to the single basic colors.
A2, carrying out composite processing on each image to be processed to obtain the image to be detected.
Specifically, when the light mixing effect is required to be evaluated based on the existence of at least two overlapped single-base color groups, in order to better determine the display range of each single-base color LED lamp in each pixel point of the display screen to be tested, so as to accurately evaluate the light mixing rate of the display screen to be tested, when the image to be tested is obtained, the display screen to be tested can be made to display three single-base colors (such as red, green and blue are sequentially displayed), and when the display screen to be tested is displayed in a single-base color mode, the display screen to be tested is photographed, the image to be processed corresponding to the single-base color is obtained, and therefore the image to be processed corresponding to each single-base color when the display screen to be tested is displayed in each single-base color mode is obtained.
After the to-be-processed images corresponding to the single primary colors are obtained, the to-be-processed images of the single primary colors can be subjected to compound processing, and the three to-be-processed images of the single primary colors are combined into one image through the compound processing, so that the to-be-detected image is obtained.
In some embodiments, the to-be-processed images corresponding to the single basic colors can be first converted into gray images in a unified manner, so that the to-be-processed images corresponding to the single basic colors are better compounded into the to-be-detected image.
In the embodiment of the application, the image of the display screen to be tested when displaying one single primary color is respectively obtained as the image to be processed corresponding to the single primary color, so that the display range of each single primary color in each pixel point of the display screen to be tested (namely, the pixel point corresponding to the single primary color group in the image to be processed) is well determined, and then the image to be processed corresponding to each single primary color is compounded into the image to be tested containing three single primary colors, so that the pixel point with overlapped single primary color groups, namely, the target pixel point, can be well determined based on the image to be tested obtained by compounding, and the evaluation accuracy is improved.
In some embodiments, in the case of photographing based on the side view angle of the display screen to be tested, the step A1 includes:
and when the display screen to be tested is displayed in each single base color, respectively lighting the pixel points of each column of the display screen to be tested, wherein under the condition that each column of the pixel points of the display screen to be tested are lighted, acquiring an image of the currently lighted column of the pixel points, and obtaining a candidate image corresponding to the currently lighted column of the pixel points.
And respectively splicing the candidate images corresponding to the single basic colors to obtain the image to be processed corresponding to the single basic colors.
Alternatively, when the angle between the viewing angle (i.e., the angle of the direction of the lens of the optical imaging device, i.e., the angle of the direction of the shot) and the normal line perpendicular to the display screen to be measured is greater than an angle threshold (e.g., 45 °), the viewing angle may be considered as the side view angle. In other embodiments, the observation may be considered to be of a side view angle when the viewing angle is not of a front view angle.
Specifically, in order to more comprehensively and accurately evaluate the light mixing effect of the display screen to be tested, an image of the display screen to be tested can be collected from a side view angle (namely, a side view angle of the display screen to be tested) to serve as the image to be tested, and the light mixing effect of the display screen to be tested when the display screen to be tested is observed from the side view angle is evaluated.
Because the focal lengths of the pixels in different columns in the display screen to be tested are different from those of the optical imaging device when the optical imaging device shoots the display screen to be tested at a side view angle, a certain influence is caused on the display range of each single-color group in the shot image, so that in order to improve the accuracy of evaluation of the light mixing effect, when the image of the display screen to be tested when displaying each single-color is obtained from the side view angle, the display screen to be tested can be respectively displayed with each single-color, when the display screen to be tested displays one single-color, the pixels in each column of the display screen to be tested (namely, the LED lamps corresponding to a certain single-color in each column of the pixels in each column of the display screen to be tested) are respectively lightened, and when each column of the pixels in the display screen to be tested is lightened, the image of the currently lightened column of the pixels is obtained, and the currently lightened candidate image corresponding to the column of the pixels is obtained.
After the candidate image corresponding to each column of pixel points is obtained under the condition that the display screen to be tested displays a certain single primary color, the pixel points in each column are spliced in sequence according to the sequence (namely the sequence of the columns) to obtain the image to be processed corresponding to the single primary color, so that the image to be processed corresponding to each single primary color is obtained.
For example, assuming that the display screen a to be tested includes 1000 columns of pixels, when the to-be-processed image of the display screen a to be tested when displaying green is obtained, firstly lighting the 1 st column of pixels of the display screen a to be tested, shooting to obtain the candidate image 1 corresponding to the 1 st column of pixels, then lighting the 2 nd column of pixels, shooting to obtain the candidate image 2 corresponding to the 2 nd column of pixels, then lighting the 3 rd column of pixels, shooting to obtain the candidate image 3 corresponding to the 3 rd column of pixels, and so on until the candidate image 1000 corresponding to the 1000 th column of pixels is obtained, obtaining 1000 candidate images, finally, splicing the 1000 candidate images according to the sequence of the corresponding column of pixels to obtain the to-be-processed image corresponding to green, that is, splicing the candidate image 1, the candidate image 2 to the candidate image 1000 in sequence to obtain the to-be-processed image including the candidate image corresponding to the 1000 columns of pixels.
It should be noted that, when each row of pixel points of the display screen to be tested are respectively lightened, only one row of pixel points is in a lightening state each time when a candidate image corresponding to a row of currently lightened images is obtained, and other rows of pixel points are in an unlit state, namely, when the next row of pixel points are lightened, the LED lamps in the row of currently lightened pixel points are turned off to enable the LED lamps to be in an unlit state, so that when the candidate image is obtained through shooting, only one row of pixel points are in a lightening state, and the candidate image corresponding to the row of pixel points is accurately obtained. Meanwhile, when the next row of pixel points are lightened, focusing needs to be carried out on the optical imaging equipment, and the candidate images corresponding to each row of pixel points are images shot under the same focal length, so that the accuracy of the obtained images to be processed is improved.
In the embodiment of the application, when the to-be-processed image of the display screen to be detected under the side view angle is obtained, each row of pixel points are respectively lightened when the display screen is displayed with a single base color, the candidate images corresponding to each row of pixel points are obtained for splicing, the to-be-processed image corresponding to the single base color under the side view angle is obtained, the to-be-processed image corresponding to each single base color is obtained, the influence on the to-be-processed image caused by differences of focal lengths corresponding to different rows of pixel points under the side view angle is avoided, the accuracy of the to-be-processed image is improved, and the accuracy of the assessment of the light mixing effect is improved.
In some embodiments, before the step A2, the method further includes:
and carrying out the same dividing processing on each image to be processed to obtain the divided images to be processed, wherein the dividing processing is used for dividing the images to be processed into a plurality of target areas, and the number of pixel points in each target area is equal.
And determining effective pixel points in the target area on the basis of gray threshold values corresponding to the target areas respectively for each divided image to be processed, wherein the gray threshold values are determined according to the maximum gray value of the pixel points in the target area, and the effective pixel points are pixel points with gray values larger than or equal to the gray threshold values.
And respectively carrying out binarization processing on the pixel points in each target area on each divided image to be processed to obtain the image to be processed after the binarization processing, wherein the gray value of the effective pixel point in the image to be processed after the binarization processing is a preset value.
Correspondingly, the step A2 includes:
and carrying out composite processing on the images to be processed after each binarization processing to obtain the images to be detected.
It can be understood that, because the gray value of the pixel point in the image to be processed obtained by shooting can reflect the brightness of the pixel point, the larger the gray value of the pixel point in the image to be processed is, the higher the brightness of the pixel point (i.e. the pixel point in the image) is, and the brightness of the pixel point is influenced by the display range (i.e. the single-primary color group) of the single-primary color LED lamp in the display screen to be detected, the brightness of the pixel point in the edge area of the single-primary color group and the pixel point in the area not corresponding to the single-primary color group is low, therefore, the pixel point corresponding to each single-primary color group in the image to be processed can be determined according to the gray value of the pixel point, i.e. the position of each single-primary color group in the target area is determined.
In order to better determine the pixel points with at least two overlapped single-base color radicals, the evaluation efficiency and accuracy of the display screen to be measured are improved, a plurality of areas (namely target areas) can be divided in each image to be measured through dividing processing, and the number of the pixel points contained in each target area is equal. Alternatively, at least two or at least three target areas may be divided in the image to be processed.
After a plurality of target areas are respectively divided in each image to be processed, for each target area in each divided image to be processed, determining an effective pixel point in the target area according to a gray threshold corresponding to the target area, wherein the effective pixel point is a pixel point with a gray value greater than or equal to the gray threshold, namely, the effective pixel point is a pixel point corresponding to a single-base color light group. The gray threshold corresponding to the target area is determined according to the maximum gray value of the pixel points in the target area.
For example, as shown in fig. 2, in a target area in an image to be processed corresponding to a single base color, it is assumed that the range of gray values of each pixel point in the target area is 0-5, the larger the gray value of the pixel point is, the higher the brightness of the pixel point is, the black part indicates that the gray value of the pixel point is 0, when the binarization processing is performed on the pixel point, 1 is used as a gray threshold value, the pixel point with the gray value greater than or equal to 1 is used as an effective pixel point, and when the binarization processing is performed, the gray value of the effective pixel point is set to be the maximum gray value of 5.
After determining the effective pixel points in each target area in each divided image to be processed, performing binarization processing on the pixel points in each target area respectively, so as to obtain a binarized image to be processed, wherein the gray value of the effective pixel point in the binarized image to be processed is a preset value.
For example, as shown in fig. 3, assume that binarized processed images corresponding to respective single base colors are obtained: and (3) compositing the single-primary-color image R, the single-primary-color image G and the single-primary-color image B to obtain an image A to be detected, and intuitively determining the positions corresponding to all the single-primary-color groups in each target area and whether all the single-primary-color groups overlap or not based on the image A to be detected, namely whether at least two overlapped single-primary-color groups exist in a pixel point or not.
It should be noted that, in some embodiments, when performing the binarization processing, the preset value is less than or equal to 85 (255/3), so that after each image to be processed is composited to obtain the image to be detected, the gray value of the pixel point where three overlapping mono-basic groups exist is not greater than the maximum gray value 255.
For example, assuming that 50 pixels are included in the target area a, and the maximum gray value of the 50 pixels is 200, the gray threshold value corresponding to the target area a may be 20 (200×0.1, i.e., a value 0.1 times the maximum gray value is taken as the gray threshold value). When the binarization process is performed on the pixel points in the target area a based on the gray threshold 20, the gray value of the pixel point having the gray value smaller than the gray threshold 20 may be set to 0, and the gray value of the pixel point having the gray value greater than or equal to the gray threshold 20 may be set to a preset value (division 80).
In other embodiments, when binarizing the pixel points in the target area in the image to be processed of each single primary color, the preset values corresponding to the different single primary colors may be different, that is, when the gray value of the effective pixel point is set to the preset value, the single primary color groups of the different single primary colors are represented by different gray values, so that after the image to be detected is obtained by compositing, the positions of the single primary color groups of each single primary color can be well determined according to the gray values, thereby well judging whether overlapping occurs to the single primary color groups of the different single primary colors, and which single primary color groups overlap, so as to improve the accuracy and efficiency of determining the target pixel point. At this time, it should be noted that the sum of any two preset values is different from the other preset value, so as to avoid interference caused by the preset values, and meanwhile, the maximum value of the gray value (i.e. brightness) is usually 255, that is, the maximum value of the sum of the three preset values is 255. For example, assume that the preset values corresponding to the individual single base colors include, from small to large: the sum of the first preset value, the second preset value and the third preset value is not equal to the third preset value, and the sum of the first preset value, the second preset value and the third preset value is 255 at maximum.
In the embodiment of the application, a plurality of target areas are divided in each image to be processed, the number of pixel points contained in each target area is equal, effective pixel points in each target area are respectively determined, the effective pixel points are pixel points corresponding to single-color light groups, the gray value of the effective pixel points is set to be a preset value through binarization processing, so that the pixel points corresponding to the single-color light groups in the image to be processed can be determined better based on the gray value of the pixel points, and the pixel points with at least two overlapped single-color light groups can be determined based on the pixel points corresponding to the single-color light groups.
In some embodiments, the step S102 includes:
for each pixel point in the image to be detected, when the gray value of the pixel point is at least twice the preset value, determining that at least two overlapped single-base color radicals exist in the pixel point with the gray value being at least twice the preset value.
And determining the target pixel point according to the pixel points with at least two overlapped single-base color radicals.
Specifically, since the gray value of the effective pixel point in the image to be processed, that is, the pixel point corresponding to the single-color group, is set to a preset value through gray binarization in advance, that is, the position of each single-color group in each image to be processed is determined, when the target pixel point is determined from the image to be processed, whether at least two overlapped single-color groups exist in the pixel point can be determined according to the gray value of each pixel point in the image to be processed.
And judging whether the gray value of each pixel point of the image to be tested is a multiple of a preset value, if the gray value of the pixel point is at least twice the preset value, indicating that at least two overlapped single-base color groups exist in the pixel point, so that the gray value of the pixel point is a multiple of the preset value and the multiple is at least 2, and therefore, determining the pixel point (namely, the pixel point with the gray value being at least twice the preset value) as a target pixel point, and evaluating the light mixing rate of the display screen to be tested.
It should be noted that, when the binarization processing is performed, when the gray value of the effective pixel point in the to-be-processed image corresponding to each single primary color is set to the same preset value, the pixel point with the gray value being at least twice of the preset value, that is, the pixel point with at least two overlapped single primary color groups, and when the gray value of the to-be-processed image corresponding to different single primary colors is set to different preset values, the pixel point with the gray value being the sum of at least two preset values is the pixel point with at least two overlapped single primary color groups.
For example, assume that, when binarization processing is performed, preset values for red, green, and blue are all 80, and that a divided target area includes 25 pixel points, there are images corresponding to the binarized target area corresponding to three single-base colors as shown in fig. 4: the target area image A, the target area image B and the target area image C are compounded to obtain an image (assumed to be a sub-image to be detected) corresponding to the target area in the image to be detected, in the sub-image to be detected, a pixel with a gray value of 0 can be regarded as not having a single-base color group, and a pixel with a gray value of 80 at least twice is at least two overlapped pixels with single-base color groups, so that the gray value of the pixel is twice or three times of a preset value, and therefore, the pixel with the gray value of 80 at least twice in the sub-image to be detected can be determined as the target pixel in the sub-image to be detected, namely, the sub-image to be detected contains 15 target pixel points.
For another example, assume that when binarization processing is performed, the preset value for red is 50, the preset value for green is 80, and the preset value for blue is 110, and that the divided target area includes 25 pixel points, and there are three images corresponding to the binarized target area corresponding to the single base color as shown in fig. 5: the target area image A, the target area image B and the target area image C are compounded to obtain an image (assumed to be a sub-image to be detected) corresponding to the target area in the image to be detected, in the sub-image to be detected, a pixel point with a gray value of 0 can be regarded as not having a single-base color group, namely, a pixel point corresponding to the single-base color group, and a pixel point with a gray value different from three preset gray values is a pixel point with at least two overlapped single-base color groups, so that the gray value of the pixel point is a superposition value of two or more preset values, and therefore, the pixel point with the gray value which does not meet each preset value and is not 0 in the sub-image to be detected can be determined as the target pixel point in the sub-image to be detected, namely, the sub-image to be detected contains 15 target pixel points.
In the embodiment of the present application, since the gray value of the effective pixel point in the to-be-processed image corresponding to each single-primary color is set to the preset value in advance, and the effective pixel point is the pixel point corresponding to the single-primary color group, when determining the target pixel point according to the to-be-detected image obtained by compositing each to-be-processed image, it can be better determined whether the pixel point where at least two single-primary color groups overlap exists according to whether the gray value of the pixel point in the to-be-detected image is a multiple of the preset value, thereby obtaining the target pixel point.
In some embodiments, the step S103 includes:
and respectively calculating the duty ratio of the target pixel point in the target area of each target area of the image to be detected to obtain the light mixing rate corresponding to each target area.
And screening out abnormal light mixing rates in the light mixing rates based on an abnormal threshold value, and obtaining the screened light mixing rates, wherein the abnormal threshold value is determined according to the average value of the light mixing rates.
And determining the target light mixing rate according to the screened light mixing rates.
Specifically, in order to improve accuracy of the obtained target light mixing rate, when determining the target light mixing rate of the display screen to be tested based on the target pixel points, after determining the light mixing rate (i.e., the first light mixing rate) corresponding to the target area for the target pixel points in each target area, the target light mixing rate of the whole display screen to be tested may be determined according to the light mixing rate corresponding to each target area. The number of the pixel points contained in each target area obtained by dividing is equal.
When determining the light mixing rate corresponding to the target area, the light mixing rate corresponding to the target area may be determined according to a ratio of the number of target pixels included in the target area to the number of pixels included in the target area.
After obtaining the light mixing rates corresponding to each target area, the average value of each light mixing rate can be calculated first, then an abnormal threshold value is determined according to the average value, the light mixing rate with larger difference from the average value in each light mixing rate is screened out through the abnormal threshold value (such as the average value is 0.2), namely, the light mixing rate with the abnormality is removed, each screened light mixing rate is obtained, and it can be understood that each screened light mixing rate does not contain the light mixing rate judged to have the abnormality.
After the abnormal light mixing rate is screened out, and each screened light mixing rate is obtained, the target light mixing rate of the whole display screen to be tested can be determined according to the average value of each screened light mixing rate.
For example, assume that the light mixing ratios corresponding to 10 target areas are calculated: (0.8,0.75,0.85,0.78,0.6,0.95,0.3,0.8,0.82,0.7), the average value of the 10 light mixing rates is calculated as: 0.735, assuming that the set anomaly threshold is 0.5 as a mean value, and the light mixing rate smaller than the anomaly threshold is an anomaly light mixing rate, that is, the anomaly threshold corresponding to the 10 light mixing rates is 0.2205 (0.735×0.3), and if 0.3 of the 10 light mixing rates is smaller than the anomaly threshold, the light mixing rate is determined to be an anomaly light mixing rate, and is removed, so as to obtain 9 screened light mixing rates: (0.8,0.75,0.85,0.78,0.6,0.95,0.8,0.82,0.7) according to the average value of the 9 light mixing rates after screening, the target light mixing rate of the display screen to be detected can be obtained as follows: 0.783 (i.e., 78.3%).
Optionally, when determining the abnormal threshold based on the average value of the light mixing rates, an abnormal threshold upper limit value (for example, average value 1.2) and an abnormal threshold lower limit value (for example, average value 0.4) may be respectively determined, and when screening out the abnormal light mixing rate, the light mixing rates smaller than the abnormal threshold lower limit value and larger than the abnormal threshold upper limit value are removed, so as to obtain the screened light mixing rate.
In the embodiment of the application, firstly, the light mixing rate corresponding to each divided target area is calculated, then, an abnormal threshold value is set according to the average value of each light mixing rate, the abnormal light mixing rate is screened out based on the abnormal threshold value, and then, the target light mixing rate of the display screen to be tested is calculated according to each screened light mixing rate which does not contain the abnormal light mixing rate, so that the interference of abnormal data is reduced, and the accuracy of evaluation of the light mixing effect is improved.
In some embodiments, before the step S101, the method further includes:
and obtaining the tristimulus values corresponding to the single basic colors of the display screen to be tested.
And determining the brightness proportion corresponding to each single basic color when the display screen to be tested needs to display standard white light according to the tristimulus values corresponding to each single basic color.
Correspondingly, the step S101 includes:
And when the display screen to be tested is displayed in the standard white light based on the brightness proportion corresponding to each single basic color, shooting the display screen to be tested to obtain the image to be tested.
Tristimulus values are a color space defined by the International Commission on illumination (CIE), also known as CIE tristimulus values XYZ, for describing the color perception of any spectral color by the human eye. The tristimulus values represent colors through three parameters of XYZ, X, Y and Z represent three axes of a color space, respectively correspond to three single basic colors of red, green and blue, the X axis represents the intensity of red light, the Y axis represents the intensity of green light and the Z axis represents the intensity of blue light, and it is understood that any color can be represented by the proportion of the tristimulus values in the three directions of XYZ.
Specifically, when the light mixing effect is required to be evaluated based on the pixel point that the difference between the UV color coordinates and the corresponding standard UV color coordinates is smaller than the threshold value, the tristimulus values corresponding to the single primary colors of the display screen to be tested can be obtained according to measurement, such as a color brightness meter, and then the brightness proportion corresponding to the single primary colors of the display screen to be tested when standard white light (such as 6500K white light) is displayed can be obtained through calculation according to the tristimulus values corresponding to the single primary colors and the corresponding color matching formula. Alternatively, the luminance ratio may be a luminance ratio of three primary colors of RGB.
After the brightness proportion corresponding to each single basic color is calculated, the standard white light can be displayed on the display screen to be tested according to the calculated brightness proportion and the corresponding single basic color, and when the display screen to be tested is displayed by the standard white light, the display screen to be tested is shot, and an image of the display screen to be tested when the standard white light is actually displayed is obtained as an image to be tested.
In the embodiment of the application, since the tristimulus values are basic standards of any color space, when the target pixel point is required to be determined based on the UV coordinates of the image pixel point, the brightness proportion of each single primary color when the display screen to be measured displays standard white can be calculated according to the tristimulus values corresponding to the measured display screen to be measured, and then the display screen to be measured is enabled to actually display standard white according to the brightness proportion, so that the effect of displaying white based on the standard brightness proportion when the display screen to be measured is actually applied is obtained, and the color difference in actual display can be better analyzed later to accurately determine the target pixel point.
In some embodiments, the step S102 includes:
and acquiring a tristimulus value corresponding to each pixel point in the image to be detected.
And determining the color coordinates corresponding to the tristimulus values corresponding to the pixel points, and converting the color coordinates corresponding to the tristimulus values corresponding to the pixel points into the UV color coordinates corresponding to the pixel points.
And determining the difference value between the UV color coordinates corresponding to the pixel points and the UV color coordinates corresponding to the standard white light for each pixel point.
And determining the target pixel point from the pixel points according to the difference value corresponding to the pixel point.
Specifically, since the acquired image to be measured is usually an RGB image, in order to analyze the color difference of each pixel in the image to be measured more accurately, the tristimulus values corresponding to each pixel in the image to be measured can be acquired pixel by pixel, the tristimulus value corresponding to each pixel is obtained, then the color coordinates corresponding to the tristimulus values corresponding to each pixel are respectively determined, and then the color coordinates corresponding to the tristimulus values corresponding to the pixels are converted into the UV color coordinates corresponding to the pixels, so that the UV color coordinates corresponding to each pixel are obtained.
After the UV color coordinates corresponding to each pixel point are obtained, since the image to be measured is an image taken when the display screen to be measured displays the standard white light based on the calculated brightness ratio, when the color difference corresponding to each pixel point is determined, the difference between the UV color coordinates corresponding to each pixel point and the UV color coordinates corresponding to the standard white light is calculated for each pixel point.
The color of the white light is a color obtained by mixing three primary colors of light, and a color difference (i.e., a difference value of UV color coordinates) between the color actually displayed by the display screen to be tested and the standard white light can reflect a light mixing effect of the display screen to be tested, so that a target pixel point can be determined from all pixel points of the image to be tested according to the difference value corresponding to the pixel point, for example, a pixel point with the difference value less than or equal to a threshold value (e.g., 0.012) can be determined as the target pixel point.
Optionally, when determining the target light mixing ratio of the display screen to be measured according to the duty ratio of the target pixel points in the image to be measured, the image to be measured may be first subjected to a division process (a second division process), a plurality of target areas (second target areas) are obtained by dividing the image to be measured through the division process, the number of the pixel points in each of the target areas is equal, then the duty ratio of the target pixel points in each of the target areas in the image to be measured in the target areas is calculated, respectively, the light mixing ratio corresponding to each of the target areas is obtained, the abnormal light mixing ratio in each of the light mixing ratios is filtered based on an abnormal threshold, the light mixing ratio after filtering is obtained, and finally the target light mixing ratio is determined according to each of the light mixing ratios after filtering, that is, the abnormal light mixing ratio is filtered through an abnormal threshold, and then the target light mixing ratio of the display screen to be measured is obtained according to each light mixing ratio after filtering, which does not include the abnormal light mixing ratio is calculated, so as to reduce the interference of abnormal data, and improve the accuracy of the evaluation of the mixed light.
In the embodiment of the application, the difference value corresponding to the pixel point is the difference value of the UV color coordinates corresponding to the pixel point, and reflects the color difference between the color actually displayed by the display screen to be tested and the color expected to be displayed, namely reflects the light mixing effect of the display screen to be tested, so that the target pixel point can be rapidly and accurately determined from each pixel point of the image to be tested according to the difference value corresponding to the pixel point, and the efficiency and the accuracy of the evaluation of the light mixing effect are improved.
In the embodiment of the application, the difference value corresponding to the pixel point is the difference value of the UV color coordinates corresponding to the pixel point, and reflects the color difference between the color actually displayed by the display screen to be tested and the color expected to be displayed, namely reflects the light mixing effect of the display screen to be tested, so that the target pixel point can be rapidly and accurately determined from each pixel point of the image to be tested according to the difference value corresponding to the pixel point, and the efficiency and the accuracy of the evaluation of the light mixing effect are improved.
It should be understood that the sequence number of each step in the foregoing embodiment does not mean that the execution sequence of each process should be determined by the function and the internal logic of each process, and should not limit the implementation process of the embodiment of the present application in any way.
Embodiment two:
Fig. 6 shows a block diagram of a light mixing effect evaluation device according to an embodiment of the present application, and for convenience of explanation, only the portions related to the embodiments of the present application are shown.
Referring to fig. 6, the apparatus includes: the image to be measured acquisition module 61, the target pixel point determination module 62 and the evaluation module 63. Wherein,
the image to be measured acquisition module 61 is used for shooting the display screen to be measured to obtain an image to be measured.
The target pixel determining module 62 is configured to determine a target pixel from the image to be detected, where the target pixel includes a pixel having at least two overlapping mono-basic color groups, or includes a pixel having a difference between a UV color coordinate and a corresponding standard UV color coordinate less than a threshold.
And the evaluation module 63 is configured to determine a target light mixing rate of the display screen to be tested according to the duty ratio of the target pixel point in the image to be tested.
In the embodiment of the present application, after an image to be measured corresponding to a display screen to be measured is obtained, a target pixel point is determined from the image to be measured, where the target pixel point is a pixel point in the image to be measured where at least two overlapping single-primary color groups exist, and overlapping single-primary color groups, that is, single-primary color light is mixed, that is, the target pixel point is a pixel point of mixed light, or the target pixel point is a difference between a UV color coordinate and a corresponding UV color coordinate, that is, a pixel point with a color difference smaller than a threshold value during actual display, so that a target light mixing rate of the display screen to be measured is determined according to a determined ratio of the target pixel point in the image to be measured, and a light mixing rate of the display screen to be measured can be determined well.
In some embodiments, the image acquisition module under test 61 comprises:
and the to-be-processed image acquisition unit is used for shooting the to-be-detected display screen when the to-be-detected display screen is respectively displayed in each single base color, so as to obtain to-be-processed images corresponding to each single base color.
And the compound processing unit is used for carrying out compound processing on each image to be processed to obtain the image to be detected.
In some embodiments, the image capturing module 61 includes:
the dividing processing unit is used for carrying out the same dividing processing on each image to be processed to obtain the images to be processed after the dividing processing, wherein the dividing processing is used for dividing the images to be processed to obtain a plurality of target areas, and the number of pixel points in each target area is equal.
And an effective pixel point determining unit, configured to determine, for each of the divided images to be processed, an effective pixel point in the target area based on a gray level threshold corresponding to each of the target areas, where the gray level threshold is determined according to a maximum gray level value of the pixel point in the target area, and the effective pixel point is a pixel point having a gray level value greater than or equal to the gray level threshold.
And the binarization processing unit is used for respectively carrying out binarization processing on the pixel points in each target area for each divided image to be processed to obtain the image to be processed after the binarization processing, wherein the gray value of the effective pixel point in the image to be processed after the binarization processing is a preset value.
And the composite processing unit is used for carrying out composite processing on the images to be processed after each binarization processing to obtain the images to be detected.
In some embodiments, in the case of photographing based on the side view angle of the display screen to be measured, the image obtaining module 61 includes:
and the candidate image acquisition unit is used for respectively lighting the pixel points of each column of the display screen to be tested when the display screen to be tested is respectively displayed in the single base color, wherein under the condition that one column of the pixel points of the display screen to be tested is lightened, the image of the currently lightened one column of the pixel points is acquired, and the candidate image corresponding to the currently lightened one column of the pixel points is obtained.
And the splicing unit is used for respectively splicing the candidate images corresponding to the single basic colors to obtain the image to be processed corresponding to the single basic colors.
In some embodiments, the target pixel point determining module 62 includes:
and a gray value judging unit, configured to judge that, for each pixel point in the image to be detected, at least two overlapping mono-basic groups exist in the pixel point whose gray value is at least twice the preset value when the gray value of the pixel point is at least twice the preset value.
And the target pixel point determining unit is used for determining the target pixel point according to the pixel points with at least two overlapped single-base color radicals.
In some embodiments, the image capturing module 61 includes:
and the tristimulus value acquisition unit is used for acquiring tristimulus values corresponding to each single basic color of the display screen to be tested.
And the brightness proportion determining unit is used for determining the brightness proportion corresponding to each single base color when the display screen to be tested needs to display standard white light according to the tristimulus values corresponding to each single base color.
And the shooting unit is used for shooting the display screen to be detected to obtain the image to be detected when the display screen to be detected is displayed in the standard white light based on the brightness proportion corresponding to each single primary color.
In some embodiments, the target pixel point determining module 62 includes:
And the image tristimulus value acquisition unit is used for acquiring the tristimulus value corresponding to each pixel point in the image to be detected.
And the UV color coordinate conversion unit is used for determining the color coordinates corresponding to the tristimulus values corresponding to the pixel points and converting the color coordinates corresponding to the tristimulus values corresponding to the pixel points into the UV color coordinates corresponding to the pixel points.
And the difference value calculation unit is used for determining the difference value between the UV color coordinates corresponding to the pixel points and the UV color coordinates corresponding to the standard white light for each pixel point.
And the target pixel point determining unit is used for determining the target pixel point from all the pixel points according to the difference value corresponding to the pixel point.
In some embodiments, the evaluation module 63 includes:
and a light mixing rate calculation unit for calculating the duty ratio of the target pixel point in the target area of each target area of the image to be detected to obtain the light mixing rate corresponding to each target area.
And the screening unit is used for screening out abnormal light mixing rates in the light mixing rates based on an abnormal threshold value to obtain the screened light mixing rates, wherein the abnormal threshold value is determined according to the average value of the light mixing rates.
And the target light mixing rate calculation unit is used for determining the target light mixing rate according to the screened light mixing rates.
It should be noted that, because the content of information interaction and execution process between the above devices/units is based on the same concept as the method embodiment of the present application, specific functions and technical effects thereof may be referred to in the method embodiment section, and will not be described herein again.
Embodiment III:
fig. 7 is a schematic structural diagram of a terminal device according to an embodiment of the present application. As shown in fig. 7, the terminal device 7 of this embodiment includes: at least one processor 70 (only one processor is shown in fig. 7), a memory 71, and a computer program 72 stored in the memory 71 and executable on the at least one processor 70, the processor 70 implementing the steps in any of the various method embodiments described above when executing the computer program 72.
The terminal device 7 may be a computing device such as a desktop computer, a notebook computer, a palm computer, a cloud server, etc. The terminal device may include, but is not limited to, a processor 70, a memory 71. It will be appreciated by those skilled in the art that fig. 7 is merely an example of the terminal device 7 and is not limiting of the terminal device 7, and may include more or fewer components than shown, or may combine certain components, or different components, such as may also include input-output devices, network access devices, etc.
The processor 70 may be a central processing unit (Central Processing Unit, CPU) and the processor 70 may be any other general purpose processor, digital signal processor (Digital Signal Processor, DSP), application specific integrated circuit (Application Specific Integrated Circuit, ASIC), field-programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 71 may in some embodiments be an internal storage unit of the terminal device 7, such as a hard disk or a memory of the terminal device 7. The memory 71 may in other embodiments also be an external storage device of the terminal device 7, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the terminal device 7. Further, the memory 71 may also include both an internal storage unit and an external storage device of the terminal device 7. The memory 71 is used for storing an operating system, application programs, boot loader (BootLoader), data, other programs, etc., such as program codes of the computer program. The memory 71 may also be used for temporarily storing data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
The embodiment of the application also provides a network device, which comprises: at least one processor, a memory, and a computer program stored in the memory and executable on the at least one processor, which when executed by the processor performs the steps of any of the various method embodiments described above.
Embodiments of the present application also provide a computer readable storage medium storing a computer program which, when executed by a processor, implements steps that may implement the various method embodiments described above.
The present embodiments provide a computer program product which, when run on a terminal device, causes the terminal device to perform steps that enable the respective method embodiments described above to be implemented.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application implements all or part of the flow of the method of the above embodiments, and may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, where the computer program, when executed by a processor, may implement the steps of each of the method embodiments described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing device/terminal apparatus, recording medium, computer Memory, read-Only Memory (ROM), random access Memory (RAM, random Access Memory), electrical carrier signals, telecommunications signals, and software distribution media. Such as a U-disk, removable hard disk, magnetic or optical disk, etc. In some jurisdictions, computer readable media may not be electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other manners. For example, the apparatus/network device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.

Claims (10)

1. A method of evaluating a light mixing effect, comprising:
shooting a display screen to be tested to obtain an image to be tested;
determining a target pixel point from the image to be detected, wherein the target pixel point comprises pixel points with at least two overlapped single-base color groups or comprises pixel points with the difference between UV color coordinates and corresponding standard UV color coordinates smaller than a threshold value;
And determining the target light mixing rate of the display screen to be tested according to the duty ratio of the target pixel point in the image to be tested.
2. The method for evaluating a light mixing effect according to claim 1, wherein the step of photographing the display screen to be tested to obtain the image to be tested comprises:
shooting the display screen to be tested when the display screen to be tested is displayed in each single basic color, and obtaining images to be processed corresponding to each single basic color;
and carrying out composite processing on each image to be processed to obtain the image to be detected.
3. The method for evaluating a light mixing effect according to claim 2, wherein in the case of photographing based on a side view angle of the display screen to be tested, photographing the display screen to be tested to obtain an image to be tested, comprising:
when the display screen to be tested is displayed in each single basic color, respectively lighting up pixel points of each column of the display screen to be tested, wherein under the condition that one column of pixel points of the display screen to be tested is lightened, obtaining an image of the currently lightened one column of pixel points to obtain a candidate image corresponding to the currently lightened one column of pixel points;
and respectively splicing the candidate images corresponding to each single basic color to obtain the image to be processed corresponding to the single basic color.
4. The method of evaluating a light mixing effect according to claim 2, further comprising, before said subjecting each of said images to be processed to a composite process to obtain said image to be measured:
carrying out the same dividing treatment on each image to be treated to obtain the images to be treated after the dividing treatment, wherein the dividing treatment is used for dividing the images to be treated to obtain a plurality of target areas, and the number of pixel points in each target area is equal;
for each divided image to be processed, determining effective pixel points in the target area based on gray threshold values corresponding to the target areas respectively, wherein the gray threshold values are determined according to the maximum gray value of the pixel points in the target area, and the effective pixel points are pixel points with gray values larger than or equal to the gray threshold values;
respectively carrying out binarization processing on the pixel points in each target area on each divided image to be processed to obtain the image to be processed after the binarization processing, wherein the gray value of the effective pixel point in the image to be processed after the binarization processing is a preset value;
Correspondingly, the step of performing composite processing on each image to be processed to obtain the image to be detected includes:
and carrying out composite processing on the images to be processed after each binarization processing to obtain the images to be detected.
5. The method for evaluating a light mixing effect according to claim 4, wherein determining the target pixel point from the image to be tested comprises:
for each pixel point in the image to be detected, under the condition that the gray value of the pixel point is at least twice of the preset value, judging that at least two overlapped single-base chromatic groups exist in the pixel point with the gray value being at least twice of the preset value;
and determining the target pixel point according to the pixel points with at least two overlapped single-base color radicals.
6. The method for evaluating a light mixing effect according to claim 4 or 5, wherein determining the target light mixing rate of the display screen to be tested according to the duty ratio of the target pixel point in the image to be tested comprises:
the ratio of the target pixel points in each target area of the image to be detected in the target area is calculated respectively, and the light mixing rate corresponding to each target area is obtained;
Screening out abnormal light mixing rates in the light mixing rates based on an abnormal threshold value, and obtaining the screened light mixing rates, wherein the abnormal threshold value is determined according to the average value of the light mixing rates;
and determining the target light mixing rate according to the screened light mixing rates.
7. The method for evaluating a light mixing effect according to claim 1, further comprising, before the capturing the to-be-tested display screen to obtain the to-be-tested image:
acquiring tristimulus values corresponding to each single basic color of the display screen to be tested;
determining the brightness proportion corresponding to each single basic color when the display screen to be tested needs to display standard white light according to the tristimulus values corresponding to each single basic color;
shooting the display screen to be tested to obtain an image to be tested, wherein the shooting comprises the following steps:
and shooting the display screen to be tested when the display screen to be tested is displayed in the standard white light based on the brightness proportion corresponding to each single basic color, so as to obtain the image to be tested.
8. The method of evaluating a light mixing effect according to claim 7, wherein determining a target pixel point from the image to be measured comprises:
Acquiring a tristimulus value corresponding to each pixel point in the image to be detected;
determining color coordinates corresponding to tristimulus values corresponding to the pixel points, and converting the color coordinates corresponding to the tristimulus values corresponding to the pixel points into UV color coordinates corresponding to the pixel points;
for each pixel point, determining a difference value between the UV color coordinate corresponding to the pixel point and the UV color coordinate corresponding to the standard white light;
and determining the target pixel point from each pixel point according to the difference value corresponding to the pixel point.
9. A light mixing effect evaluation device, characterized by comprising:
the image acquisition module to be measured is used for shooting the display screen to be measured to obtain an image to be measured;
the target pixel point determining module is used for determining a target pixel point from the image to be detected, wherein the target pixel point comprises pixel points with at least two overlapped single-base color groups or comprises pixel points with the difference between UV color coordinates and corresponding standard UV color coordinates being smaller than a threshold value;
and the evaluation module is used for determining the target light mixing rate of the display screen to be tested according to the duty ratio of the target pixel point in the image to be tested.
10. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 8 when executing the computer program.
CN202311852848.2A 2023-12-28 2023-12-28 Light mixing effect evaluation method and device and terminal equipment Pending CN117871047A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311852848.2A CN117871047A (en) 2023-12-28 2023-12-28 Light mixing effect evaluation method and device and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311852848.2A CN117871047A (en) 2023-12-28 2023-12-28 Light mixing effect evaluation method and device and terminal equipment

Publications (1)

Publication Number Publication Date
CN117871047A true CN117871047A (en) 2024-04-12

Family

ID=90589439

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311852848.2A Pending CN117871047A (en) 2023-12-28 2023-12-28 Light mixing effect evaluation method and device and terminal equipment

Country Status (1)

Country Link
CN (1) CN117871047A (en)

Similar Documents

Publication Publication Date Title
US9534957B2 (en) Unevenness inspection apparatus and unevenness inspection method
CN106303483B (en) A kind of image processing method and device
US20060181543A1 (en) Image display apparatus and image display method
CN110335273B (en) Detection method, detection device, electronic apparatus, and medium
CN113194303B (en) Image white balance method, device, electronic equipment and computer readable storage medium
CN110322830B (en) LED screen brightness correction method and device
CN113191988B (en) Brightness correction method and device, storage medium and electronic device
CN112185300A (en) Display screen correction method and device, storage medium and processor
CN111586273B (en) Electronic device and image acquisition method
CN113223466B (en) Display screen brightness correction method and device and electronic equipment
CN108093183B (en) Camera day and night mode switching method and terminal equipment
CN113781396B (en) Screen defect detection method, device, equipment and storage medium
US20130155254A1 (en) Imaging apparatus, image processing apparatus, and image processing method
CN108156434B (en) Image processing method and device, computer readable storage medium and computer equipment
CN107003255B (en) Method for inspecting terminal of component formed on substrate and substrate inspection apparatus
CN114613315A (en) Gamma curve learning method and LED display controller
CN117871047A (en) Light mixing effect evaluation method and device and terminal equipment
CN107959842B (en) Image processing method and device, computer readable storage medium and computer equipment
CN115412677B (en) Lamp spectrum determining and acquiring method, related equipment and medium
CN114862846A (en) Screening method, device, equipment and storage medium
CN107153834B (en) Image recognition light spot processing method and device
CN113870768A (en) Display compensation method and device
KR100230446B1 (en) Determination method for color of light from color image
CN116128796A (en) Automatic test method, device and equipment for camera light supplementing lamp and storage medium
Moore et al. Performance characterization of low light level color imaging sensors

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination