CN112767472B - Method for positioning lamp beads in display screen image, computing equipment and storage medium - Google Patents

Method for positioning lamp beads in display screen image, computing equipment and storage medium Download PDF

Info

Publication number
CN112767472B
CN112767472B CN201911076342.0A CN201911076342A CN112767472B CN 112767472 B CN112767472 B CN 112767472B CN 201911076342 A CN201911076342 A CN 201911076342A CN 112767472 B CN112767472 B CN 112767472B
Authority
CN
China
Prior art keywords
determining
point
target image
image
row
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911076342.0A
Other languages
Chinese (zh)
Other versions
CN112767472A (en
Inventor
周强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN201911076342.0A priority Critical patent/CN112767472B/en
Publication of CN112767472A publication Critical patent/CN112767472A/en
Application granted granted Critical
Publication of CN112767472B publication Critical patent/CN112767472B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Abstract

The application discloses a method for positioning a lamp bead in a display screen image, computing equipment and a storage medium. A method of locating light beads in an image of a display screen, comprising: determining a target image of a lamp bead to be positioned; determining the gray value of each row of pixel points in the target image and the gray value of each column of pixel points; determining a plurality of peak values of a first sequence according to the first sequence formed by gray values of each row of pixel points in the target image; determining a plurality of peaks of a second sequence according to the second sequence consisting of gray values of each column of pixel points in the target image; generating a point array according to the ordinate of the plurality of peaks of the first sequence and the abscissa of the plurality of peaks of the second sequence; determining the mass center of each lamp bead in the target image; establishing a corresponding relation between the mass center of each lamp bead and the midpoint of the point array; and regarding any one of the lamp beads in the target image, taking row and column information of points corresponding to the centroid of the any one of the lamp beads in the point array as row and column information of the any one of the lamp beads.

Description

Method for positioning lamp beads in display screen image, computing equipment and storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a method for positioning a light bead in an image of a display screen, a computing device, and a storage medium.
Background
LED displays have been widely used in various indoor and outdoor settings. The LED display screen is composed of a lamp bead array, and the scale of the lamp bead array can be flexibly adjusted. Different beads in an LED display typically have brightness and chromaticity differences. In order to make the display screen display picture more uniform in color, the lamp beads in the display screen need to be subjected to color correction during display. Before correcting the color of the beads, the beads of the display screen need to be positioned, and row and column information of each bead is determined (namely, row numbers and column numbers of the beads are determined).
At present, the mode of positioning the lamp beads mainly comprises the steps of shooting an image of a display screen, then determining a center point of a bright point (namely the lamp beads) in the image, and finally determining row and column information of the lamp beads in the image by arranging the center point.
However, in the current positioning manner of the lamp beads, when the lamp beads are damaged, there is a problem that the lamp beads cannot be accurately positioned.
Disclosure of Invention
Therefore, the application provides a new scheme for positioning the lamp beads in the display screen image, and the accuracy of positioning the lamp beads can be improved.
According to one aspect of the present application, there is provided a method for positioning a light bead in an image of a display screen, including:
determining a target image of a lamp bead to be positioned;
determining the gray value of each row of pixel points and the gray value of each column of pixel points in the target image, wherein the gray value of each row of pixel points is the cumulative sum of the gray values of the pixels in the corresponding row, and the gray value of each column of pixel points is the cumulative sum of the gray values of the pixels in the corresponding column;
determining a plurality of peaks of a first sequence consisting of gray values of each row of pixel points in the target image;
determining a plurality of peaks of a second sequence according to the second sequence consisting of gray values of each column of pixel points in the target image;
generating a point array according to the ordinate of the plurality of peaks of the first sequence and the abscissa of the plurality of peaks of the second sequence;
determining the mass center of each lamp bead in the target image;
establishing a corresponding relation between the centroid of each lamp bead and the midpoint of the point array;
and regarding any one of the lamp beads in the target image, taking row and column information of points corresponding to the centroid of any one of the lamp beads in the point array as row and column information of any one of the lamp beads.
In some embodiments, the determining the target image to be located includes: acquiring an original image of a display screen; performing binarization processing on the original image to obtain a binary image; performing expansion treatment on the binary image to obtain an expansion image; determining the maximum communication area in the expansion map; extracting a region corresponding to the maximum connected region from the original image, and taking the extracted region as the target image.
In some embodiments, the determining the target image to be located includes: acquiring an original image of a display screen; performing binarization processing on the original image to obtain a binary image; performing expansion treatment on the binary image to obtain an expansion image; determining the maximum communication area in the expansion map; determining a plurality of edge points of the maximum communication area based on an edge detection mode; performing straight line fitting operation according to the plurality of edge points to obtain four straight lines, and taking four intersection points of the four straight lines as four corner points; extracting an image area corresponding to the quadrilateral area determined by the four corner points from the original image; and carrying out distortion correction processing on the image area to obtain a rectangular area, and taking the rectangular area as the target image.
In some embodiments, the above method further comprises: and carrying out posture adjustment on the rectangular region to obtain the target image subjected to posture correction.
In some embodiments, the determining, based on the edge detection manner, a plurality of edge points of the maximum connected region includes: scanning a plurality of rows of the maximum communication area from left to right, determining a first white point scanned by each row, and taking the first white point as an edge point at the left side of the maximum communication area; scanning a plurality of rows of the maximum communication area from right to left, determining a first white point scanned by each row, and taking the first white point as an edge point on the right side of the maximum communication area; scanning a plurality of columns of the maximum communication area from top to bottom, determining a first white point scanned by each column, and taking the first white point as an edge point on the upper side of the maximum communication area; and scanning a plurality of columns of the maximum communication area from bottom to top, determining a first white point scanned by each column, and taking the first white point as an edge point of the lower side of the maximum communication area.
In some embodiments, the performing a straight line fitting operation according to the plurality of edge points, to obtain four straight lines, and taking four intersection points of the four straight lines as four corner points includes: performing straight line fitting on the edge point at the left side of the maximum communication area to obtain a first straight line; performing straight line fitting on edge points on the upper side of the maximum communication area to obtain a second straight line; performing straight line fitting on the edge point on the right side of the maximum communication area to obtain a third straight line; performing straight line fitting on edge points at the lower side of the maximum communication area to obtain a fourth straight line; and taking the intersection points of the first straight line, the second straight line, the third straight line and the fourth straight line as the four corner points.
In some embodiments, the establishing a correspondence between the centroid of each bead and the midpoint of the point array includes: determining row spacing and column spacing of the lamp beads in the target image; determining a row threshold according to the row spacing and determining a column threshold according to the column spacing, wherein the row threshold is less than the row spacing and the column threshold is less than the column spacing; for any point in the point array, selecting a lamp bead with the centroid nearest to the any point in the target image; and establishing a corresponding relation between the centroid of the selected lamp bead and any point when the transverse distance between the centroid of the selected lamp bead and any point is smaller than the column threshold value and the longitudinal distance between the centroid of the selected lamp bead and any point is smaller than the row threshold value.
In some embodiments, the above method further comprises: determining points in the point array, which do not correspond to the barycenter of the lamp beads in the target image, according to the corresponding relation between the barycenter of the lamp beads in the target image and the points in the point array; and taking each determined point which does not correspond to the centroid of the lamp bead in the target image as the position point of the lamp bead which is not lighted in the display screen.
In some embodiments, the determining the plurality of peaks of the first sequence according to the first sequence consisting of gray values of each row of pixels in the target image includes: establishing a first curve graph of the first sequence, and taking a plurality of wave peaks in the first curve graph as a plurality of peaks of the first sequence; the determining a plurality of peaks of the second sequence according to the second sequence composed of gray values of each column of pixels in the target image includes: and establishing a second curve graph of the second sequence, and taking a plurality of wave peaks in the second curve graph as a plurality of peaks of the second sequence.
In some embodiments, the determining the centroid of each bead in the target image comprises: determining a binary image of the target image; and determining the center point of each white area in the binary image of the target image, and determining the mass center of each lamp bead in the target image according to the center point of each white area.
According to one aspect of the present application, there is provided a computing device comprising: a processor; a storage device; the processor is used for executing a method for positioning the lamp beads in the display screen image.
According to one aspect of the present application, there is provided a storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by a computing device, cause the computing device to perform instructions of a method of locating light beads in a display screen image.
In summary, according to the scheme of the embodiment of the application, the brightness change of each row of pixel points (i.e. determining the first sequence) and the brightness change of each column of pixel points (i.e. determining the second sequence) in the target image can be obtained through the gray value of each row of pixel points and the gray value of each column of pixel points. Based on this, the scheme according to the embodiment of the application can determine a plurality of peaks of the first sequence and a plurality of peaks of the second sequence. In addition, for any row (column) of beads in the target image, the abscissa deviation of the center point (i.e., centroid) of the beads in the row (column) is small, and the probability that the peak value of each of the first and second sequences passes through the centroid of the beads is maximum. Therefore, according to the scheme provided by the embodiment of the application, the dot array established through the abscissa and the ordinate corresponding to the peak value can accurately represent the row-column distribution of the display screen. On the basis, according to the scheme of the embodiment of the application, the lamp beads can be positioned by establishing the corresponding relation between the point array and the lamp beads in the target image. In particular, in a scene where there is image distortion or damage to the light beads, the light beads can be accurately positioned according to the scheme of the embodiment of the application.
Drawings
FIG. 1 illustrates a flow chart of a method 100 of locating light beads in a display screen image according to some implementations of the present application;
FIG. 2A illustrates a binary image of a target image according to some embodiments of the present application;
FIG. 2B illustrates a schematic diagram of determining peaks of a first sequence according to some embodiments of the present application;
FIG. 2C illustrates a schematic diagram of determining peaks of a second sequence according to some embodiments of the present application;
FIG. 2D illustrates a schematic diagram of a spot array according to some embodiments of the present application;
FIG. 2E illustrates a schematic diagram of a correspondence of a bead centroid to a midpoint of a point array according to some embodiments of the present application;
FIG. 3A illustrates a flowchart of a method 300 of determining a target image of a light bulb to be positioned, according to some embodiments of the present application;
FIG. 3B illustrates an original image according to some embodiments of the present application;
FIG. 3C shows the binary diagram of FIG. 3B;
FIG. 3D shows an expanded view of FIG. 3C;
fig. 3E illustrates a target image according to some embodiments of the present application.
FIG. 4 illustrates a flow chart of a method 400 of determining a target image of a light bulb to be positioned according to some embodiments of the present application;
FIG. 5A illustrates a flowchart of a method 500 of determining edge points according to some embodiments of the present application;
FIG. 5B illustrates a schematic view of edge points in a maximum connected region according to some embodiments of the present application;
fig. 5C shows a schematic diagram of determining 4 corner points according to some embodiments of the present application;
fig. 5D shows an image area 503 extracted from an original image;
FIG. 6 illustrates a flow chart of a method 600 of establishing a correspondence of a centroid of a light bulb to a midpoint of a point array according to some embodiments of the present application;
FIG. 7 illustrates a component block diagram of a computing device according to some embodiments of the present application.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail below by referring to the accompanying drawings and examples.
In some application scenes, in order to correct the brightness and the color degree of the lamp beads in the LED display screen, the embodiment of the application can control the LED display screen to display a monochromatic image. Based on the above, the embodiment of the application can shoot the image of the display screen and determine the brightness information of each lamp bead in each image. In addition, the embodiment of the application can locate each lamp bead in the image, namely, the row and column information of each lamp bead in the image is determined. Therefore, according to the embodiment of the application, the corresponding relation between the lamp beads in the image and the lamp beads in the LED display screen can be established according to the row and column information of each lamp bead in the image, and display correction is carried out on the lamp beads in the LED display screen according to the corresponding relation and the color information of each lamp bead in the image. That is, the embodiment of the application can generate a corresponding coefficient for any one of the lamp beads, and the brightness and the color degree of the display screen are uniform due to the final correction effect. The following describes the positioning manner of the lamp beads in the image with reference to fig. 1.
FIG. 1 illustrates a flow chart of a method 100 of locating light beads in a display screen image according to some implementations of the present application. The method 100 may be performed, for example, by a computing device. The computing device may be, for example, a notebook computer, tablet computer, mobile terminal, server, etc.
As shown in fig. 1, in step S101, a target image of a lamp bead to be positioned is determined. For example, the display screen may be in a lit state, and step S101 may acquire an original image of the display screen and take the original image as a target image. For another example, step S101 may perform a process of filtering out a background image on the original image, extract an image area corresponding to the display screen, and use the image area as the target image.
In step S102, a gray value of each row of pixels and a gray value of each column of pixels in the target image are determined. The gray value of each row of pixel points is the cumulative sum of the gray values of the pixel points in the corresponding row, and the gray value of each column of pixel points is the cumulative sum of the gray values of the pixel points in the corresponding column. In some embodiments, step S102 may determine a gray value of each row of pixels and a gray value of each column of pixels in the binary image corresponding to the target image. In some embodiments, step S102 may determine a gray value of each row of pixels and a gray value of each column of pixels in the gray map corresponding to the target image.
In step S103, a plurality of peaks of a first sequence composed of gray values of each row of pixel points in the target image are determined. For example, step S103 may establish a first graph of the first sequence, and use a plurality of peak values in the first graph as a plurality of peak values of the first sequence. Fig. 2A illustrates a binary image of a target image according to some embodiments of the present application. Fig. 2B illustrates a schematic diagram of determining a peak value of a first sequence according to some embodiments of the present application. As shown in fig. 2B, the first graph S1 generated by the first sequence is pulsed. The abscissa of the first graph S1 (i.e., the coordinate on X2) represents the gray value of each row of pixels in the binary image. The ordinate of the first graph S1 (i.e., the coordinate on Y2) represents the ordinate of each row of pixel points in the binary image. There are 8 peaks in total (i.e., the peak value on X2) in the first graph S1. The ordinate corresponding to the 8 peaks is y1-y8. That is, the ordinate of 8 rows of pixel points corresponding to the peak value in the target image is y1 to y8 in order.
In step S104, a plurality of peaks of a second sequence composed of gray values of each column of pixel points in the target image are determined. For example, step S104 may create a second graph of the second sequence, and use a plurality of peak values in the second graph as a plurality of peak values of the second sequence. Fig. 2C illustrates a schematic diagram of determining peaks of a second sequence according to some embodiments of the present application. As shown in fig. 2C, the ordinate of the second sequence corresponding to the second graph S2 (i.e., the coordinate on Y3) represents the gray value of each column of pixels in the binary image. The abscissa of the second graph S2 (i.e., the coordinate on X3) represents the abscissa of each column of pixel points in the binary image. The second graph S2 has a total of 12 peaks (i.e., peaks at Y3). The abscissa corresponding to the 12 wave crest values is x1-x12 in sequence.
In step S105, a dot array is generated from the ordinate of the plurality of peaks of the first sequence and the abscissa of the plurality of peaks of the second sequence. Fig. 2D shows a schematic diagram of a spot array according to some embodiments of the present application. The dot array in fig. 2D is an array of intersections of white dashed lines. The coordinates of the dot array in the coordinate system (X1, Y1) of the target image are, for example, (X1, Y1) to (X12, Y8).
In step S106, the centroid of each bead in the target image is determined. The lamp bead in the target image is circular, and the mass center of the lamp bead is the center of a circle. In some embodiments, step S106 may determine a binary map of the target image. Step S106 may determine a center point of each white region (i.e., an image of each bead) in the binary image of the target image, and determine a centroid of each bead in the target image according to the center point of each white region.
In step S107, a correspondence between the centroid of each lamp bead and the midpoint of the dot array is established. In some embodiments, for any one point in the array of points, step S107 may associate a lamp bead having a centroid within a predetermined range closest to the point with the point. The centroid of any one of the lamp beads within the predetermined range satisfies the condition: the lateral spacing of the centroid from a point is less than the column threshold and the longitudinal spacing of the centroid from the corresponding point is less than the row threshold. Here, the column threshold is smaller than the column pitch of the dot array. The row threshold is less than the row spacing of the dot array. For example, fig. 2E illustrates a schematic diagram of a correspondence between a centroid of a lamp bead and a midpoint of a point array according to some embodiments of the present application. As shown in fig. 2E, the light beads 201 may be associated with the points 203 in the point array. The light beads 202 may be associated with points 204. The points 205-208 do not have a correspondence to the beads. Points 205-208 may be considered as location points for unlit light beads in a display screen.
In step S108, for any one of the beads in the target image, row and column information of a point corresponding to the centroid of the bead in the point array is set as row and column information of the bead. Here, the row and column information of one lamp bead includes, for example, a row number and a column number of the lamp bead in the display screen.
In summary, according to the method 100 of the embodiment of the present application, the brightness change of each row of pixels (i.e. determining the first sequence) and the brightness change of each column of pixels (i.e. determining the second sequence) in the target image may be obtained by using the gray value of each row of pixels and the gray value of each column of pixels. On this basis, the method 100 may determine a plurality of peaks of the first sequence and a plurality of peaks of the second sequence. In addition, for any row (column) of beads in the target image, the abscissa deviation of the center point (i.e., centroid) of the beads in the row (column) is small, and the probability that the peak value of each of the first and second sequences passes through the centroid of the beads is maximum. Thus, the method 100 can accurately represent the row-column distribution of the display screen through the dot array established by the abscissa and the ordinate corresponding to the peak value. Based on this, the method 100 can locate the beads by establishing a correspondence between the spot array and the beads in the target image. For example, due to image distortion, there is a large deviation (i.e., there is a large line deviation) between the lamp beads 201 and the lamp beads 209 in the same line in the vertical (Y1) direction in the display screen. There is a large deviation of the lamp beads 201 from the points 203. If the method 100 is not used for bead positioning, the beads 201 may be considered to be in a different row than the beads 209. The method 100 may determine that the bead 202 corresponds to the spot 204 by establishing a correspondence between the spot array and the bead. Thus, the method 100 can accurately determine that the lamp beads 201 and 209 belong to the same row.
In some embodiments, the method 100 may further perform step S109, where a point in the point array that does not correspond to the centroid of the light bead in the target image is determined according to the correspondence between the centroid of the light bead in the target image and the points in the point array.
In step S110, each point determined not to correspond to the centroid of the lamp bead in the target image is used as a position point of the lamp bead that is not lighted in the display screen. Here, even if a part of the beads in the display screen are not lit, step S110 may locate the position points corresponding to the unlit beads. For example, points 205-208 may be considered as location points for unlit light beads in a display screen.
In summary, the method 100 according to embodiments of the present application can accurately locate a light bead in a scene where there is image distortion or damage to the light bead.
In some embodiments, step S101 may directly use the original image of the display screen as the target image of the lamp bead to be positioned.
In some embodiments, step S101 may be implemented as the method 300 shown in fig. 3A.
Fig. 3A illustrates a flowchart of a method 300 of determining a target image of a light bulb to be positioned, according to some embodiments of the present application.
As shown in fig. 3A, in step S301, an original image of a display screen is acquired. For example, fig. 3B illustrates an original image according to some embodiments of the present application. Fig. 3B, although shown as a black and white image, may actually be a color image.
In step S302, binarization processing is performed on the original image to obtain a binary image. For example, fig. 3C shows the binary diagram of fig. 3B.
In step S303, the binary image is subjected to expansion processing to obtain an expansion image. For example, fig. 3D shows an expanded view of fig. 3C.
In step S304, the maximum communication area in the expansion map is determined. Here, step S304 may determine the maximum communication area according to the area of the communication area. The maximum connected region determined in step S304 may be represented by an outer contour point of the maximum connected region. Taking fig. 3D as an example, the expansion map includes 3 communication areas. Step S304 may determine the communication region 301 as a maximum communication region according to the area. The remaining two connected regions correspond to background images (e.g., stray light near the display screen).
In step S305, an area corresponding to the maximum connected area is extracted from the original image, and the extracted area is taken as a target image. For example, based on the maximum communication area 301, step S305 may extract the target image 3E from the original image of fig. 3B.
In summary, according to the method 300 of the embodiment of the present application, through binarization processing, expansion processing, and determining the target image according to the maximum connected region, the background region in the original image can be removed, so that interference of stray light in the background region on positioning of the lamp beads can be avoided, and thus accuracy of positioning of the lamp beads is improved.
In some embodiments, step S101 may be implemented as method 400.
As shown in fig. 4, in step S401, an original image of a display screen is acquired. In step S402, binarization processing is performed on the original image to obtain a binary image. In step S403, the binary map is subjected to expansion processing to obtain an expansion map. In step S404, the maximum communication area in the expansion map is determined.
In step S405, a plurality of edge points of the maximum connected region are determined based on the edge detection method. In some embodiments, step S405 may be implemented as method 500.
As shown in fig. 5A, in step S501, a plurality of rows of the maximum connected region are scanned from left to right, and the first white point scanned by each row is determined and used as the edge point on the left side of the maximum connected region. Here, step S501 may scan all or part of the rows of the maximum communication area.
In step S502, a plurality of rows of the maximum connected region are scanned from right to left, and the first white point scanned by each row is determined and used as an edge point on the right side of the maximum connected region. Here, step S502 may scan all or part of the rows of the maximum communication area.
In step S503, a plurality of columns of the maximum connected region are scanned from top to bottom, and the first white point scanned by each column is determined and used as an edge point on the upper side of the maximum connected region. Step S503 may scan all or part of the columns of the maximum communication area.
In step S504, a plurality of columns of the maximum connected region are scanned from the bottom up, and the first white point scanned by each column is determined and taken as an edge point on the lower side of the maximum connected region. Step S504 may scan all or part of the columns of the maximum connected region.
For example, fig. 5B shows a schematic diagram of edge points in a maximum connected region according to some embodiments of the present application. As shown in fig. 5B, step S501 may determine edge points a1 to am of the left side of the maximum connected region 501. Step S502 may determine edge points c1 to cm on the right side of the maximum communication area 501. Step S503 may determine edge points b1 to bn of the upper side of the maximum communication region 501. Step S504 may determine edge points d1 to dn of the lower side of the maximum communication area 501. Wherein m and n are positive integers.
In step S406, a straight line fitting operation is performed according to the plurality of edge points, so as to obtain four straight lines, and four intersection points of the four straight lines are used as four corner points. For example, step S406 performs straight line fitting on the edge point on the left side of the maximum connected region to obtain a first straight line. In step S406, a second straight line is obtained by performing straight line fitting on the edge point on the upper side of the maximum communication area. In step S406, a third straight line is obtained by performing straight line fitting on the edge point on the right side of the maximum connected region. In step S406, a fourth straight line is obtained by performing straight line fitting on the edge point at the lower side of the maximum communication area. Step S406 uses the intersection points of the first line, the second line, the third line, and the fourth line as four corner points. For example, fig. 5C shows a schematic diagram of determining 4 corner points according to some embodiments of the present application. As shown in fig. 5C, the first straight line is L1, the second straight line is L2, the third straight line is L3, the fourth straight line is L4, and the 4 corner points are e1, e2, e3, and e4. The 4 corner points may determine a quadrilateral region 502.
In step S407, an image area corresponding to the quadrangular area determined by the four corner points is extracted from the original image. For example, fig. 5D shows an image area 503 extracted from an original image. It should be noted that, there may be a case where the lamp beads are not lit in the display screen. For example, the beads near the corner of the display screen are not lit due to damage. And if the lamp beads near the corner points of the display screen are not lightened, determining the maximum communication area as a pattern area of the unfilled corner according to the original image of the display screen. According to the method 400, the corner points missing in the image can be determined by performing edge detection on the maximum connected region and performing straight line fitting on the edge points.
In some embodiments, the original image captured is prone to image distortion due to factors such as the angle of capture. For example, the display screen is rectangular, and the display screen is a non-rectangular area in the photographed original image. The method 400 may perform step S408, perform distortion correction processing on the image area, obtain a rectangular area, and use the rectangular area as the target image. In summary, according to the method 400 of the embodiment of the present application, by determining the corner points and performing distortion correction on the image area determined according to the corner points, distortion existing in the image can be improved, so that accuracy of positioning the lamp beads can be improved.
In some embodiments, the edge (e.g., lower edge) of the original image is at a deflection angle to the edge (e.g., lower edge) of the display screen. Accordingly, the lower edge of the target image determined in step S408 is at an angle to the horizontal direction. The method 400 may further execute step S409 to perform posture adjustment on the rectangular region, so as to obtain a target image after posture correction. Here, step S409 may adjust the lower edge of the rectangular region to be parallel to the horizontal direction. Since the gray value of each row and the gray value of each column are determined in steps S102 to S104, the accuracy of the determined dot array (i.e., the consistency of the dot array with the actual bead row-column distribution) can be improved by performing step S409. Thus, by executing step S409, the embodiment of the present application can improve the positioning accuracy of the lamp beads.
In some embodiments, step S107 may be implemented as method 600.
As shown in fig. 6, in step S601, the row pitch and column pitch of the beads in the target image are determined. In some embodiments, step S601 may use the row pitch and column pitch of the dot array as the row pitch and column pitch of the light beads.
In some embodiments, step S601 may determine a row pitch and a column pitch according to a distance between corner points and the number of rows and columns of the dot array. For example, the distance between the corner points e1 and e4 in fig. 5C is f. The dot array has 8 rows in total. Line spacing g=f/(8-1). The distance between the corner points e1 and e2 in fig. 5C is h. The dot array has 12 columns in total. Column spacing i=h/(12-1).
In step S602, a row threshold is determined from the row spacing and a column threshold is determined from the column spacing. Wherein the row threshold is less than the row spacing and the column threshold is less than the column spacing. For example, row threshold j=0.8×g, and column threshold k=0.6×i.
In step S603, for any one point in the array of points, one lamp bead whose centroid is closest to the point is selected in the target image.
In step S604, when the lateral distance between the centroid of the selected light bulb and the point is smaller than the column threshold value and the longitudinal distance between the centroid of the selected light bulb and the point is smaller than the row threshold value, the corresponding relation between the centroid of the selected light bulb and the point is established. In summary, the method 600 may determine the correspondence between the lamp beads and the dots according to the positional relationship between the lamp beads and the dots in the dot array.
FIG. 7 illustrates a block diagram of the components of a computing device. As shown in fig. 7, the computing device includes one or more processors (CPUs) 702, a communication module 704, a memory 706, a user interface 710, and a communication bus 708 for interconnecting these components.
The processor 702 may receive and transmit data via the communication module 704 to enable network communication and/or local communication.
The user interface 710 includes one or more output devices 712, including one or more speakers and/or one or more displays. The user interface 710 also includes one or more input devices 714. The input device 714 may receive, for example, but not limited to, a keyboard, a mouse, and touch instructions.
Memory 706 may be a high-speed random access memory such as DRAM, SRAM, DDR RAM, or other random access solid state memory devices; or non-volatile memory such as one or more magnetic disk storage devices, optical disk storage devices, flash memory devices, or other non-volatile solid state storage devices.
The memory 706 stores a set of instructions executable by the processor 702, including:
an operating system 716 including programs for handling various basic system services and for performing hardware related tasks;
application 718 includes various programs for implementing the method 100 for locating light beads in display images described above, which can implement the process flows described in the examples above.
In addition, each instance of the present application may be implemented by a data processing program executed by a data processing apparatus such as a computer. Obviously, the data processing program constitutes the invention. In addition, a data processing program typically stored in one storage medium is executed by directly reading the program out of the storage medium or by installing or copying the program into a storage device (such as a hard disk and/or a memory) of the data processing apparatus. Therefore, such a storage medium also constitutes the present invention. The storage medium may use any type of recording means, such as paper storage medium (e.g., paper tape, etc.), magnetic storage medium (e.g., floppy disk, hard disk, flash memory, etc.), optical storage medium (e.g., CD-ROM, etc.), magneto-optical storage medium (e.g., MO, etc.), etc.
The present application thus also discloses a non-volatile storage medium having stored therein a data processing program for performing any one of the examples of the method 100 of locating light beads in display screen images described herein above.
In addition, the method steps described herein may be implemented by hardware, such as logic gates, switches, application Specific Integrated Circuits (ASIC), programmable logic controllers, embedded microcontrollers, etc., in addition to data processing programs. Such hardware that can implement the methods described herein may also constitute the present application.
The foregoing description of the preferred embodiments of the present invention is not intended to limit the invention to the precise form disclosed, and any modifications, equivalents, and variations which fall within the spirit and principles of the invention are intended to be included within the scope of the present invention.

Claims (12)

1. A method of locating a light bead in an image of a display screen, comprising:
determining a target image of a lamp bead to be positioned;
determining the gray value of each row of pixel points and the gray value of each column of pixel points in the target image, wherein the gray value of each row of pixel points is the cumulative sum of the gray values of the pixels in the corresponding row, and the gray value of each column of pixel points is the cumulative sum of the gray values of the pixels in the corresponding column;
determining a plurality of peaks of a first sequence consisting of gray values of each row of pixel points in the target image;
determining a plurality of peaks of a second sequence according to the second sequence consisting of gray values of each column of pixel points in the target image;
generating a point array according to the ordinate of the plurality of peaks of the first sequence and the abscissa of the plurality of peaks of the second sequence;
determining the mass center of each lamp bead in the target image;
establishing a corresponding relation between the centroid of each lamp bead and the midpoint of the point array, wherein for any point in the point array, the lamp bead with the centroid in a preset range and the nearest distance between the centroid and the point is established with the corresponding relation;
and regarding any one of the lamp beads in the target image, taking row and column information of points corresponding to the centroid of any one of the lamp beads in the point array as row and column information of any one of the lamp beads.
2. The method of claim 1, wherein determining the target image of the lamp bead to be positioned comprises:
acquiring an original image of a display screen;
performing binarization processing on the original image to obtain a binary image;
performing expansion treatment on the binary image to obtain an expansion image;
determining the maximum communication area in the expansion map;
extracting a region corresponding to the maximum connected region from the original image, and taking the extracted region as the target image.
3. The method of claim 1, wherein determining the target image of the lamp bead to be positioned comprises:
acquiring an original image of a display screen;
performing binarization processing on the original image to obtain a binary image;
performing expansion treatment on the binary image to obtain an expansion image;
determining the maximum communication area in the expansion map;
determining a plurality of edge points of the maximum communication area based on an edge detection mode;
performing straight line fitting operation according to the plurality of edge points to obtain four straight lines, and taking four intersection points of the four straight lines as four corner points;
extracting an image area corresponding to the quadrilateral area determined by the four corner points from the original image;
and carrying out distortion correction processing on the image area to obtain a rectangular area, and taking the rectangular area as the target image.
4. The method as recited in claim 3, further comprising:
and carrying out posture adjustment on the rectangular region to obtain the target image subjected to posture correction.
5. The method of claim 3, wherein determining a plurality of edge points of the maximum connected region based on an edge detection approach comprises:
scanning a plurality of rows of the maximum communication area from left to right, determining a first white point scanned by each row, and taking the first white point as an edge point at the left side of the maximum communication area;
scanning a plurality of rows of the maximum communication area from right to left, determining a first white point scanned by each row, and taking the first white point as an edge point on the right side of the maximum communication area;
scanning a plurality of columns of the maximum communication area from top to bottom, determining a first white point scanned by each column, and taking the first white point as an edge point on the upper side of the maximum communication area;
and scanning a plurality of columns of the maximum communication area from bottom to top, determining a first white point scanned by each column, and taking the first white point as an edge point of the lower side of the maximum communication area.
6. A method according to claim 3, wherein the performing a straight line fitting operation according to the plurality of edge points to obtain four straight lines, and taking four intersection points of the four straight lines as four corner points comprises:
performing straight line fitting on the edge point at the left side of the maximum communication area to obtain a first straight line;
performing straight line fitting on edge points on the upper side of the maximum communication area to obtain a second straight line;
performing straight line fitting on the edge point on the right side of the maximum communication area to obtain a third straight line;
performing straight line fitting on edge points at the lower side of the maximum communication area to obtain a fourth straight line;
and taking the intersection points of the first straight line, the second straight line, the third straight line and the fourth straight line as the four corner points.
7. The method of claim 3, wherein said establishing a correspondence between the centroid of each of said beads and the midpoint of said array of points comprises:
determining row spacing and column spacing of the lamp beads in the target image;
determining a row threshold according to the row spacing and determining a column threshold according to the column spacing, wherein the row threshold is less than the row spacing and the column threshold is less than the column spacing;
for any point in the point array, selecting a lamp bead with the centroid nearest to the any point in the target image;
and establishing a corresponding relation between the centroid of the selected lamp bead and any point when the transverse distance between the centroid of the selected lamp bead and any point is smaller than the column threshold value and the longitudinal distance between the centroid of the selected lamp bead and any point is smaller than the row threshold value.
8. The method as recited in claim 1, further comprising:
determining points in the point array, which do not correspond to the barycenter of the lamp beads in the target image, according to the corresponding relation between the barycenter of the lamp beads in the target image and the points in the point array;
and taking each determined point which does not correspond to the centroid of the lamp bead in the target image as the position point of the lamp bead which is not lighted in the display screen.
9. The method of claim 1, wherein the determining the plurality of peaks of the first sequence from the first sequence consisting of gray values for each row of pixels in the target image comprises: establishing a first curve graph of the first sequence, and taking a plurality of wave peaks in the first curve graph as a plurality of peaks of the first sequence;
the determining a plurality of peaks of the second sequence according to the second sequence composed of gray values of each column of pixels in the target image includes: and establishing a second curve graph of the second sequence, and taking a plurality of wave peaks in the second curve graph as a plurality of peaks of the second sequence.
10. The method of claim 1, wherein the determining the centroid of each bead in the target image comprises:
determining a binary image of the target image;
and determining the center point of each white area in the binary image of the target image, and determining the mass center of each lamp bead in the target image according to the center point of each white area.
11. A computing device, comprising:
a processor;
a storage device;
wherein the processor is configured to perform the method of locating light beads in a display screen image as claimed in any one of claims 1 to 10.
12. A storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by a computing device, cause the computing device to perform the method of locating light beads in a display screen image of any of claims 1-10.
CN201911076342.0A 2019-11-06 2019-11-06 Method for positioning lamp beads in display screen image, computing equipment and storage medium Active CN112767472B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911076342.0A CN112767472B (en) 2019-11-06 2019-11-06 Method for positioning lamp beads in display screen image, computing equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911076342.0A CN112767472B (en) 2019-11-06 2019-11-06 Method for positioning lamp beads in display screen image, computing equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112767472A CN112767472A (en) 2021-05-07
CN112767472B true CN112767472B (en) 2023-07-14

Family

ID=75692774

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911076342.0A Active CN112767472B (en) 2019-11-06 2019-11-06 Method for positioning lamp beads in display screen image, computing equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112767472B (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0889438A3 (en) * 1997-07-04 1999-12-22 Agfa-Gevaert N.V. Method of determining a symmetry line in a radiation image
US20050200291A1 (en) * 2004-02-24 2005-09-15 Naugler W. E.Jr. Method and device for reading display pixel emission and ambient luminance levels
CN102723054A (en) * 2012-06-18 2012-10-10 西安电子科技大学 Online calibration system and online calibration method for ununiformity of LED (light-emitting diode) display screen
CN106384355B (en) * 2016-09-21 2019-05-31 安徽慧视金瞳科技有限公司 A kind of automatic calibration method in projection interactive system

Also Published As

Publication number Publication date
CN112767472A (en) 2021-05-07

Similar Documents

Publication Publication Date Title
CN109977949B (en) Frame fine adjustment text positioning method and device, computer equipment and storage medium
US7019713B2 (en) Methods and measurement engine for aligning multi-projector display systems
CN112584116B (en) Projection correction method, projection correction device, storage medium and electronic equipment
JP4618287B2 (en) Adjustment method and system
US5764793A (en) Method of and apparatus for inspecting pattern defects
US9361698B1 (en) Structure light depth sensor
JP6115214B2 (en) Pattern processing apparatus, pattern processing method, and pattern processing program
JPH1173501A (en) Reference picture preparation method and pattern detection device
CN111553870B (en) Image processing method based on distributed system
US9204130B2 (en) Method and system for creating a three dimensional representation of an object
CN113160043A (en) Mura processing method and device for flexible display screen
JP2021086616A (en) Method for extracting effective region of fisheye image based on random sampling consistency
CN115440159A (en) Image correction method, and correction method and device for LED display screen image
CN112767472B (en) Method for positioning lamp beads in display screen image, computing equipment and storage medium
US20120038785A1 (en) Method for producing high resolution image
KR20190027165A (en) Image Adjustment System and Method for Unifying Features of Multiple Images
WO2022254854A1 (en) Three-dimensional measurement device
JP2014178536A (en) Drawing data generation method, drawing method, drawing data generation apparatus, and drawing apparatus
AU2013264673A1 (en) Pattern processing apparatus, pattern processing method, and pattern processing program
CN110543798A (en) two-dimensional code identification method and device
CN113850100B (en) Method for correcting two-dimensional code and electronic equipment
JP2020182127A (en) Calibration device, calibration system, and calibration method of display device
CN113538590A (en) Zoom camera calibration method and device, terminal equipment and storage medium
TW201816725A (en) Method for improving occluded edge quality in augmented reality based on depth camera
KR20180037347A (en) Board inspection apparatus and method of compensating board distortion using the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant