CN115474306A - Atmosphere lamp video data processing method, device, equipment and storage medium - Google Patents

Atmosphere lamp video data processing method, device, equipment and storage medium Download PDF

Info

Publication number
CN115474306A
CN115474306A CN202211086299.8A CN202211086299A CN115474306A CN 115474306 A CN115474306 A CN 115474306A CN 202211086299 A CN202211086299 A CN 202211086299A CN 115474306 A CN115474306 A CN 115474306A
Authority
CN
China
Prior art keywords
subarea
pixel range
display terminal
areas
data processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211086299.8A
Other languages
Chinese (zh)
Inventor
孙铭梁
郭延东
李洪海
林芳芳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Xinlongpeng Technology Co ltd
Original Assignee
Shenzhen Xinlongpeng Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Xinlongpeng Technology Co ltd filed Critical Shenzhen Xinlongpeng Technology Co ltd
Priority to CN202211086299.8A priority Critical patent/CN115474306A/en
Publication of CN115474306A publication Critical patent/CN115474306A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B45/00Circuit arrangements for operating light-emitting diodes [LED]
    • H05B45/10Controlling the intensity of the light
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/67Circuits for processing colour signals for matrixing
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B45/00Circuit arrangements for operating light-emitting diodes [LED]
    • H05B45/20Controlling the colour of the light

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)

Abstract

The invention discloses an atmosphere lamp video data processing method, device and equipment and a storage medium. According to the invention, a display signal input source of a display terminal is received, and a pixel point region of the display terminal is divided into a plurality of subareas according to the number of lamp beads of an atmosphere lamp; acquiring a row pixel range and a column pixel range of each partition area; dividing each of the plurality of subarea areas into a plurality of subarea subareas according to the row pixel range and the column pixel range of each of the plurality of subarea areas; in one frame of display picture data, acquiring RGB basic tone values of the display signal input source in a plurality of subarea subareas; and mapping the RGB basic tone values of a plurality of subarea subareas in each subarea area to corresponding LED lamp beads to generate a spatial gradient effect. The atmosphere lamp can show the visual effect of gradual space change along with the display picture, can smoothly transition along with the display picture, and has high user experience.

Description

Atmosphere lamp video data processing method, device, equipment and storage medium
Technical Field
The invention relates to the technical field of light control, in particular to an atmosphere lamp video data processing method, device, equipment and storage medium.
Background
The atmosphere lamp is also called as an LED atmosphere lamp, is a perfect choice for theme parks, hotels, homes, exhibitions, businesses and artistic lighting in the LED lamp, and creates a required atmosphere for life of people. With the increase of living level, people have higher aesthetic requirements on the use of electronic equipment, such as pursuing enjoyment on videos and sounds and more expecting more ambience during watching videos and playing games. Therefore, generally, desktop computers, chassis and keyboards with higher ends in the market are added with the effect of the atmosphere lamp while pursuing quality and hand feeling, so that users can better blend in the atmosphere.
The atmosphere lamp of the television is a device which generates a signal for controlling the lamp strip by acquiring a video input signal of the television and generating a signal for controlling the lamp strip according to the change of the video input signal, so that the lamp strip generates light with corresponding color tones along with the change of the picture color tones of the television. The LED lamp strip is formed by connecting n LED lamp beads with red (R), green (G) and blue (B) channels in series. As shown in fig. 1, a typical LED strip is placed around a frame of a back of a television, and each lamp bead in the LED strip corresponds to a screen of the television. Among the prior art, the mode that TV set atmosphere lamp is most based on input pixel statistics, this kind of scheme makes LED lamp area follow the unable quick response picture of screen and changes, has the delay of several frames, and is excessively harsh, leads to the adjacent frame of video picture to appear more obvious colour or luminance difference, and the lamp area can produce obvious scintillation to influence user's impression and experience. Therefore, how to reliably and efficiently solve the problem of atmosphere lamp video data processing has become a problem to be solved urgently by those skilled in the art.
Disclosure of Invention
Aiming at the defects, the embodiment of the invention provides an atmosphere lamp video data processing method, device, equipment and storage medium, which are used for solving the problems in the prior art.
In order to solve the above technical problem, an embodiment of the present invention provides an ambience light video data processing method, where the method includes:
receiving a display signal input source of a display terminal, and dividing a pixel point region of the display terminal into a plurality of subarea regions according to the number of lamp beads of an atmosphere lamp;
acquiring a row pixel range and a column pixel range of each partition area;
dividing each of the plurality of subarea areas into a plurality of subarea subareas according to the row pixel range and the column pixel range of each of the plurality of subarea areas; the plurality of subarea areas take the middle point of the display terminal as a picture center point to show a radial type and expand towards the outside of a display picture;
in one frame of display picture data, acquiring RGB basic tone values of the display signal input source in a plurality of subarea subareas;
and mapping the RGB basic tone values of a plurality of subarea areas in each subarea area to corresponding LED lamp beads to generate a spatial gradient effect.
Preferably, the atmosphere lamp video data processing method further comprises:
in multi-frame display picture data, acquiring RGB basic tone values of the display signal input source in a plurality of subarea subregions;
acquiring a mean value of the RGB basic hue values;
and mapping the RGB basic tone mean values of a plurality of subarea areas in each subarea area to corresponding LED lamp beads to generate a smooth transition effect.
Preferably, the receiving a display signal input source of the display terminal, and dividing the pixel point region of the display terminal into a plurality of subareas according to the number of the bulbs of the atmosphere lamp comprises:
establishing a plane rectangular coordinate system by taking the central point of the display terminal as the origin of coordinates;
dividing the long edge and the wide edge of the display terminal into a plurality of equal parts according to the number of lamp beads of the atmosphere lamp distributed on the long edge and the wide edge of the display terminal;
obtaining coordinates of each critical point which divides the long edge and the wide edge of the display terminal into a plurality of equal parts;
and connecting each critical point coordinate with the central point coordinate of the display terminal, and dividing the pixel point region of the display terminal into a plurality of subarea regions.
Preferably, the acquiring a row pixel range and a column pixel range of each of the plurality of subarea areas comprises:
acquiring the row pixel range and the column pixel range of the plurality of subarea areas in a first quadrant;
acquiring the row pixel range and the column pixel range of the plurality of subarea areas in a second quadrant;
acquiring the row pixel range and the column pixel range of the plurality of subarea areas in a third quadrant;
and acquiring the row pixel range and the column pixel range of the plurality of subarea areas in a fourth quadrant.
Preferably, the acquiring the row pixel range and the column pixel range of the plurality of subarea areas in the first quadrant comprises:
the column pixel range is:
Figure BDA0003835264420000031
the row pixel range is: x 1 +zp/2<X<X 0 +zp/2,
Figure RE-GDA0003906108110000032
Figure RE-GDA0003906108110000033
The video size of the display terminal is zp multiplied by hp, zp is the long side of the video size, hp is the wide side of the video size, k is the serial number of the atmosphere lamp beads on the long side or the wide side of the display terminal, and m is the total number of the lamp beads on the long side or the wide side of the display terminal.
Preferably, the dividing each of the plurality of partition areas into a plurality of partition sub-areas according to the row pixel range and the column pixel range of each of the plurality of partition areas comprises:
dividing the range of the row pixels according to preset weight, and dividing each of the plurality of subarea areas into a plurality of subarea subareas.
Preferably, the mapping the RGB basic tone values of a plurality of subareas in each of the plurality of subareas into the corresponding LED lamp beads to generate the spatial gradient effect includes:
and after multiplying the RGB basic tone values of the plurality of subarea sub-areas in each subarea area by a preset weight coefficient, mapping the RGB basic tone values to the corresponding LED lamp beads to generate a spatial gradient effect.
To solve the above technical problem, an embodiment of the present invention provides an ambience light video data processing apparatus, including:
the first dividing module is used for receiving a display signal input source of the display terminal and dividing a pixel point area of the display terminal into a plurality of subareas according to the number of lamp beads of the atmosphere lamp;
the range determining module is used for acquiring the row pixel range and the column pixel range of each partition area;
the second dividing module is used for dividing each partition area into a plurality of partition sub-areas according to the row pixel range and the column pixel range of each partition area;
the first RGB tone value extraction module is used for acquiring RGB basic tone values of the display signal input source in a plurality of subarea subareas in one frame of display picture data;
the first mapping module is used for mapping the RGB basic tone values of the plurality of subarea subareas in each subarea area to the corresponding LED lamp beads so as to generate a spatial gradient effect.
To solve the foregoing technical problem, an embodiment of the present invention provides an ambience light video data processing device, including: at least one processor, at least one memory and computer program instructions stored in the memory, which when executed by the processor, implement the method of the first aspect of the embodiments described above.
To solve the above technical problem, an embodiment of the present invention provides a storage medium having computer program instructions stored thereon, which when executed by a processor implement the method according to the first aspect of the foregoing embodiments.
In summary, the atmosphere lamp video data processing method, apparatus, device and storage medium provided in the embodiments of the present invention. According to the invention, a display signal input source of a display terminal is received, and a pixel point region of the display terminal is divided into a plurality of subareas according to the number of lamp beads of an atmosphere lamp; acquiring a row pixel range and a column pixel range of each of the plurality of partition areas; dividing each of the plurality of subarea areas into a plurality of subarea subareas according to the row pixel range and the column pixel range of each of the plurality of subarea areas; in one frame of display picture data, acquiring RGB basic tone values of the display signal input source in a plurality of subarea subareas; and mapping the RGB basic tone values of a plurality of subarea subareas in each subarea area to corresponding LED lamp beads to generate a spatial gradient effect. Therefore, the atmosphere lamp video data processing method can enable the atmosphere lamp to present the visual effect of gradual change of space along with the display picture; in addition, the atmosphere lamp can smoothly transit along with the display picture, and the user experience is high.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments of the present invention will be briefly described below, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic diagram illustrating an atmosphere lamp distribution principle of an atmosphere lamp video data processing method according to an embodiment of the present invention.
Fig. 2 is a flowchart of an ambience lamp video data processing method according to an embodiment of the present invention.
FIG. 3 is a flowchart of an ambience lamp video data processing method according to another embodiment of the present invention.
FIG. 4 is a schematic diagram of an ambience lamp video data processing method according to another embodiment of the present invention.
Fig. 5 is a schematic diagram illustrating a partitioning principle of an atmosphere light video data processing method according to an embodiment of the present invention.
FIG. 6 is a schematic diagram of the pixel range determination of the ambience lamp video data processing method according to an embodiment of the invention.
FIG. 7 is a schematic diagram of a pixel range determination method for atmosphere light video data processing according to another embodiment of the present invention.
Fig. 8 is a schematic diagram illustrating the principle of partitioning sub-areas of an atmosphere light video data processing method according to an embodiment of the present invention.
Fig. 9 is a schematic structural diagram of an ambience lamp video data processing device according to an embodiment of the present invention.
Fig. 10 is a schematic structural diagram of another atmosphere lamp video data processing apparatus according to an embodiment of the present invention.
Fig. 11 is a schematic structural diagram of an atmosphere lamp video data processing device according to an embodiment of the present invention.
Detailed Description
Features and exemplary embodiments of various aspects of the present invention will be described in detail below, and in order to make objects, technical solutions and advantages of the present invention more apparent, the present invention will be further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not to be construed as limiting the invention. It will be apparent to one skilled in the art that the present invention may be practiced without some of these specific details. The following description of the embodiments is merely intended to provide a better understanding of the present invention by illustrating examples of the present invention.
It should be noted that, in this document, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrases "comprising 8230; \8230;" comprise 8230; "do not exclude the presence of additional like elements in a process, method, article, or apparatus that comprises the recited element.
Referring to fig. 2, fig. 2 is a method for processing ambience lamp video data according to the present application, the method including the following steps:
s1, receiving a display signal input source of a display terminal, and dividing a pixel point region of the display terminal into a plurality of subarea regions according to the number of lamp beads of an atmosphere lamp; the plurality of subarea areas take the middle point of the display terminal as a picture center point to show a radial type and expand towards the outside of a display picture;
specifically, in the application, a display signal input source is received from a display terminal, so that R, G and B pixel values of the display terminal are acquired as an atmosphere lamp display data source. It is understood that the interface for acquiring the display signal input source includes one or more of an HDMI interface, a DP interface, or an LVDS interface, and is not limited in particular herein.
Specifically, referring to fig. 1, the atmosphere lamp in the present application is formed by connecting n LED lamp beads with red (R), green (G), and blue (B) channels in series, and is distributed at edge positions of four sides, i.e., upper, lower, left, and right, on the back of the display terminal. This application will according to the lamp pearl number of atmosphere lamp the pixel region of display terminal divides into a plurality of subregion, and the lamp pearl number of atmosphere lamp can be according to display terminal's length and width size setting, does not do specific limit here.
Specifically, this application will according to the lamp pearl number of atmosphere lamp the pixel region of display terminal divides into a plurality of subregion, carries out data processing to the data in every subregion to in the R, G, B pixel value mapping of each subregion correspond LED lamp pearl, make atmosphere lamp can follow display terminal's display screen and present regular scintillation effect.
S2, acquiring a row pixel range and a column pixel range of each partition area;
specifically, the present application obtains a row pixel range and a column pixel range of each of the plurality of partition areas through pixel scanning. The pixel scanning process specifically comprises the steps of refreshing from left to right in each of the plurality of partition areas, determining the row pixel range of a row of pixels after completing the scanning of the row of pixels, scanning the next row, and continuing refreshing from left to right until the last pixel point in the partition area is refreshed. It is to be understood that the pixel scanning process in this application may also be a line-by-line scanning from right to left, from top to bottom, or a line-by-line scanning from right to left, from bottom to top, which is not limited herein.
S3, dividing each partition area into a plurality of partition sub-areas according to the row pixel range and the column pixel range of each partition area;
specifically, after the row pixel range and the column pixel range of each of the plurality of partition areas are obtained, each of the plurality of partition areas is divided into a plurality of partition sub-areas according to the present application. For example, a subarea area corresponding to a long and peripheral first LED lamp bead on the display terminal is divided into three subarea sub-areas, a subarea area corresponding to a long and peripheral second LED lamp bead on the display terminal is divided into three subarea sub-areas, and a subarea area corresponding to a long and peripheral third LED lamp bead on the display terminal is divided into three subarea sub-areas.
It should be noted that, in the present application, it is preferable to divide each of the plurality of partitioned areas into a plurality of partitioned sub-areas with the same number, so as to implement a smooth transition of the atmosphere lamp following the flicker of the display terminal, and improve the visual display effect.
S4, in one frame of display picture data, acquiring RGB basic tone values of the display signal input source in a plurality of subarea subregions;
and S5, mapping the RGB basic tone values of the plurality of subarea subareas in the plurality of subarea areas to the corresponding LED lamp beads to generate a spatial gradient effect.
Specifically, after each partition area is divided into a plurality of partition areas with the same number of parts, the basic hue value sigma of each frame of display picture data is obtained through spatial filtering R1 、∑ G1 、∑ B1 ~ ∑ Rn 、∑ Gn 、∑ Bn And mapping the RGB basic tone value of each frame of display picture data in a plurality of subarea areas in each subarea area into the corresponding LED lamp beads, so that the atmosphere lamp can realize hierarchical flicker along with each frame of picture data, and the change of picture content generates a gradual change effect on space.
In summary, the present application provides an atmosphere lamp video data processing method, in this scheme, the present invention divides a pixel point region of a display terminal into a plurality of subareas according to the number of lamp beads of an atmosphere lamp by receiving a display signal input source of the display terminal; acquiring a row pixel range and a column pixel range of each of the plurality of subarea areas; dividing each of the plurality of subareas into a plurality of subarea areas according to the row pixel range and the column pixel range of each of the plurality of subarea areas; in one frame of display picture data, acquiring RGB basic tone values of the display signal input source in a plurality of subarea subareas; and mapping the RGB basic tone values of a plurality of subarea subareas in each subarea area to corresponding LED lamp beads to generate a spatial gradient effect. Therefore, the atmosphere lamp video data processing method can enable the atmosphere lamp to show the visual effect of space gradual change along with the display picture, the diversity of the display effect is reflected, and the user experience is high.
On the basis of the above-described embodiment:
referring to fig. 3, fig. 3 is a flowchart of another atmosphere lamp video data processing method provided in the present application.
Referring to fig. 4, fig. 4 is a schematic diagram of another atmosphere lamp video data processing method provided in the present application.
As a preferred embodiment, the ambience light video data processing method further includes:
s6, in multi-frame display picture data, RGB basic tone values of the display signal input source in the plurality of subarea subregions are obtained;
s7, obtaining a mean value of the RGB basic hue values;
and S8, mapping the RGB basic tone mean value of the plurality of subarea areas in each of the plurality of subarea areas to the corresponding LED lamp beads to generate a smooth transition effect.
Specifically, in this embodiment, the data used for mapping each LED bead in the atmosphere lamp strip is finally obtained by performing weighted averaging on the statistical results generated by the data of multiple frames of adjacent pictures. It can be understood that this application is through right after RGB basic tone value in the subregion carries out the weighted average, recompress and map to in the lamp area, it is more level and smooth to make the atmosphere lamp follow the transition when display terminal changes, has avoided high frequency flicker.
Referring to fig. 5, fig. 5 is a schematic diagram illustrating a partition division principle of another atmosphere lamp video data processing method provided in the present application.
As a preferred embodiment, the receiving a display signal input source of a display terminal, and dividing a pixel point region of the display terminal into a plurality of subareas according to the number of lamp beads of an atmosphere lamp includes:
establishing a plane rectangular coordinate system by taking the central point of the display terminal as the origin of coordinates;
specifically, in this embodiment, a planar rectangular coordinate system is established with the central point of the display terminal as the origin of coordinates, so as to divide the display frame of the display terminal into four parts, namely a first quadrant area, a second quadrant area, a third quadrant area and a fourth quadrant area.
Dividing the long edge and the wide edge of the display terminal into a plurality of equal parts according to the number of lamp beads of the atmosphere lamp distributed on the long edge and the wide edge of the display terminal;
specifically, as shown in fig. 5, taking the second quadrant as an example, if the number of LED lamp beads of the atmosphere lamp distributed on the X axis of the second quadrant, that is, on one half of the long side of the display terminal, is N, the left long side of the display terminal is divided into N equal parts.
Obtaining coordinates of each critical point which divides the long edge and the wide edge of the display terminal into a plurality of equal parts;
and connecting each critical point coordinate with the central point coordinate of the display terminal, and dividing the pixel point region of the display terminal into a plurality of subarea regions.
Specifically, the critical point coordinates are connected with the central point coordinate of the display terminal, so that the plurality of subarea areas are radially expanded to the display terminal picture by taking the origin of coordinates as the central point, namely, each subarea area is in a triangular shape. It can be understood that there is not the coincidence between a plurality of subregion of this application to make the data diverse who maps to in each LED lamp pearl, can more add comprehensive control atmosphere lamp and follow the display frame scintillation, promote the synchronous demonstration's of atmosphere lamp effect.
It is worth to be noted that, in an embodiment, the display screen may be equally divided into N equal parts according to the number of the LED lamp beads of the atmosphere lamp, and the serial numbers of the LED lamp beads correspond to the serial numbers of the partition areas in sequence, which is not limited specifically herein.
Referring to fig. 6, fig. 6 is a schematic diagram illustrating a pixel range determination method of an ambience light video data processing method according to the present application.
As a preferred embodiment, the acquiring the row pixel range and the column pixel range of each of the plurality of subarea areas comprises:
acquiring the row pixel range and the column pixel range of the plurality of subarea areas in a first quadrant;
acquiring the row pixel range and the column pixel range of the plurality of subarea areas in a second quadrant;
acquiring the row pixel range and the column pixel range of the plurality of subarea areas in a third quadrant;
and acquiring the row pixel range and the column pixel range of the plurality of subarea areas in the fourth quadrant.
Specifically, the method and the device perform line-by-line scanning on the row pixels of each partition area in the first quadrant, the second quadrant, the third quadrant and the fourth quadrant in sequence to determine the row pixel range and the column pixel range of each partition area.
As a preferred embodiment, the acquiring the row pixel range and the column pixel range of the plurality of partition areas in the first quadrant comprises:
the range of the pixels in the column is as follows:
Figure BDA0003835264420000111
the row pixel range is: x 1 +zp/2<X<X 0 +zp/2,
Figure RE-GDA0003906108110000112
Figure RE-GDA0003906108110000113
The video size of the display terminal is zp multiplied by hp, zp is the long side of the video size, hp is the wide side of the video size, k is the serial number of the atmosphere lamp beads on the long side or the wide side of the display terminal, and m is the total number of the lamp beads on the long side or the wide side of the display terminal.
Specifically, taking the first quadrant as an example, as shown in fig. 6, the shaded area is a range of the sub-area corresponding to the kth LED lamp bead in the first quadrant. According to the method and the device, the shadow area is scanned line by line, and the ranges of the left side and the right side of the shadow area are calculated to determine the coordinate range of the partition area corresponding to the Kth LED lamp bead in the first quadrant.
Specifically, assume that the screen ratio of the display terminal is 16:9, the size of the video picture can be set to 16px9p. The number of the LED lamp beads at the top of the display screen in the first quadrant is m. The input pixels correspond to rows X and columns Y, and as the K area completely falls in the second quadrant region, the range of the column pixels, namely the range of Y coordinates, is half of the wide side of the video picture size, namely, 0-straw Y-straw 4.5p
Specifically, as shown in fig. 6, a partition area corresponding to the kth LED lamp bead is a shadow area, and end points of the shadow area are three points B1C1O respectively; the A1G1 line segment is half of the long side of the display terminal, i.e., A1G1= L1/2= 169/2; if the total number of the lamp beads on the long side or the wide side of the display terminal is m, the length occupied by each LED lamp bead is 16p/m, and the length occupied by the Kth LED lamp bead is L2=16pk/m, that is, B1G1= L7= L2-16p/2= 169k/m-16 p/2; the length of the (k-1) th LED lamp bead is L3=16 (k-1)/m, that is, G1C1= L6= L3-16p/2=16 (k-1)/m-16 p/2; the connecting line OF the point D1, the point E1 and the point F1 is parallel to the point G1A1, and the variation range in the Y direction is 0 to 4.5p, and assuming that the length OF G1F1 is Y, OF1= L4=9p/2-Y, and OG1= L5=9p/2.
Specifically, in the triangle OB1G1, the following relationship can be obtained according to the trigonometric function:
D1F1/B1G 1=OF1/OG1;
E1F1/C1G1=OF1/OG1;
namely: X0/L7= L4/L5;
X1/L6=L4/L5;
substituting the formula yields:
Figure BDA0003835264420000131
Figure BDA0003835264420000132
thus:
X 0 =(144pk-32ky-72pm+16my)/9m;
X 1 =(144pk-32ky-144p+32y-72pm+16my)/9m;
the row pixel range of the first quadrant is then: x 1 +16p/2<X<X 0 +16p/2。
Referring to fig. 7, fig. 7 is a schematic diagram illustrating a pixel range determination method for atmosphere video data processing according to another embodiment of the present disclosure.
Specifically, taking the second quadrant as an example, as shown in fig. 7, the shaded area is a range of the partition area corresponding to the kth LED lamp bead in the second quadrant. According to the method and the device, the shadow area is scanned line by line, and the ranges of the left side and the right side of the shadow area are calculated to determine the coordinate range of the partition area corresponding to the Kth LED lamp bead in the second quadrant.
Specifically, assume that the screen ratio of the display terminal is 16:9, the size of the video picture can be set to 16px9p. The number of the LED lamp beads at the top of the display screen in the second quadrant area is m. The input pixels correspond to rows X and columns Y, and since the K area completely falls in the second quadrant region, the range of the column pixels, namely the range of Y coordinates, is half of the wide side of the size of a video picture, namely 0-Y-4.5p
Specifically, as shown in fig. 7, the partition area corresponding to the kth LED lamp bead is a shadow area, and end points of the shadow area are three BCO points respectively; the AG line segment is half of the long side of the display terminal, i.e., AG = L1=16p/2; if the total number of the lamp beads on the long side or the wide side of the display terminal is m, the length occupied by each LED lamp bead is 16p/m, and the length occupied by the Kth LED lamp bead is 16pk/m, namely AC = L2=16pk/m; the length occupied by the (k-1) th LED lamp bead is 16 (k-1)/m, namely AB = L3=16 (k-1)/m; connecting lines OF the points D, E and F are parallel to the GA, the variation range in the Y direction is 0-4.5 p, and assuming that the GF length is Y, OF = L4=9p/2-Y, and OG = L5=9p/2; GC = L6= L1-L2; BG = L7= L1-L3.
Specifically, in the triangular OBG, the following relationship can be obtained according to the trigonometric function:
DF/BG=OF/OG;
EF/CG=OF/OG;
namely: X0/L7= L4/L5;
X1/L6=L4/L5;
substituting the formula yields:
Figure BDA0003835264420000141
Figure BDA0003835264420000142
thus:
X 0 =8p-(144pk+144p+32ky-16my-32y)/9m;
X 1 =8p+(32ky-144pk)/9m-16y/9;
the row pixel range of the second quadrant: 16p/2-X 0 <X<16p/2-X 1
Specifically, the row pixel range and the column pixel range corresponding to other LED lamp beads in the third quadrant region are obtained in the same manner, and are not described herein again; the row pixel range and the column pixel range corresponding to each LED lamp bead in the fourth quadrant region are the same, and are not described herein again.
Referring to fig. 8, fig. 8 is a schematic diagram illustrating a partition sub-area division of another atmosphere lamp video data processing method provided in the present application.
As a preferred embodiment, said dividing each of said plurality of partition areas into a plurality of partition sub-areas according to a row pixel range and a column pixel range of each of said plurality of partition areas comprises:
dividing the row pixel range according to preset weight, and dividing each of the plurality of subarea areas into a plurality of subarea subareas.
Specifically, the method and the device perform line-by-line scanning on each partition area, judge the position of the pixel, perform the pixel in the corresponding area, and finally output the result. And then dividing the row pixel range according to preset weight, and dividing each of the plurality of subarea areas into a plurality of subarea subareas. For example, for the Kth lamp bead, the sub-area is partitioned by restricting the sigma in the sub-area k_1 、Σ k_2 And sigma k_3 Y in (3) is taken as value ranges y1, y2 and y3, and the ranges x0, x1 and x2 are inversely deduced by the formula, so that the sigma is determined k_1 、Σ k_2 And sigma k_3 In (c) is used.
As a preferred embodiment, the mapping the RGB basic tone values of the plurality of subareas in each of the plurality of subareas into the corresponding LED lamp beads to generate the spatial gradient effect includes:
and multiplying the RGB basic tone values of the plurality of subarea areas in each subarea area by a preset weight coefficient, and mapping the RGB basic tone values to the corresponding LED lamp beads to generate a spatial gradual change effect.
Specifically, the RGB basic hue values of the multiple subareas can be multiplied by the corresponding coefficient r according to actual requirements 1 、r 2 、r 3 E.g. corresponding to the coefficient r 1 、r 2 、r 3 Presents an arithmetic progression distribution, which is not specifically limited herein.
Referring to fig. 9, fig. 9 is a block diagram of an ambience light video data processing apparatus according to the present application.
The embodiment of the invention provides an atmosphere lamp video data processing device, which comprises:
the first partitioning module 1 is used for receiving a display signal input source of a display terminal and partitioning a pixel point region of the display terminal into a plurality of partitioning regions according to the number of lamp beads of an atmosphere lamp;
the range determining module 2 is used for acquiring a row pixel range and a column pixel range of each of the plurality of subarea areas;
the second dividing module 3 is configured to divide each of the plurality of partition areas into a plurality of partition sub-areas according to a row pixel range and a column pixel range of each of the plurality of partition areas;
a first RGB tone value extraction module 4, configured to obtain, in a frame of display image data, RGB basic tone values of the display signal input source in the plurality of partition sub-areas;
the first mapping module 5 is configured to map RGB basic hue values of a plurality of sub-regions in each of the plurality of sub-regions into corresponding LED lamp beads to generate a spatial gradient effect.
Referring to fig. 10, fig. 10 is a block diagram of an ambience lamp video data processing device according to the present application.
The embodiment of the invention provides an atmosphere lamp video data processing device, which further comprises:
a second RGB hue value extraction module 6, configured to obtain, in the multi-frame display picture data, RGB basic hue values of the display signal input source in the multiple partition sub-areas;
a calculating module 7, configured to obtain a mean value of the RGB basic hue values;
and the second mapping module 8 is configured to map the RGB-based color mean values of the plurality of subareas in each of the plurality of subareas to the corresponding LED lamp beads to generate a smooth transition effect.
In addition, the atmosphere lamp video data processing method according to the embodiment of the present invention described in conjunction with fig. 1 may be implemented by an atmosphere lamp video data processing apparatus. Fig. 11 is a schematic diagram illustrating a hardware structure of an ambience lamp video data processing device according to an embodiment of the present invention.
The ambience lamp video data processing device may include a processor 401 and a memory 402 storing computer program instructions.
In particular, the processor 401 may include a Central Processing Unit (CPU), or an Application Specific Integrated Circuit (ASIC), or may be configured as one or more Integrated circuits implementing embodiments of the present invention.
Memory 402 may include a mass storage for data or instructions. By way of example, and not limitation, memory 402 may include a Hard Disk Drive (HDD), floppy Disk Drive, flash memory, optical Disk, magneto-optical Disk, magnetic tape, or Universal Serial Bus (USB) Drive or a combination of two or more of these. Memory 402 may include removable or non-removable (or fixed) media, where appropriate. The memory 402 may be internal or external to the data processing apparatus, where appropriate. In a particular embodiment, the memory 402 is non-volatile solid-state memory. In a particular embodiment, the memory 402 includes Read Only Memory (ROM). Where appropriate, the ROM may be mask-programmed ROM, programmable ROM (PROM), erasable PROM (EPROM), electrically Erasable PROM (EEPROM), electrically rewritable ROM (EAROM), or flash memory, or a combination of two or more of these.
Processor 401 may implement any of the above described embodiments of the ambience lamp video data processing method by reading and executing computer program instructions stored in memory 402.
In one example, the ambience light video data processing device may further comprise a communication interface 403 and a bus 410. As shown in fig. 11, the processor 401, the memory 402, and the communication interface 403 are connected via a bus 410 to complete communication therebetween.
The communication interface 403 is mainly used for implementing communication between modules, apparatuses, units and/or devices in the embodiment of the present invention.
Bus 410 comprises hardware, software, or both coupling the components of the ambience light video data processing device to each other. By way of example, and not limitation, a bus may include an Accelerated Graphics Port (AGP) or other graphics bus, an Enhanced Industry Standard Architecture (EISA) bus, a Front Side Bus (FSB), a Hyper Transport (HT) interconnect, an Industry Standard Architecture (ISA) bus, an infiniband interconnect, a Low Pin Count (LPC) bus, a memory bus, a Micro Channel Architecture (MCA) bus, a Peripheral Component Interconnect (PCI) bus, a PCI-Express (PCI-X) bus, a Serial Advanced Technology Attachment (SATA) bus, a video electronics standards association local (VLB) bus, or other suitable bus, or a combination of two or more of these. Bus 410 may include one or more buses, where appropriate. Although embodiments of the invention describe and illustrate a particular bus, the invention contemplates any suitable bus or interconnect.
In addition, in combination with the ambient light video data processing method in the foregoing embodiment, the embodiment of the present invention may be implemented by providing a computer-readable storage medium. The computer readable storage medium having stored thereon computer program instructions; the computer program instructions, when executed by a processor, implement any one of the above-described methods of ambience light video data processing.
It should also be noted that the exemplary embodiments mentioned in this patent describe some methods or systems based on a series of steps or devices. However, the present invention is not limited to the order of the above-described steps, that is, the steps may be performed in the order mentioned in the embodiments, may be performed in an order different from the order in the embodiments, or may be performed simultaneously.
As described above, only the specific embodiments of the present invention are provided, and it can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the system, the module and the unit described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. It should be understood that the scope of the present invention is not limited thereto, and any equivalent modifications or substitutions can be easily made by those skilled in the art within the technical scope of the present invention.

Claims (10)

1. An ambience light video data processing method, characterized in that the method comprises:
receiving a display signal input source of a display terminal, and dividing a pixel point region of the display terminal into a plurality of subareas according to the number of lamp beads of the atmosphere lamp; the middle point of the display terminal is used as a picture center point of the plurality of subarea areas to show a radial type to expand outside the display picture;
acquiring a row pixel range and a column pixel range of each partition area;
dividing each of the plurality of subarea areas into a plurality of subarea subareas according to the row pixel range and the column pixel range of each of the plurality of subarea areas;
in one frame of display picture data, acquiring RGB basic tone values of the display signal input source in a plurality of subarea subareas;
and mapping the RGB basic tone values of a plurality of subarea areas in each subarea area to corresponding LED lamp beads to generate a spatial gradient effect.
2. The ambience lamp video data processing method of claim 1, wherein the ambience lamp video data processing method further comprises:
in multi-frame display picture data, acquiring RGB basic tone values of the display signal input source in a plurality of subarea subareas;
acquiring a mean value of the RGB basic hue values;
and mapping the RGB basic tone mean values of a plurality of subarea areas in each subarea area to corresponding LED lamp beads to generate a smooth transition effect.
3. The ambience lamp video data processing method of claim 1, wherein the receiving of the display signal input source of the display terminal and the dividing of the pixel area of the display terminal into the plurality of subareas according to the number of the lamp beads of the ambience lamp comprises:
establishing a plane rectangular coordinate system by taking the central point of the display terminal as the origin of coordinates;
dividing the long edge and the wide edge of the display terminal into a plurality of equal parts according to the number of lamp beads of the atmosphere lamp distributed on the long edge and the wide edge of the display terminal;
obtaining coordinates of each critical point which divides the long edge and the wide edge of the display terminal into a plurality of equal parts;
and connecting each critical point coordinate with the central point coordinate of the display terminal, and dividing the pixel point region of the display terminal into a plurality of subarea regions.
4. The ambience lamp video data processing method of claim 3, wherein the obtaining of the range of row pixels and the range of column pixels for each of the plurality of partitioned areas comprises:
acquiring the row pixel range and the column pixel range of the plurality of subarea areas in a first quadrant;
acquiring the row pixel range and the column pixel range of the plurality of subarea areas in a second quadrant;
acquiring the row pixel range and the column pixel range of the plurality of subarea areas in a third quadrant;
and acquiring the row pixel range and the column pixel range of the plurality of subarea areas in the fourth quadrant.
5. The ambience lamp video data processing method of claim 4, wherein the obtaining of the row pixel range and the column pixel range of the plurality of divided areas in the first quadrant comprises:
the range of the pixels in the column is as follows:
Figure FDA0003835264400000021
the row pixel range is: x 1 +zp/2<X<X 0 +zp/2,
Figure FDA0003835264400000022
Figure FDA0003835264400000023
The video size of the display terminal is zp multiplied by hp, zp is the long side of the video size, hp is the wide side of the video size, k is the serial number of the atmosphere lamp beads on the long side or the wide side of the display terminal, and m is the total number of the lamp beads on the long side or the wide side of the display terminal.
6. The ambience light video data processing method of claim 1, wherein the dividing each of the plurality of partition regions into a plurality of partition sub-regions in dependence on a row pixel range and a column pixel range of each of the plurality of partition regions comprises:
dividing the row pixel range according to a preset weight, and dividing each of the plurality of subarea areas into a plurality of subarea subareas.
7. The ambience lamp video data processing method of claim 1, wherein the mapping of the RGB basic hue values of the plurality of subareas of each of the plurality of subareas into corresponding LED lamp beads to generate the spatial gradient effect comprises:
and after the RGB basic tone values of a plurality of subarea subregions in each subarea region are multiplied by a preset weight coefficient, mapping the RGB basic tone values to corresponding LED lamp beads to generate a spatial gradient effect.
8. An ambience light video data processing device, characterized in that the device comprises:
the first dividing module is used for receiving a display signal input source of the display terminal and dividing a pixel point region of the display terminal into a plurality of divided regions according to the number of lamp beads of the atmosphere lamp;
the range determining module is used for acquiring a row pixel range and a column pixel range of each partition area;
the second dividing module is used for dividing each of the plurality of subarea areas into a plurality of subarea subareas according to the row pixel range and the column pixel range of each of the plurality of subarea areas;
the first RGB tone value extraction module is used for acquiring RGB basic tone values of the display signal input source in a plurality of subarea subareas in one frame of display picture data;
the first mapping module is used for mapping the RGB basic tone values of the plurality of subarea sub-areas in each subarea area to the corresponding LED lamp beads to generate a spatial gradient effect.
9. An ambience light video data processing device, characterized by comprising: at least one processor, at least one memory, and computer program instructions stored in the memory that, when executed by the processor, implement the method of any of claims 1-7.
10. A storage medium having computer program instructions stored thereon, which when executed by a processor implement the method of any one of claims 1-7.
CN202211086299.8A 2022-09-06 2022-09-06 Atmosphere lamp video data processing method, device, equipment and storage medium Pending CN115474306A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211086299.8A CN115474306A (en) 2022-09-06 2022-09-06 Atmosphere lamp video data processing method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211086299.8A CN115474306A (en) 2022-09-06 2022-09-06 Atmosphere lamp video data processing method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115474306A true CN115474306A (en) 2022-12-13

Family

ID=84371045

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211086299.8A Pending CN115474306A (en) 2022-09-06 2022-09-06 Atmosphere lamp video data processing method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115474306A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115810338A (en) * 2023-02-03 2023-03-17 深圳市蔚来芯科技有限公司 Display processing method and system based on image scene
CN117202451A (en) * 2023-11-07 2023-12-08 深圳市千岩科技有限公司 Atmosphere lamp equipment, and light-emitting control method, device and medium thereof
CN117412451A (en) * 2023-12-13 2024-01-16 深圳市千岩科技有限公司 Atmosphere lamp equipment, mapping color matching method thereof, corresponding device and medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115810338A (en) * 2023-02-03 2023-03-17 深圳市蔚来芯科技有限公司 Display processing method and system based on image scene
CN117202451A (en) * 2023-11-07 2023-12-08 深圳市千岩科技有限公司 Atmosphere lamp equipment, and light-emitting control method, device and medium thereof
CN117202451B (en) * 2023-11-07 2024-01-30 深圳市千岩科技有限公司 Atmosphere lamp equipment, and light-emitting control method, device and medium thereof
CN117412451A (en) * 2023-12-13 2024-01-16 深圳市千岩科技有限公司 Atmosphere lamp equipment, mapping color matching method thereof, corresponding device and medium
CN117412451B (en) * 2023-12-13 2024-03-15 深圳市千岩科技有限公司 Atmosphere lamp equipment, mapping color matching method thereof, corresponding device and medium

Similar Documents

Publication Publication Date Title
CN115474306A (en) Atmosphere lamp video data processing method, device, equipment and storage medium
KR100791375B1 (en) Apparatus and method for color correction
US8003927B2 (en) Image projection apparatus which projects an image corrected according to a projection surface
TWI511559B (en) Image processing method
CN110910333B (en) Image processing method and image processing apparatus
US20160329027A1 (en) Image processing device with image compensation function and image processing method thereof
KR20080015101A (en) Color transformation luminance correction method and device
CN104811677A (en) Display control method and device of mobile terminal
KR101985880B1 (en) Display device and control method thereof
CN113053324A (en) Backlight control method, device, equipment, system and storage medium
WO2022156129A1 (en) Image processing method, image processing apparatus, and computer device
US11889083B2 (en) Image display method and device, image recognition method and device, storage medium, electronic apparatus, and image system
CN113597061A (en) Method, apparatus and computer readable storage medium for controlling a magic color light strip
WO2016110094A1 (en) Method and device for projection control and projection device
KR20190080732A (en) Estimation of illumination chromaticity in automatic white balancing
US7532773B2 (en) Directional interpolation method and device for increasing resolution of an image
WO2024050699A1 (en) Atmosphere lamp video data processing method, apparatus and device, and storage medium
CN106709959B (en) method and device for recognizing chocolate plate and electronic equipment
CN117062282A (en) Light control method, device, equipment, storage medium and vehicle
CN113507572A (en) Video picture display method, device, terminal and storage medium
CN108765502B (en) Color appearance obtaining method in complex environment
CN103986922B (en) Image processing method
KR20160078023A (en) Apparatus and method for controlling display
CN109996017B (en) Image adjusting method and terminal thereof
CN112488972A (en) Method and device for synthesizing green screen image and virtual image in real time

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination