CN114040535A - Method and system for converting video image change data into LED three-dimensional multicolor effect - Google Patents

Method and system for converting video image change data into LED three-dimensional multicolor effect Download PDF

Info

Publication number
CN114040535A
CN114040535A CN202111347432.6A CN202111347432A CN114040535A CN 114040535 A CN114040535 A CN 114040535A CN 202111347432 A CN202111347432 A CN 202111347432A CN 114040535 A CN114040535 A CN 114040535A
Authority
CN
China
Prior art keywords
value
pix
data
formula
difference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111347432.6A
Other languages
Chinese (zh)
Other versions
CN114040535B (en
Inventor
顾培德
苏亚飞
喻姜文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Wanunicom Intelligent Technology Co ltd
Original Assignee
Shenzhen Wanunicom Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Wanunicom Intelligent Technology Co ltd filed Critical Shenzhen Wanunicom Intelligent Technology Co ltd
Priority to CN202111347432.6A priority Critical patent/CN114040535B/en
Priority to PCT/CN2021/133943 priority patent/WO2023082361A1/en
Publication of CN114040535A publication Critical patent/CN114040535A/en
Application granted granted Critical
Publication of CN114040535B publication Critical patent/CN114040535B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B45/00Circuit arrangements for operating light-emitting diodes [LED]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0125Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level one of the standards being a high definition standard
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/155Coordinated control of two or more light sources
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/165Controlling the light source following a pre-assigned programmed sequence; Logic control [LC]

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

The invention is suitable for the field of atmosphere lamp control, and provides a method and a system for converting video image change data into an LED three-dimensional multicolor effect, wherein the method comprises the following steps: step S1: acquiring video stream information from an HDMI (high-definition multimedia interface); step S2: re-encoding the video stream to a fixed size resolution; step S3: comparing the frame change in the corresponding range according to the refresh rate of the video stream; step S4: comparing the acquired data difference with the data difference of the previous picture frame; step S5: packing the changed YUV key data and sending the changed YUV key data to the corresponding position information; step S6: and sending the changed data to the LED control equipment in a wired/wireless manner and controlling the corresponding LED lamp to make corresponding display atmosphere change. The technical problem that an atmosphere lamp for creating atmosphere in the prior art cannot be linked with a television picture so as to generate an atmosphere environment matched with the television picture is solved.

Description

Method and system for converting video image change data into LED three-dimensional multicolor effect
Technical Field
The invention belongs to the field of atmosphere lamp control, and particularly relates to a method and a system for converting video image change data into an LED three-dimensional multicolor effect.
Background
Along with the life is more and more abundant, people's demand to family's screen atmosphere is more and more high, and the atmosphere of building at present all carries out the atmosphere through solitary controller and builds, and the atmosphere of building out is all comparatively single or can carry out simple change to can not carry out the atmosphere along with the picture of TV broadcast and build, realize that the atmosphere etc. changes along with the picture of TV broadcast.
Disclosure of Invention
The invention aims to provide a method and a system for converting video image change data into an LED three-dimensional multicolor effect, and aims to solve the technical problem that an atmosphere lamp for creating atmosphere cannot be linked with a television picture in the prior art, so that an atmosphere environment matched with the television picture is generated.
The invention is realized by a method for converting video image change data into LED stereo fantasy color effect, which comprises the following steps:
step S1: acquiring video stream information from an HDMI (high-definition multimedia interface);
step S2: re-encoding the video stream to a fixed size resolution;
step S3: comparing the frame change in the corresponding range according to the refresh rate of the video stream;
step S4: comparing the acquired data difference with the data difference of the previous picture frame;
step S5: packing the changed YUV key data and sending the changed YUV key data to the corresponding position information;
step S6: and sending the changed data to the LED control equipment in a wired/wireless manner and controlling the corresponding LED lamp to make corresponding display atmosphere change.
The further technical scheme of the invention is as follows: the specific step of step S1 is to send the video stream to the HDMI HUB through the HDMI interface, the HDMI HUB outputs the video stream in two paths, one path is output to the external display for video display, and the other path is output to the CPU for video stream decoding.
The further technical scheme of the invention is as follows: the specific step of step S2 is to compress the decoded video streams with different resolutions to a specified size, and sharpen the video stream image after compression is completed, so that the edge portion of the difference is highlighted.
The further technical scheme of the invention is as follows: the step S3 further includes the steps of:
step S31: acquiring all data of a current picture, and distinguishing the atmosphere of the current video stream according to the numerical value of the whole color;
acquiring a value of the whole picture average by a formula APV = (Pix (0) + Pix (1) +.... + Pix (n))/n, wherein APV is the value of the whole picture average, Pix is a pixel value of a corresponding point, 0 is a pixel corresponding point index, following the order from left to right and from top to bottom, and n is the sum of pixels;
obtaining the difference value of the coordinate and the peripheral pixels by the formula Δ DPix (a, b) = (Pix (a-1, b) -Pix (a, b)) + (Pix (a, b-1) -Pix (a, b)) + (Pix (a +1, b) -Pix (a, b)) + (Pix (a, b +1) -Pix (a, b)) +); wherein, Δ DPix (a, b) is the difference between the coordinate and the peripheral pixel, a is the abscissa, and b is the ordinate;
calculating the relevant proportion of the change rate of the picture by the formula Δ APV = (w × h)/(. Σ (Δ DPix (0,0) < P0).; (. Δ DPix (w, h) < P0)), wherein Δ APV is the relevant proportion of the change rate of the picture, w is the width, h is the height, and P0 is a coefficient;
calculating an overall atmosphere value by a formula CPV = APV (absolute pressure) APV, wherein the CPV is the overall atmosphere value;
step S32: acquiring a current frame refresh rate VFRE and recording a related value;
step S33: acquiring the brightest and darkest arrays of the current picture, wherein the arrays comprise array relations;
obtaining a brightest array by a formula WPT = [ Pixtop (1), Pixtop (2),... and Pixtop (100) ], and recording 100 brightest groups of data, wherein Pixtop is the pixel value with the largest H value in HSV values in the whole picture, contains coordinate information, and WPT is the brightest 100 sets of data;
the darkest array is obtained through the formula DPT = [ Pixbottom (1), Pixbottom (2),.. and Pixbottom (100) ], and the darkest 100 groups of data are recorded, wherein Pixbottom (1) is the pixel value with the smallest H value in HSV values, contains coordinate information, and DPT is the darkest 100 sets of data values.
The further technical scheme of the invention is as follows: the step S4 further includes the steps of:
step S41: acquiring an array set with a large difference ratio;
obtaining the absolute value of the change of two frames by the formula of PFPix (a, b) = | P (a, b) -CF (a, b) |, wherein PFPix (a, b) is the absolute value of the change of two frames, PF (a, b) is the pixel value of the previous frame, and CF (a, b) is the pixel value of the current frame;
calculating a specified pixel value by the formula Δ RFPix (a, b) = PFPix (a, b) × VFRE P, wherein the Δ RFPix (a, b) is the specified pixel value, and P is a change coefficient;
step S42: increasing the display effect in the corresponding coordinates in the array;
calculating the pixel value of the frame data by the formula RPix (a, b) = Pix (a, b) × P1 +. RFPixelbook (a, b) × P2; where RPix (a, b) is the pixel value of the frame data, a is the abscissa, b is the ordinate, and P1 and P2 are coefficients.
The further technical scheme of the invention is as follows: the step S5 further includes the steps of:
step S51: taking horizontal coordinate data;
taking out the pixels of the frame by the formula Δ HPix (a, b) = (Pix (a, b +1) -Pix (a, b)), wherein the difference value of the pixels of the frame is given as (a), the Δ HPix (a, b) is the difference value of the pixels of the frame, the a is the abscissa, and the b is the ordinate;
calculating the difference average value of all elements under the specified A value by the formula Δ AHPix = (n)/(. Σ (Δ HPix (a,0) > P4). (. HPix (a, n) > P5)), wherein Δ AHPix is the difference average value of all elements under the specified A value, P4, P5 is a coefficient, and is less than the coefficient P4, and P5 is ignored;
obtaining the calculated value of the a coordinate by the formula vh (a) = RPix (a, b) × P1 +. Δ AHPix × P2 + Pixtop (0) × P3 + Pixbottom (0) × P4, wherein vh (a) is the calculated value of the a coordinate, and P1, P2, P3, and P4 are correlation coefficients, respectively;
step S52: taking ordinate data;
obtaining the difference value of the fetched pixel under the ordinate (b) through the formula Δ VPix (a, b) = (Pix (a +1, b) -Pix (a, b)), wherein the difference value of the fetched pixel under the ordinate (b) is VPix (a, b), and the a is the abscissa and the b is the ordinate;
all the abscissa differences to the abscissa by the formula Δ AVPix = (n)/(Σ (Δ VPix (0, b) > P) · Δ VPix (n, b) > P)), wherein Δ AVPix is all the abscissa differences under the abscissa, and P is a coefficient and is less than the neglect of the coefficient P;
obtaining the value of the coordinate of the ordinate (b) after calculation by a formula vv (b) = RPix (a, b) · P1 +. Δ AVPix × P2 + Pixtop (0) × P3 + Pixbottom (0) × P4, wherein vv (b) is the value of the coordinate of the ordinate (b) after calculation, P1, P2, P3, and P4 are correlation coefficients, respectively;
step S53: taking intermediate data;
obtaining the difference of the fetched pixel in the middle value by the formula of CPix (a, b) = | Pix (a-1, b) -Pix (a, b) | + | Pix (a, b-1) -Pix (a, b) | + | Pix (a +1), b) -Pix (a, b) | + | Pix (a, b +1) -Pix (a, b) |, wherein CPix (a, b) is the difference of the fetched pixel in the middle value, a is the horizontal coordinate, and b is the vertical coordinate;
obtaining an intermediate value difference through a formula of CVPix = (n x n)/(. Σ (CPix (0,0) > P). (. CPix (n, n) > P)), wherein Δ CVPix is the intermediate value difference, and P is a coefficient and is less than the neglect of the coefficient P;
the calculated value of the middle coordinate is obtained by the formula VC (a, b) = RPix (a, b) = P1 +/CVPix P2 + Pixtop (0) × P3 + Pixbottom × P4, where VC (a, b) is the calculated value of the middle coordinate, P1, P2, P3, and P4 are correlation coefficients, a is an abscissa, and b is an ordinate.
Another object of the present invention is to provide a system for converting video image variation data into LED stereoscopic illusion-color effects, the system including an HDMIN-HUB having an HDMI-IN interface and an HDMI-OUT interface, a CPU connected to the HDMIN-HUB, a controller unit connected to the CPU, and a light bank unit connected to the controller unit.
The further technical scheme of the invention is as follows: the controller unit comprises a wireless transmitting LED controller and an MCU-LED controller which are connected with the CPU.
The further technical scheme of the invention is as follows: the lamp group unit comprises a wireless lamp group and a wired lamp group, the wireless lamp group is wirelessly connected with the wireless transmitting LED controller, and the wired lamp group is connected with the MCU-LED controller in a wired mode.
The further technical scheme of the invention is as follows: the wireless lamp group comprises a wireless receiving LED controller which is wirelessly connected with the wireless sending LED controller, and a combined lamp, a top lamp, a bottom lamp, a left lamp, a right lamp and a vertical/horizontal middle lamp which are connected with the wireless receiving LED controller; the wired lamp group comprises a combined lamp, a top lamp, a bottom lamp, a left lamp, a right lamp and a vertical/horizontal middle lamp which are connected with the MCU-LED controller.
The invention has the beneficial effects that: through the method and the system for converting the video image change data into the LED three-dimensional multicolor effect, the scenes of family atmosphere illumination can be richer and linked, peripheral expansion is performed on content display effects such as televisions and projection, content pictures can be richer, more derivative effects can be realized on the pictures, better effects can be realized during film watching, the film watching can be more immersive through the playing of the content and wireless transmission cooperation, more peripheral atmosphere rendering is realized compared with the single content of the existing cinema, under some scenes, more stereoscopic immersion is realized, and the stereoscopic atmosphere data is synchronized by using some algorithms, the lamp can be connected to the same system in a wireless or wired mode, and more DIY own scene effects can be realized.
Drawings
Fig. 1 is a block flow diagram of a method for converting video image variation data into LED stereoscopic illusion-color effects according to an embodiment of the present invention;
fig. 2 is a system block diagram of a system for converting into LED stereoscopic illusion-color effects according to video image change data according to an embodiment of the invention.
Detailed Description
Fig. 1 shows a method for converting into LED stereoscopic illusion-color effect according to video image change data, which comprises the following steps:
step S1: acquiring video stream information from an HDMI (high-definition multimedia interface); the method comprises the specific steps that a video stream is sent to an HDMI HUB through an HDMI interface, the HDMI HUB outputs the video stream in two paths, one path of the video stream is output to an external display for video display, and the other path of the video stream is output to a CPU for video stream decoding.
Step S2: re-encoding the video stream to a fixed size resolution; compressing the decoded video streams with different resolutions to a specified size, and sharpening the video stream images after compression to highlight the edge parts of the differences.
Step S3: comparing the frame change in the corresponding range according to the refresh rate of the video stream;
the step S3 further includes the steps of:
step S31: acquiring all data of a current picture, and distinguishing the atmosphere of the current video stream according to the numerical value of the whole color;
acquiring a value of the whole picture average by a formula APV = (Pix (0) + Pix (1) +.... + Pix (n))/n, wherein APV is the value of the whole picture average, Pix is a pixel value of a corresponding point, 0 is a pixel corresponding point index, following the order from left to right and from top to bottom, and n is the sum of pixels;
obtaining the difference value of the coordinate and the peripheral pixels by the formula Δ DPix (a, b) = (Pix (a-1, b) -Pix (a, b)) + (Pix (a, b-1) -Pix (a, b)) + (Pix (a +1, b) -Pix (a, b)) + (Pix (a, b +1) -Pix (a, b)) +); wherein, Δ DPix (a, b) is the difference between the coordinate and the peripheral pixel, a is the abscissa, and b is the ordinate;
calculating the relevant proportion of the change rate of the picture by the formula Δ APV = (w × h)/(. Σ (Δ DPix (0,0) < P0).; (. Δ DPix (w, h) < P0)), wherein Δ APV is the relevant proportion of the change rate of the picture, w is the width, h is the height, and P0 is a coefficient;
calculating an overall atmosphere value by a formula CPV = APV (absolute pressure) APV, wherein the CPV is the overall atmosphere value;
step S32: acquiring a current frame refresh rate VFRE and recording a related value;
step S33: acquiring the brightest and darkest arrays of the current picture, wherein the arrays comprise array relations;
obtaining a brightest array by a formula WPT = [ Pixtop (1), Pixtop (2),... and Pixtop (100) ], and recording 100 brightest groups of data, wherein Pixtop is the pixel value with the largest H value in HSV values in the whole picture, contains coordinate information, and WPT is the brightest 100 sets of data;
the darkest array is obtained through the formula DPT = [ Pixbottom (1), Pixbottom (2),.. and Pixbottom (100) ], and the darkest 100 groups of data are recorded, wherein Pixbottom (1) is the pixel value with the smallest H value in HSV values, contains coordinate information, and DPT is the darkest 100 sets of data values.
Step S4: comparing the acquired data difference with the data difference of the previous picture frame;
the step S4 further includes the steps of:
step S41: acquiring an array set with a large difference ratio;
obtaining the absolute value of the change of two frames by the formula of PFPix (a, b) = | P (a, b) -CF (a, b) |, wherein PFPix (a, b) is the absolute value of the change of two frames, PF (a, b) is the pixel value of the previous frame, and CF (a, b) is the pixel value of the current frame;
calculating a specified pixel value by the formula Δ RFPix (a, b) = PFPix (a, b) × VFRE P, wherein the Δ RFPix (a, b) is the specified pixel value, and P is a change coefficient;
step S42: increasing the display effect in the corresponding coordinates in the array;
calculating the pixel value of the frame data by the formula RPix (a, b) = Pix (a, b) × P1 +. RFPixelbook (a, b) × P2; where RPix (a, b) is the pixel value of the frame data, a is the abscissa, b is the ordinate, and P1 and P2 are coefficients, which are mainly used to adjust the difference of local display colors.
Step S5: packing the changed YUV key data and sending the changed YUV key data to the corresponding position information;
the step S5 further includes the steps of:
step S51: taking horizontal coordinate data;
taking out the pixels of the frame by the formula Δ HPix (a, b) = (Pix (a, b +1) -Pix (a, b)), wherein the difference value of the pixels of the frame is given as (a), the Δ HPix (a, b) is the difference value of the pixels of the frame, the a is the abscissa, and the b is the ordinate;
calculating the difference average value of all elements under the specified A value by the formula Δ AHPix = (n)/(. Σ (Δ HPix (a,0) > P4). (. HPix (a, n) > P5)), wherein Δ AHPix is the difference average value of all elements under the specified A value, P4, P5 is a coefficient, and is less than the coefficient P4, and mainly obtaining all longitudinal coordinate differences under the abscissa;
the calculated value of the a coordinate is obtained by the formula vh (a) = RPix (a, b) × P1 +. Δ AHPix P2 + Pixtop (0) × P3 + Pixbottom (0) × P4, where vh (a) is the calculated value of the a coordinate, and P1, P2, P3, and P4 are correlation coefficients, respectively, and are mainly used for sending abscissa data to the data of the LED control device.
Step S52: taking ordinate data;
obtaining the difference value of the fetched pixel under the ordinate (b) through the formula Δ VPix (a, b) = (Pix (a +1, b) -Pix (a, b)), wherein the difference value of the fetched pixel under the ordinate (b) is VPix (a, b), and the a is the abscissa and the b is the ordinate;
all the abscissa differences to the abscissa under the abscissa are obtained through the formula Δ AVPix = (n)/(∑ (Δ VPix (0, b) > P) · Δ VPix (n, b) > P)), wherein Δ AVPix is all the abscissa differences under the abscissa, P is a coefficient, and is smaller than the neglect of the coefficient P, and all the abscissa differences under the ordinate are mainly obtained;
the calculated value of the coordinate of the ordinate (b) is obtained by the formula vv (b) = RPix (a, b) · P1 +. AVPix × P2 + Pixtop (0) × P3 + Pixbottom (0) × P4, where vv (b) is the calculated value of the coordinate of the ordinate (b), and P1, P2, P3, and P4 are correlation coefficients, respectively, and are mainly used for transmitting the ordinate data to the data of the LED control device.
Step S53: taking intermediate data;
obtaining the difference of the fetched pixel in the middle value by the formula of CPix (a, b) = | Pix (a-1, b) -Pix (a, b) | + | Pix (a, b-1) -Pix (a, b) | + | Pix (a +1), b) -Pix (a, b) | + | Pix (a, b +1) -Pix (a, b) |, wherein CPix (a, b) is the difference of the fetched pixel in the middle value, a is the horizontal coordinate, and b is the vertical coordinate;
obtaining an intermediate value difference through a formula of (Δ CVPix = (n x n)/(. Σ (CPix (0,0) > P). - (. CPix (n, n) > P)), wherein Δ CVPix is the intermediate value difference, P is a coefficient, and is less than the neglect of the coefficient P, and the intermediate value difference is mainly obtained;
the calculated value of the middle coordinate is obtained through a formula VC (a, b) = RPix (a, b) = P1 +. CVPix P2 + Pixtop (0) × P3 + Pixbottom × P4, wherein VC (a, b) is the calculated value of the middle coordinate, P1, P2, P3 and P4 are correlation coefficients respectively, a is an abscissa and b is an ordinate, and the intermediate coordinate is mainly used for sending ordinate data to data of the LED control equipment.
Step S6: and sending the changed data to the LED control equipment in a wired/wireless manner and controlling the corresponding LED lamp to make corresponding display atmosphere change.
Fig. 2 shows a system for converting data into LED stereoscopic illusive color effect according to video image change, the system includes an HDMIN-HUB having an HDMI-IN interface and an HDMI-OUT interface, a CPU connected to the HDMIN-HUB, a controller unit connected to the CPU, and a lamp set unit connected to the controller unit. And the video signal is accessed into the HDMIN-HUB through the HDMI-IN interface, and is accessed into the television for playing through the HDMI-OUT interface. In addition, the video signal is accessed to the CPU for processing, the processed video signal is sent to the controller unit, the lamp group unit is controlled by the controller unit to emit light, the lamp group unit can create atmosphere effect change which is the same as that of the video according to the atmosphere of the video, and the lamp group unit is linked with the video.
The controller unit comprises a wireless transmitting LED controller and an MCU-LED controller which are connected with the CPU. The controller unit is divided into a wired type and a wireless type, and can be selectively used according to different requirements and environments when in use, or can be used in both types, or can be used in combination of the two types.
The lamp group unit comprises a wireless lamp group and a wired lamp group, the wireless lamp group is wirelessly connected with the wireless transmitting LED controller, and the wired lamp group is connected with the MCU-LED controller in a wired mode. The lamp group unit is similar to the controller unit in selection, and can be properly selected according to different requirements.
The wireless lamp group comprises a wireless receiving LED controller which is wirelessly connected with the wireless sending LED controller, and a combined lamp, a top lamp, a bottom lamp, a left lamp, a right lamp and a vertical/horizontal middle lamp which are connected with the wireless receiving LED controller; the wired lamp group comprises a combined lamp, a top lamp, a bottom lamp, a left lamp, a right lamp and a vertical/horizontal middle lamp which are connected with the MCU-LED controller. The wireless lamp group has the advantages that wiring is not needed, the installation is convenient, the received signal is unstable due to the wireless received signal, the wired lamp group has the advantages that the received signal is stable, the wiring is troublesome, and the wired lamp group and the wireless lamp group have various advantages.
Through the method and the system for converting the video image change data into the LED three-dimensional multicolor effect, the scenes of family atmosphere illumination can be richer and linked, peripheral expansion is performed on content display effects such as televisions and projection, content pictures can be richer, more derivative effects can be realized on the pictures, better effects can be realized during film watching, the film watching can be more immersive through the playing of the content and wireless transmission cooperation, more peripheral atmosphere rendering is realized compared with the single content of the existing cinema, under some scenes, more stereoscopic immersion is realized, and the stereoscopic atmosphere data is synchronized by using some algorithms, the lamp can be connected to the same system in a wireless or wired mode, and more DIY own scene effects can be realized.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (10)

1. A method for converting to LED stereoscopic illusive color effect according to video image variation data, the method comprising the steps of:
step S1: acquiring video stream information from an HDMI (high-definition multimedia interface);
step S2: re-encoding the video stream to a fixed size resolution;
step S3: comparing the frame change in the corresponding range according to the refresh rate of the video stream;
step S4: comparing the acquired data difference with the data difference of the previous picture frame;
step S5: packing the changed YUV key data and sending the changed YUV key data to the corresponding position information;
step S6: and sending the changed data to the LED control equipment in a wired/wireless manner and controlling the corresponding LED lamp to make corresponding display atmosphere change.
2. The method according to claim 1, wherein the specific step of step S1 is to send the video stream to an HDMI HUB via an HDMI interface, the HDMI HUB sends the video stream in two paths, one path is sent to an external display for video display, and the other path is sent to a CPU for video stream decoding.
3. The method according to claim 2, wherein the step S2 is specifically performed by compressing the decoded video streams with different resolutions to a specified size, and sharpening the video stream image after the compression is completed to highlight the edge portion of the difference.
4. The method according to claim 3, wherein the step S3 further comprises the steps of:
step S31: acquiring all data of a current picture, and distinguishing the atmosphere of the current video stream according to the numerical value of the whole color;
acquiring a value of the whole picture average by a formula APV = (Pix (0) + Pix (1) +.... + Pix (n))/n, wherein APV is the value of the whole picture average, Pix is a pixel value of a corresponding point, 0 is a pixel corresponding point index, following the order from left to right and from top to bottom, and n is the sum of pixels;
obtaining the difference value of the coordinate and the peripheral pixels by the formula Δ DPix (a, b) = (Pix (a-1, b) -Pix (a, b)) + (Pix (a, b-1) -Pix (a, b)) + (Pix (a +1, b) -Pix (a, b)) + (Pix (a, b +1) -Pix (a, b)) +); wherein, Δ DPix (a, b) is the difference between the coordinate and the peripheral pixel, a is the abscissa, and b is the ordinate;
calculating the relevant proportion of the change rate of the picture by the formula Δ APV = (w × h)/(. Σ (Δ DPix (0,0) < P0).; (. Δ DPix (w, h) < P0)), wherein Δ APV is the relevant proportion of the change rate of the picture, w is the width, h is the height, and P0 is a coefficient;
calculating an overall atmosphere value by a formula CPV = APV (absolute pressure) APV, wherein the CPV is the overall atmosphere value;
step S32: acquiring a current frame refresh rate VFRE and recording a related value;
step S33: acquiring the brightest and darkest arrays of the current picture, wherein the arrays comprise array relations;
obtaining a brightest array by a formula WPT = [ Pixtop (1), Pixtop (2),... and Pixtop (100) ], and recording 100 brightest groups of data, wherein Pixtop is the pixel value with the largest H value in HSV values in the whole picture, contains coordinate information, and WPT is the brightest 100 sets of data;
the darkest array is obtained through the formula DPT = [ Pixbottom (1), Pixbottom (2),.. and Pixbottom (100) ], and the darkest 100 groups of data are recorded, wherein Pixbottom (1) is the pixel value with the smallest H value in HSV values, contains coordinate information, and DPT is the darkest 100 sets of data values.
5. The method according to claim 4, wherein the step S4 further comprises the steps of:
step S41: acquiring an array set with a large difference ratio;
obtaining the absolute value of the change of two frames by the formula of PFPix (a, b) = | P (a, b) -CF (a, b) |, wherein PFPix (a, b) is the absolute value of the change of two frames, PF (a, b) is the pixel value of the previous frame, and CF (a, b) is the pixel value of the current frame;
calculating a specified pixel value by the formula Δ RFPix (a, b) = PFPix (a, b) × VFRE P, wherein the Δ RFPix (a, b) is the specified pixel value, and P is a change coefficient;
step S42: increasing the display effect in the corresponding coordinates in the array;
calculating the pixel value of the frame data by the formula RPix (a, b) = Pix (a, b) × P1 +. RFPixelbook (a, b) × P2; where RPix (a, b) is the pixel value of the frame data, a is the abscissa, b is the ordinate, and P1 and P2 are coefficients.
6. The method according to claim 5, wherein the step S5 further comprises the steps of:
step S51: taking horizontal coordinate data;
taking out the pixels of the frame by the formula Δ HPix (a, b) = (Pix (a, b +1) -Pix (a, b)), wherein the difference value of the pixels of the frame is given as (a), the Δ HPix (a, b) is the difference value of the pixels of the frame, the a is the abscissa, and the b is the ordinate;
calculating the difference average value of all elements under the specified A value by the formula Δ AHPix = (n)/(. Σ (Δ HPix (a,0) > P4). (. HPix (a, n) > P5)), wherein Δ AHPix is the difference average value of all elements under the specified A value, P4, P5 is a coefficient, and is less than the coefficient P4, and P5 is ignored;
obtaining the calculated value of the a coordinate by the formula vh (a) = RPix (a, b) × P1 +. Δ AHPix × P2 + Pixtop (0) × P3 + Pixbottom (0) × P4, wherein vh (a) is the calculated value of the a coordinate, and P1, P2, P3, and P4 are correlation coefficients, respectively;
step S52: taking ordinate data;
obtaining the difference value of the fetched pixel under the ordinate (b) through the formula Δ VPix (a, b) = (Pix (a +1, b) -Pix (a, b)), wherein the difference value of the fetched pixel under the ordinate (b) is VPix (a, b), and the a is the abscissa and the b is the ordinate;
all the abscissa differences to the abscissa by the formula Δ AVPix = (n)/(Σ (Δ VPix (0, b) > P) · Δ VPix (n, b) > P)), wherein Δ AVPix is all the abscissa differences under the abscissa, and P is a coefficient and is less than the neglect of the coefficient P;
obtaining the value of the coordinate of the ordinate (b) after calculation by a formula vv (b) = RPix (a, b) · P1 +. Δ AVPix × P2 + Pixtop (0) × P3 + Pixbottom (0) × P4, wherein vv (b) is the value of the coordinate of the ordinate (b) after calculation, P1, P2, P3, and P4 are correlation coefficients, respectively;
step S53: taking intermediate data;
obtaining the difference of the fetched pixel in the middle value by the formula of CPix (a, b) = | Pix (a-1, b) -Pix (a, b) | + | Pix (a, b-1) -Pix (a, b) | + | Pix (a +1), b) -Pix (a, b) | + | Pix (a, b +1) -Pix (a, b) |, wherein CPix (a, b) is the difference of the fetched pixel in the middle value, a is the horizontal coordinate, and b is the vertical coordinate;
obtaining an intermediate value difference through a formula of CVPix = (n x n)/(. Σ (CPix (0,0) > P). (. CPix (n, n) > P)), wherein Δ CVPix is the intermediate value difference, and P is a coefficient and is less than the neglect of the coefficient P;
the calculated value of the middle coordinate is obtained by the formula VC (a, b) = RPix (a, b) = P1 +/CVPix P2 + Pixtop (0) × P3 + Pixbottom × P4, where VC (a, b) is the calculated value of the middle coordinate, P1, P2, P3, and P4 are correlation coefficients, a is an abscissa, and b is an ordinate.
7. The system of any one of claims 1-6, wherein the system comprises an HDMIN-HUB with an HDMI-IN interface and an HDMI-OUT interface, a CPU connected to the HDMIN-HUB, a controller unit connected to the CPU, and a light bank unit connected to the controller unit.
8. The system of claim 7, wherein the controller unit comprises a wireless transmitting LED controller and a MCU-LED controller connected to the CPU.
9. The system of claim 8, wherein the light group unit comprises a wireless light group connected with the wireless transmitting LED controller through a wireless connection and a wired light group connected with the MCU-LED controller through a wired connection.
10. The system of claim 9, wherein the wireless light set comprises a wireless receiving LED controller wirelessly connected to the wireless transmitting LED controller, and a combination light, a top light, a bottom light, a left light, a right light, and a vertical/horizontal middle light connected to the wireless receiving LED controller; the wired lamp group comprises a combined lamp, a top lamp, a bottom lamp, a left lamp, a right lamp and a vertical/horizontal middle lamp which are connected with the MCU-LED controller.
CN202111347432.6A 2021-11-15 2021-11-15 Method and system for converting video image change data into LED stereoscopic multicolor effect Active CN114040535B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202111347432.6A CN114040535B (en) 2021-11-15 2021-11-15 Method and system for converting video image change data into LED stereoscopic multicolor effect
PCT/CN2021/133943 WO2023082361A1 (en) 2021-11-15 2021-12-31 Method and system for converting video image change data into led three-dimensional magic color effect

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111347432.6A CN114040535B (en) 2021-11-15 2021-11-15 Method and system for converting video image change data into LED stereoscopic multicolor effect

Publications (2)

Publication Number Publication Date
CN114040535A true CN114040535A (en) 2022-02-11
CN114040535B CN114040535B (en) 2024-02-09

Family

ID=80144360

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111347432.6A Active CN114040535B (en) 2021-11-15 2021-11-15 Method and system for converting video image change data into LED stereoscopic multicolor effect

Country Status (2)

Country Link
CN (1) CN114040535B (en)
WO (1) WO2023082361A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010092692A (en) * 2008-10-07 2010-04-22 Sharp Corp Lighting device
CN102104057A (en) * 2009-12-18 2011-06-22 精工爱普生株式会社 Display device
CN105632412A (en) * 2015-01-20 2016-06-01 常州市武进区半导体照明应用技术研究院 Method, apparatus and system capable of providing background light display and synchronous video playing
CN105872748A (en) * 2015-12-07 2016-08-17 乐视网信息技术(北京)股份有限公司 Lamplight adjusting method and device based on video parameter
CN111683439A (en) * 2020-06-08 2020-09-18 Tcl华星光电技术有限公司 Control method of color light strip and display device
CN111787671A (en) * 2020-07-15 2020-10-16 江门市征极光兆科技有限公司 Control method based on movie and television picture synchronous light atmosphere
CN113597061A (en) * 2021-07-16 2021-11-02 深圳市传视界电子科技有限公司 Method, apparatus and computer readable storage medium for controlling a magic color light strip

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8928811B2 (en) * 2012-10-17 2015-01-06 Sony Corporation Methods and systems for generating ambient light effects based on video content
CN106658134A (en) * 2016-10-28 2017-05-10 青岛海信电器股份有限公司 Time synchronization method for ambient light television and ambient light television
WO2019233800A1 (en) * 2018-06-08 2019-12-12 Signify Holding B.V. Adjusting parameters of light effects specified in a light script
CN108933961B (en) * 2018-06-26 2021-03-23 深圳市韵阳科技有限公司 Method and system for controlling LED color development according to image edge data
WO2022121114A1 (en) * 2020-12-11 2022-06-16 萤火虫(深圳)灯光科技有限公司 Lighting module control method, lighting module, electronic device, and storage medium
CN113613370B (en) * 2021-08-30 2024-03-19 江苏惠通集团有限责任公司 Atmosphere lamp control method and device, computer readable storage medium and terminal

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010092692A (en) * 2008-10-07 2010-04-22 Sharp Corp Lighting device
CN102104057A (en) * 2009-12-18 2011-06-22 精工爱普生株式会社 Display device
CN105632412A (en) * 2015-01-20 2016-06-01 常州市武进区半导体照明应用技术研究院 Method, apparatus and system capable of providing background light display and synchronous video playing
CN105872748A (en) * 2015-12-07 2016-08-17 乐视网信息技术(北京)股份有限公司 Lamplight adjusting method and device based on video parameter
CN111683439A (en) * 2020-06-08 2020-09-18 Tcl华星光电技术有限公司 Control method of color light strip and display device
CN111787671A (en) * 2020-07-15 2020-10-16 江门市征极光兆科技有限公司 Control method based on movie and television picture synchronous light atmosphere
CN113597061A (en) * 2021-07-16 2021-11-02 深圳市传视界电子科技有限公司 Method, apparatus and computer readable storage medium for controlling a magic color light strip

Also Published As

Publication number Publication date
WO2023082361A1 (en) 2023-05-19
CN114040535B (en) 2024-02-09

Similar Documents

Publication Publication Date Title
JP5992997B2 (en) Method and apparatus for generating a video encoded signal
US11183143B2 (en) Transitioning between video priority and graphics priority
US9161023B2 (en) Method and system for response time compensation for 3D video processing
KR101972748B1 (en) Apparatus and method for dynamic range transforming of images
KR102061349B1 (en) High dynamic range image signal generation and processing
US9277196B2 (en) Systems and methods for backward compatible high dynamic range/wide color gamut video coding and rendering
US10368105B2 (en) Metadata describing nominal lighting conditions of a reference viewing environment for video playback
US11032579B2 (en) Method and a device for encoding a high dynamic range picture, corresponding decoding method and decoding device
EP2819414A2 (en) Image processing device and image processing method
KR102176398B1 (en) A image processing device and a image processing method
KR20160070720A (en) Decoding device and decoding method, and coding device and coding method
US9161030B1 (en) Graphics overlay system for multiple displays using compressed video
CN107197266B (en) HDR video coding method
US9053752B1 (en) Architecture for multiple graphics planes
US8483389B1 (en) Graphics overlay system for multiple displays using compressed video
US20130258051A1 (en) Apparatus and method for processing 3d video data
WO2008122184A1 (en) Video device and method for adjusting the video image characteristics
CN111406404B (en) Compression method, decompression method, system and storage medium for obtaining video file
CN114040535A (en) Method and system for converting video image change data into LED three-dimensional multicolor effect
CN116389794A (en) Techniques for enabling ultra high definition alliance specified reference mode (UHDA-SRM)
KR20040036116A (en) Controlling method and apparatus for transmitting moving picture color by using color space
US20220201159A1 (en) Image output device, image receiving device, and image transmission method
Boitard et al. Evaluation of color pixel representations for high dynamic range digital cinema
GB2539760A (en) HDR compression method and application to viewers

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant