CN113727501B - Sound-based light dynamic control method, device, system and storage medium - Google Patents

Sound-based light dynamic control method, device, system and storage medium Download PDF

Info

Publication number
CN113727501B
CN113727501B CN202110820192.0A CN202110820192A CN113727501B CN 113727501 B CN113727501 B CN 113727501B CN 202110820192 A CN202110820192 A CN 202110820192A CN 113727501 B CN113727501 B CN 113727501B
Authority
CN
China
Prior art keywords
color
value
sound
primary color
primary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110820192.0A
Other languages
Chinese (zh)
Other versions
CN113727501A (en
Inventor
杨伟展
魏彬
朱奕光
谢姜
张良良
曾滔滔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Foshan Electrical and Lighting Co Ltd
Original Assignee
Foshan Electrical and Lighting Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Foshan Electrical and Lighting Co Ltd filed Critical Foshan Electrical and Lighting Co Ltd
Priority to CN202110820192.0A priority Critical patent/CN113727501B/en
Publication of CN113727501A publication Critical patent/CN113727501A/en
Application granted granted Critical
Publication of CN113727501B publication Critical patent/CN113727501B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • H05B47/12Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by detecting audible sound
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Spectrometry And Color Measurement (AREA)

Abstract

The invention discloses a sound-based lamplight dynamic control method, which comprises the following steps: acquiring sound data; converting the sound data into a frequency spectrum signal, and carrying out color processing on the frequency spectrum signal in a zoned mode to generate a target base color value; converting the sound data into a brightness value; generating color control parameters according to the brightness value and the target base color value; and outputting the color control parameters to each pixel point of the light group in sequence so as to dynamically display the lights of the light group in a running water mode. The invention also discloses a computer device, a system and a computer readable storage medium. According to the invention, the sound, the color and the brightness are combined to form the effect that the color and the brightness change simultaneously with the sound, so that the particularity of the sound can be effectively reflected, a good atmosphere is created, and the experience effect of a user is improved.

Description

Sound-based light dynamic control method, device, system and storage medium
Technical Field
The present invention relates to the field of light control technologies, and in particular, to a sound-based light dynamic control method, a computer device, a light dynamic control system, and a computer-readable storage medium.
Background
Along with the improvement of living standard, people increasingly demand entertainment activities, such as concerts, large-scale stage plays, music exhibition halls, music fountain squares and the like, stage art workers mostly adopt colorful light effects to strengthen performance effects and increase the infectivity of art.
However, the existing light effects are all to build the light effects through the decorative lamp with the fixed effect, the light effects displayed by the mode are fixed in type, single in color and fixed in changing mode, cannot be matched with external sounds, and are limited in display situation and weak in entertainment.
For example, patent CN104244529a discloses a sound induction beat lighting system, which can individually design special effect tags of corresponding time nodes for each piece of music according to the beat of specific music, and activate corresponding special effect tags according to the time nodes while playing the music, so as to control the display combination distribution, brightness and color of a lighting display unit; when specific prop sound is detected, the overall brightness of the special effect of the light can be increased; when a specific voice is detected, a set of lamplight special effect flow can be activated according to preset settings. However, the system can only process preset music, the preset music needs to be subjected to label setting and special effect setting in advance, the form of the lamplight special effect is fixed, the real-time lamplight effect cannot be adjusted for all the sounds, and the system is weak in flexibility and single in effect.
Disclosure of Invention
The invention aims to solve the technical problem of providing a sound-based lamplight dynamic control method, computer equipment, a system and a computer-readable storage medium, which can combine sound with color and brightness to form the effect that the color and the brightness of lamplight change simultaneously with the sound.
In order to solve the technical problems, the invention provides a sound-based lamplight dynamic control method, which comprises the following steps: acquiring sound data; converting the sound data into a frequency spectrum signal, and carrying out color processing on the frequency spectrum signal in a subarea mode to generate a target base color value; converting the sound data into a brightness value; generating color control parameters according to the brightness value and the target primary color value; and outputting the color control parameters to each pixel point of the light group in sequence so as to enable the light of the light group to be dynamically displayed in a running water mode.
As an improvement of the above solution, the step of performing color processing on the spectral signal sub-areas to generate a target base color value includes: dividing the spectrum signal into at least two primary color areas and calculating a primary color value of each primary color area, wherein the primary color areas correspond to the primary color values one by one; performing color transition processing on each primary color value respectively; respectively carrying out color correction processing on each primary color value after the color transition processing; and respectively carrying out color compensation processing on each primary color value after the color correction processing.
As an improvement of the above solution, when the spectrum signal is divided into at least three primary color areas, the step of performing color processing on the spectrum signal divided areas to generate target primary color values further includes: and respectively performing color highlighting processing on each primary color value after the color compensation processing.
As an improvement of the above-described scheme, the step of dividing the spectrum signal into at least two primary color regions and calculating the primary color value of each primary color region includes: dividing the spectrum signal into at least two primary color areas according to frequency values; the frequency correspondence values in each base region are summed separately to generate a base color value.
As an improvement of the above solution, the step of performing color transition processing on each of the primary color values includes: extracting a current base color value and N corresponding historical base color values, wherein the current base color value is a base color value of the sound data obtained by calculation at the current moment, the N historical base color values are base color values of the sound data obtained by calculation for the previous N times at the current moment, one current base color value corresponds to the N historical base color values, and N is a positive integer; and calculating the average value of the current base color value and the corresponding N historical base color values, and taking the average value as the base color value after color transition processing.
As an improvement of the above-described aspect, the step of performing color correction processing on each of the primary color values after the color transition processing includes: and multiplying the primary color value after the color transition treatment with a corresponding preset correction coefficient to carry out color correction, wherein the preset correction coefficient corresponds to the primary color value after the color transition treatment one by one.
As an improvement of the above-described aspect, the step of performing color compensation processing on each of the color-corrected primary color values, respectively, includes: detecting whether the primary color value after the color correction processing is lost or not in a preset time; counting the detection times and the loss times of the base color value after the color correction treatment respectively; and calculating to obtain the primary color value after the color compensation processing according to the detection times, the loss times and the primary color value after the color correction processing.
As an improvement of the above solution, the step of detecting whether the color correction processed primary color values are lost in a preset time includes: comparing the color-corrected primary color value with a preset compensation value in a preset time, judging whether the color-corrected primary color value is smaller than the preset compensation value, if so, indicating that the color-corrected primary color value is lost, and if not, indicating that the color-corrected primary color value is not lost; the step of calculating the color-compensated primary color value according to the detection times, the loss times and the color-corrected primary color value comprises the following steps: according to the formula RGB i =RGB j ·RGB c Color compensation is performed, wherein RGB i RGB is the color compensated primary color value j RGB, which is the color corrected primary color value c =1+(RGB s /S)·K,RGB c To compensate the coefficients, RGB s And for the loss times, S is the detection times, and K is a preset proportion.
As an improvement of the above-described aspect, the step of performing color highlighting processing on each of the color-compensated primary color values, respectively, includes: extracting the minimum primary color value from all the primary color values after the color compensation processing; subtracting the minimum primary color value from each primary color value after the color compensation processing to generate a reference primary color value, wherein the reference primary color value corresponds to the primary color value after the color compensation processing one by one; extracting the maximum reference base color value from all the reference base color values; when the maximum reference base color value is not zero, according to the formula RGB o =D·[(RGB i -V min )/V max ] 8 Performing a nonlinear operation for color highlighting, wherein RGB o D is a preset maximum intensity value, RGB, for the primary color value after color highlighting i V is the primary color value after color compensation processing min For minimum base color value, V max The base color value after the color highlighting treatment corresponds to the base color value after the color compensation treatment one by one; when the maximum reference base color value is zero, according to the formula RGB o Color highlighting process was performed for =d.
As an improvement of the above-described aspect, the step of converting the sound data into the luminance value includes: calculating a sound average value of the sound data; processing the sound average value by adopting an automatic gain algorithm to obtain a reference average value at the current moment; and generating a brightness value according to the sound average value and the reference average value of the current moment.
As an improvement of the above solution, the step of processing the sound average value by using an automatic gain algorithm to obtain a reference average value at the current time includes: calculating a reference average value of the current time according to the sound average value, a last reference average value and a last count value, wherein the last reference average value is a reference average value of sound data acquired before the current time, and the last count value is a count value of sound data acquired before the current time, when the sound average value is larger than the last reference average value, the sound average value is taken as the reference average value of the current time and the maximum reference sound value, and the count value of the current time is reset, when the sound average value is smaller than or equal to the last reference average value and the last count value are both larger than zero, the method is used for calculating the sound average value according to the formula A agc_o =V base ·(V count_0 (T) calculating a reference average value at the current time, wherein A agc_o As the reference average value of the current time, V base For maximum reference sound value, V count_0 =V count_i -1,V count_0 V is the count value at the current time count_i For the last count value, T is the acquisition times of sound data in a preset time, otherwise, setting the reference average value of the current moment to be zero; the step of generating a luminance value according to the sound average value and the reference average value of the current time includes: when the reference average value of the current moment is not zero, according to the formula V bright =D·(A avg /A agc_o ) 8 Performing a nonlinear operation to generate a luminance value, wherein V bright For brightness value, A avg The average value of sound is the average value of sound, and D is the preset maximum intensity value; when the reference average value of the current moment is zero, according to the formula V bright =0 generates a luminance value.
As an improvement of the above solution, the step of generating the color control parameter according to the luminance value and the target base color value includes: according to the formula RGB K =RGB q ·(V bright Respectively calculating color control factors corresponding to each target base color value, wherein RGB K RGB as color control factors q For the target base color value, V bright The color control factors are in one-to-one correspondence with the target base color values; all color control factors are combined to form color control parameters.
Correspondingly, the invention also provides computer equipment, which comprises a memory and a processor, wherein the memory stores a computer program, and the processor realizes the steps of the light dynamic control method when executing the computer program.
Correspondingly, the invention also provides a lamplight dynamic control system which comprises the lighting equipment and the computer equipment, wherein the computer equipment is connected with the lighting equipment.
Correspondingly, the invention further provides a computer readable storage medium, on which a computer program is stored, wherein the computer program realizes the steps of the light dynamic control method when being executed by a processor.
The implementation of the invention has the following beneficial effects:
according to the invention, the sound is subjected to regional color conversion processing according to the sound spectrum, so that the color matched with the sound is generated; meanwhile, the invention also carries out brightness conversion according to the sound, thereby generating brightness matched with the sound; therefore, the sound, the color and the brightness are combined to form the effect that the color and the brightness synchronously change along with the sound, the particularity of the sound can be effectively reflected, a good atmosphere is created, and the experience effect of a user is improved.
Further, the invention processes the spectrum signal by adopting processing methods such as color transition, color correction, color compensation, color highlighting and the like, so that the color change is softer, the transition is richer, and the conversion of RGB colors can be matched better.
In addition, the invention also adds an automatic gain algorithm and a nonlinear algorithm, so that the point pixels in the light group can run in rhythm according to the sound, and the light group can keep basically consistent running effect no matter what the volume of music is set.
Drawings
FIG. 1 is a flow chart of a first embodiment of a sound-based light dynamic control method of the present invention;
FIG. 2 is a flow chart of a second embodiment of the sound-based light dynamic control method of the present invention;
fig. 3 is a flow chart of a third embodiment of the sound-based light dynamic control method of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings, for the purpose of making the objects, technical solutions and advantages of the present invention more apparent.
Referring to fig. 1, fig. 1 shows a first embodiment of the sound-based light dynamic control method of the present invention, comprising:
s101, acquiring sound data.
The acquisition equipment acquires point sound information at regular time, combines the point sound information acquired in a preset time period into sound data at regular time and sends the sound data to the computer equipment, and the computer equipment processes the sound data after acquiring the sound data. That is, the sound data includes a plurality of dot sound information, wherein the dot sound information is an analog signal.
In order to enable the lamplight of the lamplight group to present a good running water type dynamic display effect, the invention can acquire the point sound information every 92us, and 256 point sound information (A) 1 、A 2 …A 256 ) The input of the sound data is performed once (i.e., the input of the sound data is performed every 23.5 ms), but not limited thereto, and may be adjusted according to actual conditions.
S102, converting the sound data into a frequency spectrum signal, and carrying out color processing on the frequency spectrum signal in areas to generate a target base color value.
In particular, a fast fourier transform algorithm may be employed to convert sound data into a spectral signal for processing. In practical application, the obtained 256-point sound information is input into a fast Fourier transform algorithm to output 128 frequency points to form a frequency spectrum signal, wherein the frequency points respectively correspond to frequencies of 0-5.435 KHz.
Meanwhile, aiming at the frequency characteristics of the spectrum signal, the invention carries out regional processing on the spectrum signal and carries out color conversion on the spectrum signal in the regions, wherein each region can correspond to a target base color value. For example, when divided into three areas, three primary colors of red (R), green (G), blue (B) can be correspondingly converted; for another example, when divided into two regions, two primary colors of red (R) and blue (B) can be correspondingly converted.
S103, converting the sound data into a luminance value.
Correspondingly, the invention converts the sound data into brightness according to the characteristics of the sound data. Specifically, the step of converting sound data into a luminance value includes:
(1) A sound average value of the sound data is calculated.
When the sound data includes 256-point sound information, the sound data can be obtained according to formula A avg =(A 1 +A 2 +…+A 256 ) Calculating the sound average A of the sound data by/256 avg Thereby providing a more average sound reference value for the sound data.
(2) The sound average is processed by an automatic gain algorithm (AGC, automatic Gain Control) to obtain a reference average at the current time.
Specifically, a reference average value of the current time is calculated according to the sound average value, a last reference average value and a last count value, wherein the last reference average value is a reference average value of sound data acquired before the current time, and the last count value is a count value of sound data acquired before the current time, and the method comprises the steps of:
when the sound average value is larger than the previous reference average value, taking the sound average value as the reference average value and the maximum reference sound value of the current moment, and resetting the count value of the current moment;
when the average value of the sound is smaller than or equal to the last reference average value and the last count value are both larger than zero, the method is based on the formula A agc_o =V base ·(V count_0 (T) calculating a reference average value at the current time, wherein A agc_o As the reference average value of the current time, V base For maximum reference sound value, V count_0 =V count_i -1,V count_0 V is the count value at the current time count_i The time T is the acquisition times of sound data in a preset time for the last count value;
otherwise, the reference average value of the current moment is set to zero.
That is to say: when A is avg >A agc_i At the time, set A agc_o =A avg And reset V base =A avg Reset V count_0 =t; when A is avg ≤A agc_i ,A agc_i > 0 and V count_i At > 0, keep V base Unchanged, and set A agc_o =V base ·(V count_i -1)/T; in other cases, A agc_o =0。
Wherein:
A avg is the average value of sound;
A agc_i the initial value of the reference average value is 0;
A agc_o the reference average value of the current moment;
V base is the maximum reference sound value;
V count_i is the last count value;
t is the number of times of acquiring the sound data in the preset time, preferably, the value of T is 200 (the number of times of acquiring the sound data in about 5S), but the invention is not limited thereto, and can be adjusted according to practical situations.
(3) And generating a brightness value according to the sound average value and the reference average value of the current moment.
Specifically, when the reference average value of the current time is not zero, according to formula V bright =D·(A avg /A agc_o ) 8 Performing a nonlinear operation to generate a luminance value, wherein V bright For brightness value, A avg For the average value of the sound, D is a preset maximum intensity value, and correspondingly, when A agc_o When=0, V bright =0。
Step (3) is to the average value A of the sound in step (1) avg And the reference average value A in the step (2) agc_o Nonlinear operation processing is performed to generate a luminance value in a specific range.
For example, when RGB three primary colors are employed, the luminance range may be set to [0, 255], and the value of D corresponds to 255. Accordingly, when the calculated luminance value is 0, it means that the luminance value is 0%, and when the calculated luminance value is 255, it means that the luminance value is 100%.
Therefore, the brightness corresponding to the sound data can be effectively calculated through the step S103, so as to form the brightness value conforming to the sound characteristics.
S104, generating color control parameters according to the brightness value and the target base color value.
It should be noted that, a piece of sound data may correspondingly generate a set of color control parameters, where each set of color control parameters includes at least two color control factors, where the color control factors are used to represent brightness and color, where the number of color control factors is consistent with the number of partitions of the spectrum signal, one partition corresponds to a color in one color control factor, and brightness of a color control factor in the same set of color control parameters is consistent.
For example, when divided into three regions, the three primary colors of red (R), green (G), and blue (B) can be correspondingly converted, and at this time, the color control parameters include three color control factors, which are respectively: factor R (luminance a+color R), factor G (luminance a+color G), and factor B (luminance a+color B).
Specifically, the step of generating the color control parameter according to the luminance value and the target base color value includes:
(1) According to the formula RGB K =RGB q ·(V bright and/D) respectively calculating the color control factors corresponding to each target base color value.
It should be noted that the number of the substrates, RGB (red, green and blue) K RGB as color control factors q For the target base color value, V bright And D is a preset maximum intensity value, and the color control factors are in one-to-one correspondence with the target base color values. Accordingly, when the RGB three primary colors are employed, the preset maximum intensity value may be set to 255.
(2) All color control factors are combined to form color control parameters.
For example, when divided into three regions, the color control parameters can be correspondingly converted into three primary colors of red (R), green (G) and blue (B), and the color control parameters comprise three color control factors, namely R K 、G K B (B) K Wherein R is K =R q ·(V bright /255),G K =G q ·(V bright /255),B K =B q ·(V bright 255); accordingly, the color control parameter is (R K ,G K ,B K )。
Therefore, step S104 fuses the color portion and the brightness portion of the sound data to form a unique color control parameter.
S105, outputting the color control parameters to each pixel point of the light group in sequence so as to enable the light of the light group to be dynamically displayed in a running water mode.
For example, when the light group is a light strip, a plurality of pixel points (LED 1 … … LEDs 100) are sequentially (e.g., from left to right) provided on each light strip. In the control process, the color control parameters are output to the 1 st pixel point LED1 of the lamp strip, then to the 2 nd pixel point LED2, and so on, and finally to the 100 th pixel point LED100, and finally the continuous moving flowing effect of the lamp light is formed.
Accordingly, in the process of continuously acquiring sound data, color control parameters are also continuously output to the light group, so that a flowing water effect that the light group continuously changes along with sound is formed.
Therefore, unlike the prior art, the invention combines the frequency, the color and the brightness of the sound to form the effect that the color, the brightness and the sound are changed simultaneously, has strong flexibility, can effectively reflect the specificity of the sound, creates good atmosphere and improves the experience effect of the user.
Referring to fig. 2, fig. 2 shows a second embodiment of the sound-based light dynamic control method of the present invention, comprising:
s201, sound data is acquired.
The sound data includes a plurality of dot sound information, wherein the dot sound information is an analog signal.
S202, converting the sound data into a spectrum signal.
In particular, a fast fourier transform algorithm may be employed to convert sound data into a spectral signal for processing.
S203, dividing the spectrum signal into at least two primary color areas and calculating the primary color value of each primary color area.
In this embodiment, the spectrum signal is divided into three regions.
It should be noted that the primary color areas are in one-to-one correspondence with the primary color values. Specifically, the step of dividing the spectrum signal into at least two primary color regions and calculating the primary color value of each primary color region includes:
(1) The spectral signal is divided into at least two primary color regions according to the frequency values.
For example, when the frequency point range of the spectrum signal is 0 to 5.435KHz, the extracted spectrum signal may be divided into 3 primary color areas (0 to 1KHz,1K to 2.5KHz,2.5K to 5.435 KHz), wherein the primary color areas 0 to 1KHz are used for conversion to red (R) in three primary colors, the primary color areas 1K to 2.5KHz are used for conversion to green (G) in three primary colors, and the primary color areas 2.5K to 5.435KHz are used for conversion to blue (B) in three primary colors.
(2) The frequency correspondence values in each base region are summed separately to generate a base color value.
By summing the frequency corresponding values in each base color region (i.e., the values corresponding to the frequencies of the abscissa in the spectrogram), a more comprehensive value can be generated as the corresponding base color value. Accordingly, red (R) in three primary colors can be generated by summing the corresponding values of frequencies in the primary color region 0 to 1KHz, green (G) in three primary colors can be generated by summing the corresponding values of frequencies in the primary color region 1K to 2.5KHz, and blue (B) in three primary colors can be generated by summing the corresponding values of frequencies in the primary color region 2.5K to 5.435KHz, and thus, initial extraction of colors can be achieved by the present method.
S204, performing color transition processing on each primary color value.
Specifically, the step of performing color transition processing on each primary color value includes:
(1) And extracting the current base color value and N corresponding historical base color values.
It should be noted that, the current base color value is a base color value of the sound data calculated at the current time, and the N historical base color values are base color values of the sound data calculated at the previous N times at the current time, where one current base color value corresponds to the N historical base color values, and N is a positive integer.
For example, for the red color of the three primary colors, the current red primary color value and the corresponding N historical red primary color values need to be extracted; for the blue color in the three primary colors, the current blue color value and the corresponding N historical blue color values are required to be extracted.
(2) And calculating the average value of the current base color value and the corresponding N historical base color values, and taking the average value as the base color value after color transition processing.
Preferably, after calculating the average value of the N historical base color values corresponding to the three base color values, the base color values after the three color transition processes can be obtained: r is R e =(R a +R a-1 +R a-2 )/3,G e =(G a +G a-1 +G a-2 )/3,B e =(B a +B a-1 +B a-2 ) 3, wherein R is a 、G a 、B a R is the current base color value a-1 、R a-2 Is R a Corresponding historical base color value, G a-1 、G a-2 Is G a Corresponding historical base color value, B a-1 +B a-2 Is B a Corresponding historical base color values.
Therefore, by respectively carrying out sliding buffer processing on each primary color value, the change among all pixel points in the lamplight group can be softer.
S205, color correction processing is performed on each of the primary color values after the color transition processing.
Specifically, the step of performing color correction processing on each of the primary color values after the color transition processing includes: and multiplying the primary color value after the color transition treatment with a corresponding preset correction coefficient to carry out color correction, wherein the preset correction coefficient corresponds to the primary color value after the color transition treatment one by one.
For example, R after color transition treatment e 、G e 、B e The values of which are multiplied by correction coefficients R d 、G d 、B d (i.e. R j =R e ·R d ,G j =G e ·G d 、B j =B e ·B d ) In general R d >G d >B d . Preferably, the correction coefficients used in the present invention are: r is R d =6,G d =5,B d =4, but not limited thereto, and can be adjusted according to practical situations.
Therefore, by multiplying each primary color value after the color transition processing by a correction coefficient, respectively, the data of the three primary color areas can be made more indicative of RGB colors.
S206, performing color compensation processing on each primary color value after the color correction processing.
Specifically, the step of performing color compensation processing on each of the color correction processed primary color values includes:
(1) And detecting whether the primary color values subjected to color correction processing are lost or not in a preset time.
The corresponding inspection method is as follows:
comparing the primary color value after the color correction with a preset compensation value in a preset time, judging whether the primary color value after the color correction is smaller than the preset compensation value,
if yes, the primary color value after the color correction processing is lost,
if not, the primary color value after the color correction processing is not lost;
(2) The detection times and the loss times of the base color value after the color correction processing are respectively counted.
(3) And calculating to obtain the primary color value after the color compensation processing according to the detection times, the loss times and the primary color value after the color correction processing.
Specifically, according to the formula RGB i =RGB j ·RGB c Color compensation is performed, wherein RGB i RGB is the color compensated primary color value j RGB, which is the color corrected primary color value c =1+(RGB s /S)·K,RGB c To compensate the coefficients, RGB s For the number of losses, S is the number of detections and K is a preset ratio.
For example, let the preset time be the first M seconds, m= 8,K =20%, and respectively determine the three primary color values R after the color correction process j 、G j 、B j Whether the compensation value is smaller than a preset compensation value or not; if yes, judging that the signal is lost; if not, judging that the data is not lost. Then, by counting the number of losses R of the previous 8 seconds s 、G s 、B s And the detection times S of the first 8 seconds to obtain a compensation coefficient R c =1+(R s /S)·20%、G c =1+(G s /S)·20%、B c =1+(B s 20% of the color value R after compensation is finally calculated i =R j ·R c 、G i =G j ·G c 、B i =B j ·B c
Preferably, the compensation coefficient ranges from 1 to 1.2, but is not limited thereto, and can be adjusted according to practical situations.
Therefore, each primary color value after the color correction processing is respectively subjected to the color compensation processing through a color loss compensation algorithm, so that the colors of R, G, G, B, R and B are richer in critical time.
S207, performing color highlighting processing on each primary color value after the color compensation processing.
Specifically, the step of performing color highlighting processing on each of the color-compensated primary color values, respectively, includes:
(1) Extracting the minimum primary color value from all the primary color values after the color compensation processing;
(2) Subtracting the minimum primary color value from each primary color value after the color compensation processing to generate a reference primary color value, wherein the reference primary color value corresponds to the primary color value after the color compensation processing one by one;
(3) Extracting the maximum reference base color value from all the reference base color values;
(4) When the maximum reference base color value is not zero, according to the formula RGB o =D·[(RGB i -V min )/V max ] 8 Performing nonlinear operation to perform color saliency, wherein RGbo is a primary color value after color saliency processing, D is a preset maximum intensity value, and RGB i V is the primary color value after color compensation processing min For minimum base color value, V max The base color value after the color highlighting treatment corresponds to the base color value after the color compensation treatment one by one as the maximum base color value. Correspondingly, when V max When=0, RGB o =d. Preferably, the value of D is 255, but not limited thereto.
In this embodiment, the base color value after the color highlighting process is taken as the target base color value.
Specifically, three primary color values R generated in step S206 i 、G i 、B i In the method, the minimum base color value is selected as V min The method comprises the steps of carrying out a first treatment on the surface of the Let R be i 、G i 、B i The three values are subtracted by V min Then, the maximum reference base color value is selected as V max The method comprises the steps of carrying out a first treatment on the surface of the Then nonlinear operation is carried out, and finally the value is converted into the range of 0 to 255, namely R o =255·[(R i -V min )/V max ] 8 、G o =255(G i -V min )/V max ] 8 、B o =255(B i -V min )/V max ] 8
For example, R generated in step S206 i =150、G i =50、B i =145, the smallest primary value V min =G i =50, maximum reference base color value V max =R i -V min At this time, R can be calculated according to the formula o =255,G o =0,B o =169。
Therefore, the invention can make the color output be mostly red, green and blue three colors by respectively carrying out color highlighting processing on each primary color value, and the color change is more obvious.
S208, the sound data is converted into a luminance value.
S209, generating color control parameters according to the brightness value and the target base color value.
S210, sequentially outputting the color control parameters to all pixel points of the light group so as to enable the light of the light group to be dynamically displayed in a running water mode.
The steps S208 to S210 of the present embodiment may refer to the steps S103 to S105 of the first embodiment, and will not be described in detail herein.
According to the invention, the sound frequency and the color are combined, so that the light is bright red at low frequency, bright green at medium frequency and bright blue at high frequency, and meanwhile, other colors (colors obtained by mixing RGB with different brightness) appear when the color transition/frequency is critical, so that the colors are richer.
In addition, the invention also adds an automatic gain algorithm and a nonlinear algorithm, so that the point pixels in the light group can run in rhythm according to the sound, and the light group can keep basically consistent running effect no matter what the volume of music is set.
Referring to fig. 3, fig. 3 shows a third embodiment of the sound-based light dynamic control method of the present invention, comprising:
s301, acquiring sound data.
S302, converting the sound data into a spectrum signal.
S303, dividing the spectrum signal into at least two primary color areas and calculating the primary color value of each primary color area.
In this embodiment, the spectrum signal is divided into two primary color regions.
S304, performing color transition processing on each primary color value.
S305, performing color correction processing on each primary color value after the color transition processing.
S306, performing color compensation processing on each primary color value after the color correction processing.
In this embodiment, the color-compensated primary color value is used as the target primary color value.
S307, the sound data is converted into a luminance value.
S308, generating color control parameters according to the brightness value and the target base color value.
S309, outputting the color control parameters to each pixel point of the light group in sequence, so that the lights of the light group are dynamically displayed in a running water mode.
Unlike the second embodiment described in fig. 2, in this embodiment, it is not necessary to perform color highlighting processing separately for each of the color-compensated primary color values.
When the spectrum signal is divided into two primary color areas for color processing, the primary color value after the color compensation processing can be used as the target primary color value, and the color highlighting processing is not needed.
Correspondingly, the invention also discloses a computer device which comprises a memory and a processor, wherein the memory stores a computer program, and the processor realizes the steps of the light dynamic control method when executing the computer program. Meanwhile, the invention also discloses a dynamic light control system, which comprises lighting equipment and the computer equipment, wherein the computer equipment is connected with the lighting equipment; it should be noted that, the computer device and the lighting device may be connected by wireless or wired connection; preferably, the lighting device of the present invention is a light strip, but not limited to this, and the type of the lighting device may be selected according to actual use situations. In addition, the invention also discloses a computer readable storage medium, on which a computer program is stored, wherein the computer program realizes the steps of the light dynamic control method when being executed by a processor.
While the foregoing is directed to the preferred embodiments of the present invention, it will be appreciated by those skilled in the art that changes and modifications may be made without departing from the principles of the invention, such changes and modifications are also intended to be within the scope of the invention.

Claims (13)

1. A sound-based dynamic control method for light, comprising:
acquiring sound data;
converting the sound data into a frequency spectrum signal, and carrying out color processing on the frequency spectrum signal in a subarea mode to generate a target base color value; wherein the step of performing color processing on the spectral signal sub-regions to generate target base color values comprises: dividing the spectrum signal into at least two primary color areas and calculating a primary color value of each primary color area, wherein the primary color areas correspond to the primary color values one by one; performing color transition processing on each primary color value respectively; respectively carrying out color correction processing on each primary color value after the color transition processing; respectively carrying out color compensation processing on each primary color value after the color correction processing;
converting the sound data into a brightness value;
generating color control parameters according to the brightness value and the target primary color value;
sequentially outputting the color control parameters to each pixel point of the light group so as to enable the lights of the light group to be dynamically displayed in a running water mode;
the step of performing color transition processing on each primary color value comprises the following steps: extracting a current base color value and N corresponding historical base color values, wherein the current base color value is a base color value of the sound data obtained by calculation at the current moment, the N historical base color values are base color values of the sound data obtained by calculation for the previous N times at the current moment, one current base color value corresponds to the N historical base color values, and N is a positive integer; and calculating the average value of the current base color value and the corresponding N historical base color values, and taking the average value as the base color value after color transition processing.
2. The sound-based light dynamic control method of claim 1, wherein when dividing the spectrum signal into at least three primary color regions, the step of color processing the spectrum signal sub-regions to generate target primary color values further comprises: and respectively performing color highlighting processing on each primary color value after the color compensation processing.
3. The sound-based light dynamic control method of claim 2, wherein the step of dividing the spectrum signal into at least two primary color regions and calculating a primary color value of each primary color region comprises:
dividing the spectrum signal into at least two primary color areas according to frequency values;
the frequency correspondence values in each base region are summed separately to generate a base color value.
4. The sound-based light dynamic control method as set forth in claim 2, wherein the step of performing color correction processing on each of the color transition processed primary color values, respectively, comprises:
and multiplying the primary color value after the color transition treatment with a corresponding preset correction coefficient to carry out color correction, wherein the preset correction coefficient corresponds to the primary color value after the color transition treatment one by one.
5. The sound-based light dynamic control method as set forth in claim 2, wherein the step of performing color compensation processing on each of the color-corrected primary color values, respectively, comprises:
detecting whether the primary color value after the color correction processing is lost or not in a preset time;
counting the detection times and the loss times of the base color value after the color correction treatment respectively;
and calculating to obtain the primary color value after the color compensation processing according to the detection times, the loss times and the primary color value after the color correction processing.
6. The sound-based light dynamic control method as set forth in claim 5, wherein the step of detecting whether the color-corrected primary color value is lost within a preset time comprises:
comparing the primary color value after the color correction with a preset compensation value in a preset time, judging whether the primary color value after the color correction is smaller than the preset compensation value,
if yes, the primary color value after the color correction processing is lost,
if not, the primary color value after the color correction processing is not lost;
the step of calculating the color-compensated primary color value according to the detection times, the loss times and the color-corrected primary color value comprises the following steps:
according to the formula RGB i =RGB j ·RGB c Color compensation is performed, wherein RGB i RGB is the color compensated primary color value j RGB, which is the color corrected primary color value c =1+(RGB s /S)·K,RGB c To compensate the coefficients, RGB s And for the loss times, S is the detection times, and K is a preset proportion.
7. The sound-based light dynamic control method as set forth in claim 2, wherein the step of performing color highlighting processing on each of the color-compensated primary color values, respectively, comprises:
extracting the minimum primary color value from all the primary color values after the color compensation processing;
subtracting the minimum primary color value from each primary color value after the color compensation processing to generate a reference primary color value, wherein the reference primary color value corresponds to the primary color value after the color compensation processing one by one;
extracting the maximum reference base color value from all the reference base color values;
when the maximum reference base color value is not zero, according to the formula RGB o =D·[(RGB i -V min )/V max ] 8 Performing a nonlinear operation for color highlighting, wherein RGB o D is a preset maximum intensity value, RGB, for the primary color value after color highlighting i V is the primary color value after color compensation processing min For minimum base color value, V max The base color value after the color highlighting treatment corresponds to the base color value after the color compensation treatment one by one;
when the maximum reference base color value is zero, according to the formula RGB o Color highlighting process was performed for =d.
8. The sound-based light dynamic control method of claim 1, wherein the step of converting sound data into a luminance value comprises:
calculating a sound average value of the sound data;
processing the sound average value by adopting an automatic gain algorithm to obtain a reference average value at the current moment;
and generating a brightness value according to the sound average value and the reference average value of the current moment.
9. The sound-based light dynamic control method of claim 8, wherein the step of processing the sound average using an automatic gain algorithm to obtain a reference average at a current time comprises:
calculating a reference average value of the current time according to the sound average value, a last reference average value and a last count value, wherein the last reference average value is a reference average value of sound data acquired before the current time, the last count value is a count value of sound data acquired before the current time,
when the sound average value is larger than the last reference average value, the sound average value is used as the reference average value and the maximum reference sound value of the current moment, the count value of the current moment is reset,
when the sound average value is smaller than or equal to the last reference average value and the last count value are both larger than zero, the method is used for determining the sound average value according to the formulaA reference average value of the current time is calculated, wherein,for the reference average value of the current time, +.>For maximum reference sound value, +.>For the count value of the current time,/->For the last count value +.>For the number of acquisitions of sound data within a preset time,
otherwise, setting the reference average value at the current moment to be zero;
the step of generating a luminance value according to the sound average value and the reference average value of the current time includes:
when the reference average value of the current moment is not zero, according to the formulaPerforming a nonlinear operation to generate a luminance value, wherein +.>For brightness value +.>For the mean value of sound>Is a preset maximum intensity value;
when the reference average value of the current moment is zero, according to the formula0 generates a luminance value.
10. The sound-based dynamic light control method of claim 1, wherein the step of generating color control parameters according to the luminance value and the target base color value comprises:
according to the formula RGB K =RGB q ·(V bright Respectively calculating color control factors corresponding to each target base color value, wherein RGB K RGB as color control factors q For the target base color value, V bright The color control factors are in one-to-one correspondence with the target base color values;
all color control factors are combined to form color control parameters.
11. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any of claims 1-10 when the computer program is executed.
12. A light dynamic control system comprising a lighting device and the computer device of claim 11, the computer device being connected to the lighting device.
13. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1-10.
CN202110820192.0A 2021-07-20 2021-07-20 Sound-based light dynamic control method, device, system and storage medium Active CN113727501B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110820192.0A CN113727501B (en) 2021-07-20 2021-07-20 Sound-based light dynamic control method, device, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110820192.0A CN113727501B (en) 2021-07-20 2021-07-20 Sound-based light dynamic control method, device, system and storage medium

Publications (2)

Publication Number Publication Date
CN113727501A CN113727501A (en) 2021-11-30
CN113727501B true CN113727501B (en) 2023-11-24

Family

ID=78673573

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110820192.0A Active CN113727501B (en) 2021-07-20 2021-07-20 Sound-based light dynamic control method, device, system and storage medium

Country Status (1)

Country Link
CN (1) CN113727501B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114375083B (en) * 2021-12-17 2024-04-05 广西世纪创新显示电子有限公司 Light rhythm method, device, terminal equipment and storage medium
CN114466495B (en) * 2022-04-12 2022-06-21 深圳市国晨光电科技有限公司 Method for controlling high matching change of light brightness and color through sound change
CN115520090A (en) * 2022-11-04 2022-12-27 北京经纬恒润科技股份有限公司 Control method and device of light information, storage medium and equipment

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005091918A (en) * 2003-09-18 2005-04-07 Friend Spring Industrial Co Ltd Method and apparatus for controlling full-color led acousto-optical system
CN102123546A (en) * 2010-12-21 2011-07-13 广州杰赛科技股份有限公司 Full-color acousto-optic conversion control method and system
CN103686144A (en) * 2012-08-30 2014-03-26 苹果公司 Correction factor for color response calibration
JP2014085386A (en) * 2012-10-19 2014-05-12 Jvc Kenwood Corp Voice information display device, voice information display method and program
CN104270866A (en) * 2014-10-17 2015-01-07 孟庆云 Method, device and system for implementing horse race lamp
KR20160130122A (en) * 2015-05-01 2016-11-10 지용규 Light emitting speaker, light emitting speaker system, and driving method of light emitting speaker
CN107889323A (en) * 2017-09-27 2018-04-06 杭州古北电子科技有限公司 The control method and device that a kind of light is shown
CN108289357A (en) * 2018-03-27 2018-07-17 淮阴师范学院 A kind of LED landscape lamp control system and its working method
JP2019109314A (en) * 2017-12-16 2019-07-04 達也 宮崎 Sound-light conversion display method and display unit
CN110099485A (en) * 2019-04-29 2019-08-06 北京利合世旺教育科技有限公司 Acousto-optic regular movements control system and control method
CN212785706U (en) * 2020-08-19 2021-03-23 深圳市亿牛国际科技有限公司 Sound box with colour lamp

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW571266B (en) * 2002-02-27 2004-01-11 Friend Spring Ind Co Ltd Control method and device of full-color LED audio/visual generating system
JPWO2006100980A1 (en) * 2005-03-18 2008-09-04 パイオニア株式会社 Audio signal processing apparatus and computer program therefor

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005091918A (en) * 2003-09-18 2005-04-07 Friend Spring Industrial Co Ltd Method and apparatus for controlling full-color led acousto-optical system
CN102123546A (en) * 2010-12-21 2011-07-13 广州杰赛科技股份有限公司 Full-color acousto-optic conversion control method and system
CN103686144A (en) * 2012-08-30 2014-03-26 苹果公司 Correction factor for color response calibration
JP2014085386A (en) * 2012-10-19 2014-05-12 Jvc Kenwood Corp Voice information display device, voice information display method and program
CN104270866A (en) * 2014-10-17 2015-01-07 孟庆云 Method, device and system for implementing horse race lamp
KR20160130122A (en) * 2015-05-01 2016-11-10 지용규 Light emitting speaker, light emitting speaker system, and driving method of light emitting speaker
CN107889323A (en) * 2017-09-27 2018-04-06 杭州古北电子科技有限公司 The control method and device that a kind of light is shown
JP2019109314A (en) * 2017-12-16 2019-07-04 達也 宮崎 Sound-light conversion display method and display unit
CN108289357A (en) * 2018-03-27 2018-07-17 淮阴师范学院 A kind of LED landscape lamp control system and its working method
CN110099485A (en) * 2019-04-29 2019-08-06 北京利合世旺教育科技有限公司 Acousto-optic regular movements control system and control method
CN212785706U (en) * 2020-08-19 2021-03-23 深圳市亿牛国际科技有限公司 Sound box with colour lamp

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
金文灿.基于频谱分析的多灯联动音乐灯光系统.《工程科技Ⅱ辑》.2019,第9-26页. *

Also Published As

Publication number Publication date
CN113727501A (en) 2021-11-30

Similar Documents

Publication Publication Date Title
CN113727501B (en) Sound-based light dynamic control method, device, system and storage medium
CN107889323B (en) Control method and device for light display
CN107103917B (en) Music rhythm detection method and system
CN101252801B (en) Method and apparatus for controlling light
CN113613369A (en) Light effect control method, device, equipment and storage medium
KR101657975B1 (en) music-generation method based on real-time image
KR20010020900A (en) Method and apparatus for harmonizing colors by harmonics and converting sound into colors mutually
CN106132040B (en) Sing the lamp light control method and device of environment
CN103455790A (en) Skin identification method based on skin color model
CN102123546A (en) Full-color acousto-optic conversion control method and system
RU2007140985A (en) COLOR TRANSFORMATION BLOCK FOR REDUCING RIM
CN110337158B (en) Light emitting control method and device of light emitting diode
CN103607823A (en) LED light emitting device
CN114828359A (en) Music-based atmosphere lamp display method, device, equipment and storage medium
CN117794030B (en) Concert scene lamp group coordination control method, device, equipment and medium
KR101693109B1 (en) Light emitting speaker, light emitting speaker system, and driving method of light emitting speaker
CN113853047A (en) Light control method and device, storage medium and electronic equipment
CN105810226A (en) Method and system for controlling expression effects of music
CN105611693B (en) The control method and device of sound equipment and desk lamp with dimmer switch
CN115520090A (en) Control method and device of light information, storage medium and equipment
CN115767844A (en) Lamp effect control method and related equipment thereof
CN107607781B (en) Frequency display method and device of electromagnetic equipment
CN112286349B (en) Visual interaction control method based on sound, intelligent terminal and storage device
US20110213477A1 (en) Method for controlling in particular lighting technology by audio signal and a device for performing this method
CN110730526A (en) Site illumination control device and illumination control method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant