CN116133197A - Light control method, device, system, equipment and storage medium - Google Patents

Light control method, device, system, equipment and storage medium Download PDF

Info

Publication number
CN116133197A
CN116133197A CN202211669133.9A CN202211669133A CN116133197A CN 116133197 A CN116133197 A CN 116133197A CN 202211669133 A CN202211669133 A CN 202211669133A CN 116133197 A CN116133197 A CN 116133197A
Authority
CN
China
Prior art keywords
audio
parameters
light effect
distance
experimenter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211669133.9A
Other languages
Chinese (zh)
Inventor
崔为之
张鹏
周凌翔
唐杰
贾巨涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Zhuhai Lianyun Technology Co Ltd
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Zhuhai Lianyun Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai, Zhuhai Lianyun Technology Co Ltd filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN202211669133.9A priority Critical patent/CN116133197A/en
Publication of CN116133197A publication Critical patent/CN116133197A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/165Controlling the light source following a pre-assigned programmed sequence; Logic control [LC]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

The application relates to a light control method, a device, a system, equipment and a storage medium, wherein the method comprises the steps of obtaining audio parameters of audio and video data being played by playing equipment and the distance between an experimenter and the playing equipment; predicting target light effect parameters corresponding to the audio parameters and the distance by adopting a pre-trained model; and controlling a lamp for illuminating the experimenter and/or the playing equipment according to the target light effect parameters. The scheme of the embodiment can control the light effect of the lamp through the audio parameters of the audio and video data, so that an experienter experiences the audio and video data and has an immersive experience.

Description

Light control method, device, system, equipment and storage medium
Technical Field
The present disclosure relates to the field of data processing, and in particular, to a light control method, device, system, apparatus, and storage medium.
Background
With the development of the internet age and the arrival of the big data age, people gradually move from the age with too little information into the age with information overload. The intelligent product is distributed over the living aspects of people, such as smart phones, smart tablets and intelligent products owned by home households. In the intelligent society, people have higher and higher requirements on the intelligent, so that furniture is intelligent in the first color in the decoration of new families at present: intelligent sound box, intelligent refrigerator, intelligent television, intelligent air conditioner, etc. The birth of the intelligent products improves the convenience of life of people, and makes the time utilization of people more important.
At present, people need to have an immersive feeling while watching home theatres, so that the creation of an intelligent home theatre is a technical problem to be solved by the people in the field.
Disclosure of Invention
The application provides a light control method, a device, a system, equipment and a storage medium, which are used for realizing the construction of an intelligent home theater.
In a first aspect, a light control method is provided, including:
acquiring audio parameters of audio and video data being played by a playing device and the distance between an experimenter and the playing device;
predicting target light effect parameters corresponding to the audio parameters and the distance by adopting a pre-trained model;
and controlling a lamp for illuminating the experimenter and/or the playing equipment according to the target light effect parameters.
Optionally, after controlling the lamps that illuminate the experimenter and/or the playing device according to the target light effect parameters, the method further includes:
acquiring current light effect parameters of the lamp; the current light effect parameters are provided by the experienter;
calculating a training error of the model based on the current light effect parameter and the target light effect parameter;
optimizing the model based on the training error.
Optionally, acquiring audio parameters of audio and video data being played by the playing device includes:
converting the audio and video data into logic pulses by adopting a PID algorithm;
and taking the logic pulse as an audio parameter of the audio-video data.
Optionally, acquiring the distance between the experimenter and the playing device includes:
acquiring a first geographic position of the experienter and a second geographic position of the playing device;
based on the first geographic location and the second geographic location, a distance between the experimenter and the played device is calculated.
Optionally, the light effect parameters include:
light brightness and/or light color.
Optionally, before predicting the target light effect parameter corresponding to the audio parameter and the distance by using a pre-trained model, the method further comprises:
acquiring at least one training sample, wherein any training sample comprises an input audio parameter and an input distance which are input by the model, and an output light effect parameter corresponding to the input audio parameter and the input distance;
training the model using the at least one training sample.
In a second aspect, there is provided a light control system comprising:
the device comprises a controller, playing equipment and a lamp;
the controller is used for acquiring audio parameters of audio and video data being played by the playing device and the distance between the experimenter and the playing device; predicting target light effect parameters corresponding to the audio parameters and the distance by adopting a pre-trained model; and controlling a lamp for illuminating the experimenter and/or the playing equipment according to the target light effect parameters.
In a third aspect, there is provided a light control apparatus comprising:
the acquisition module is used for acquiring the audio parameters of the audio and video data being played by the playing equipment and the distance between the experimenter and the playing equipment;
the prediction module is used for predicting target light effect parameters corresponding to the audio parameters and the distance by adopting a pre-trained model;
and the control module is used for controlling the lamps for illuminating the experimenters and/or the playing equipment according to the target light effect parameters.
In a fourth aspect, there is provided an electronic device comprising: the device comprises a processor, a memory and a communication bus, wherein the processor and the memory are communicated with each other through the communication bus;
the memory is used for storing a computer program;
the processor is configured to execute the program stored in the memory, and implement the light control method according to the first aspect.
In a fifth aspect, a computer readable storage medium is provided, in which a computer program is stored, where the computer program, when executed by a processor, implements the light control method according to the first aspect.
Compared with the prior art, the technical scheme provided by the embodiment of the application has the following advantages: according to the method provided by the embodiment of the application, the audio parameters of the audio and video data being played by the playing device and the distance between the experimenter and the playing device are obtained; predicting target light effect parameters corresponding to the audio parameters and the distance by adopting a pre-trained model; and controlling a lamp for illuminating the experimenter and/or the playing equipment according to the target light effect parameters. The scheme of the embodiment can control the light effect of the lamp through the audio parameters of the audio and video data, so that an experienter experiences the audio and video data and has an immersive experience.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
In order to more clearly illustrate the embodiments of the invention or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, and it will be obvious to a person skilled in the art that other drawings can be obtained from these drawings without inventive effort.
Fig. 1 is a schematic flow chart of a light control method in an embodiment of the present application;
FIG. 2 is a schematic flow chart of a light control method according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a light control device according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a light control system according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a light control device according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device in an embodiment of the present application.
Detailed Description
For the purposes of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present application based on the embodiments herein.
It should be noted that the terms "first," "second," and the like in the description and claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
In order to solve the technical problems in the related art, the embodiment of the application provides a light control method which can be applied to electronic equipment.
The electronic device described in the embodiments of the present application may include a terminal or a server, which is not limited in the embodiments of the present application. The terminal device may be a mobile terminal such as a mobile phone, a tablet computer, a notebook computer, a palm top computer, a PAD (Personal Digital Assistant ), a PMP (Portable Media Player, portable multimedia player), a navigation apparatus, etc., a stationary terminal such as a digital TV, a desktop computer, etc.
As shown in fig. 1, the method comprises the steps of:
step 101, obtaining audio parameters of audio and video data being played by the playing device and the distance between the experimenter and the playing device.
In this embodiment, the playing device may be a television, a projector, a handheld computer, etc., which is not limited in this embodiment.
In this embodiment, the experienter is a user who experiences audio and video data of the playing device, and takes the playing device as a television as an example, and the experienter can be a user who watches the television.
In application, the audio/video data is video data or audio data, which is not particularly limited in this embodiment. For example, the audio-video data may be a piece of movie work watched by the user, or the audio-video data may be a piece of audio novel.
In this embodiment, the audio parameters may be parameters such as volume and tone, which are not particularly limited in this embodiment.
In this embodiment, the distance between the experimenter and the playing device may be determined by the geographic location.
In a specific implementation, in an alternative embodiment, a first geographic location of an experimenter and a second geographic location of a playing device are obtained; based on the first geographic location and the second geographic location, a distance between the experimenter and the played device is calculated.
In this embodiment, the first geographic location and the second geographic location may both be represented by longitude and latitude. Wherein the formula adopted in calculating the distance between two points represented by longitude and latitude is as follows:
d=r*Arcos[cos(Y1)*cos(Y2)*cos(x1-x2)+sin(Y1)*sin(Y2)]
wherein d is the distance between two points, and corresponds to the distance between the experimenter and the playing device in the embodiment; (x 1, Y1) is the longitude and latitude of one point, (x 2, Y2) is the longitude and latitude of another point, and the corresponding (x 1, Y1) in this embodiment may be the first geographic location of the experimenter, and (x 2, Y2) may be the second geographic location of the playing device; r is the earth radius, typically r= 6371.0 km.
In this embodiment, the playing device 5 and the experimenter in the smart home space may be obtained through image recognition, so as to obtain the first geographic position of the experimenter and the second geographic position of the playing device.
In this embodiment, the audio parameters of the audio/video data may be extracted by a PID algorithm.
In a specific implementation, in an alternative embodiment, a PID algorithm is used to convert audio and video data into logic pulses; the logic pulses are used as audio parameters of the audio-video data.
0, wherein the formula adopted by the PID algorithm is as follows:
Figure BDA0004014980640000061
wherein k is p Represents proportional gain, T d Is a differential time constant, T t As the integration time constant, U (t) is an output signal, and in this embodiment, U (t) is an audio parameter of the audio/video data; t is time, err (t) is the error of the audio and video frames at different moments.
And 102, predicting target lamplight effect parameters corresponding to the audio parameters and the distance by adopting a pre-trained model.
The algorithm used by the model in this embodiment includes, but is not limited to, apollo (Apollo)
An algorithm.
In the application, training samples can be adopted in advance to train the model, the training samples are derived from big data, the input audio parameters and the input distance of the model are marked in each training sample, and meanwhile, the output light effect parameters which the model should output under the input audio parameters and the input distance are marked, so after training the model by adopting the training samples, after inputting specific audio parameters and the specific distance into the model, the model can output the light effect parameters which are matched with the audio parameters and the specific distance under the data condition.
The light effect parameters in this embodiment may be light brightness, light color, etc., which is not specifically limited in this embodiment.
And 103, controlling the lamps for illuminating the experimenters and/or the playing equipment according to the target light effect parameters.
It should be understood that when the target light effect parameter is light brightness, the light brightness of the lamp for illuminating the experimenter and/or the playing device is adjusted according to the light brightness; when the target light effect parameter is the light color, adjusting the light brightness of the lamp for illuminating the experimenter and/or the playing equipment according to the light color; when the target light effect parameters are light brightness and light color, the light brightness and the light color of the lamp for illuminating the experimenter and/or the playing device are adjusted according to the light color and the light brightness.
In application, because the model is obtained according to big data training, the target light effect output by the model may not meet the actual requirement of the experienter due to individual variability of the experienter, and in this case, the experienter can change the light effect parameters of the lamp according to the self requirement. In order to better adapt the model to the experimenter, the model may also be optimized based on the current lighting effect of the luminaire provided by the user.
When the method is specifically implemented, in an optional embodiment, after controlling a lamp for illuminating an experimenter and/or playing equipment according to target light effect parameters, current light effect parameters of the lamp are obtained; the current light effect parameters are provided by the experimenter; calculating training errors of the model based on the current light effect parameters and the target light effect parameters; the model is optimized based on the training error.
In the technical scheme provided by the embodiment, the audio parameters of the audio and video data being played by the playing device and the distance between the experimenter and the playing device are obtained; predicting target light effect parameters corresponding to the audio parameters and the distance by adopting a pre-trained model; and controlling a lamp for illuminating the experimenter and/or the playing equipment according to the target light effect parameters. The scheme of the embodiment can control the light effect of the lamp through the audio parameters of the audio and video data, so that an experienter experiences the audio and video data and has an immersive experience.
Referring to fig. 2, fig. 2 is a schematic diagram of a light control method according to an embodiment of the present application, where the method may include the following steps:
step 201, acquiring audio and video data being played by a playing device;
step 202, converting audio and video data into logic pulses by adopting a PID algorithm to obtain audio parameters of the audio and video data;
step 203, acquiring a first geographic position of an experienter and a second geographic position of a playing device;
step 204, obtaining a distance between the experimenter and the playing device based on the first geographic position and the second geographic position;
step 205, predicting target lamplight effect parameters corresponding to the audio parameters and the distance by adopting a pre-trained model;
step 206, controlling the lamps for illuminating the experimenters and/or the playing equipment according to the target light effect parameters;
step 207, obtaining current lamplight effect parameters of the lamp; the current light effect parameters are provided by the experimenter;
step 208, calculating training errors of the model based on the current light effect parameters and the target light effect parameters;
step 209, optimizing the model based on the training error.
It should be understood that the present example does not limit the execution order between steps 201-202 and 203-204, i.e., steps 201-202 may be performed before steps 203-204, and steps 201-202 may be performed after steps 203-204. When steps 201 to 202 are performed after steps 203 to 204, steps 201 and 202 become steps 203 and 204, respectively, and steps 203 and 204 become steps 201 and 202, respectively.
In the technical scheme provided by the embodiment, audio parameters in audio and video data are acquired through a PID algorithm, the distance between an experimenter and playing equipment is acquired through a geographic position, then a pre-trained model is adopted to acquire target light effect parameters corresponding to the audio parameters and the distance, and finally the lamp is controlled according to the target light effect parameters. And may also optimize the model based on current light effect parameters provided by the experimenter. The scheme of the embodiment can control the light effect of the lamp through the audio parameters of the audio and video data, so that an experienter experiences the audio and video data and has an immersive experience. And the model is optimized through the current light effect parameters, so that the matching degree of the model and an experienter is higher, and the target light effect parameters output by the model enable the user to have more realistic feeling.
Based on the same inventive concept, an embodiment of the present application provides a light control apparatus, as shown in fig. 3, which may include:
the system comprises a video and audio signal acquisition module 301, a central control module 302, a bus adaptation module 303, a node execution module 304 and a light effect acquisition and feedback module 305.
The video and audio signal acquisition module 301 directly analyzes audio and video input by a TV (television), converts the audio signal and/or video signal strength into logic pulses transmitted synchronously through PID algorithm processing, and transmits the logic pulses to the central control module 302. The light effect and feedback module 305 collects the light effect near the viewer and synchronously feeds back the signal to the central control module 302; the central control module 302 selects an optimal light path to process by using an Apollo algorithm, and transmits the optimal light path to the node execution module 304 through the bus adaptation module; the node execution module 304 receives the control command and then controls the light to be most suitable for the light to match with the picture and the sound effect, so that the viewer can feel the feeling of being in the scene.
In this embodiment, the video and audio signal acquisition module 301 does not need a special sensor to acquire the generated signal, directly intercepts and analyzes the TV input audio and video signal, discriminates the strength of the generated analog signal, and processes the analog signal through a PID algorithm. The decoder and logic circuit combination is used to convert the audio/video signal strength into a logic pulse for synchronous transmission, and the logic pulse is transmitted to the central control module 302.
In this embodiment, the central control module 302 uses a single-chip microcomputer to complete the central module control. The uplink adopts parallel interface data, and meanwhile, the uplink parallel interface bears the task of receiving downlink signal feedback. And an Apollo algorithm is arranged in the controller, and a serial node feedback control mode is adopted for the downlink interface.
In this embodiment, the central control module 302 may be implemented by a single chip microcomputer, which is not limited in this embodiment.
The bus adapting module 303 is responsible for communication and signal conversion between the central control module 302 and the downlink nodes, and provides downlink control commands for the central control module 302, status command transmission after status feedback of the lower nodes is compared with uplink interface signals of the central control module 302, address allocation between the downlink nodes, and CAN mode RS232 communication function.
The node execution module 304 controls the control receiver lamp after the node receives the control command. The node uplink adopts serial synchronous receiving, decoding and compiling, control command sending, control command executing, executing state feedback and communication on buses among the nodes.
The light effect collection and feedback module 305 collects light effects near the viewer, directly communicates with the central control module 302, and feeds back synchronous light effects for the viewer.
Based on the same inventive concept, the embodiments of the present application provide a light control system, and the specific implementation of the system may be referred to the description of the embodiment parts of the method, and the details are not repeated, as in fig. 4, where the system may include:
a controller 401, a playback device 402 and a luminaire 403;
the controller 401 is configured to obtain audio parameters of audio and video data being played by the playing device 402, and a distance between the experimenter and the playing device 402; predicting target light effect parameters corresponding to the audio parameters and the distance by adopting a pre-trained model; the luminaires 403 illuminating the experimenters and/or the playback devices are controlled according to the target light effect parameters.
Based on the same inventive concept, the embodiments of the present application provide a light control device, the specific implementation of which can be referred to the description of the embodiment parts of the method, and the repetition is omitted, as shown in fig. 5, where the system may include:
the obtaining module 501 is configured to obtain audio parameters of audio and video data being played by the playing device, and a distance between an experimenter and the playing device;
the prediction module 502 is configured to predict a target lighting effect parameter corresponding to the audio parameter and the distance by using a pre-trained model;
a control module 503, configured to control a lamp that illuminates the experimenter and/or the playing device according to the target lighting effect parameter.
The device is also used for:
according to the target light effect parameters, after controlling the lamps for illuminating the experimenters and/or the playing equipment, obtaining the current light effect parameters of the lamps; the current light effect parameters are provided by the experimenter;
calculating training errors of the model based on the current light effect parameters and the target light effect parameters;
the model is optimized based on the training error.
The acquisition module 501 is configured to:
converting the audio and video data into logic pulses by adopting a PID algorithm;
the logic pulses are used as audio parameters of the audio-video data.
The acquisition module 501 is configured to:
acquiring a first geographic position of an experienter and a second geographic position of playing equipment;
based on the first geographic location and the second geographic location, a distance between the experimenter and the played device is calculated.
The light effect parameters include:
light brightness and/or light color.
The device is also used for
Before a pre-trained model is adopted to predict target light effect parameters corresponding to the audio parameters and the distances, at least one training sample is obtained, wherein any training sample comprises input audio parameters and input distances input by the model, and output light effect parameters corresponding to the input audio parameters and the input distances;
the model is trained using at least one training sample.
Based on the same concept, the embodiment of the application also provides an electronic device, as shown in fig. 6, where the electronic device mainly includes: processor 601, memory 602, and communication bus 606, wherein processor 601 and memory 602 communicate with each other via communication bus 606. The memory 602 stores a program executable by the processor 601, and the processor 601 executes the program stored in the memory 602 to implement the following steps:
acquiring audio parameters of audio and video data being played by the playing device and the distance between an experimenter and the playing device; predicting target light effect parameters corresponding to the audio parameters and the distance by adopting a pre-trained model; and controlling a lamp for illuminating the experimenter and/or the playing equipment according to the target light effect parameters.
The communication bus 606 mentioned in the above-mentioned electronic device may be a peripheral component interconnect standard (Peripheral Component Interconnect, abbreviated to PCI) bus or an extended industry standard architecture (Extended Industry Standard Architecture, abbreviated to EISA) bus, or the like. The communication bus 606 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in fig. 6, but not only one bus or one type of bus.
The memory 602 may include random access memory (Random Access Memory, simply RAM) or may include non-volatile memory (non-volatile memory), such as at least one disk memory. Alternatively, the memory may be at least one memory device located remotely from the aforementioned processor 601.
The processor 601 may be a general-purpose processor including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), a digital signal processor (Digital Signal Processing, DSP), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), a Field programmable gate array (Field-Programmable Gate Array, FPGA), or other programmable logic device, discrete gate or transistor logic device, or discrete hardware components.
In a further embodiment of the present application, there is also provided a computer-readable storage medium having stored therein a computer program which, when run on a computer, causes the computer to perform the light control method described in the above embodiments.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer instructions are loaded and executed on a computer, the processes or functions described in accordance with the embodiments of the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, by a wired (e.g., coaxial cable, optical fiber, digital Subscriber Line (DSL)), or wireless (e.g., infrared, microwave, etc.) means from one website, computer, server, or data center to another. The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape, etc.), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk), etc.
It should be noted that in this document, relational terms such as "first" and "second" and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The foregoing is only a specific embodiment of the invention to enable those skilled in the art to understand or practice the invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. A method of controlling light comprising:
acquiring audio parameters of audio and video data being played by a playing device and the distance between an experimenter and the playing device;
predicting target light effect parameters corresponding to the audio parameters and the distance by adopting a pre-trained model;
and controlling a lamp for illuminating the experimenter and/or the playing equipment according to the target light effect parameters.
2. The method of claim 1, wherein after controlling the light fixtures illuminating the experimenter and/or the playback device according to the target light effect parameters, further comprising:
acquiring current light effect parameters of the lamp; the current light effect parameters are provided by the experienter;
calculating a training error of the model based on the current light effect parameter and the target light effect parameter;
optimizing the model based on the training error.
3. The method of claim 1, wherein obtaining audio parameters of audio-visual data being played by the playback device comprises:
converting the audio and video data into logic pulses by adopting a PID algorithm;
and taking the logic pulse as an audio parameter of the audio-video data.
4. The method of claim 1, wherein obtaining a distance between an experimenter and the playback device comprises:
acquiring a first geographic position of the experienter and a second geographic position of the playing device;
based on the first geographic location and the second geographic location, a distance between the experimenter and the played device is calculated.
5. A method according to any one of claims 1-4, wherein the light effect parameters comprise:
light brightness and/or light color.
6. The method of claim 1, wherein prior to predicting the target light effect parameter corresponding to the audio parameter and the distance using a pre-trained model, further comprising:
acquiring at least one training sample, wherein any training sample comprises an input audio parameter and an input distance which are input by the model, and an output light effect parameter corresponding to the input audio parameter and the input distance;
training the model using the at least one training sample.
7. A light control system, comprising:
the device comprises a controller, playing equipment and a lamp;
the controller is used for acquiring audio parameters of audio and video data being played by the playing device and the distance between the experimenter and the playing device; predicting target light effect parameters corresponding to the audio parameters and the distance by adopting a pre-trained model; and controlling a lamp for illuminating the experimenter and/or the playing equipment according to the target light effect parameters.
8. A light control apparatus, comprising:
the acquisition module is used for acquiring the audio parameters of the audio and video data being played by the playing equipment and the distance between the experimenter and the playing equipment;
the prediction module is used for predicting target light effect parameters corresponding to the audio parameters and the distance by adopting a pre-trained model;
and the control module is used for controlling the lamps for illuminating the experimenters and/or the playing equipment according to the target light effect parameters.
9. An electronic device, comprising: the device comprises a processor, a memory and a communication bus, wherein the processor and the memory are communicated with each other through the communication bus;
the memory is used for storing a computer program;
the processor is configured to execute a program stored in the memory, and implement the light control method according to any one of claims 1 to 6.
10. A computer readable storage medium storing a computer program, wherein the computer program when executed by a processor implements the light control method of any one of claims 1-6.
CN202211669133.9A 2022-12-23 2022-12-23 Light control method, device, system, equipment and storage medium Pending CN116133197A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211669133.9A CN116133197A (en) 2022-12-23 2022-12-23 Light control method, device, system, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211669133.9A CN116133197A (en) 2022-12-23 2022-12-23 Light control method, device, system, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116133197A true CN116133197A (en) 2023-05-16

Family

ID=86309259

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211669133.9A Pending CN116133197A (en) 2022-12-23 2022-12-23 Light control method, device, system, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116133197A (en)

Similar Documents

Publication Publication Date Title
US10966044B2 (en) System and method for playing media
US10991462B2 (en) System and method of controlling external apparatus connected with device
KR102177830B1 (en) System and method for controlling external apparatus connenced whth device
US11217241B2 (en) Method for providing content and electronic device supporting the same
RU2479018C2 (en) Method, system and user interface for automatically creating atmosphere, particularly lighting atmosphere based on keyword input
CN102845076B (en) Display apparatus, control apparatus, television receiver, method of controlling display apparatus, program, and recording medium
US20080046944A1 (en) Ubiquitous home media service apparatus and method based on smmd, and home media service system and method using the same
TW201719333A (en) A voice controlling system and method
KR20160127737A (en) Information processing apparatus, information processing method, and program
JP2017010516A (en) Method, apparatus, and terminal device for human-computer interaction based on artificial intelligence
CN103929662A (en) Electronic Apparatus And Method Of Controlling The Same
KR102175165B1 (en) System and method for controlling external apparatus connenced whth device
CN112136102B (en) Information processing apparatus, information processing method, and information processing system
CN114067798A (en) Server, intelligent equipment and intelligent voice control method
KR102535152B1 (en) Display device and method for controlling thereof
US11012780B2 (en) Speaker system with customized audio experiences
KR20190059509A (en) Electronic apparatus and the control method thereof
CN116133197A (en) Light control method, device, system, equipment and storage medium
WO2022268136A1 (en) Terminal device and server for voice control
KR101625900B1 (en) Smart Audio apparatus based on IoT
CN113608449B (en) Speech equipment positioning system and automatic positioning method in smart home scene
US20230261897A1 (en) Display device
CN105981479A (en) Balance adjustment control method for sound/illumination devices
US11818820B2 (en) Adapting a lighting control interface based on an analysis of conversational input
CN109819297A (en) A kind of method of controlling operation thereof and set-top box

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination