CN115038210A - Indoor light adjusting method and adjusting system - Google Patents

Indoor light adjusting method and adjusting system Download PDF

Info

Publication number
CN115038210A
CN115038210A CN202210739561.8A CN202210739561A CN115038210A CN 115038210 A CN115038210 A CN 115038210A CN 202210739561 A CN202210739561 A CN 202210739561A CN 115038210 A CN115038210 A CN 115038210A
Authority
CN
China
Prior art keywords
information
state
emotional
emotion
person
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210739561.8A
Other languages
Chinese (zh)
Inventor
张彩兰
张昊川
涂中秋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Cl Lighting Technology Co ltd
Original Assignee
Shenzhen Cl Lighting Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Cl Lighting Technology Co ltd filed Critical Shenzhen Cl Lighting Technology Co ltd
Priority to CN202210739561.8A priority Critical patent/CN115038210A/en
Publication of CN115038210A publication Critical patent/CN115038210A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B45/00Circuit arrangements for operating light-emitting diodes [LED]
    • H05B45/20Controlling the colour of the light
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • H05B47/12Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by detecting audible sound
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • H05B47/125Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by using cameras
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Abstract

The invention relates to the technical field of light control, in particular to an indoor light adjusting method and an indoor light adjusting system, wherein the method comprises the following steps: acquiring voice information, face information, infrared scanning information and user behavior information of indoor personnel; analyzing according to the voice information, the face information, the infrared scanning information and the user behavior information to obtain the emotional states of the personnel, wherein the emotional states of the personnel comprise normal emotional states and abnormal emotional states; generating an illumination mode according to the emotional upset state under the condition that the emotional state of the person is the emotional upset state; and controlling the lamp to illuminate according to the illumination mode. The method and the device have the advantages that different illumination modes can be automatically adjusted according to the emotion of the user, and the user experience degree is improved.

Description

Indoor light adjusting method and adjusting system
Technical Field
The invention relates to the technical field of light control, in particular to an indoor light adjusting method and an indoor light adjusting system.
Background
At present, some LED lamps with adjustable emotions exist, the LED lamps have multiple preset emotion modes, different colors of light can be emitted under different emotion modes, and users can select to use the LED lamps according to own needs. The products can not actively adjust corresponding light for users, the users need to select the mode of the lamp by themselves, and the user experience is poor.
Disclosure of Invention
In order to realize that different illumination modes are automatically adjusted according to the emotion of a user and improve the user experience degree, the application provides an indoor light adjusting method and an indoor light adjusting system.
The above object of the present invention is achieved by the following technical solutions:
an indoor light adjusting method, comprising:
acquiring voice information, face information, infrared scanning information and user behavior information of indoor personnel;
analyzing according to the voice information, the face information, the infrared scanning information and the user behavior information to obtain the emotional states of the personnel, wherein the emotional states of the personnel comprise normal emotional states and abnormal emotional states;
generating an illumination mode according to the emotional upset state under the condition that the emotional state of the person is the emotional upset state;
and controlling the lamp to illuminate according to the illumination mode.
By adopting the technical scheme, when a person is located indoors, the system analyzes the emotion state of the person by acquiring the three information of voice information, face information, infrared scanning information and user behavior information of the person, when the person is in emotional abnormality, the corresponding illumination mode is generated according to the specifically acquired emotional abnormality state, then the lamp is controlled to illuminate, so that different illumination modes can be automatically adjusted according to the emotion of the user, and the user experience is improved.
The present application may be further configured in a preferred example to: the emotional state of the person is obtained by analyzing the voice information, the face information, the infrared scanning information and the user behavior information, wherein the emotional state of the person comprises a normal emotional state and an abnormal emotional state, and the method comprises the following steps:
analyzing the voice information to obtain a voice emotion state;
analyzing the face information to obtain the face emotion state;
analyzing the infrared scanning information to obtain an infrared temperature emotional state;
and inputting the voice emotion state, the face emotion state, the infrared temperature emotion state and the user behavior information into an emotion weight analysis model, and reasoning to obtain the emotion state of the person.
By adopting the technical scheme, the collected three information are respectively analyzed to obtain the corresponding voice emotion state, the face emotion state and the infrared temperature emotion state, then the voice emotion state, the face emotion state and the infrared temperature emotion state are input into a pre-trained emotion weight analysis model in combination with user behavior information to infer and obtain the correct emotion state of the personnel, and further the emotion of the personnel reflected by different information is referred, and the different information can be summarized and inferred through the model inference again in combination with the user behavior information to obtain the true emotion of the personnel to control the lighting of the lamp.
The application may be further configured in a preferred example to: further comprising:
and generating an illumination mode according to the user behavior information under the condition that the emotion state of the person is a normal emotion state.
By adopting the technical scheme, the system can also acquire the user behavior information of indoor personnel, and when the emotional state of the personnel is normal, the system can match a proper illumination mode for the personnel according to the specific user behavior information such as dancing, running and the like.
The application may be further configured in a preferred example to: further comprising:
under the condition that the emotional state of the person is abnormal, analyzing whether a crying state of the child exists or not according to the voice information and the infrared scanning information;
and under the condition that the child cry exists, controlling the lamp to illuminate according to a preset rule.
By adopting the technical scheme, when the system detects that the child crying is in the room, the lamp can be controlled to illuminate according to the preset rule, so that the aims of comforting the child and attracting the attention of the child to cry are fulfilled.
The present application may be further configured in a preferred example to: the control lamps and lanterns illumination according to preset rule includes:
and controlling the lamp to flash and illuminate within a preset time according to a preset frequency, a preset brightness and a preset color.
The present application may be further configured in a preferred example to: controlling the lighting of the luminaire according to the illumination pattern, comprising:
and under the condition that the emotional state of the person is an emotional abnormal state, controlling the lighting of the lamp according to the sudden change of the illumination mode.
By adopting the technical scheme, when the emotion of the personnel is abnormal, the system can remind the personnel to relax from the abnormal emotional state in a light sudden change mode.
The present application may be further configured in a preferred example to: and under the condition that the emotion state of the person is an emotion normal state, gradually controlling the lighting of the lamp according to the illumination mode.
The second purpose of the invention of the application is realized by the following technical scheme:
an indoor light regulating system comprising:
the acquisition module is used for acquiring voice information, face information, infrared scanning information and user behavior information of indoor personnel;
the analysis module is used for analyzing according to the voice information, the face information, the infrared scanning information and the user behavior information to obtain the emotional states of the personnel, wherein the emotional states of the personnel comprise normal emotional states and abnormal emotional states;
the illumination mode generation module is used for generating an illumination mode according to the emotional abnormal state under the condition that the emotional state of the person is the emotional abnormal state;
and the control module is used for controlling the illumination of the lamp according to the illumination mode.
The application may be further configured in a preferred example to: the analysis module includes:
the first analysis unit is used for analyzing the voice information to obtain a voice emotion state;
the second analysis unit is used for analyzing the face information to obtain the face emotion state;
the third analysis unit is used for analyzing the infrared scanning information to obtain an infrared temperature emotional state;
and the weight analysis unit is used for inputting the speech emotion state, the face emotion state, the infrared temperature emotion state and the user behavior information into the emotion weight analysis model and deducing to obtain the emotion state of the person.
In summary, the present application includes at least one of the following beneficial technical effects:
1. when a person is located indoors, the system obtains emotion states of the person through analysis of three information, namely voice information, face information, infrared scanning information and user behavior information of the person, when the person is in an abnormal emotion state, a corresponding illumination mode is generated according to the obtained emotion abnormal state, then illumination of the lamp is controlled, different illumination modes are automatically adjusted according to the emotion of the user, and user experience is improved;
2. after the collected three information are respectively analyzed, corresponding voice emotion state, face emotion state and infrared temperature emotion state are obtained, then the voice emotion state, the face emotion state and the infrared temperature emotion state are input into a pre-trained emotion weight analysis model in combination with user behavior information to infer and obtain correct emotion state of the person, and then the emotion of the person reflected by different information is referred, and the different information can be inferred and summarized again through model inference in combination with the user behavior information to obtain real emotion of the person to control lighting of the lamp;
3. when the system detects that the child crying exists indoors, the lamp can be controlled to illuminate according to a preset rule, so that the aims of comforting the child and attracting the attention of the child to stop crying are fulfilled; when the emotion of the person is abnormal, the system can remind the person of coming from the state of abnormal emotion in a mode of sudden change of light.
Drawings
Fig. 1 is a flowchart illustrating an implementation of an indoor light adjusting method according to an embodiment of the present disclosure;
fig. 2 is a flowchart illustrating an implementation of step S13 in the method for adjusting indoor light according to an embodiment of the present application;
FIG. 3 is a flow chart of an implementation of a method for adjusting indoor lighting in another embodiment of the present application;
fig. 4 is a schematic diagram of modules of an indoor light adjusting system according to an embodiment of the present disclosure.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the understanding of the same, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
It should be noted that the terms "first", "second", etc. in the present invention are used for distinguishing similar objects, and are not necessarily used for describing a particular order or sequence. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the disclosure described herein are capable of operation in other sequences than those illustrated or described herein. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure.
In addition, the term "and/or" herein is only one kind of association relationship describing an associated object, and means that there may be three kinds of relationships, for example, a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship, unless otherwise specified.
Fig. 1 is a flowchart illustrating an implementation of an indoor light adjusting method according to an embodiment of the present application, where the indoor light adjusting method includes:
s11, acquiring voice information, face information, infrared scanning information and user behavior information of indoor personnel;
specifically, a voice collecting device such as a sound receiver, a sound pickup and the like is installed indoors, a shooting module such as a camera and an infrared scanner are installed indoors, the voice collecting device is used for acquiring voice information, the shooting module is used for acquiring face information of people, the infrared scanner is used for acquiring infrared scanning information of the people indoors, the voice collecting device, the shooting module and the infrared scanner are all in communication connection with a server of the system and send the acquired information to the server of the system, and particularly, communication is preferably achieved in a wireless communication mode such as a LoRa technology, a WiFi/IEEE 802.11 protocol, a ZigBee/802.15.4 protocol, a Thread/IEEE 802.15.4, a Z-Wave protocol and the like.
The user behavior information can be acquired through the shooting module to obtain indoor video stream data, and then the video stream data of preset seconds is reasoned through the behavior model to obtain corresponding indoor user behavior information, specifically, the behavior model is obtained through the following training mode:
performing labeling processing on each video stream data sample in the video stream data sample training set to label out user behavior information of each video stream data sample, wherein the user behavior information is associated with all or part of information in the video stream data samples; and training the neural network through the video stream data sample training set subjected to labeling processing to obtain a behavior model.
The user behavior information includes but is not limited to eating, sleeping, chatting, running, singing, washing dishes, sitting, doing, fighting and the like, and the user behavior information obtained through behavior model training may include various behaviors, for example, the user behavior information inferred from video stream data may include washing dishes and chatting.
S13, analyzing according to the voice information, the face information, the infrared scanning information and the user behavior information to obtain the emotional state of the person;
specifically, the emotional state of the person includes a normal emotional state and a disordered emotional state, and the emotional state of the person may include: anger, sadness, joy, panic, fear, thoughts, excitement, etc.; wherein, joy, thoughts and excitement are normal states of emotion, anger, sadness, panic and fear are abnormal states of emotion;
referring to fig. 2, S13 includes:
s131, analyzing the voice information to obtain a voice emotion state;
specifically, the speech emotion state is obtained by reasoning through a pre-trained speech model, and the speech model is obtained by training in the following way:
labeling each voice information sample in the voice information sample training set to label the voice emotion state of each voice information sample, wherein the voice emotion state is associated with all or part of information in the voice information samples; and training the neural network through the labeled voice information sample training set to obtain a voice model.
Wherein the voice emotion state comprises angry voice emotion, sad voice emotion, happy voice emotion, panic voice emotion, fear voice emotion, thoughtful voice emotion and the like;
s133, analyzing the face information to obtain the face emotion state;
specifically, the face information is a face image, the emotion state of the face is obtained by reasoning through a pre-trained face model, and the face model is obtained by training in the following way:
labeling each face information sample in the face information sample training set to label the face emotion state of each face information sample, wherein the face emotion state is associated with all or part of information in the face information samples; and training the neural network through the labeled human face information sample training set to obtain a human face model. The human face emotional state comprises angry expression emotion, sad expression emotion, joyful expression emotion, panic expression emotion, fear expression emotion, thoughtful expression emotion and the like;
s135, analyzing the infrared scanning information to obtain an infrared temperature emotional state;
specifically, the infrared scanning information is an infrared scanning image of a person, and the temperature of each part in the infrared scanning image obtained by scanning with an infrared scanner is different, so that the colors displayed by the image for representing the temperature are different, the emotional state of the infrared temperature is obtained by reasoning through a pre-trained infrared discrimination model, and the infrared discrimination model is obtained by training in the following way:
labeling each infrared scanning image sample in the infrared scanning image sample training set to label the infrared temperature emotional state of each infrared scanning image sample, wherein the infrared temperature emotional state is associated with all or part of information in the infrared scanning image sample; and training the neural network through the infrared scanning image sample training set subjected to labeling processing to obtain an infrared discrimination model. Further, displaying colors of different areas in the infrared scanning image sample, and corresponding to each infrared temperature emotion state of the training result, wherein the infrared temperature emotion states comprise angry infrared temperature emotion, sad infrared temperature emotion, joy infrared temperature emotion, panic infrared temperature emotion, fear infrared temperature emotion, thoughts infrared temperature emotion and the like;
and S137, inputting the voice emotional state, the face emotional state, the infrared temperature emotional state and the user behavior information into an emotional weight analysis model, and reasoning to obtain the emotional state of the person.
The emotion weight analysis model is obtained by training in the following way:
labeling each emotion weight sample in the emotion weight information sample training set to label the actual person emotion state of each emotion weight information sample, wherein each group of emotion weight information samples comprises a voice emotion state, a human face emotion state, an infrared temperature emotion state and user behavior information, and the actual person emotion state is associated with all or part of information in the emotion weight information samples; and training the neural network through the emotion weight information sample training set subjected to labeling processing to obtain an emotion weight analysis model.
The emotional states of the person include, but are not limited to, normal, anger, sadness, joy, panic, fear, thoughts, excitement, and the like, for example, a set of emotional weighting information sample contents is: fear speech emotion, anger expression emotion, anger infrared temperature emotion, chat and run, and the actual corresponding actual staff emotion state is excited, the group of emotion weight information samples are marked through excitation, and then the neural network is trained through a plurality of groups of such marked emotion weight information samples to obtain an emotion weight analysis model.
And furthermore, through two-level model reasoning, the result deduced through the four first-level models is input into the two-level models for reasoning, the emotion of the person reflected by different information is referred, and the different information can be summarized and deduced through the model reasoning, so that the true emotion of the person is obtained to control the lighting of the lamp.
S15, generating an illumination mode according to the emotional upset state when the emotional state of the person is the emotional upset state;
that is, the corresponding lighting mode is matched in the preset lookup table according to the emotional upset state, and the preset lookup table is a preset table for querying the lighting mode. That is, the illumination mode is used as a value in the key value pair, and is pre-recorded through the table, and subsequently, only the corresponding illumination mode needs to be searched by using the key in the key value pair. In this case, each emotional abnormality state is a key in the key value pair, and the illumination mode is a value in the key value pair. Therefore, the abnormal states of the emotions can be used as keys to search the corresponding illumination modes; the illumination pattern includes illumination brightness and illumination color.
And S17, generating an illumination mode according to the user behavior information under the condition that the emotion state of the person is normal.
When the emotional state of the person is normal, for example, the emotional state of the person is happy, a proper illumination mode can be matched for the person according to specific user behavior information such as dancing, running and the like; specifically, the matching of the illumination mode through the user behavior information is also queried through a preset query table, and the preset query table also stores the corresponding relationship between each user behavior information and the illumination mode in advance. In this case, the user behavior information is the key in the key value pair, and the illumination mode is the value in the key value pair. Therefore, the corresponding illumination mode can be searched by taking the behavior information of each user as a key.
And S19, controlling the lamp to illuminate according to the illumination mode.
Specifically, the light fixture may include a plurality of light bulbs, and S19 includes:
and S191, under the condition that the emotional state of the person is an emotional disorder state, controlling the lamp to illuminate according to the abrupt change of the illumination mode.
And S193, under the condition that the emotion state of the person is an emotion normal state, gradually controlling the lamp to illuminate according to the illumination mode.
Namely, after the corresponding illumination mode is obtained through matching, under the condition that the emotion state of the person is an emotional disorder state, such as an anger state, the lamp is controlled to suddenly change from the original illumination mode to the matched illumination mode, wherein sudden change means instantaneous change; and under the condition that the emotional state of the person is a normal emotional state, such as a happy state, the lamp is controlled to gradually change from the original illumination mode to the matched illumination mode, wherein gradual change means gradual/slow/gradient change within a time span, such as gradual change to the matched illumination mode within 10 s;
with reference to fig. 3, in an embodiment, the indoor light adjusting method further includes:
s21, analyzing whether the crying state of the child exists or not according to the voice information and the infrared scanning information under the condition that the emotional state of the person is abnormal;
specifically, the voice information and the infrared scanning information are input into a crying analysis model of the child, and whether the crying state of the child exists or not is obtained through reasoning.
The crying analysis model of the children is obtained by training in the following way:
labeling each child crying analysis sample in the child crying analysis sample training set to mark whether each child crying analysis sample has a child crying state or not, wherein each group of child crying analysis samples comprise a voice information sample and an infrared scanning information sample, and whether the child crying state exists or not is related to all or part of information in the child crying analysis samples; and training the neural network through the labeled child crying analysis sample training set to obtain a child crying analysis model.
The method comprises the steps that voice information reflects sound of a child, infrared scanning information reflects body size of the child, then a crying analysis model of the child is obtained through combined training of the voice information and the infrared scanning information, the collected voice information and the collected infrared scanning information are input into the crying analysis model of the child, and whether the child cryes indoors or not can be obtained through analysis.
And S23, controlling the lamp to illuminate according to a preset rule under the condition that the child is crying and screaming.
Specifically, controlling the lighting of the lamp according to the preset rule comprises: controlling the lamp to flash and illuminate within a preset time according to a preset frequency, a preset brightness and a preset color; that is, the preset rule is preset, for example, the lamp is preset to illuminate at a preset frequency of flashing every 0.5s within a preset time of 20s, at a flashing brightness of 150lx every time, and at a flashing pink color every time; the brightness and color of each flash can also be preset to different brightness and color to improve the interest, so as to achieve the purpose of comforting children and attracting the attention of the children to stop crying.
The application still provides an indoor lighting governing system, refers to fig. 4, includes:
the acquisition module is used for acquiring voice information, face information, infrared scanning information and user behavior information of indoor personnel;
the analysis module is used for analyzing according to the voice information, the face information, the infrared scanning information and the user behavior information to obtain the emotional states of the personnel, wherein the emotional states of the personnel comprise normal emotional states and abnormal emotional states;
the illumination mode generation module is used for generating an illumination mode according to the emotional abnormal state under the condition that the emotional state of the person is the emotional abnormal state;
and the control module is used for controlling the illumination of the lamp according to the illumination mode.
In one embodiment, the analysis module comprises:
the first analysis unit is used for analyzing the voice information to obtain a voice emotion state;
the second analysis unit is used for analyzing the face information to obtain the face emotion state;
the third analysis unit is used for analyzing the infrared scanning information to obtain an infrared temperature emotional state;
and the weight analysis unit is used for inputting the speech emotion state, the face emotion state, the infrared temperature emotion state and the user behavior information into the emotion weight analysis model and deducing to obtain the emotion state of the person.
For specific definition of the indoor light adjusting system, reference may be made to the above definition of the indoor light adjusting method, which is not described herein again. The steps of the indoor light adjusting method can be wholly or partially realized by software, hardware and a combination thereof.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs (also known as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented using high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, and are not limited herein as long as the desired results of the technical solutions disclosed in the present application can be achieved.
The above-described embodiments should not be construed as limiting the scope of the present application. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (9)

1. An indoor light adjusting method, comprising:
acquiring voice information, face information, infrared scanning information and user behavior information of indoor personnel;
analyzing according to the voice information, the face information, the infrared scanning information and the user behavior information to obtain the emotional states of the personnel, wherein the emotional states of the personnel comprise normal emotional states and abnormal emotional states;
generating an illumination mode according to the emotional upset state under the condition that the emotional state of the person is the emotional upset state;
and controlling the lamp to illuminate according to the illumination mode.
2. The indoor light adjusting method according to claim 1, wherein the obtaining of the emotional state of the person according to the voice information, the face information, the infrared scanning information, and the user behavior information, the emotional state of the person including a normal emotional state and an abnormal emotional state, comprises:
analyzing the voice information to obtain a voice emotion state;
analyzing the face information to obtain the face emotion state;
analyzing the infrared scanning information to obtain an infrared temperature emotional state;
and inputting the voice emotion state, the face emotion state, the infrared temperature emotion state and the user behavior information into an emotion weight analysis model, and reasoning to obtain the emotion state of the person.
3. A method for adjusting indoor light, as defined in claim 1, further comprising:
and generating an illumination mode according to the user behavior information under the condition that the emotion state of the person is a normal emotion state.
4. A method for indoor lighting adjustment as claimed in claim 3, further comprising:
under the condition that the emotional state of the person is abnormal, analyzing whether a crying state of the child exists or not according to the voice information and the infrared scanning information;
and under the condition that the child cry exists, controlling the lamp to illuminate according to a preset rule.
5. An indoor light adjusting method according to claim 4, wherein the controlling of the lighting of the light fixture according to the preset rule comprises:
and controlling the lamp to flash and illuminate within a preset time according to a preset frequency, a preset brightness and a preset color.
6. An indoor light adjusting method according to claim 4, wherein controlling the lighting of the light fixture according to the illumination pattern comprises:
and under the condition that the emotional state of the person is an emotional disorder state, controlling the lamp to illuminate according to the sudden change of the illumination mode.
7. An indoor light adjusting method according to claim 1, wherein in a case where an emotional state of a person is an emotional normal state, lighting of a lamp is controlled gradually according to the illumination pattern.
8. Indoor light control system, its characterized in that includes:
the acquisition module is used for acquiring voice information, face information, infrared scanning information and user behavior information of indoor personnel;
the analysis module is used for analyzing and obtaining the emotional states of the personnel according to the voice information, the face information, the infrared scanning information and the user behavior information, wherein the emotional states of the personnel comprise normal emotional states and abnormal emotional states;
the illumination mode generation module is used for generating an illumination mode according to the emotional state under the condition that the emotional state of the person is the emotional state;
and the control module is used for controlling the illumination of the lamp according to the illumination mode.
9. An indoor light conditioning system as recited in claim 8, wherein the analysis module comprises:
the first analysis unit is used for analyzing the voice information to obtain a voice emotion state;
the second analysis unit is used for analyzing the face information to obtain the face emotional state;
the third analysis unit is used for analyzing the infrared scanning information to obtain an infrared temperature emotional state;
and the weight analysis unit is used for inputting the speech emotion state, the face emotion state, the infrared temperature emotion state and the user behavior information into the emotion weight analysis model and deducing to obtain the emotion state of the person.
CN202210739561.8A 2022-06-28 2022-06-28 Indoor light adjusting method and adjusting system Pending CN115038210A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210739561.8A CN115038210A (en) 2022-06-28 2022-06-28 Indoor light adjusting method and adjusting system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210739561.8A CN115038210A (en) 2022-06-28 2022-06-28 Indoor light adjusting method and adjusting system

Publications (1)

Publication Number Publication Date
CN115038210A true CN115038210A (en) 2022-09-09

Family

ID=83127587

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210739561.8A Pending CN115038210A (en) 2022-06-28 2022-06-28 Indoor light adjusting method and adjusting system

Country Status (1)

Country Link
CN (1) CN115038210A (en)

Similar Documents

Publication Publication Date Title
US10529196B2 (en) Status indication triggering and user interfacing in a smart-home device
US9137878B2 (en) Dynamic lighting based on activity type
US20170303646A1 (en) Functional, socially-enabled jewelry and systems for multi-device interaction
US8994292B2 (en) Adaptive lighting system
CN107006100B (en) Control illumination dynamic
TW201821946A (en) Data transmission system and method thereof
KR101353195B1 (en) Method for Lighting Control using Mobile Device
US20170214962A1 (en) Information processing apparatus, information processing method, and program
US10560994B2 (en) Lighting control apparatus, corresponding method and computer program product
CN110235525A (en) Recommended engine for lighting system
TWI694411B (en) Emotional based interaction device and method
KR20180110472A (en) System and method for controlling a stereoscopic emotion lighting
CN115038210A (en) Indoor light adjusting method and adjusting system
CN112074804A (en) Information processing system, information processing method, and recording medium
US10019489B1 (en) Indirect feedback systems and methods
Cunha et al. AmbLEDs collaborative healthcare for AAL systems
EP3607521B1 (en) Method and apparatus for monitoring usage of a lighting system
CN211698184U (en) Detection system and detection device
JP2012520109A (en) Interactive system and method for sensing motion
JP2003223994A (en) Illumination control device, illumination control method and format of illumination control data
CN116326209A (en) Engaging a user in context-sensing inference in a lighting arrangement

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination