CN109116978B - Light emitting device, interaction method thereof, electronic device and storage medium - Google Patents

Light emitting device, interaction method thereof, electronic device and storage medium Download PDF

Info

Publication number
CN109116978B
CN109116978B CN201810575523.7A CN201810575523A CN109116978B CN 109116978 B CN109116978 B CN 109116978B CN 201810575523 A CN201810575523 A CN 201810575523A CN 109116978 B CN109116978 B CN 109116978B
Authority
CN
China
Prior art keywords
light
effect
sensing module
interaction
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810575523.7A
Other languages
Chinese (zh)
Other versions
CN109116978A (en
Inventor
李阳
顾嘉唯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Luka Beijing Intelligent Technology Co ltd
Original Assignee
Beijing Ling Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Ling Technology Co ltd filed Critical Beijing Ling Technology Co ltd
Priority to CN201810575523.7A priority Critical patent/CN109116978B/en
Priority to PCT/CN2018/113439 priority patent/WO2019233029A1/en
Publication of CN109116978A publication Critical patent/CN109116978A/en
Application granted granted Critical
Publication of CN109116978B publication Critical patent/CN109116978B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • G10L2015/223Execution procedure of a spoken command
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Abstract

The invention discloses an interaction method of a light-emitting device, which comprises the following steps: acquiring the activity of the activity factor; wherein the activity factor is selected from at least one of the following parameters: the current time, the light intensity, the weather condition, the time interval between the current time and the last interaction and the use duration in a preset interaction period; determining an active state according to the activity of the activity factor; receiving an interactive operation instruction; determining a required lighting effect by combining the interactive operation instruction and the active state; and outputting a light-emitting control instruction to control the light-emitting device to emit light according to the light-emitting effect. The invention also discloses a light-emitting device, an electronic device and a storage medium. The light-emitting device, the interaction method thereof, the electronic equipment and the storage medium can solve the problem that the judgment logic of the light-emitting device in the prior art is single to a certain extent.

Description

Light emitting device, interaction method thereof, electronic device and storage medium
Technical Field
The present invention relates to the field of interaction technologies, and in particular, to a light emitting device, an interaction method thereof, an electronic device, and a storage medium.
Background
Among the prior art, there is the little night-light of intelligence at present can come the automatic lighting night-light through detecting the human body under the dim circumstances of environment to need not user oneself and control the night-light.
However, the inventor finds that at least the following problems exist in the prior art when the invention is realized:
in the existing technical scheme, the corresponding function is realized through a single mode, if the intelligent small night lamp can only automatically light up through the behavior of a detection person, the application scene is single, the complex situation is difficult to process, the fault tolerance rate is low, the false triggering rate is high, and the linkage effect is avoided. Meanwhile, the device feels no surprise to death due to the simple judgment logic, is difficult to jump out of the feeling of the electronic device, and is difficult to convey the experience of perceiving the user, understanding the user and properly feeding back the experience to the user.
Disclosure of Invention
In view of the above, an objective of the embodiments of the present invention is to provide a light emitting device, an interaction method thereof, an electronic device, and a storage medium, which can solve the problem of single determination logic of the light emitting device in the prior art to a certain extent.
In view of the above object, according to a first aspect of the embodiments of the present invention, there is provided an interaction method of a light emitting device, including:
acquiring the activity of the activity factor; wherein the activity factor is selected from at least one of the following parameters: the current time, the light intensity, the weather condition, the time interval between the current time and the last interaction and the use duration in a preset interaction period;
determining an active state according to the activity of the activity factor;
receiving an interactive operation instruction;
determining a required lighting effect by combining the interactive operation instruction and the active state;
and outputting a light-emitting control instruction to control the light-emitting device to emit light according to the light-emitting effect.
Optionally, in the step of obtaining the activity of the activity factor, the obtained activity factor at least includes a time interval between the current time and the last interaction and a duration of use in a preset interaction period.
Optionally, the obtained liveness factor further includes light intensity.
Optionally, the light-emitting device further comprises a sound-emitting device; the interaction method further comprises the following steps:
determining a required sound effect by combining the interactive operation instruction and the active state;
and outputting a sound production control instruction to control the sound production device to produce sound according to the sound effect.
Optionally, the light-emitting device further includes an infrared sensing module, a brightness sensing module, and a pickup module;
the receiving of the interoperation instruction includes:
receiving data collected by the infrared sensing module, the brightness sensing module and/or the pickup module;
and determining the interactive operation instruction according to the data collected by the infrared sensing module, the brightness sensing module and/or the pickup module.
Optionally, the method further includes:
and if the infrared sensing module, the brightness sensing module and the pickup module do not acquire data, the light-emitting effect and the sound effect are determined according to the active state.
Optionally, the method further includes:
if the brightness data acquired by the brightness sensing module is lower than a first brightness threshold value and the infrared data acquired by the infrared sensing module is within a preset range, determining that the interactive operation instruction is a voice awakening instruction and starting voice interaction;
or if the voice data collected by the pickup module is detected to be a wake-up word, determining that the interactive operation instruction is a voice wake-up instruction, and starting voice interaction.
Optionally, after the voice interaction is started, the method further includes:
determining the required light emitting effect as a wake-up light effect by combining the voice wake-up instruction and the active state;
and outputting a light-emitting control instruction to control the light-emitting device to emit light according to the awakening light effect.
Optionally, the method further includes:
determining whether the current time is in a sleep period;
if the current time is in a sleep time period, the brightness data collected by the brightness sensing module is lower than a second brightness threshold value, and a light-on instruction is received, the light-emitting effect is a small night light effect, and a light-emitting control instruction is output to control the light-emitting device to emit light according to the small night light effect.
In a second aspect of the embodiments of the present invention, there is provided a light emitting device including:
the interactive triggering unit is used for receiving an interactive operation instruction;
a light emitting unit for emitting light according to the light emission control instruction;
a processing unit configured to:
acquiring the activity degrees of at least two activity degree factors; wherein the activity factor is selected from: the current time, the light intensity, the weather condition, the time interval between the current time and the last interaction and the use duration in a preset interaction period;
determining an active state according to the activity of the at least two activity factors;
determining a required lighting effect by combining the interactive operation instruction and the active state;
and outputting a light-emitting control instruction to control the light-emitting unit to emit light according to the light-emitting effect.
Optionally, the light-emitting device further includes a sound-generating unit, configured to generate sound according to the sound-generating control instruction;
the processing unit configured to:
determining a required sound effect by combining the interactive operation instruction and the active state;
and outputting a sounding control instruction to control the sounding unit to sound according to the sound effect.
Optionally, the interaction triggering unit includes an infrared sensing module, a brightness sensing module and a pickup module;
the processing unit configured to:
receiving data collected by the infrared sensing module, the brightness sensing module and/or the pickup module;
and determining the interactive operation instruction according to the data collected by the infrared sensing module, the brightness sensing module and/or the pickup module.
In a third aspect of the embodiments of the present invention, there is provided an electronic device, including:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the one processor to cause the at least one processor to perform a method as in any one of the preceding claims.
A fourth aspect of the embodiments of the present invention provides a computer-readable storage medium storing a computer program, wherein the computer program, when executed by a processor, implements the steps of the method of any one of the preceding claims.
As can be seen from the foregoing, in the light emitting device and the interaction method thereof, the electronic device, and the storage medium provided in the embodiments of the present invention, the active state is determined according to the activity of the activity factor, and then the active state is combined with the received interactive operation instruction, so as to determine the corresponding light emitting effect, so that the determination logic of the light emitting effect is no longer single, and the problem of misoperation is not easily caused compared with the light emitting device with a single determination logic.
Drawings
Fig. 1 is a schematic flowchart illustrating an interaction method of a light emitting device according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a light-emitting device according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of an embodiment of an electronic device provided in the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to specific embodiments and the accompanying drawings.
It should be noted that all expressions using "first" and "second" in the embodiments of the present invention are used for distinguishing two entities with the same name but different names or different parameters, and it should be noted that "first" and "second" are merely for convenience of description and should not be construed as limitations of the embodiments of the present invention, and they are not described in any more detail in the following embodiments.
In a first aspect of the embodiments of the present invention, an interaction method for a light emitting device is provided, which can solve the problem of single judgment logic of the light emitting device in the prior art to a certain extent. Fig. 1 is a schematic flow chart of an embodiment of an interaction method of a light emitting device according to the present invention.
The interaction method of the light-emitting device comprises the following steps:
step 101: acquiring the activity of the activity factor; wherein the activity factor is selected from at least one of the following parameters: the current time, the light intensity, the weather condition, the time interval between the current time and the last interaction and the use duration in the preset interaction period.
Step 102: and determining the active state according to the activity of the activity factor.
Here, the active state depends on the activity of the selected activity factor, and specifically, the following table may be referred to.
As an embodiment of the present invention, when the activity factor selects the current time, the activity distribution for different time periods (hr) is as follows:
TABLE 1 relationship between Current time and Activity
Time period 00-02 02-04 04-06 06-08 08-10 10-12 12-14 14-16 16-18 18-20 20-22 22-24
Degree of liveness -10 -20 -10 0 +5 +5 +5 +10 +10 +10 +20 0
At this time, the active states in the time period may be classified into six types, i.e., very inactive (-20), very inactive (-10), inactive (0), active (+5), very active (+10), and very active (+20), according to the activity level corresponding to the time period. For example, when the current time is 23 pm, the corresponding activity level is 0, and the current active state is "inactive".
As an embodiment of the present invention, when the liveness factor is selected to be the current light intensity, the different light conditions (%) liveness effect profiles are as follows:
TABLE 2 relationship between light intensity and liveness
Intensity of light 0-10 10-30 30-50 50-70 70-90 90-100
Degree of liveness +20 +10 0 5 -10 -20
At this time, the active states in the light intensity range can be classified into six types, i.e., very inactive (-20), very inactive (-10), inactive (0), active (+5), very active (+10), and very active (+20), according to the activity level corresponding to the light intensity. For example, if the current light intensity ratio is 5%, the corresponding activity level is +20, and the current active state is "very active".
It should be noted that the light intensity in table 2 is a percentage, which is a ratio of the current light intensity to the reference light intensity. The reference light intensity can be the light intensity collected at 12 am in the previous day, and when the light intensity reaches 12 am in the current day, the light intensity detected in real time can be updated to be new reference light intensity, so that the light intensity percentage of each day can be updated in real time, and the actual situation is better met. Of course, alternatively, the reference light intensity may be a predetermined light intensity value, such as a default setting value or a user-defined value, etc.
As an embodiment of the present invention, when the activity factor selects the current or current day's weather condition, the activity impact distribution for different weather conditions is as follows in Table 3:
TABLE 3 weather conditions vs. liveness
Figure BDA0001686721600000051
At this time, the active states of the corresponding weather conditions can be classified into five types, i.e., very inactive (-20), very inactive (-15), inactive (-10), normal (0), and very active (+10), according to the activity level corresponding to the weather conditions. For example, when the current or present day weather condition is a heavy rain, the corresponding activity level is-15, and the current activity state is "very inactive".
As an embodiment of the present invention, when the activity factor selects the time interval between the current time and the last interaction, the activity impact distribution of the different time intervals is as follows:
TABLE 4 Interactive time Interval and Activity relationship
Interaction time interval Greater than 48h 48h-24h 24h-12h 12h-4h 4h-30min 30-15min Less than 15min
Degree of liveness +30 +20 +10 0 -10 -15 -20
At this time, the active states of the corresponding time interval may be classified into seven types, i.e., very inactive (-20), very inactive (-15), inactive (-10), normal (0), active (+10), very active (+20), and very active (+30), according to the activity level corresponding to the interaction time interval. For example, if the current time is more than 48 hours from the last interaction, and the corresponding activity level is +30, the current active state is "very active".
As an embodiment of the present invention, when the activity factor selects the usage duration within the preset interaction period (e.g. 24 hours or 48 hours, etc.), the activity impact distribution of different usage durations is as follows:
TABLE 5 relationship of duration of use and liveness
Accumulated length of use 0 0-1min 1min-5min 5min-30min 30min-2hr Over 2hr
Degree of liveness +20 +15 +10 0 -10 -20
At this time, according to the activity degree corresponding to the usage duration in the preset interaction period, the active states of the corresponding usage duration can be divided into six types, namely, very inactive (-20), inactive (-10), normal (0), active (+10), very active (+15), and very active (+ 20). For example, when the usage duration in the preset interaction period is 31 minutes, and the corresponding activity degree is-10, the current active state is "inactive".
It should be noted that, when the end points of the two ranges overlap, the parameter ranges in the above table may be distinguished by a rule about left or right, for example, the cumulative duration of use is 0-1min and 1min-5min overlap at 1 minute, the end point of 1 minute is not included in 0-1min, or the end point of 1 minute is not included in 1min-5min, so as to distinguish from 0-1 min. The parameter ranges of other tables can also be distinguished by using similar rules, which are not described herein again.
Preferably, when the activity factor is selected, more than two activity factors can be selected from five parameters of the current time, the light intensity, the weather condition, the time interval between the current time and the last interaction and the use time in the preset interaction period to determine the activity state.
For example, when selecting the current time and light intensity, the activities in tables 1 and 2 above may be added, and then the sum of the activities compared to the active state. For example, when the current time is 21 points, the activity is +20, and when the current light intensity is 5, the activity is +20, and the sum of the two activities is +40, this time may be defined as "very active". When the sum of the two activity degrees is other values, the division can be performed correspondingly according to the division manner of the active state, which is not described herein again.
Furthermore, in the step of obtaining the activity of the activity factor, the obtained activity factor at least includes a time interval between the current time and the last interaction and a use duration in a preset interaction period, so that the activity state can be better reflected. Preferably, the acquired activity factor further includes light intensity, so as to further accurately reflect the activity state.
Preferably, as an embodiment of the present invention, the current time, the light intensity, the weather condition, the time interval between the current time and the last interaction, and the usage time duration in the preset interaction period are all used as activity factors to determine the activity state.
Specifically, the calculation formula of the total activity a is as follows:
A=50+T+L+W+I+P
wherein, the initial state activity is 50, T, L, W, I, P respectively represents the current time, the light intensity, the weather condition, the time interval between the current time and the last interaction and the activity of the duration of use in the preset interaction period, and the activities of these several activity factors can refer to the records in tables 1 to 5, which are not described herein again.
Thus, after calculating the total activity A, the activity status may be determined according to Table 6 below.
TABLE 6 relationship of Total Activity to active State
Active state Is very inactive Is very inactive Inactive Is normal Activating Is very active Is very active
Interval of total activity -∞~10 10~20 20~40 40~60 60~80 80~90 90~+∞
Optionally, the active state in the interaction method of the light emitting apparatus may be updated by using a timing refresh method, for example, refreshing every hour; the refresh mechanism may also be triggered when a certain parameter changes, for example, when rainstorm suddenly occurs, the activity level in the active state during rainstorm is updated to refresh the active state, and so on.
In summary, according to the embodiments of the active state calculation rule, the light emitting device has an active state at any time.
Step 103: and receiving an interactive operation instruction. Here, the interoperation command may be a command generated for interacting with the light emitting device, and mainly controls the light emitting device to emit light with a certain light emitting effect through the command.
Optionally, the light-emitting device further includes an infrared sensing module, a brightness sensing module, and a pickup module; preferably, the infrared sensing module is a pyroelectric infrared sensor;
the receiving of the interoperation instruction includes:
receiving data collected by the infrared sensing module, the brightness sensing module and/or the pickup module;
and determining the interactive operation instruction according to the data collected by the infrared sensing module, the brightness sensing module and/or the pickup module.
Different interactive operation instructions can be obtained according to data acquired by different modules or any combination of the data.
For example, if the brightness data acquired by the brightness sensing module is lower than a first brightness threshold and the infrared data acquired by the infrared sensing module is within a first preset range, it is determined that the interactive operation instruction is a voice wake-up instruction. Of course, this is only an example, and the specific combination may be set according to the actual situation.
Step 104: and determining the required lighting effect by combining the interactive operation instruction and the active state.
As an embodiment of the present invention, the method for interacting light emitting devices further includes:
and if the infrared sensing module, the brightness sensing module and the pickup module do not acquire data, determining that the current standby state is not used (also one type of interactive operation instruction), and determining the light-emitting effect according to the active state.
According to the different active states, the lighting device will adopt different lighting effects as silent standby state when not in use standby, as shown in table 7 below.
TABLE 7 distribution of luminous efficacy in Standby State
Classification Is very inactive Is very inactive Inactive Is normal Activating Is very active Is very active
Luminous effect 1
Luminous effect 2
Luminous effect 3
Luminous effect 4
Luminous effect 5
Luminous effect 6
Luminous effect 7
Luminous effect 8
Luminous effect 9
As shown in Table 7, the lighting effects 1-9 are different lighting effects respectively for showing the active status of the lighting device, such as lighting effects 1-3 showing a cheerful lighting effect under very active status and lighting effects 7-9 showing a quiet and gloomy lighting effect under very inactive status.
As an embodiment of the present invention, the method for interacting light emitting devices further includes:
if the brightness data acquired by the brightness sensing module is lower than a first brightness threshold value and the infrared data acquired by the infrared sensing module is within a preset range, determining that the interactive operation instruction is a voice awakening instruction and starting voice interaction; here, the first brightness threshold may be a value set as needed, and the selection criterion may be a brightness value considered as a degree that light rays reach a darker level under a normal condition; the preset range may be a range of the infrared data that can be detected when a person walks under normal conditions, and a value of the range may be selected according to actual conditions, which is not limited herein. Under the condition, the brightness data acquired by the brightness sensing module is lower than the first brightness threshold, and the infrared data acquired by the infrared sensing module is in the preset range, so that the function of a wakeup word is replaced, and the corresponding voice interaction function is not required to be awakened by the wakeup word before voice interaction, so that the operation steps are saved, and convenience is brought to users.
As an alternative, if the voice data collected by the pickup module is detected to be a wakeup word, determining that the interactive operation instruction is a voice wakeup instruction, and starting voice interaction; optionally, the wakeup word may be default, or may be set by the user, and the content of the wakeup word is not specifically limited herein.
After the voice interaction is started, the light emitting device can be controlled to emit light with corresponding light emitting effect through voice, or other interaction with the light emitting device can be carried out through voice control.
Further, after the voice interaction is started, the interaction method of the light-emitting device may further include the steps of:
determining the required light emitting effect as a wake-up light effect by combining the voice wake-up instruction and the active state;
and outputting a light-emitting control instruction to control the light-emitting device to emit light according to the awakening light effect.
Therefore, after the voice awakening instruction is confirmed to be received, the light with the awakening light effect is sent by the light-emitting device in combination with the active state, so that a user can know that the light-emitting device is in a state of interaction through voice at present according to the special light effect, on one hand, the user is favorably reminded that the current voice interaction function is activated, interaction with equipment can be carried out through the voice instruction at any time, on the other hand, the user is reminded that the current voice interaction function is activated, and if the voice interaction function is misoperation, the function is timely closed so as to avoid other misoperation problems.
As an embodiment of the present invention, the method for interacting light emitting devices further includes:
determining whether the current time is in a sleep period (e.g., 23 o 'clock late to 5 o' clock early);
if the current time is in a sleep time period, the brightness data collected by the brightness sensing module is lower than a second brightness threshold value, and a light-on instruction is received, the light-emitting effect is a small night light effect, and a light-emitting control instruction is output to control the light-emitting device to emit light according to the small night light effect.
Here, the second brightness threshold may be a value set as needed, and the selection criterion may be light intensity considered to be when people are asleep under normal conditions; the second brightness threshold may be the same as or different from the first brightness threshold in some cases, and the specific value is determined according to the specific situation. The effect of the small night lamp is yellow and dark on the whole, and the specific color temperature, the brightness and the like of the small night lamp can be selected according to actual needs and are not limited.
The lighting device can show different lighting effects (including bright illumination light and color-changing light) when interacting with people under different conditions according to different active states, as shown in table 8 below.
TABLE 8 relationship of Interactive operation Instructions, active State and Lighting effects
Figure BDA0001686721600000101
As can be seen from table 8, the interoperation command includes at least the following:
1) if the brightness data acquired by the brightness sensing module is higher than the third brightness threshold value and the infrared data acquired by the infrared sensing module is within the preset range, it is indicated that the current light intensity is in a bright state and the infrared signal detects that someone passes through, and at this time, one of the corresponding light-emitting effects 10-16 is determined in combination with the corresponding active state. Here, the third brightness threshold may be a value set as needed, and the selection criterion may be a brightness value considered as a brightness degree that the light reaches brighter degree under normal conditions; in some cases, the specific value may be the same as the first brightness threshold, which depends on the specific situation and is not limited herein.
2) If the brightness data collected by the brightness sensing module is lower than a first brightness threshold value, and the infrared data collected by the infrared sensing module is in a preset range, it is indicated that the current light intensity is in a dim state and the infrared signal detects that someone passes through, it is determined that the interaction operation instruction is a voice awakening instruction, voice interaction is started, and meanwhile, one of corresponding awakening light colors c1-c7 is determined by combining with an active state.
3) And if the voice data collected by the pickup module is detected to be a wake-up word, which indicates that the user is waking up the voice interaction function through voice, determining that the interaction operation instruction is a voice wake-up instruction, starting voice interaction, and determining one of corresponding wake-up light colors c1-c7 by combining with an active state.
4) If the voice data collected by the pickup module are detected to be continuous three times of sounds within a certain time interval, which indicates that the user claps hands three times, at this time, one of the corresponding light-emitting effects 17-23 is determined in combination with the corresponding active state.
5) If the brightness data collected by the brightness sensing module is detected to be shaking (e.g. brightness suddenly decreases or the brightness repeats), it indicates that the user is waving his/her hand at the top of the device, and at this time, determines one of the corresponding light-emitting effects 24-30 in combination with the corresponding active state.
6) If the voice data collected by the pickup module is detected to have data shock (e.g. a sudden change in frequency or amplitude), indicating that the user is currently blowing against the top of the device, then one of the corresponding lighting effects 31-37 is determined in combination with the corresponding activity status.
7) If the current time is in the sleep time period, the brightness data collected by the brightness sensing module is lower than a second brightness threshold value, and a light-on instruction is received, so that the small night light needs to be turned on at present, the light-emitting effect is the small night light effect, and the light-emitting control instruction is output to control the light-emitting device to emit light according to the small night light effect. Optionally, the light-on instruction may be implemented by operations such as voice control, physical switch key or mobile phone control.
Step 105: outputting a light-emitting control instruction to control the light-emitting device to emit light according to the light-emitting effect; here, through the correspondence between the aforementioned interactive operation instruction, the active state and the lighting effect, a corresponding lighting control instruction can be obtained to control the lighting device to light; the light-emitting control instruction comprises light parameters such as light intensity and color temperature, and the light parameters corresponding to different light-emitting effects are different.
As an embodiment of the present invention, the light-emitting device further includes a sound-generating device, optionally, the sound-generating device is a microphone, and the sound-collecting module is a microphone sound-collecting module; the interaction method of the light-emitting device further comprises the following steps:
determining a required sound effect by combining the interactive operation instruction and the active state;
and outputting a sound production control instruction to control the sound production device to produce sound according to the sound effect.
In particular, the sound effect can be combined with the aforementioned lighting effect to provide a more versatile interactive effect.
For example, different sound and light effects are used as the silent standby state when the standby is not used, as shown in table 9 below.
TABLE 9 distribution of Acousto-optic effects in Standby State
Classification Is very inactive Is very inactive Inactive Is normal Activating Is very active Is very active
Acousto-optic effects 1
Acousto-optic effect 2
Acousto-optic effect 3
Acoustooptic effect 4
Acousto-optic effect 5
Acousto-optic effect 6
Acoustooptic effect 7
Acousto-optic effect 8
Acousto-optic effect 9
As shown in table 9, the sound and light effects 1-9 are respectively different sound effects and light combinations to represent the activity of the lighting device, for example, the sound and light effects 1-3 under very activity can represent sound and light effects with cheerful passion, and the sound and light effects 7-9 under very inactive condition can represent sound and light effects with quiet and melancholy.
For another example, the lighting device may show different acousto-optic effects when interacting with human under different conditions according to different active states, as shown in table 10 below.
TABLE 10 relationship between Interactive operation Instructions, active State and Acousto-optic Effect
Figure BDA0001686721600000121
Figure BDA0001686721600000131
As can be seen from table 10, the interoperation command includes at least the following:
1) if the brightness data acquired by the brightness sensing module is higher than a third brightness threshold value and the infrared data acquired by the infrared sensing module is within a preset range, it is indicated that the current light intensity is in a bright state and the infrared signal detects that someone passes through, and at this time, one of the corresponding acousto-optic effects 10-16 is determined in combination with the corresponding active state. Here, the third brightness threshold may be a value set as needed, and the selection criterion may be a brightness value considered as a brightness degree that the light reaches brighter degree under normal conditions; in some cases, the specific value may be the same as the first brightness threshold, which depends on the specific situation and is not limited herein.
2) If the brightness data collected by the brightness sensing module is lower than a first brightness threshold value, and the infrared data collected by the infrared sensing module is in a preset range, it is indicated that the current light intensity is in a dim state and the infrared signal detects that someone passes through, it is determined that the interaction operation instruction is a voice awakening instruction, voice interaction is started, meanwhile, one of corresponding awakening light colors c1-c7 is determined by combining with an active state, and meanwhile, corresponding sound effects can be provided by matching with the corresponding awakening light colors.
3) If the voice data collected by the pickup module is detected to be a wake-up word, which indicates that the user is waking up the voice interaction function through voice, the interaction operation instruction is determined to be the voice wake-up instruction, voice interaction is started, one of corresponding wake-up light colors c1-c7 is determined by combining the active state, and corresponding sound effects can be provided by matching with the corresponding wake-up light colors.
4) If the voice data collected by the pickup module is detected to be continuous three times of sounds within a certain time interval, the user takes a clap three times, and at the moment, one of the corresponding acousto-optic effects 17-23 is determined according to the corresponding active state.
5) If the brightness data collected by the brightness sensing module is detected to be data shock (such as sudden brightness reduction or repeated high and low), indicating that the user is waving his hand at the top of the device, at this time, determining one of the corresponding acousto-optic effects 24-30 in combination with the corresponding active state.
6) If the voice data collected by the pickup module is detected to have data shock (e.g. a sudden change in frequency or amplitude), indicating that the user is currently blowing against the top of the device, then one of the corresponding acousto-optic effects 31-37 is determined in combination with the corresponding activity status.
7) If the current time is in the sleep time period, the brightness data collected by the brightness sensing module is lower than a second brightness threshold value, and a light-on instruction is received, so that the small night light needs to be turned on at present, the light-emitting effect is the small night light effect, and the light-emitting control instruction is output to control the light-emitting device to emit light according to the small night light effect. Optionally, the light-on instruction may be implemented by operations such as voice control, physical switch key or mobile phone control; simultaneously, optionally, also can cooperate corresponding little night-light to provide corresponding sound effect to richen interactive effect.
It can be seen that according to different active states, different acousto-optic feedback is shown when a user interacts with the light-emitting device, so that the feedback performance for the same operation is different under the conditions of different time, different light conditions, different weather conditions, different interaction intervals and different accumulated use time, the user can be really understood, and the expression is rich, so that the user feels surprise and is not dull.
It can be seen from the foregoing embodiments that, in the interaction method for a light emitting device according to the embodiments of the present invention, the active state is determined according to the activity of the activity factor, and then the active state is combined with the received interactive operation instruction, so as to determine the corresponding light emitting effect, so that the determination logic of the light emitting effect is no longer single, and compared with a light emitting device with a single determination logic, the problem of misoperation is not easily caused.
Furthermore, when the activity factor selects more than two of the current time, the light intensity, the weather condition, the time interval between the current time and the last interaction and the use duration in the preset interaction period, the activity state can be better determined, so that the interaction with the user can be better carried out. The active state judgment mechanism provided by the embodiment of the invention enables the light-emitting device to have an own active judgment mode, can show different feedbacks when meeting different interaction triggers, is not a constant interactive expression mode any more, jumps out of the rigid feeling of the electronic device, increases the richness and the attraction of the expression, and enables a user to feel over-expected.
Furthermore, when the interactive operation instruction is an instruction obtained by analyzing data acquired by the infrared sensing module, the brightness sensing module and/or the sound pickup module, a multi-mode interactive light-emitting device is formed, namely, intelligent equipment which can recognize human behaviors, understand human behaviors and interact with the human behaviors. Under different activity degrees, different acousto-optic feedback and different judging mechanisms are respectively provided for the user to wake up by voice, the infrared induction detects that a person is present and light is bright, the infrared induction detects that the person is present and light is dim, the user takes a hand three times, the user waves the hand towards the top end of the equipment, the user blows towards the top end of the equipment, the user executes the light-on operation in the sleeping time period and in the dim state, and the like, and the logic judgment decision is made in a one-to-one correspondence mode. The integral multi-mode interactive logic judgment framework can process the logic judgment of complex conditions, is not a one-to-one judgment mode any more, gives the most appropriate feedback under the current condition through the comprehensive judgment of ambient light, the current time and the user behavior, and improves the fault-tolerant rate of the device, the false triggering rate is very low, and the linkage effect is good.
According to the interaction method of the light-emitting device provided by the embodiment of the invention, the input information (the current time, the current light condition, whether people are sensed by the current infrared rays or not, the current weather condition, the previous interaction time of the device with the user, the accumulated time for using the equipment within about 24hr, the current voice command condition of the people and the current behavior operation of the people) is comprehensively analyzed to determine a proper interaction feedback mode to be expressed in a sound and light effect adding mode, so that the interaction requirements under various different scenes are met.
In a second aspect of the embodiments of the present invention, a light emitting device is provided, which can solve the problem of single judgment logic of the light emitting device in the prior art to a certain extent. Fig. 2 is a schematic structural diagram of a light-emitting device according to an embodiment of the present invention.
The light emitting device includes:
an interaction triggering unit 201, configured to receive an interaction operation instruction;
a light emitting unit 203 for emitting light according to the light emission control instruction;
a processing unit 202 configured to:
acquiring the activity degrees of at least two activity degree factors; wherein the activity factor is selected from: the current time, the light intensity, the weather condition, the time interval between the current time and the last interaction and the use duration in a preset interaction period;
determining an active state according to the activity of the at least two activity factors;
determining a required lighting effect by combining the interactive operation instruction and the active state;
according to the lighting effect, a lighting control instruction is output to control the lighting unit 203 to light.
It can be seen from the foregoing embodiments that, in the light emitting device provided in the embodiments of the present invention, the active state is determined according to the activity of the activity factor, and then the active state is combined with the received interactive operation instruction, so as to determine the corresponding light emitting effect, so that the determination logic of the light emitting effect is no longer single, and compared with a light emitting device with a single determination logic, the light emitting device is not prone to cause a problem of misoperation.
As an embodiment of the present invention, the light-emitting device further includes a sound-emitting unit 304 for emitting sound according to the sound-emission control instruction;
the processing unit 202 is configured to:
determining a required sound effect by combining the interactive operation instruction and the active state;
according to the sound effect, a sound production control instruction is output to control the sound production unit 304 to produce sound, so that the sound effect can be combined with the aforementioned light-emitting effect to provide more diversified interactive effects.
As an embodiment of the present invention, the interaction triggering unit 201 includes an infrared sensing module, a brightness sensing module and a sound pickup module;
the processing unit 202 is configured to:
receiving data collected by the infrared sensing module, the brightness sensing module and/or the pickup module;
and determining the interactive operation instruction according to the data collected by the infrared sensing module, the brightness sensing module and/or the pickup module.
When the interactive operation instruction is an instruction obtained by analyzing data acquired by the infrared sensing module, the brightness sensing module and/or the sound pickup module, a multi-mode interactive light-emitting device is formed, and the intelligent equipment can identify human behaviors, understand the human behaviors and interact with the human behaviors. Under different activity degrees, different acousto-optic feedback and different judging mechanisms are respectively provided for the user to wake up by voice, the infrared induction detects that a person is present and light is bright, the infrared induction detects that the person is present and light is dim, the user takes a hand three times, the user waves the hand towards the top end of the equipment, the user blows towards the top end of the equipment, the user executes the light-on operation in the sleeping time period and in the dim state, and the like, and the logic judgment decision is made in a one-to-one correspondence mode. The integral multi-mode interactive logic judgment framework can process the logic judgment of complex conditions, is not a one-to-one judgment mode any more, gives the most appropriate feedback under the current condition through the comprehensive judgment of ambient light, the current time and the user behavior, and improves the fault-tolerant rate of the device, the false triggering rate is very low, and the linkage effect is good.
In view of the above object, a third aspect of the embodiments of the present invention provides an embodiment of an apparatus for performing the interaction method. Fig. 3 is a schematic hardware structure diagram of an embodiment of an apparatus for performing the interaction method according to the present invention.
As shown in fig. 3, the apparatus includes:
one or more processors 301 and a memory 302, with one processor 301 being illustrated in fig. 3.
The apparatus for performing the interaction method may further include: an input device 303 and an output device 304.
The processor 301, the memory 302, the input device 303 and the output device 304 may be connected by a bus or other means, and fig. 3 illustrates the connection by a bus as an example.
The memory 302 is a non-volatile computer-readable storage medium and can be used for storing non-volatile software programs, non-volatile computer-executable programs, and modules, such as program instructions/modules corresponding to the interaction method in the embodiment of the present application (for example, the interaction triggering unit 201, the light emitting unit 203, and the processing unit 202 shown in fig. 2). The processor 301 executes various functional applications of the server and data processing by running nonvolatile software programs, instructions and modules stored in the memory 302, that is, implements the interaction method of the above-described method embodiment.
The memory 302 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the light emitting device, and the like. Further, the memory 302 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some embodiments, memory 302 may optionally include memory located remotely from processor 301, which may be connected to the member user behavior monitoring device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 303 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the light emitting device. The output means 304 may comprise a display device such as a display screen.
The one or more modules are stored in the memory 302 and, when executed by the one or more processors 301, perform the interaction method of any of the method embodiments described above. The technical effect of the embodiment of the device for executing the interaction method is the same as or similar to that of any method embodiment.
Embodiments of the present application provide a non-transitory computer storage medium, where a computer-executable instruction is stored, and the computer-executable instruction may execute a processing method for list item operations in any of the above method embodiments. Embodiments of the non-transitory computer storage medium may be the same or similar in technical effect to any of the method embodiments described above.
Finally, it should be noted that, as will be understood by those skilled in the art, all or part of the processes in the methods of the above embodiments may be implemented by a computer program that can be stored in a computer-readable storage medium and that, when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like. The technical effect of the embodiment of the computer program is the same as or similar to that of any of the method embodiments described above.
Furthermore, the apparatuses, devices, etc. described in the present disclosure may be various electronic terminal devices, such as a mobile phone, a Personal Digital Assistant (PDA), a tablet computer (PAD), a smart television, etc., and may also be large terminal devices, such as a server, etc., and therefore the scope of protection of the present disclosure should not be limited to a specific type of apparatus, device. The client disclosed by the present disclosure may be applied to any one of the above electronic terminal devices in the form of electronic hardware, computer software, or a combination of both.
Furthermore, the method according to the present disclosure may also be implemented as a computer program executed by a CPU, which may be stored in a computer-readable storage medium. The computer program, when executed by the CPU, performs the above-described functions defined in the method of the present disclosure.
Further, the above method steps and system elements may also be implemented using a controller and a computer readable storage medium for storing a computer program for causing the controller to implement the functions of the above steps or elements.
Further, it should be appreciated that the computer-readable storage media (e.g., memory) described herein can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory. By way of example, and not limitation, nonvolatile memory can include Read Only Memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which can act as external cache memory. By way of example and not limitation, RAM is available in a variety of forms such as synchronous RAM (DRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), and Direct Rambus RAM (DRRAM). The storage devices of the disclosed aspects are intended to comprise, without being limited to, these and other suitable types of memory.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the disclosure herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as software or hardware depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
The various illustrative logical blocks, modules, and circuits described in connection with the disclosure herein may be implemented or performed with the following components designed to perform the functions described herein: a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination of these components. A general purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
The steps of a method or algorithm described in connection with the disclosure herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. The ASIC may reside in a user terminal. In the alternative, the processor and the storage medium may reside as discrete components in a user terminal.
In one or more exemplary designs, the functions may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a general-purpose or special-purpose computer, or a general-purpose or special-purpose processor. Also, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, Digital Subscriber Line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes Compact Disc (CD), laser disc, optical disc, Digital Versatile Disc (DVD), floppy disk, blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
Disclosed exemplary embodiments should be noted, however, that various changes and modifications could be made herein without departing from the scope of the disclosure as defined by the appended claims. The functions, steps and/or actions of the method claims in accordance with the disclosed embodiments described herein need not be performed in any particular order. Furthermore, although elements of the disclosure may be described or claimed in the singular, the plural is contemplated unless limitation to the singular is explicitly stated.
It should be understood that, as used herein, the singular forms "a," "an," "the" are intended to include the plural forms as well, unless the context clearly supports the exception. It should also be understood that "and/or" as used herein is meant to include any and all possible combinations of one or more of the associated listed items.
The above-mentioned serial numbers of the embodiments of the present disclosure are merely for description and do not represent the merits of the embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
Those of ordinary skill in the art will understand that: the discussion of any embodiment above is meant to be exemplary only, and is not intended to intimate that the scope of the disclosure, including the claims, is limited to these examples; within the idea of an embodiment of the invention, also technical features in the above embodiment or in different embodiments may be combined and there are many other variations of the different aspects of an embodiment of the invention as described above, which are not provided in detail for the sake of brevity. Therefore, any omissions, modifications, substitutions, improvements, and the like that may be made without departing from the spirit and principles of the embodiments of the present invention are intended to be included within the scope of the embodiments of the present invention.

Claims (13)

1. The interaction method of the night lamp is characterized in that the night lamp comprises an infrared sensing module and a brightness sensing module; the method comprises the following steps:
acquiring the activity of the activity factor; the acquired activity factor at least comprises light intensity, a time interval between the current time and the last interaction and the use duration in a preset interaction period;
determining an active state according to the activity of the activity factor;
receiving an interactive operation instruction, specifically comprising: receiving data collected by the infrared sensing module and the brightness sensing module; determining the interactive operation instruction according to the data acquired by the infrared sensing module and the brightness sensing module;
determining a required lighting effect by combining the interactive operation instruction and the active state;
outputting a light-emitting control instruction to control the night lamp to emit light according to the light-emitting effect;
the method further comprises the following steps:
if the brightness data acquired by the brightness sensing module is lower than a first brightness threshold value and the infrared data acquired by the infrared sensing module is within a preset range, determining that the interactive operation instruction is a voice awakening instruction and starting voice interaction;
determining the required light emitting effect as a wake-up light effect by combining the voice wake-up instruction and the active state; wherein the wake-up light effect differs depending on the active state;
and outputting a light emitting control instruction to control the night lamp to emit light according to the awakening light effect.
2. The method of claim 1, wherein the activity factor further comprises a current time and/or weather conditions.
3. The method of claim 1, wherein the nightlight further comprises a sound emitting device; the interaction method further comprises the following steps:
determining a required sound effect by combining the interactive operation instruction and the active state;
and outputting a sound production control instruction to control the sound production device to produce sound according to the sound effect.
4. The method of claim 3, wherein the night light further comprises a pickup module;
the receiving the interoperation command further includes:
receiving data collected by the pickup module;
and determining the interactive operation instruction according to the data acquired by the pickup module.
5. The method of claim 4, further comprising:
and if the infrared sensing module, the brightness sensing module and the pickup module do not acquire data, the light-emitting effect and the sound effect are determined according to the active state.
6. The method of claim 4, further comprising:
and if the voice data collected by the pickup module is detected as a wake-up word, determining that the interactive operation instruction is a voice wake-up instruction, and starting voice interaction.
7. The method of claim 4, further comprising:
determining whether the current time is in a sleep period;
if the current time is in a sleep time period, the brightness data collected by the brightness sensing module is lower than a second brightness threshold value, and a light-on instruction is received, the light-emitting effect is a small night light effect, and according to the small night light effect, a light-emitting control instruction is output to control the night light to emit light.
8. A night light, comprising:
the interactive triggering unit is used for receiving an interactive operation instruction; the interaction triggering unit comprises an infrared sensing module and a brightness sensing module;
a light emitting unit for emitting light according to the light emission control instruction;
a processing unit configured to:
acquiring the activity degrees of at least two activity degree factors; the acquired activity factor at least comprises light intensity, a time interval between the current time and the last interaction and the use duration in a preset interaction period;
determining an active state according to the activity of the at least two activity factors;
receiving data collected by the infrared sensing module and the brightness sensing module; determining the interactive operation instruction according to the data acquired by the infrared sensing module and the brightness sensing module;
determining a required lighting effect by combining the interactive operation instruction and the active state;
outputting a light-emitting control instruction to control the light-emitting unit to emit light according to the light-emitting effect;
the processing unit further configured to:
if the brightness data acquired by the brightness sensing module is lower than a first brightness threshold value and the infrared data acquired by the infrared sensing module is within a preset range, determining that the interactive operation instruction is a voice awakening instruction and starting voice interaction;
determining the required light emitting effect as a wake-up light effect by combining the voice wake-up instruction and the active state; wherein the wake-up light effect differs depending on the active state;
and outputting a light emitting control instruction to control the night lamp to emit light according to the awakening light effect.
9. The night light of claim 8, wherein the activity factor further comprises a current time and/or weather conditions.
10. The night light of claim 8, further comprising an audible unit configured to generate an audible signal based on audible control instructions;
the processing unit configured to:
determining a required sound effect by combining the interactive operation instruction and the active state;
and outputting a sounding control instruction to control the sounding unit to sound according to the sound effect.
11. The night light of claim 10, wherein the interaction triggering unit further comprises a sound pickup module;
the processing unit configured to:
receiving data collected by the pickup module;
and determining the interactive operation instruction according to the data acquired by the pickup module.
12. An electronic device, comprising:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the one processor to cause the at least one processor to perform the method of any one of claims 1-7.
13. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
CN201810575523.7A 2018-06-06 2018-06-06 Light emitting device, interaction method thereof, electronic device and storage medium Active CN109116978B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201810575523.7A CN109116978B (en) 2018-06-06 2018-06-06 Light emitting device, interaction method thereof, electronic device and storage medium
PCT/CN2018/113439 WO2019233029A1 (en) 2018-06-06 2018-11-01 Light-emitting device and interaction method therefor, and electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810575523.7A CN109116978B (en) 2018-06-06 2018-06-06 Light emitting device, interaction method thereof, electronic device and storage medium

Publications (2)

Publication Number Publication Date
CN109116978A CN109116978A (en) 2019-01-01
CN109116978B true CN109116978B (en) 2021-03-23

Family

ID=64821796

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810575523.7A Active CN109116978B (en) 2018-06-06 2018-06-06 Light emitting device, interaction method thereof, electronic device and storage medium

Country Status (2)

Country Link
CN (1) CN109116978B (en)
WO (1) WO2019233029A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110291995A (en) * 2019-05-24 2019-10-01 丁韩 Lamp light control method and device applied to pet supplies
CN113709953A (en) * 2021-09-03 2021-11-26 上海蔚洲电子科技有限公司 LED light interactive control system and method and interactive display system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105511623A (en) * 2015-12-15 2016-04-20 深圳先进技术研究院 Interaction method and device
CN105764188A (en) * 2016-03-23 2016-07-13 北京百度网讯科技有限公司 Illumination controller and illumination control system and method
CN107277989A (en) * 2017-06-16 2017-10-20 深圳市盛路物联通讯技术有限公司 Intelligent House Light control method and device
CN107295193A (en) * 2017-07-14 2017-10-24 广东欧珀移动通信有限公司 Jingle bell control method, device, storage medium and electronic equipment
CN108093526A (en) * 2017-12-28 2018-05-29 美的智慧家居科技有限公司 Control method, device and the readable storage medium storing program for executing of LED light

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4605486B2 (en) * 2009-03-31 2011-01-05 八洲電業株式会社 LED lighting control system
CN102346436B (en) * 2010-08-05 2013-04-03 深圳市超维实业有限公司 Natural awakening device and method
CN103889091A (en) * 2012-12-19 2014-06-25 海尔集团公司 Street lamp control method and control device
CN105848374A (en) * 2015-01-12 2016-08-10 芋头科技(杭州)有限公司 Light control system and method
CN105444007A (en) * 2016-01-08 2016-03-30 福州智能小白电子科技有限公司 Intelligent LED lamp
CN105873321B (en) * 2016-05-09 2018-08-31 国网山东省电力公司巨野县供电公司 A kind of lighting system control method using intelligent switch control system
CN105792485A (en) * 2016-05-17 2016-07-20 南宁市茂宏信息技术有限公司 Intelligent lighting control switch
US9781794B1 (en) * 2016-11-17 2017-10-03 Echelon Corporation System and method for optimizing lighting in response to online weather data
CN106793306A (en) * 2016-12-28 2017-05-31 郑州北斗七星通讯科技有限公司 A kind of Intelligent indoor mini light night
CN106878118A (en) * 2017-01-03 2017-06-20 美的集团股份有限公司 A kind of intelligent home appliance voice control method and system
CN106912150B (en) * 2017-03-17 2019-06-21 青岛亿联客信息技术有限公司 Automatically according to method, the system of the use habit illumination of user
CN107484308B (en) * 2017-07-31 2019-09-17 北京小米移动软件有限公司 Control method, device and the storage medium of lighting apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105511623A (en) * 2015-12-15 2016-04-20 深圳先进技术研究院 Interaction method and device
CN105764188A (en) * 2016-03-23 2016-07-13 北京百度网讯科技有限公司 Illumination controller and illumination control system and method
CN107277989A (en) * 2017-06-16 2017-10-20 深圳市盛路物联通讯技术有限公司 Intelligent House Light control method and device
CN107295193A (en) * 2017-07-14 2017-10-24 广东欧珀移动通信有限公司 Jingle bell control method, device, storage medium and electronic equipment
CN108093526A (en) * 2017-12-28 2018-05-29 美的智慧家居科技有限公司 Control method, device and the readable storage medium storing program for executing of LED light

Also Published As

Publication number Publication date
WO2019233029A1 (en) 2019-12-12
CN109116978A (en) 2019-01-01

Similar Documents

Publication Publication Date Title
US20220171451A1 (en) Techniques for adjusting computing device sleep states
JP6490675B2 (en) Smart home hazard detector that gives a non-alarm status signal at the right moment
KR102444165B1 (en) Apparatus and method for providing a meeting adaptively
CN106791215B (en) Alarm clock setting method and mobile terminal with alarm clock function
CN108111948A (en) The visual output that server at speech interface equipment provides
US20190230775A1 (en) Illumination control
CN111282126A (en) Sleep wake-up method and device
WO2018129716A1 (en) Intelligent light control system and method
JP2022534338A (en) Smart display panel device and related method
CN109116978B (en) Light emitting device, interaction method thereof, electronic device and storage medium
US9874933B1 (en) Device shutdown routine optimization
EP3574713B1 (en) Recommendation engine for a lighting system
CN106453867A (en) Alarm clock control method and device
US10642231B1 (en) Switch terminal system with an activity assistant
US20180210700A1 (en) Contextual user interface based on environment
CN109429415A (en) Illumination control method, apparatus and system
US10754611B2 (en) Filtering sound based on desirability
KR100809659B1 (en) System and method for offering intelligent home service
CN110740550B (en) Control method and device for story accompanying lamp, story accompanying lamp and storage medium
CN111028908A (en) Sleep state monitoring method, device, equipment and computer readable storage medium
US20210011614A1 (en) Method and apparatus for mood based computing experience
WO2019058673A1 (en) Information processing device, information processing terminal, information processing method, and program
JPWO2018147455A1 (en) Wake-up method and device using the same
CN109766040B (en) Control method and control device
US10401805B1 (en) Switch terminal system with third party access

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 100000 Room D529, No. 501, Floor 5, Building 2, Fourth District, Wangjing Dongyuan, Chaoyang District, Beijing

Patentee after: Beijing Wuling Technology Co.,Ltd.

Address before: 100102 room 3602, 36 / F, building 101, building 13, District 4, Wangjing East Garden, Chaoyang District, Beijing

Patentee before: BEIJING LING TECHNOLOGY Co.,Ltd.

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20221221

Address after: 100000 Room 815, Floor 8, Building 6, Yard 33, Guangshun North Street, Chaoyang District, Beijing

Patentee after: Luka (Beijing) Intelligent Technology Co.,Ltd.

Address before: 100000 Room D529, No. 501, Floor 5, Building 2, Fourth District, Wangjing Dongyuan, Chaoyang District, Beijing

Patentee before: Beijing Wuling Technology Co.,Ltd.