WO2019233029A1 - 发光装置及其交互方法、电子设备、存储介质 - Google Patents

发光装置及其交互方法、电子设备、存储介质 Download PDF

Info

Publication number
WO2019233029A1
WO2019233029A1 PCT/CN2018/113439 CN2018113439W WO2019233029A1 WO 2019233029 A1 WO2019233029 A1 WO 2019233029A1 CN 2018113439 W CN2018113439 W CN 2018113439W WO 2019233029 A1 WO2019233029 A1 WO 2019233029A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
sensing module
interaction
emitting device
effect
Prior art date
Application number
PCT/CN2018/113439
Other languages
English (en)
French (fr)
Inventor
李阳
顾嘉唯
Original Assignee
北京物灵智能科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京物灵智能科技有限公司 filed Critical 北京物灵智能科技有限公司
Publication of WO2019233029A1 publication Critical patent/WO2019233029A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • G10L2015/223Execution procedure of a spoken command
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Definitions

  • the present invention relates to the field of interaction technology, and particularly to a light-emitting device, an interaction method thereof, an electronic device, and a storage medium.
  • the corresponding functions are implemented in a single mode.
  • the smart night light can only be automatically turned on by detecting human behavior, making the application scenario single and difficult to handle complex conditions, and fault tolerance. Very low, high false trigger rate, no linkage effect.
  • the device itself feels rigid and no surprises, it is difficult to jump out of the electronic device itself, and it is difficult to convey the experience of perceiving the user, understanding the user, and appropriately feedback to the user.
  • one object of the embodiments of the present invention is to provide a light emitting device and an interaction method thereof, an electronic device, and a storage medium, which can solve a problem of a single logic of a light emitting device in the prior art to a certain extent.
  • a first aspect of an embodiment of the present invention provides an interaction method of a light emitting device, including:
  • the activity factor is selected from at least one of the following parameters: current time, light intensity, weather conditions, the time interval between the current time and the last interaction, and the Duration of use
  • a light-emitting control instruction is output to control the light-emitting device to emit light.
  • the obtained activity factor includes at least the time interval between the current time and the last interaction and the duration of use within a preset interaction period.
  • the obtained activity factor further includes light intensity.
  • the light emitting device further includes a sound generating device; the interaction method further includes:
  • a sounding control instruction is output to control the sounding device to sound.
  • the light emitting device further includes an infrared sensing module, a brightness sensing module, and a sound pickup module;
  • the receiving an interactive operation instruction includes:
  • the interactive operation instruction is determined according to data collected by the infrared sensing module, the brightness sensing module, and / or the sound pickup module.
  • the method further includes:
  • the lighting effect and sound effect are determined according to the active state.
  • the method further includes:
  • the brightness sensing module determines that the interactive operation instruction is a voice wake-up instruction, and start the voice Interaction
  • the wake-up word is detected in the voice data collected by the pickup module, it is determined that the interactive operation instruction is a voice wake-up instruction, and a voice interaction is started.
  • the method further includes:
  • a light emission control instruction is output to control the light emitting device to emit light.
  • the method further includes:
  • the brightness data collected by the brightness sensing module is lower than the second brightness threshold, and a light-on instruction is received, the lighting effect is a night light effect, and according to the night light Effect, output a light emission control instruction to control the light emitting device to emit light.
  • a light emitting device including:
  • An interaction triggering unit for receiving an interactive operation instruction
  • a light emitting unit for emitting light according to a light emitting control instruction
  • the processing unit is configured as:
  • the activity factors are selected from the group consisting of: current time, light intensity, weather conditions, the time interval between the current time and the last interaction, and the duration of use within a preset interaction period;
  • a light-emitting control instruction is output to control the light-emitting unit to emit light.
  • the light emitting device further includes a sound emitting unit for sounding according to a sounding control instruction;
  • the processing unit is configured to:
  • the interaction triggering unit includes an infrared sensing module, a brightness sensing module, and a sound pickup module;
  • the processing unit is configured to:
  • the interactive operation instruction is determined according to data collected by the infrared sensing module, the brightness sensing module, and / or the sound pickup module.
  • an electronic device including:
  • At least one processor At least one processor
  • a memory connected in communication with the at least one processor; wherein,
  • the memory stores instructions executable by the one processor, and the instructions are executed by the at least one processor, so that the at least one processor can execute the method according to any one of the foregoing.
  • a computer-readable storage medium storing a computer program, wherein the computer program, when executed by a processor, implements the steps of the method according to any one of the foregoing.
  • the light emitting device and the interaction method, electronic device, and storage medium provided by the embodiments of the present invention determine the active state according to the activity of the activity factor, and then compare the active state with the received interactive operation instruction. In combination, the corresponding light-emitting effect is determined, so that the judgment logic of the light-emitting effect is no longer single, and compared with the light-emitting device with a single judgment logic, the problem of misoperation is less likely to occur.
  • FIG. 1 is a schematic flowchart of an embodiment of an interaction method of a light emitting device provided by the present invention
  • FIG. 2 is a schematic structural diagram of an embodiment of a light emitting device provided by the present invention.
  • FIG. 3 is a schematic structural diagram of an embodiment of an electronic device provided by the present invention.
  • FIG. 1 it is a schematic flowchart of an embodiment of an interaction method of a light emitting device provided by the present invention.
  • the interaction method of the light emitting device includes:
  • Step 101 Obtain the activity degree of the activity factor, wherein the activity factor is selected from at least one of the following parameters: current time, light intensity, weather conditions, current time interval from the last interaction, preset interaction Duration of use in a cycle.
  • Step 102 Determine an active state according to the activity of the activity factor.
  • the active state is determined according to the active degree of the selected activeness factor, and specifically, refer to the following table.
  • the activity distributions in different time periods (hr) are shown in Table 1:
  • the active state in the time period can be divided into very inactive (-20), very inactive (-10), inactive (0), active (+5), Very active (+10) and very active (+20). For example, when the current time is 23 o'clock in the evening and the corresponding activity is 0, the current active state is "inactive".
  • the active states in the light intensity range can be divided into very inactive (-20), very inactive (-10), inactive (0), and active (+5) , Very active (+10), very active (+20).
  • very inactive 20
  • very inactive -10
  • inactive (0) inactive
  • active (+5) Very active (+10)
  • very active (+20) very active (+20
  • the value of the light intensity in Table 2 is a percentage, which is the ratio of the current light intensity to the reference light intensity.
  • the reference light intensity may be the light intensity collected at 12 noon the previous day. At 12:00 on the day, the light intensity detected in real time may be updated to a new reference light intensity, so that the daily light intensity percentage can be obtained Real-time updates, more in line with the actual situation.
  • the reference light intensity may also be a predetermined light intensity value, such as a value set by default or a user-defined value, and so on.
  • the active status of the corresponding weather conditions can be divided into very inactive (-20), very inactive (-15), inactive (-10), normal (0), and very Five (+10) active.
  • very inactive 20
  • very inactive 15
  • inactive 10
  • normal (0) normal (0)
  • very Five 10
  • the corresponding activity is -15
  • the current active state is "very inactive.”
  • the active state of the corresponding time interval can be divided into very inactive (-20), very inactive (-15), inactive (-10), normal (0), Seven types of active (+10), very active (+20), and very active (+30). For example, if the time interval between the current time and the last interaction is greater than 48 hours, and the corresponding activity level is +30, the current active state is "very active”.
  • the activity factor selects the use duration within a preset interaction period (for example, 24 hours or 48 hours, etc.)
  • a preset interaction period for example, 24 hours or 48 hours, etc.
  • the active state of the corresponding use time can be divided into very inactive (-20), inactive (-10), normal (0), and active (+ 10) Very active (+15) and very active (+20).
  • very inactive 20
  • inactive 10
  • normal (0) 10
  • active 10.1
  • very active 10.1
  • very active 10.1
  • very active 10.1
  • very active 10.1
  • very active 10.1
  • very active 10.1
  • very active 10.1
  • very active very active
  • the parameter ranges in the above table can be distinguished by using the left or right rule when the endpoints of the two ranges overlap.
  • the cumulative use time is 0-1min and 1min-5min at 1 minute. Overlaps can be distinguished from 1min-5min in a way that does not include the endpoint of 1 minute in 0-1min, or distinguished from 0-1min in a way that does not include the endpoint of 1 minute in 0-1min.
  • the parameter ranges of other tables can also be distinguished by similar rules, which will not be repeated here.
  • more than two parameters can be selected from the five parameters: the current time, light intensity, weather conditions, the time interval between the current time and the last interaction, and the duration of use in the preset interaction period To determine the active status.
  • the active levels in the foregoing Tables 1 and 2 can be added, and then the sum of the active levels can be compared with the active state. For example, when the current time is 21:00, the activity is +20, and when the current light intensity is 5, the activity is +20. At this time, the sum of the activities is +40. active".
  • the sum of the activity levels of the two is other values, it may be divided accordingly according to the foregoing division method of the active states, and details are not described herein again.
  • the obtained activity factor includes at least the time interval between the current time and the last interaction and the use time in a preset interaction period, so that the active state can be better reflected.
  • the obtained activity factor further includes light intensity, thereby further accurately reflecting the active state.
  • the current time, light intensity, weather conditions, the time interval between the current time and the last interaction, and the duration of use within a preset interaction period are all used as active factors to perform the active state. OK.
  • the initial state activity is 50
  • T, L, W, I, and P respectively represent the current time, light intensity, weather conditions, the current time interval from the last interaction and the active time in the preset interaction period.
  • T, L, W, I, and P respectively represent the current time, light intensity, weather conditions, the current time interval from the last interaction and the active time in the preset interaction period.
  • the active state can be determined according to Table 6 below.
  • the active status in the interactive method of the light emitting device may be updated by a timed refresh method, such as refreshing every one hour; or a refresh mechanism may be triggered when a parameter changes, such as when suddenly During the heavy rain, the activity during the heavy rain weather is updated to refresh the active state, and so on.
  • a timed refresh method such as refreshing every one hour
  • a refresh mechanism may be triggered when a parameter changes, such as when suddenly During the heavy rain, the activity during the heavy rain weather is updated to refresh the active state, and so on.
  • the light emitting device has an active state at any time.
  • Step 103 Receive an interactive operation instruction.
  • the interactive operation instruction may refer to an instruction generated for interacting with the light-emitting device, and mainly uses the instruction to control the light-emitting device to emit light with a certain lighting effect.
  • the light emitting device further includes an infrared sensing module, a brightness sensing module, and a sound pickup module; preferably, the infrared sensing module is a pyroelectric infrared sensor;
  • the receiving an interactive operation instruction includes:
  • the interactive operation instruction is determined according to data collected by the infrared sensing module, the brightness sensing module, and / or the sound pickup module.
  • the interactive operation instruction is a voice wake-up instruction .
  • the specific combination mode can be set according to the actual situation.
  • Step 104 Combine the interactive operation instruction and the active state to determine a required lighting effect.
  • the interaction method of the light emitting device further includes:
  • the infrared sensor module, the brightness sensor module, and the sound pickup module do not collect data, then it is confirmed that it is in an unused standby state at this time (also a type of interactive operation instruction), and the lighting effect is based on the active state determine.
  • the light-emitting device uses different light-emitting effects as a silent standby state when it is not in standby, as shown in Table 7 below.
  • the luminous effects 1-9 are different luminous effects, which are used to represent the active status of the light-emitting device shown.
  • the luminous effects 1-3 under very active conditions will show enthusiastic and cheerful luminous effects.
  • Inactive glow effects 7-9 will show a quiet and melancholic light effect.
  • the interaction method of the light emitting device further includes:
  • the first brightness threshold may be a value set according to needs, and a selection criterion may be a brightness value that is considered to be a relatively dim light under normal circumstances; the preset range However, under normal circumstances, when someone walks around, the range of infrared data that can be detected can be selected according to the actual situation, which is not limited here.
  • the brightness data collected by the brightness sensing module is lower than the first brightness threshold, and the infrared data collected by the infrared sensing module is within a preset range, replacing the role of the wake word Without having to awaken the corresponding voice interaction function through the wake-up word before the voice interaction, thereby saving operation steps and facilitating the use of the user.
  • the wake-up word is detected in the voice data collected by the pickup module, it is determined that the interactive operation instruction is a voice wake-up instruction and voice interaction is started; optionally, the wake-up word may be a default It can also be set by the user, and its content is not specifically limited here.
  • the light-emitting device may be controlled to emit light with a corresponding light-emitting effect by voice, or perform other interactions with the light-emitting device by voice control.
  • the interaction method of the light emitting device may further include the following steps:
  • a light emission control instruction is output to control the light emitting device to emit light.
  • the light-emitting device After determining that the voice wake-up instruction is received, in combination with the active state, the light-emitting device emits light with a wake-up light effect, so that the user knows that the current light-emitting device is in a state that can be interacted by voice according to the special light effect, On the one hand, it is helpful to remind the user that the current voice interaction function is activated, and can interact with the device at any time through voice instructions. On the other hand, it also reminds the user that the current voice interaction function is activated. If it is a misoperation, please turn off the function in time to avoid it. Other misoperation issues.
  • the interaction method of the light emitting device further includes:
  • the brightness data collected by the brightness sensing module is lower than the second brightness threshold, and a light-on instruction is received, the lighting effect is a night light effect, and according to the night light Effect, output a light emission control instruction to control the light emitting device to emit light.
  • the second brightness threshold may be a value set according to needs, and a selection criterion may be a light intensity that is considered to be when people fall asleep under normal circumstances; and the second brightness threshold is in some cases
  • the lower value may be the same as the first brightness threshold, or may be different.
  • the specific value depends on the specific situation.
  • the small night light effect is a yellowish and darker effect as a whole, and its specific color temperature and brightness can be selected according to actual needs, which is not limited here.
  • the light-emitting device displays different light-emitting effects (including bright illumination light and color-changing light) when interacting with people in different situations, as shown in Table 8 below.
  • the interactive operation instructions include at least the following types:
  • the third brightness threshold may be a value set according to needs, and a selection criterion may be a brightness value that is considered to be a lighter degree under normal circumstances; in some cases, the The specific value may be the same as the first brightness threshold, which depends on specific conditions, and is not specifically limited herein.
  • the brightness data collected by the brightness sensing module is lower than the first brightness threshold and the infrared data collected by the infrared sensing module is within a preset range, it means that the current light intensity is dim and the infrared signal detects someone After that, it is determined that the interactive operation instruction is a voice wake-up instruction, and a voice interaction is started, and at the same time, one of the corresponding wake-up light colors c1-c7 is determined in combination with the active state.
  • the wake-up word is detected in the voice data collected by the pickup module, it indicates that the user is awakening the voice interaction function by voice, then determine that the interaction operation instruction is a voice wake-up instruction, and start voice interaction, and determine it in conjunction with the active state One of the corresponding wake-up light colors c1-c7.
  • the voice data collected by the pickup module is detected three consecutive beeps within a certain time interval, it means that the user has clapped his hands three times. At this time, the corresponding luminous effect is determined among the corresponding active states 17-23. One.
  • the brightness data collected by the brightness sensing module is detected as a data shake (such as a sudden decrease in brightness or repeated high and low levels), it means that the current user is waving his hand on the top of the device. At this time, the corresponding lighting effect is determined in conjunction with the corresponding active state. 24 One of -30.
  • the voice data collected by the pickup module is detected as a data oscillation (such as a sudden change in frequency or amplitude), it means that the current user is blowing air at the top of the device. At this time, the corresponding light emission is determined in accordance with the corresponding active state.
  • a data oscillation such as a sudden change in frequency or amplitude
  • the light-on instruction may be implemented by operations such as voice control, physical switch buttons, or mobile phone control.
  • Step 105 according to the light-emitting effect, output a light-emitting control instruction to control the light-emitting device to emit light; here, by corresponding to the foregoing interaction operation instruction, active state, and light-emitting effect, a corresponding light-emitting control instruction can be obtained to control the light-emitting device.
  • the light-emitting device emits light; the light-emitting control instruction includes light parameters such as light intensity and color temperature, and different light parameters have different light parameters.
  • the light emitting device further includes a sound emitting device.
  • the sound emitting device is a microphone and the sound pickup module is a microphone sound pickup module.
  • the method for interacting with the light emitting device further includes:
  • a sounding control instruction is output to control the sounding device to sound.
  • the sound effect may be combined with the aforementioned lighting effect to provide more interactive effects.
  • the acousto-optic effects 1-9 are different combinations of sound effects and light to express the active condition of the light-emitting device.
  • the acousto-optic effects 1-3 under very active conditions will show enthusiastic and cheerful sound effects and light. Effects, in very inactive sound and light effects 7-9 will show quiet and melancholic sound and light effects.
  • the light-emitting device may exhibit different acousto-optic effects when interacting with people in different situations, as shown in Table 10 below.
  • the interactive operation instructions include at least the following types:
  • the third brightness threshold may be a value set according to needs, and a selection criterion may be a brightness value that is considered to be a lighter degree under normal circumstances; in some cases, the The specific value may be the same as the first brightness threshold, which depends on specific conditions, and is not specifically limited herein.
  • the brightness data collected by the brightness sensing module is lower than the first brightness threshold and the infrared data collected by the infrared sensing module is within a preset range, it means that the current light intensity is dim and the infrared signal detects someone After that, it is determined that the interactive operation instruction is a voice wake-up instruction, and voice interaction is started. At the same time, one of the corresponding wake-up light colors c1-c7 is determined based on the active state, and a corresponding sound can be provided in conjunction with the corresponding wake-up light color. effect.
  • the wake-up word is detected in the voice data collected by the pickup module, it indicates that the user is awakening the voice interaction function by voice, then determine that the interaction operation instruction is a voice wake-up instruction, and start voice interaction, and determine it in conjunction with the active state
  • One of the corresponding wake-up light colors c1-c7 can also provide a corresponding sound effect in accordance with the corresponding wake-up light color.
  • the voice data collected by the pickup module is detected for three consecutive beeps within a certain time interval, it means that the user has clapped three times. At this time, the corresponding sound and light effects 17-23 are determined in accordance with the corresponding active state. one.
  • the brightness data collected by the brightness sensing module is detected as a data shake (such as a sudden decrease in brightness or repeated high and low levels), it means that the current user is waving his hand on the top of the device. At this time, the corresponding sound and light effect is determined in conjunction with the corresponding active state One of 24-30.
  • the voice data collected by the pickup module is detected as a data oscillation (such as a sudden change in frequency or amplitude), it means that the current user is blowing at the top of the device. At this time, the corresponding sound is determined in conjunction with the corresponding active state.
  • a data oscillation such as a sudden change in frequency or amplitude
  • the light-on instruction may be implemented by operations such as voice control, physical switch buttons, or mobile phone control; at the same time, optionally, it may also provide corresponding sound effects in cooperation with corresponding night lights to enrich interactive effects.
  • the method for interacting with a light emitting device determines an active state according to the activity of an activity factor, and then combines the active state with the received interactive operation instruction to determine the corresponding light emission. Effect, the judgment logic of the light-emitting effect is no longer single, and compared with the light-emitting device with a single judgment logic, the problem of misoperation is less likely to occur.
  • the active state can be better performed. OK to better interact with users.
  • the active state directly affects various performance states of the light-emitting device.
  • the active-state judgment mechanism provided by the embodiment of the present invention enables the light-emitting device to have its own active judgment mode, which can perform different feedback when encountering different interaction triggers, and no longer It is an invariant way of interactive expression. It jumps out of the dull feeling of the electronic device, which increases the richness and attractiveness of the performance, and makes users feel beyond expectations.
  • the interactive operation instruction is an instruction obtained by analyzing data collected by the infrared sensing module, the brightness sensing module, and / or the sound pickup module
  • a multi-modal interactive light emitting is formed.
  • the device can be a smart device that recognizes human behavior and can understand and interact with human behavior.
  • the user performs the light-on operation and the like has different sound and light feedback and a judgment mechanism for being awakened, and one by one makes a logical judgment decision.
  • the overall multi-modal interactive logical judgment framework can handle complex situations. It is no longer a one-to-one judgment method. Through the comprehensive judgment of ambient light, current time, and user behavior, the most appropriate is given in the current situation.
  • the feedback improves the fault tolerance rate of the device, the false trigger rate is very low, and the linkage effect is good.
  • the method for interacting with the light emitting device comprehensively analyzes each input information (current time, current light conditions, whether the current infrared senses someone, current weather conditions, the last interaction time between the device and the user, and accumulated within nearly 24 hours. Use the length of the device, the voice command of the current person, and the behavior of the current person) to make a comprehensive judgment to determine the appropriate interactive feedback method to be expressed by sound and light effects to meet the interaction needs in various scenarios.
  • FIG. 2 it is a schematic structural diagram of an embodiment of a light-emitting device provided by the present invention.
  • the light emitting device includes:
  • An interaction triggering unit 201 configured to receive an interactive operation instruction
  • a light emitting unit 203 configured to emit light according to a light emission control instruction
  • the processing unit 202 is configured to:
  • the activity factors are selected from the group consisting of: current time, light intensity, weather conditions, the time interval between the current time and the last interaction, and the duration of use within a preset interaction period;
  • a light-emitting control instruction is output to control the light-emitting unit 203 to emit light.
  • the light-emitting device provided by the embodiment of the present invention determines the active state according to the activity of the activity factor, and then combines the active state with the received interactive operation instruction to determine the corresponding lighting effect, so The judgment logic of the light-emitting effect is no longer single. Compared with a light-emitting device with a single judgment logic, the problem of misoperation is less likely to occur.
  • the light emitting device further includes a sound emitting unit 304 for sounding according to a sounding control instruction;
  • the processing unit 202 is configured to:
  • a sounding control instruction is output to control the sounding unit 304 to sound, so that the sound effect can be combined with the aforementioned lighting effect to provide more interactive effects.
  • the interaction triggering unit 201 includes an infrared sensing module, a brightness sensing module, and a sound pickup module;
  • the processing unit 202 is configured to:
  • the interactive operation instruction is determined according to data collected by the infrared sensing module, the brightness sensing module, and / or the sound pickup module.
  • a multi-modal interaction light emitting device is formed.
  • a smart device that recognizes and understands human behavior. Under different levels of activity, the user's voice wakes up, the infrared sensor detects someone and the light is bright, the infrared sensor detects someone and the light is dim, the user claps three times, the user waves at the top of the device, the user blows at the top of the device, sleep time In the segment and in the dim state, the user performs the light-on operation and the like has different sound and light feedback and a judgment mechanism for being awakened, and one by one makes a logical judgment decision.
  • the overall multi-modal interactive logical judgment framework can handle complex situations. It is no longer a one-to-one judgment method. Through the comprehensive judgment of ambient light, current time, and user behavior, the most appropriate is given in the current situation. The feedback improves the fault tolerance rate of the device, the false trigger rate is very low, and the linkage effect is good.
  • FIG. 3 it is a schematic diagram of a hardware structure of an embodiment of an apparatus for performing the interaction method provided by the present invention.
  • the device includes:
  • the device for executing the interaction method may further include: an input device 303 and an output device 304.
  • the processor 301, the memory 302, the input device 303, and the output device 304 may be connected through a bus or other methods. In FIG. 3, the connection through the bus is taken as an example.
  • the memory 302 is a non-volatile computer-readable storage medium, and can be used to store non-volatile software programs, non-volatile computer executable programs, and modules, such as programs corresponding to the interaction methods in the embodiments of the present application. Instructions / modules (for example, the interaction trigger unit 201, the light emitting unit 203, and the processing unit 202 shown in FIG. 2).
  • the processor 301 executes various functional applications and data processing of the server by running non-volatile software programs, instructions, and modules stored in the memory 302, that is, an interactive method that implements the foregoing method embodiments.
  • the memory 302 may include a storage program area and a storage data area, where the storage program area may store an operating system and application programs required for at least one function; the storage data area may store data created according to the use of the light emitting device, and the like.
  • the memory 302 may include a high-speed random access memory, and may further include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other non-volatile solid-state storage device.
  • the memory 302 may optionally include a memory remotely set relative to the processor 301, and these remote memories may be connected to the member user behavior monitoring device through a network. Examples of the above network include, but are not limited to, the Internet, an intranet, a local area network, a mobile communication network, and combinations thereof.
  • the input device 303 may receive inputted numeric or character information and generate key signal inputs related to user settings and function control of the light emitting device.
  • the output device 304 may include a display device such as a display screen.
  • the one or more modules are stored in the memory 302, and when executed by the one or more processors 301, execute the interaction method in any of the method embodiments described above.
  • the technical effect of the embodiment of the apparatus for executing the interaction method is the same as or similar to that of any of the foregoing method embodiments.
  • An embodiment of the present application provides a non-transitory computer storage medium, where the computer storage medium stores computer-executable instructions, and the computer-executable instructions can perform a method for processing a list item operation in any of the foregoing method embodiments.
  • the technical effect of the embodiment of the non-transitory computer storage medium is the same as or similar to that of any of the foregoing method embodiments.
  • the program can be stored in a computer-readable storage In the medium, when the program is executed, it may include the processes of the embodiments of the methods described above.
  • the storage medium may be a magnetic disk, an optical disk, a read-only memory (Read-Only Memory, ROM), or a random access memory (Random, Access Memory, RAM).
  • the technical effect of the embodiment of the computer program is the same as or similar to that of any of the foregoing method embodiments.
  • the devices and devices described in the present disclosure may be various electronic terminal devices, such as mobile phones, personal digital assistants (PDAs), tablet computers (PADs), smart TVs, etc., or large terminal devices such as Servers, etc., so the scope of protection of this disclosure should not be limited to a certain type of device, equipment.
  • the client described in the present disclosure may be applied to any one of the foregoing electronic terminal devices in the form of electronic hardware, computer software, or a combination of the two.
  • the method according to the present disclosure may also be implemented as a computer program executed by a CPU, and the computer program may be stored in a computer-readable storage medium.
  • the computer program is executed by the CPU, the above-mentioned functions defined in the method of the present disclosure are performed.
  • the above method steps and system units may also be implemented using a controller and a computer-readable storage medium for storing a computer program that causes the controller to implement the above steps or unit functions.
  • non-volatile memory may include read-only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory Memory.
  • Volatile memory may include random access memory (RAM), which may act as external cache memory.
  • RAM can be obtained in various forms, such as synchronous RAM (DRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), dual data rate SDRAM (DDR, SDRAM), enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM) and Direct RambusRAM (DRRAM).
  • DRAM synchronous RAM
  • DRAM dynamic RAM
  • SDRAM synchronous DRAM
  • DDR dual data rate SDRAM
  • SDRAM enhanced SDRAM
  • SLDRAM Synchronous Link DRAM
  • DRRAM Direct Rambus RAM
  • the storage devices of the disclosed aspects are intended to include, without being limited to, these and other suitable types of memory.
  • DSP digital signal processors
  • ASIC special purpose processors Integrated circuits
  • FPGA field programmable gate arrays
  • a general-purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine.
  • a processor may also be implemented as a combination of computing devices, eg, a combination of a DSP and a microprocessor, multiple microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
  • a software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, removable disk, CD-ROM, or any other form of storage medium known in the art.
  • An exemplary storage medium is coupled to the processor such that the processor can read information from, and write information to, the storage medium.
  • the storage medium may be integrated with the processor.
  • the processor and the storage medium may reside in an ASIC.
  • the ASIC can reside in a user terminal.
  • the processor and the storage medium may reside as discrete components in a user terminal.
  • the functions may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium.
  • Computer-readable media includes computer storage media and communication media including any medium that facilitates transfer of a computer program from one place to another.
  • a storage media may be any available media that can be accessed by a general purpose or special purpose computer.
  • the computer-readable medium may include RAM, ROM, EEPROM, CD-ROM or other optical disk storage devices, magnetic disk storage devices, or other magnetic storage devices, or may be used to carry or store a form of instructions Or any other medium of the required program code of a data structure and that can be accessed by a general purpose or special purpose computer or a general purpose or special purpose processor. Also, any connection is properly termed a computer-readable medium.
  • magnetic and optical disks include compact disks (CDs), laser disks, optical disks, digital versatile disks (DVDs), floppy disks, and Blu-ray disks, where magnetic disks typically reproduce data magnetically, and optical disks use lasers to optically reproduce data . Combinations of the foregoing should also be included within the scope of computer-readable media.
  • the program may be stored in a computer-readable storage medium.
  • the storage medium mentioned may be a read-only memory, a magnetic disk or an optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

一种发光装置及其交互方法、电子设备、存储介质,包括:获取活跃度因子的活跃度(101);其中,所述活跃度因子选自以下参数中的至少一种:当前时间、光线强度、天气状况、当前时间距离上次互动的时间间隔、预设互动周期内的使用时长;根据所述活跃度因子的活跃度确定活跃状态(102);接收交互操作指令(103);结合所述交互操作指令和所述活跃状态,确定所需的发光效果(104);根据所述发光效果,输出发光控制指令以控制所述发光装置发光(105)。所述发光装置及其交互方法、电子设备、存储介质,可以在一定程度上解决现有技术中的发光装置的判断逻辑单一的问题。

Description

发光装置及其交互方法、电子设备、存储介质 技术领域
本发明涉及交互技术领域,特别是指一种发光装置及其交互方法、电子设备、存储介质。
背景技术
现有技术中,目前有一种智能小夜灯可在环境昏暗的情况下通过检测人体来自动点亮夜灯,从而无需用户自己对夜灯进行控制。
但是,发明人在实现本发明的时候发现,现有技术中至少存在以下问题:
在现有的技术方案中,都是通过单模态方式来实现对应的功能,如智能小夜灯只能通过检测人的行为而自动亮起,使得应用场景单一且难以处理复杂的状况,容错率很低,误触发率很高,无联动效果。同时,由于其简单的判断逻辑导致装置本身给人感受死板无惊喜,难以跳出电子装置本身感受,难以传达感知用户、理解用户且恰当反馈给用户的体验。
发明内容
有鉴于此,本发明实施例的目的之一在于,提出一种发光装置及其交互方法、电子设备、存储介质,可以在一定程度上解决现有技术中的发光装置的判断逻辑单一的问题。
基于上述目的,本发明实施例的第一个方面,提供了一种发光装置的交互方法,包括:
获取活跃度因子的活跃度;其中,所述活跃度因子选自以下参数中的至少一种:当前时间、光线强度、天气状况、当前时间距离上次互动的时间间隔、预设互动周期内的使用时长;
根据所述活跃度因子的活跃度确定活跃状态;
接收交互操作指令;
结合所述交互操作指令和所述活跃状态,确定所需的发光效果;
根据所述发光效果,输出发光控制指令以控制所述发光装置发光。
可选的,所述获取活跃度因子的活跃度的步骤中,获取的活跃度因子至少包括当前时间距离上次互动的时间间隔和预设互动周期内的使用时长。
可选的,获取的活跃度因子还包括光线强度。
可选的,所述发光装置还包括发声装置;所述交互方法还包括:
结合所述交互操作指令和所述活跃状态,确定所需的声音效果;
根据所述声音效果,输出发声控制指令以控制所述发声装置发声。
可选的,所述发光装置还包括红外传感模块、亮度传感模块和拾音模块;
所述接收交互操作指令,包括:
接收所述红外传感模块、亮度传感模块和/或拾音模块采集的数据;
根据所述红外传感模块、亮度传感模块和/或拾音模块采集的数据,确定所述交互操作指令。
可选的,所述方法还包括:
若所述红外传感模块、亮度传感模块和拾音模块未采集到数据,所述发光效果和声音效果根据所述活跃状态确定。
可选的,所述方法还包括:
若所述亮度传感模块采集的亮度数据低于第一亮度阈值,且所述红外传感模块采集的红外数据处于预设范围内,则确定所述交互操作指令为语音唤醒指令,并启动语音交互;
或者,若所述拾音模块采集的语音数据被检测到唤醒词,则确定所述交互操作指令为语音唤醒指令,并启动语音交互。
可选的,启动语音交互之后,还包括:
结合所述语音唤醒指令和所述活跃状态,确定所需的发光效果为唤醒光效;
根据所述唤醒光效,输出发光控制指令以控制所述发光装置发光。
可选的,所述方法还包括:
确定所述当前时间是否处于睡眠时间段;
若所述当前时间处于睡眠时间段,所述亮度传感模块采集的亮度数据低于第二亮度阈值,且接收到开灯指令,则所述发光效果为小夜灯效果,并根据所述小夜灯效果,输出发光控制指令以控制所述发光装置发光。
本发明实施例的第二个方面,提供了一种发光装置,包括:
交互触发单元,用于接收交互操作指令;
发光单元,用于根据发光控制指令发光;
处理单元,被配置为:
获取至少两个活跃度因子的活跃度;其中,所述活跃度因子选自:当前时间、光线强度、天气状况、当前时间距离上次互动的时间间隔、预设互动周期内的使用时长;
根据所述至少两个活跃度因子的活跃度确定活跃状态;
结合所述交互操作指令和所述活跃状态,确定所需的发光效果;
根据所述发光效果,输出发光控制指令以控制所述发光单元发光。
可选的,所述发光装置还包括发声单元,用于根据发声控制指令发声;
所述处理单元,被配置为:
结合所述交互操作指令和所述活跃状态,确定所需的声音效果;
根据所述声音效果,输出发声控制指令以控制所述发声单元发声。
可选的,所述交互触发单元包括红外传感模块、亮度传感模块和拾音模块;
所述处理单元,被配置为:
接收所述红外传感模块、亮度传感模块和/或拾音模块采集的数据;
根据所述红外传感模块、亮度传感模块和/或拾音模块采集的数据,确定所述交互操作指令。
本发明实施例的第三个方面,提供了一种电子设备,包括:
至少一个处理器;以及,
与所述至少一个处理器通信连接的存储器;其中,
所述存储器存储有可被所述一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够执行如前任意一项所述的方法。
本发明实施例的第四个方面,提供了一种存储有计算机程序的计算机可读存储介质,其中,所述计算机程序在由处理器执行时实现前述任一项所述的方法的步骤。
从上面所述可以看出,本发明实施例提供的发光装置及其交互方法、电子设备、存储介质,根据活跃度因子的活跃度确定活跃状态,再将活跃状态与接收到的交互操作指令相结合,从而确定相应的发光效果,使得发光效果的判断逻辑不再单一,较之单一判断逻辑的发光装置,不容易出现误操作的问题。
附图说明
图1为本发明提供的发光装置的交互方法的一个实施例的流程示意图;
图2为本发明提供的发光装置的一个实施例的结构示意图;
图3为本发明提供的电子设备的一个实施例的结构示意图。
具体实施方式
为使本发明的目的、技术方案和优点更加清楚明白,以下结合具体实施例,并参照附图,对本发明进一步详细说明。
需要说明的是,本发明实施例中所有使用“第一”和“第二”的表述均是为了区分两个相同名称非相同的实体或者非相同的参量,可见“第一”“第二”仅为了表述的方便,不应理解为对本发明实施例的限定,后续实施例对此不再一一说明。
本发明实施例的第一个方面,提出了一种发光装置的交互方法,可以在一定程度上解决现有技术中的发光装置的判断逻辑单一的问题。如图1所示,为本发明提供的发光装置的交互方法的一个实施例的流程示意图。
所述发光装置的交互方法,包括:
步骤101:获取活跃度因子的活跃度;其中,所述活跃度因子选自以下参数中的至少一种:当前时间、光线强度、天气状况、当前时间距离上次互动的时间间隔、预设互动周期内的使用时长。
步骤102:根据所述活跃度因子的活跃度确定活跃状态。
这里,所述活跃状态根据选择的活跃度因子的活跃度而定,具体地,可参考以下表格。
作为本发明的一个实施例,当活跃度因子选择当前时间时,不同时间段(hr)的活跃度分布如下表1:
表1当前时间与活跃度关系
时间段 00-02 02-04 04-06 06-08 08-10 10-12 12-14 14-16 16-18 18-20 20-22 22-24
活跃度 -10 -20 -10 0 +5 +5 +5 +10 +10 +10 +20 0
此时,根据时间段对应的活跃度,可以将该时间段内的活跃状态分为非常不活跃(-20)、很不活跃(-10)、不活跃(0)、活跃(+5)、很活跃(+10)、非常活跃(+20)这六种。例如,当前时间为晚上23点时,对应的活跃度为0,则当前的活跃状态为“不活跃”。
作为本发明的一个实施例,当活跃度因子选择当前的光线强度时,不同光 线情况(%)活跃度影响分布如下表2:
表2光线强度与活跃度关系
光线强度 0-10 10-30 30-50 50-70 70-90 90-100
活跃度 +20 +10 0 5 -10 -20
此时,根据光线强度对应的活跃度,可以将该光线强度范围内的活跃状态分为非常不活跃(-20)、很不活跃(-10)、不活跃(0)、活跃(+5)、很活跃(+10)、非常活跃(+20)这六种。例如,当前的光线强度占比为5%时,对应的活跃度为+20,则当前的活跃状态为“非常活跃”。
需要说明的是,表2中的光线强度的取值是百分比,其为当前光线强度与基准光强的比值。所述基准光强,可以是前一天正午12点所采集到的光线强度,到当天12点时,可以将实时检测的光强更新为新的基准光强,从而可以使每天的光线强度百分比得到实时更新,更加符合实际情况。当然,可选的,所述基准光强也可以是一个预定的光强值,比如默认设置的值或用户自定义的值,等等。
作为本发明的一个实施例,当活跃度因子选择当前或当天的天气状况时,不同天气情况活跃度影响分布如下表3:
表3天气状况与活跃度关系
Figure PCTCN2018113439-appb-000001
此时,根据天气状况对应的活跃度,可以将相应天气状况的活跃状态分为非常不活跃(-20)、很不活跃(-15)、不活跃(-10)、正常(0)、很活跃(+10)这五种。例如,当前或当天的天气状况为暴雨时,对应的活跃度为-15,则当 前的活跃状态为“很不活跃”。
作为本发明的一个实施例,当活跃度因子选择当前时间距离上次互动的时间间隔时,不同的时间间隔活跃度影响分布如下表4:
表4互动时间间隔与活跃度关系
互动时间间隔 大于48h 48h-24h 24h-12h 12h-4h 4h-30min 30-15min 小于15min
活跃度 +30 +20 +10 0 -10 -15 -20
此时,根据互动时间间隔对应的活跃度,可以将相应时间间隔的活跃状态分为非常不活跃(-20)、很不活跃(-15)、不活跃(-10)、正常(0)、活跃(+10)、很活跃(+20)、非常活跃(+30)这七种。例如,当前时间距离上次互动的时间间隔大于48小时,对应的活跃度为+30,则当前的活跃状态为“非常活跃”。
作为本发明的一个实施例,当活跃度因子选择预设互动周期内(例如24小时或48小时等)的使用时长时,不同的使用时长活跃度影响分布如下表5:
表5使用时长与活跃度关系
累计使用时长 0 0-1min 1min-5min 5min-30min 30min-2hr 2hr以上
活跃度 +20 +15 +10 0 -10 -20
此时,根据预设互动周期内的使用时长对应的活跃度,可以将相应使用时长的活跃状态分为非常不活跃(-20)、不活跃(-10)、正常(0)、活跃(+10)、很活跃(+15)、非常活跃(+20)这六种。例如,预设互动周期内的使用时长为31分钟时,对应的活跃度为-10,则当前的活跃状态为“不活跃”。
需要说明的是,上述表格中的参数范围,在出现两个范围的端点重叠时,可以采用就左或就右的规则来区分,例如,累积使用时长0-1min和1min-5min在1分钟处重叠,可以采用0-1min中不包含1分钟这个端点的方式来与1min-5min进行区分,或者,采用1min-5min中不包含1分钟这个端点的方式来与0-1min进行区分。其他表格的参数范围也可以采用类似规则进行区分,在此不再赘述。
较佳的,在选择活跃度因子时,还可以从当前时间、光线强度、天气状况、当前时间距离上次互动的时间间隔、预设互动周期内的使用时长这五个参数中选择两个以上的活跃度因子来确定活跃状态。
例如,选择当前时间和光线强度时,可将前述表1和表2中的活跃度相加,然后将活跃度之和与活跃状态进行对照。例如,当前时间为21点时,活跃度为+20,而当前光线强度为5时,活跃度为+20,此时二者活跃度之和为+40, 则可将此时定义为“非常活跃”。二者活跃度之和为其他值时,可以根据前述的活跃状态的划分方式进行相应地划分,在此不再赘述。
进一步的,所述获取活跃度因子的活跃度的步骤中,获取的活跃度因子至少包括当前时间距离上次互动的时间间隔和预设互动周期内的使用时长,从而能够较好地反应活跃状态。较佳的,获取的活跃度因子还包括光线强度,从而进一步准确反应活跃状态。
较佳的,作为本发明的一个实施例,所述当前时间、光线强度、天气状况、当前时间距离上次互动的时间间隔和预设互动周期内的使用时长均作为活跃度因子来进行活跃状态的确定。
具体地,总活跃度A的计算公式如下:
A=50+T+L+W+I+P
其中,初始状态活跃度为50,T、L、W、I、P分别代表当前时间、光线强度、天气状况、当前时间距离上次互动的时间间隔和预设互动周期内的使用时长的活跃度,这几个活跃度因子的活跃度可参考表1至表5的记载,在此不再赘述。
这样,在计算出总活跃度A之后,可以根据下表6,确定活跃状态。
表6总活跃度与活跃状态的关系
活跃状态 非常不活跃 很不活跃 不活跃 正常 活跃 很活跃 非常活跃
总活跃度区间 -∞~10 10~20 20~40 40~60 60~80 80~90 90~+∞
可选的,所述发光装置的交互方法中的活跃状态可以采用定时刷新的方式来进行更新,例如每隔一小时刷新一次;也可以在某个参数出现变化时触发刷新机制,例如在突然下暴雨时将活跃状态中暴雨天气时的活跃度进行更新从而刷新活跃状态,等等,具体实现方式可以根据实际需要进行调整,在此不再赘述。
综上,根据前述的活跃状态计算规则的各实施例,所述发光装置在任意时刻都有一个活跃状态。
步骤103:接收交互操作指令。这里,所述的交互操作指令可以是指,用于与所述发光装置进行交互而产生的指令,主要是通过该指令控制发光装置发出某种发光效果的光。
可选的,所述发光装置还包括红外传感模块、亮度传感模块和拾音模块;较佳的,所述红外传感模块为热释电红外传感器;
所述接收交互操作指令,包括:
接收所述红外传感模块、亮度传感模块和/或拾音模块采集的数据;
根据所述红外传感模块、亮度传感模块和/或拾音模块采集的数据,确定所述交互操作指令。
根据不同模块采集得到的数据或者这些数据的任意结合,可以得出不同的交互操作指令。
例如,若所述亮度传感模块采集的亮度数据低于第一亮度阈值,且所述红外传感模块采集的红外数据处于第一预设范围内,则确定所述交互操作指令为语音唤醒指令。当然,这里仅仅是举例的方式,具体的组合方式可以根据实际情况进行设定。
步骤104:结合所述交互操作指令和所述活跃状态,确定所需的发光效果。
作为本发明的一个实施例,所述发光装置的交互方法,还包括:
若所述红外传感模块、亮度传感模块和拾音模块未采集到数据,则确认此时为不使用的待机状态(也是交互操作指令的一种),所述发光效果根据所述活跃状态确定。
根据不同的活跃状态,该发光装置在不使用待机时候会采用不同的发光效果作为静默待机状态,如下表7所示。
表7待机状态下的发光效果分布
分类 非常不活跃 很不活跃 不活跃 正常 活跃 很活跃 非常活跃
发光效果1            
发光效果2          
发光效果3        
发光效果4        
发光效果5        
发光效果6        
发光效果7        
发光效果8          
发光效果9            
如表7所示,发光效果1-9分别为不同发光效果,用以表现所示发光装置的活跃状况,如在非常活跃下的发光效果1-3会表现出热情欢快的光效,在非常不活跃的发光效果7-9则会表现出安静忧郁的光效。
作为本发明的一个实施例,所述发光装置的交互方法,还包括:
若所述亮度传感模块采集的亮度数据低于第一亮度阈值,且所述红外传感模块采集的红外数据处于预设范围内,则确定所述交互操作指令为语音唤醒指令,并启动语音交互;这里,所述第一亮度阈值可以是根据需要而设定的取值,其选择标准可以是,在正常情况下被认为是光线达到较为昏暗的程度的亮度取值;所述预设范围可以是,在正常情况下,当有人走动时,所能检测到的红外数据的范围,其取值可以根据实际情况进行选择,在此不做限定。可以看出,在此种情况下,所述亮度传感模块采集的亮度数据低于第一亮度阈值,且所述红外传感模块采集的红外数据处于预设范围内,替代了唤醒词的作用,而无需在语音交互前通过唤醒词对相应的语音交互功能进行唤醒,从而节省了操作步骤,方便了用户的使用。
作为一种替代方式,若所述拾音模块采集的语音数据被检测到唤醒词,则确定所述交互操作指令为语音唤醒指令,并启动语音交互;可选的,所述唤醒词可以是默认的,也可以是用户自己设定的,在此不对其内容进行具体限定。
在启动语音交互后,则可通过语音控制发光装置发出相应发光效果的光,或者通过语音控制与所述发光装置进行其他互动。
进一步的,在启动语音交互之后,所述发光装置的交互方法,还可包括以下步骤:
结合所述语音唤醒指令和所述活跃状态,确定所需的发光效果为唤醒光效;
根据所述唤醒光效,输出发光控制指令以控制所述发光装置发光。
这样,通过在确定接收到语音唤醒指令后,结合所述活跃状态,使发光装置发出具有唤醒光效的光线,使得用户根据该种特殊光效获知目前发光装置处于可通过语音进行互动的状态,一方面有利于提醒用户目前语音互动功能已激活,随时可通过语音指令与设备进行交互,另一方面,也提醒用户目前语音互动功能已激活,若为误操作,请及时关闭该功能,以免出现其他误操作问题。
作为本发明的一个实施例,所述发光装置的交互方法,还包括:
确定所述当前时间是否处于睡眠时间段(例如,晚23点至早5点);
若所述当前时间处于睡眠时间段,所述亮度传感模块采集的亮度数据低于 第二亮度阈值,且接收到开灯指令,则所述发光效果为小夜灯效果,并根据所述小夜灯效果,输出发光控制指令以控制所述发光装置发光。
这里,所述第二亮度阈值可以是根据需要而设定的取值,其选择标准可以是,在正常情况下被认为在人们熟睡时的光线强度;所述第二亮度阈值,在某些情况下可能是与所述第一亮度阈值相同的,也可以是不同的,具体取值视具体情况而定。所述小夜灯效果整体上为偏黄偏暗的效果,其具体色温、亮度等可以根据实际需要进行选择,在此不做限定。
根据不同的活跃状态,该发光装置在不同情况与人进行互动时候会表现不同的发光效果(包括可亮照明光和变色光),如下表8所示。
表8交互操作指令、活跃状态与发光效果的关系
Figure PCTCN2018113439-appb-000002
从表8可以看出,所述交互操作指令至少包括以下几种:
1)若所述亮度传感模块采集的亮度数据高于第三亮度阈值,且所述红外传感模块采集的红外数据处于预设范围内,说明当前光线强度处于明亮状态且红外信号检测到有人经过,此时,结合相应的活跃状态确定相应的发光效果10-16的其中之一。这里,所述第三亮度阈值可以是根据需要而设定的取值,其选择标准可以是,在正常情况下被认为是光线达到较为明亮的程度的亮度取值;在某些情况下,其具体取值可以是与所述第一亮度阈值相同的,这取决于具体的情况,在此不做具体限定。
2)若所述亮度传感模块采集的亮度数据低于第一亮度阈值,且所述红外传感模块采集的红外数据处于预设范围内,说明当前光线强度处于昏暗状态且红外信号检测到有人经过,则确定所述交互操作指令为语音唤醒指令,并启动语音交互,同时结合活跃状态确定相应的唤醒灯光颜色c1-c7的其中之一。
3)若所述拾音模块采集的语音数据被检测到唤醒词,说明用户正在通过语音唤醒语音互动功能,则确定所述交互操作指令为语音唤醒指令,并启动语音交互,同时结合活跃状态确定相应的唤醒灯光颜色c1-c7的其中之一。
4)若所述拾音模块采集的语音数据被检测到在一定时间间隔内的连续三次响声,说明用户拍手了三次,此时,结合相应的活跃状态确定相应的发光效果17-23的其中之一。
5)若所述亮度传感模块采集的亮度数据被检测到数据震荡(如亮度突然降低或高低反复),说明当前用户正在设备顶部挥手,此时,结合相应的活跃状态确定相应的发光效果24-30的其中之一。
6)若所述拾音模块采集的语音数据被检测到数据震荡(如频率或振幅发生突然变化),说明当前用户正在对着设备顶部吹气,此时,结合相应的活跃状态确定相应的发光效果31-37的其中之一。
7)若所述当前时间处于睡眠时间段,所述亮度传感模块采集的亮度数据低于第二亮度阈值,且接收到开灯指令,说明现在需要打开小夜灯,则所述发光效果为小夜灯效果,并根据所述小夜灯效果,输出发光控制指令以控制所述发光装置发光。可选的,所述开灯指令,可以采用声控、物理开关按键或手机控制等操作来实现。
步骤105:根据所述发光效果,输出发光控制指令以控制所述发光装置发光;这里,通过前述的交互操作指令、活跃状态与发光效果的对应,即可得到相应的发光控制指令以控制所述发光装置发光;所述发光控制指令中包括光线 的强度、色温等光参数,不同的发光效果对应的光参数不同。
作为本发明的一个实施例,所述发光装置还包括发声装置,可选的,所述发声装置为麦克风,所述拾音模块为麦克风拾音模块;所述发光装置的交互方法,还包括:
结合所述交互操作指令和所述活跃状态,确定所需的声音效果;
根据所述声音效果,输出发声控制指令以控制所述发声装置发声。
具体的,所述声音效果可以与前述的发光效果相结合,以提供更多元的互动效果。
例如在不使用待机时候采用不同的声光效果作为静默待机状态,如下表9所示。
表9待机状态下的声光效果分布
分类 非常不活跃 很不活跃 不活跃 正常 活跃 很活跃 非常活跃
声光效果1            
声光效果2          
声光效果3        
声光效果4        
声光效果5        
声光效果6        
声光效果7        
声光效果8          
声光效果9            
如表9所示,声光效果1-9分别为不同的音效加光的组合来表现该发光装置的活跃情况,如在非常活跃下的声光效果1-3会表现出热情欢快的音效和光效,在非常不活跃的声光效果7-9会表现出安静忧郁的音效和光效。
再比如,根据不同的活跃状态,该发光装置在不同情况与人进行互动时候会表现不同的声光效果,如下表10所示。
表10交互操作指令、活跃状态与声光效果的关系
Figure PCTCN2018113439-appb-000003
Figure PCTCN2018113439-appb-000004
从表10可以看出,所述交互操作指令至少包括以下几种:
1)若所述亮度传感模块采集的亮度数据高于第三亮度阈值,且所述红外传感模块采集的红外数据处于预设范围内,说明当前光线强度处于明亮状态且红外信号检测到有人经过,此时,结合相应的活跃状态确定相应的声光效果10-16的其中之一。这里,所述第三亮度阈值可以是根据需要而设定的取值,其选择标准可以是,在正常情况下被认为是光线达到较为明亮的程度的亮度取值;在某些情况下,其具体取值可以是与所述第一亮度阈值相同的,这取决于具体的情况,在此不做具体限定。
2)若所述亮度传感模块采集的亮度数据低于第一亮度阈值,且所述红外传感模块采集的红外数据处于预设范围内,说明当前光线强度处于昏暗状态且红外信号检测到有人经过,则确定所述交互操作指令为语音唤醒指令,并启动语音交互,同时结合活跃状态确定相应的唤醒灯光颜色c1-c7的其中之一,同 时也可配合相应的唤醒灯光颜色提供相应的声音效果。
3)若所述拾音模块采集的语音数据被检测到唤醒词,说明用户正在通过语音唤醒语音互动功能,则确定所述交互操作指令为语音唤醒指令,并启动语音交互,同时结合活跃状态确定相应的唤醒灯光颜色c1-c7的其中之一,同时也可配合相应的唤醒灯光颜色提供相应的声音效果。
4)若所述拾音模块采集的语音数据被检测到在一定时间间隔内的连续三次响声,说明用户拍手了三次,此时,结合相应的活跃状态确定相应的声光效果17-23的其中之一。
5)若所述亮度传感模块采集的亮度数据被检测到数据震荡(如亮度突然降低或高低反复),说明当前用户正在设备顶部挥手,此时,结合相应的活跃状态确定相应的声光效果24-30的其中之一。
6)若所述拾音模块采集的语音数据被检测到数据震荡(如频率或振幅发生突然变化),说明当前用户正在对着设备顶部吹气,此时,结合相应的活跃状态确定相应的声光效果31-37的其中之一。
7)若所述当前时间处于睡眠时间段,所述亮度传感模块采集的亮度数据低于第二亮度阈值,且接收到开灯指令,说明现在需要打开小夜灯,则所述发光效果为小夜灯效果,并根据所述小夜灯效果,输出发光控制指令以控制所述发光装置发光。可选的,所述开灯指令,可以采用声控、物理开关按键或手机控制等操作来实现;同时,可选的,也可配合相应的小夜灯提供相应的声音效果,以丰富互动效果。
可以看出,根据不同的活跃状态,当用户与发光装置互动的时候表现不同的声光反馈,从而在不同的时间、不同的光线情况、不同的天气情况、互动间隔和累计使用时长不同的情况下针对同一个操作的反馈表现也不一样,做到真正的理解用户且表达丰富给用户感觉惊喜不呆板。
从上述实施例可以看出,本发明实施例提供的发光装置的交互方法,根据活跃度因子的活跃度确定活跃状态,再将活跃状态与接收到的交互操作指令相结合,从而确定相应的发光效果,使得发光效果的判断逻辑不再单一,较之单一判断逻辑的发光装置,不容易出现误操作的问题。
进一步的,当活跃度因子选择当前时间、光线强度、天气状况、当前时间距离上次互动的时间间隔、预设互动周期内的使用时长中的两个以上时,能够更好地对活跃状态进行确定,从而更好地与用户进行互动。活跃状态直接影响 该发光装置的各种表现状态,本发明实施例提供的活跃状态判断机制,使得发光装置有自己的活跃判断方式,在遇到不同的互动触发时候能表现不同的反馈,不再是一成不变的互动表现方式,跳出电子装置呆板的感受,增加了表现的丰富性和吸引力,让用户感受超预期。
更进一步的,当所述交互操作指令是通过红外传感模块、亮度传感模块和/或拾音模块采集得到的数据而分析得出的指令时,则形成了一种多模态互动的发光装置,即可识别人的行为且可理解人行为并随之互动的智能设备。在不同的活跃程度下,用户语音唤醒、红外感应检测到有人且光线明亮、红外感应检测到有人且光线昏暗、用户拍手三次、用户对着设备顶端挥手、用户对着设备顶端吹气、睡眠时间段内且昏暗状态下用户执行开灯操作等都有各自不同的声光反馈和被唤醒与否的判断机制,一一对应做逻辑判断决策。整体的多模态互动的逻辑判断框架能够处理复杂情况的逻辑判断,不再是一对一的判断方式,通过对环境光线、当前时间、用户行为的综合判断,在当前情况下给出最恰当的反馈,使得该装置的容错率提高,误触发率很低,联动效果好。
本发明实施例提供的发光装置的交互方法,通过综合分析各个输入信息(当前时间、当前光线情况、当前红外是否感应到有人、当前天气情况、本装置上次和用户互动时间、近24hr内累计使用该设备时长、当前人的语音命令情况、当前人的行为操作)做综合判断从而确定合适的互动反馈方式以声音加光效的方式来表现,满足各种不同场景下的互动需求。
本发明实施例的第二个方面,提出了一种发光装置,可以在一定程度上解决现有技术中的发光装置的判断逻辑单一的问题。如图2所示,为本发明提供的发光装置的一个实施例的结构示意图。
所述发光装置,包括:
交互触发单元201,用于接收交互操作指令;
发光单元203,用于根据发光控制指令发光;
处理单元202,被配置为:
获取至少两个活跃度因子的活跃度;其中,所述活跃度因子选自:当前时间、光线强度、天气状况、当前时间距离上次互动的时间间隔、预设互动周期内的使用时长;
根据所述至少两个活跃度因子的活跃度确定活跃状态;
结合所述交互操作指令和所述活跃状态,确定所需的发光效果;
根据所述发光效果,输出发光控制指令以控制所述发光单元203发光。
从上述实施例可以看出,本发明实施例提供的发光装置,根据活跃度因子的活跃度确定活跃状态,再将活跃状态与接收到的交互操作指令相结合,从而确定相应的发光效果,使得发光效果的判断逻辑不再单一,较之单一判断逻辑的发光装置,不容易出现误操作的问题。
作为本发明的一个实施例,所述发光装置还包括发声单元304,用于根据发声控制指令发声;
所述处理单元202,被配置为:
结合所述交互操作指令和所述活跃状态,确定所需的声音效果;
根据所述声音效果,输出发声控制指令以控制所述发声单元304发声,使得所述声音效果可以与前述的发光效果相结合,以提供更多元的互动效果。
作为本发明的一个实施例,所述交互触发单元201包括红外传感模块、亮度传感模块和拾音模块;
所述处理单元202,被配置为:
接收所述红外传感模块、亮度传感模块和/或拾音模块采集的数据;
根据所述红外传感模块、亮度传感模块和/或拾音模块采集的数据,确定所述交互操作指令。
当所述交互操作指令是通过红外传感模块、亮度传感模块和/或拾音模块采集得到的数据而分析得出的指令时,则形成了一种多模态互动的发光装置,即可识别人的行为且可理解人行为并随之互动的智能设备。在不同的活跃程度下,用户语音唤醒、红外感应检测到有人且光线明亮、红外感应检测到有人且光线昏暗、用户拍手三次、用户对着设备顶端挥手、用户对着设备顶端吹气、睡眠时间段内且昏暗状态下用户执行开灯操作等都有各自不同的声光反馈和被唤醒与否的判断机制,一一对应做逻辑判断决策。整体的多模态互动的逻辑判断框架能够处理复杂情况的逻辑判断,不再是一对一的判断方式,通过对环境光线、当前时间、用户行为的综合判断,在当前情况下给出最恰当的反馈,使得该装置的容错率提高,误触发率很低,联动效果好。
基于上述目的,本发明实施例的第三个方面,提出了一种执行所述交互方法的装置的一个实施例。如图3所示,为本发明提供的执行所述交互方法的装 置的一个实施例的硬件结构示意图。
如图3所示,所述装置包括:
一个或多个处理器301以及存储器302,图3中以一个处理器301为例。
所述执行所述交互方法的装置还可以包括:输入装置303和输出装置304。
处理器301、存储器302、输入装置303和输出装置304可以通过总线或者其他方式连接,图3中以通过总线连接为例。
存储器302作为一种非易失性计算机可读存储介质,可用于存储非易失性软件程序、非易失性计算机可执行程序以及模块,如本申请实施例中的所述交互方法对应的程序指令/模块(例如,附图2所示的交互触发单元201、发光单元203和处理单元202)。处理器301通过运行存储在存储器302中的非易失性软件程序、指令以及模块,从而执行服务器的各种功能应用以及数据处理,即实现上述方法实施例的交互方法。
存储器302可以包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需要的应用程序;存储数据区可存储根据发光装置的使用所创建的数据等。此外,存储器302可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他非易失性固态存储器件。在一些实施例中,存储器302可选包括相对于处理器301远程设置的存储器,这些远程存储器可以通过网络连接至会员用户行为监控装置。上述网络的实例包括但不限于互联网、企业内部网、局域网、移动通信网及其组合。
输入装置303可接收输入的数字或字符信息,以及产生与发光装置的用户设置以及功能控制有关的键信号输入。输出装置304可包括显示屏等显示设备。
所述一个或者多个模块存储在所述存储器302中,当被所述一个或者多个处理器301执行时,执行上述任意方法实施例中的交互方法。所述执行所述交互方法的装置的实施例,其技术效果与前述任意方法实施例相同或者类似。
本申请实施例提供了一种非暂态计算机存储介质,所述计算机存储介质存储有计算机可执行指令,该计算机可执行指令可执行上述任意方法实施例中的列表项操作的处理方法。所述非暂态计算机存储介质的实施例,其技术效果与前述任意方法实施例相同或者类似。
最后需要说明的是,本领域普通技术人员可以理解实现上述实施例方法中 的全部或部分流程,是可以通过计算机程序来指令相关硬件来完成,所述的程序可存储于一计算机可读取存储介质中,该程序在执行时,可包括如上述各方法的实施例的流程。其中,所述的存储介质可为磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)或随机存储记忆体(Random Access Memory,RAM)等。所述计算机程序的实施例,其技术效果与前述任意方法实施例相同或者类似。
此外,典型地,本公开所述的装置、设备等可为各种电子终端设备,例如手机、个人数字助理(PDA)、平板电脑(PAD)、智能电视等,也可以是大型终端设备,如服务器等,因此本公开的保护范围不应限定为某种特定类型的装置、设备。本公开所述的客户端可以是以电子硬件、计算机软件或两者的组合形式应用于上述任意一种电子终端设备中。
此外,根据本公开的方法还可以被实现为由CPU执行的计算机程序,该计算机程序可以存储在计算机可读存储介质中。在该计算机程序被CPU执行时,执行本公开的方法中限定的上述功能。
此外,上述方法步骤以及系统单元也可以利用控制器以及用于存储使得控制器实现上述步骤或单元功能的计算机程序的计算机可读存储介质实现。
此外,应该明白的是,本文所述的计算机可读存储介质(例如,存储器)可以是易失性存储器或非易失性存储器,或者可以包括易失性存储器和非易失性存储器两者。作为例子而非限制性的,非易失性存储器可以包括只读存储器(ROM)、可编程ROM(PROM)、电可编程ROM(EPROM)、电可擦写可编程ROM(EEPROM)或快闪存储器。易失性存储器可以包括随机存取存储器(RAM),该RAM可以充当外部高速缓存存储器。作为例子而非限制性的,RAM可以以多种形式获得,比如同步RAM(DRAM)、动态RAM(DRAM)、同步DRAM(SDRAM)、双数据速率SDRAM(DDR SDRAM)、增强SDRAM(ESDRAM)、同步链路DRAM(SLDRAM)以及直接RambusRAM(DRRAM)。所公开的方面的存储设备意在包括但不限于这些和其它合适类型的存储器。
本领域技术人员还将明白的是,结合这里的公开所描述的各种示例性逻辑块、模块、电路和算法步骤可以被实现为电子硬件、计算机软件或两者的组合。为了清楚地说明硬件和软件的这种可互换性,已经就各种示意性组件、方块、 模块、电路和步骤的功能对其进行了一般性的描述。这种功能是被实现为软件还是被实现为硬件取决于具体应用以及施加给整个系统的设计约束。本领域技术人员可以针对每种具体应用以各种方式来实现所述的功能,但是这种实现决定不应被解释为导致脱离本公开的范围。
结合这里的公开所描述的各种示例性逻辑块、模块和电路可以利用被设计成用于执行这里所述功能的下列部件来实现或执行:通用处理器、数字信号处理器(DSP)、专用集成电路(ASIC)、现场可编程门阵列(FPGA)或其它可编程逻辑器件、分立门或晶体管逻辑、分立的硬件组件或者这些部件的任何组合。通用处理器可以是微处理器,但是可替换地,处理器可以是任何传统处理器、控制器、微控制器或状态机。处理器也可以被实现为计算设备的组合,例如,DSP和微处理器的组合、多个微处理器、一个或多个微处理器结合DSP核、或任何其它这种配置。
结合这里的公开所描述的方法或算法的步骤可以直接包含在硬件中、由处理器执行的软件模块中或这两者的组合中。软件模块可以驻留在RAM存储器、快闪存储器、ROM存储器、EPROM存储器、EEPROM存储器、寄存器、硬盘、可移动盘、CD-ROM、或本领域已知的任何其它形式的存储介质中。示例性的存储介质被耦合到处理器,使得处理器能够从该存储介质中读取信息或向该存储介质写入信息。在一个替换方案中,所述存储介质可以与处理器集成在一起。处理器和存储介质可以驻留在ASIC中。ASIC可以驻留在用户终端中。在一个替换方案中,处理器和存储介质可以作为分立组件驻留在用户终端中。
在一个或多个示例性设计中,所述功能可以在硬件、软件、固件或其任意组合中实现。如果在软件中实现,则可以将所述功能作为一个或多个指令或代码存储在计算机可读介质上或通过计算机可读介质来传送。计算机可读介质包括计算机存储介质和通信介质,该通信介质包括有助于将计算机程序从一个位置传送到另一个位置的任何介质。存储介质可以是能够被通用或专用计算机访问的任何可用介质。作为例子而非限制性的,该计算机可读介质可以包括RAM、ROM、EEPROM、CD-ROM或其它光盘存储设备、磁盘存储设备或其它磁性存储设备,或者是可以用于携带或存储形式为指令或数据结构的所需程序代码并且能够被通用或专用计算机或者通用或专用处理器访问的任何其它介质。此外,任何连接都可以适当地称为计算机可读介质。例如,如果使用同轴线缆、 光纤线缆、双绞线、数字用户线路(DSL)或诸如红外线、无线电和微波的无线技术来从网站、服务器或其它远程源发送软件,则上述同轴线缆、光纤线缆、双绞线、DSL或诸如红外先、无线电和微波的无线技术均包括在介质的定义。如这里所使用的,磁盘和光盘包括压缩盘(CD)、激光盘、光盘、数字多功能盘(DVD)、软盘、蓝光盘,其中磁盘通常磁性地再现数据,而光盘利用激光光学地再现数据。上述内容的组合也应当包括在计算机可读介质的范围内。
公开的示例性实施例,但是应当注公开的示例性实施例,但是应当注意,在不背离权利要求限定的本公开的范围的前提下,可以进行多种改变和修改。根据这里描述的公开实施例的方法权利要求的功能、步骤和/或动作不需以任何特定顺序执行。此外,尽管本公开的元素可以以个体形式描述或要求,但是也可以设想多个,除非明确限制为单数。
应当理解的是,在本文中使用的,除非上下文清楚地支持例外情况,单数形式“一个”(“a”、“an”、“the”)旨在也包括复数形式。还应当理解的是,在本文中使用的“和/或”是指包括一个或者一个以上相关联地列出的项目的任意和所有可能组合。
上述本公开实施例序号仅仅为了描述,不代表实施例的优劣。
本领域普通技术人员可以理解实现上述实施例的全部或部分步骤可以通过硬件来完成,也可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,上述提到的存储介质可以是只读存储器,磁盘或光盘等。
所属领域的普通技术人员应当理解:以上任何实施例的讨论仅为示例性的,并非旨在暗示本公开的范围(包括权利要求)被限于这些例子;在本发明实施例的思路下,以上实施例或者不同实施例中的技术特征之间也可以进行组合,并存在如上所述的本发明实施例的不同方面的许多其它变化,为了简明它们没有在细节中提供。因此,凡在本发明实施例的精神和原则之内,所做的任何省略、修改、等同替换、改进等,均应包含在本发明实施例的保护范围之内。

Claims (14)

  1. 一种发光装置的交互方法,其特征在于,包括:
    获取活跃度因子的活跃度;其中,所述活跃度因子选自以下参数中的至少一种:当前时间、光线强度、天气状况、当前时间距离上次互动的时间间隔、预设互动周期内的使用时长;
    根据所述活跃度因子的活跃度确定活跃状态;
    接收交互操作指令;
    结合所述交互操作指令和所述活跃状态,确定所需的发光效果;
    根据所述发光效果,输出发光控制指令以控制所述发光装置发光。
  2. 根据权利要求1所述的方法,其特征在于,所述获取活跃度因子的活跃度的步骤中,获取的活跃度因子至少包括当前时间距离上次互动的时间间隔和预设互动周期内的使用时长。
  3. 根据权利要求2所述的方法,其特征在于,获取的活跃度因子还包括光线强度。
  4. 根据权利要求1所述的方法,其特征在于,所述发光装置还包括发声装置;所述交互方法还包括:
    结合所述交互操作指令和所述活跃状态,确定所需的声音效果;
    根据所述声音效果,输出发声控制指令以控制所述发声装置发声。
  5. 根据权利要求4所述的方法,其特征在于,所述发光装置还包括红外传感模块、亮度传感模块和拾音模块;
    所述接收交互操作指令,包括:
    接收所述红外传感模块、亮度传感模块和/或拾音模块采集的数据;
    根据所述红外传感模块、亮度传感模块和/或拾音模块采集的数据,确定所述交互操作指令。
  6. 根据权利要求5所述的方法,其特征在于,还包括:
    若所述红外传感模块、亮度传感模块和拾音模块未采集到数据,所述发光效果和声音效果根据所述活跃状态确定。
  7. 根据权利要求5所述的方法,其特征在于,还包括:
    若所述亮度传感模块采集的亮度数据低于第一亮度阈值,且所述红外传感模块采集的红外数据处于预设范围内,则确定所述交互操作指令为语音唤醒指 令,并启动语音交互;
    或者,若所述拾音模块采集的语音数据被检测到唤醒词,则确定所述交互操作指令为语音唤醒指令,并启动语音交互。
  8. 根据权利要求7所述的方法,其特征在于,启动语音交互之后,还包括:
    结合所述语音唤醒指令和所述活跃状态,确定所需的发光效果为唤醒光效;
    根据所述唤醒光效,输出发光控制指令以控制所述发光装置发光。
  9. 根据权利要求5所述的方法,其特征在于,还包括:
    确定所述当前时间是否处于睡眠时间段;
    若所述当前时间处于睡眠时间段,所述亮度传感模块采集的亮度数据低于第二亮度阈值,且接收到开灯指令,则所述发光效果为小夜灯效果,并根据所述小夜灯效果,输出发光控制指令以控制所述发光装置发光。
  10. 一种发光装置,其特征在于,包括:
    交互触发单元,用于接收交互操作指令;
    发光单元,用于根据发光控制指令发光;
    处理单元,被配置为:
    获取至少两个活跃度因子的活跃度;其中,所述活跃度因子选自:当前时间、光线强度、天气状况、当前时间距离上次互动的时间间隔、预设互动周期内的使用时长;
    根据所述至少两个活跃度因子的活跃度确定活跃状态;
    结合所述交互操作指令和所述活跃状态,确定所需的发光效果;
    根据所述发光效果,输出发光控制指令以控制所述发光单元发光。
  11. 根据权利要求10所述的装置,其特征在于,所述发光装置还包括发声单元,用于根据发声控制指令发声;
    所述处理单元,被配置为:
    结合所述交互操作指令和所述活跃状态,确定所需的声音效果;
    根据所述声音效果,输出发声控制指令以控制所述发声单元发声。
  12. 根据权利要求11所述的装置,其特征在于,所述交互触发单元包括红外传感模块、亮度传感模块和拾音模块;
    所述处理单元,被配置为:
    接收所述红外传感模块、亮度传感模块和/或拾音模块采集的数据;
    根据所述红外传感模块、亮度传感模块和/或拾音模块采集的数据,确定所述交互操作指令。
  13. 一种电子设备,包括:
    至少一个处理器;以及,
    与所述至少一个处理器通信连接的存储器;其中,
    所述存储器存储有可被所述一个处理器执行的指令,所述指令被所述至少一个处理器执行,以使所述至少一个处理器能够执行如权利要求1-9任意一项所述的方法。
  14. 一种存储有计算机程序的计算机可读存储介质,其中,所述计算机程序在由处理器执行时实现权利要求1-9中任一项所述的方法的步骤。
PCT/CN2018/113439 2018-06-06 2018-11-01 发光装置及其交互方法、电子设备、存储介质 WO2019233029A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201810575523.7 2018-06-06
CN201810575523.7A CN109116978B (zh) 2018-06-06 2018-06-06 发光装置及其交互方法、电子设备、存储介质

Publications (1)

Publication Number Publication Date
WO2019233029A1 true WO2019233029A1 (zh) 2019-12-12

Family

ID=64821796

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/113439 WO2019233029A1 (zh) 2018-06-06 2018-11-01 发光装置及其交互方法、电子设备、存储介质

Country Status (2)

Country Link
CN (1) CN109116978B (zh)
WO (1) WO2019233029A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110291995A (zh) * 2019-05-24 2019-10-01 丁韩 应用于宠物用品的灯光控制方法及装置
CN113709953A (zh) * 2021-09-03 2021-11-26 上海蔚洲电子科技有限公司 一种led灯光互动控制系统、方法及互动显示系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010238572A (ja) * 2009-03-31 2010-10-21 Yashima Dengyo Co Ltd Led照明制御システム
CN103889091A (zh) * 2012-12-19 2014-06-25 海尔集团公司 路灯控制方法及控制装置
CN105792485A (zh) * 2016-05-17 2016-07-20 南宁市茂宏信息技术有限公司 一种智能照明控制开关
CN106793306A (zh) * 2016-12-28 2017-05-31 郑州北斗七星通讯科技有限公司 一种智能室内小夜灯
CN107484308A (zh) * 2017-07-31 2017-12-15 北京小米移动软件有限公司 照明设备的控制方法、装置及存储介质
WO2018093803A1 (en) * 2016-11-17 2018-05-24 Echelon Corporation System and method for optimizing lighting in response to online weather data

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102346436B (zh) * 2010-08-05 2013-04-03 深圳市超维实业有限公司 一种自然唤醒装置和方法
CN105848374A (zh) * 2015-01-12 2016-08-10 芋头科技(杭州)有限公司 一种灯光控制系统及方法
CN105511623B (zh) * 2015-12-15 2018-11-20 深圳先进技术研究院 交互方法及装置
CN105444007A (zh) * 2016-01-08 2016-03-30 福州智能小白电子科技有限公司 一种智能led灯具
CN105764188A (zh) * 2016-03-23 2016-07-13 北京百度网讯科技有限公司 照明控制器、照明控制系统和方法
CN105873321B (zh) * 2016-05-09 2018-08-31 国网山东省电力公司巨野县供电公司 一种使用智能开关控制系统的照明系统控制方法
CN106878118A (zh) * 2017-01-03 2017-06-20 美的集团股份有限公司 一种智能家电语音控制方法及系统
CN106912150B (zh) * 2017-03-17 2019-06-21 青岛亿联客信息技术有限公司 自动依据用户的使用习惯照明的方法、系统
CN107277989B (zh) * 2017-06-16 2019-08-13 深圳市盛路物联通讯技术有限公司 智能家居照明控制方法及装置
CN107295193B (zh) * 2017-07-14 2020-06-02 Oppo广东移动通信有限公司 响铃控制方法、装置、存储介质及电子设备
CN108093526A (zh) * 2017-12-28 2018-05-29 美的智慧家居科技有限公司 Led灯的控制方法、装置和可读存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010238572A (ja) * 2009-03-31 2010-10-21 Yashima Dengyo Co Ltd Led照明制御システム
CN103889091A (zh) * 2012-12-19 2014-06-25 海尔集团公司 路灯控制方法及控制装置
CN105792485A (zh) * 2016-05-17 2016-07-20 南宁市茂宏信息技术有限公司 一种智能照明控制开关
WO2018093803A1 (en) * 2016-11-17 2018-05-24 Echelon Corporation System and method for optimizing lighting in response to online weather data
CN106793306A (zh) * 2016-12-28 2017-05-31 郑州北斗七星通讯科技有限公司 一种智能室内小夜灯
CN107484308A (zh) * 2017-07-31 2017-12-15 北京小米移动软件有限公司 照明设备的控制方法、装置及存储介质

Also Published As

Publication number Publication date
CN109116978B (zh) 2021-03-23
CN109116978A (zh) 2019-01-01

Similar Documents

Publication Publication Date Title
US11269393B2 (en) Techniques for adjusting computing device sleep states
US10381010B2 (en) Voice control user interface during low power mode
US20170289766A1 (en) Digital Assistant Experience based on Presence Detection
US11119723B2 (en) User-adaptive volume selection
US20200034115A1 (en) Systems and methods for communicating notifications and textual data associated with applications
US9094539B1 (en) Dynamic device adjustments based on determined user sleep state
US20180082684A1 (en) Voice Control User Interface with Progressive Command Engagement
US20170213553A1 (en) Voice Control User Interface with Progressive Command Engagement
US11423899B2 (en) Controlling device output according to a determined condition of a user
US11733761B2 (en) Methods and apparatus to manage power and performance of computing devices based on user presence
JP2022534338A (ja) スマート・ディスプレイ・パネル装置および関連する方法
US20170206901A1 (en) Voice Control User Interface with Progressive Command Engagement
WO2019233029A1 (zh) 发光装置及其交互方法、电子设备、存储介质
JP7108885B2 (ja) 覚醒誘導制御装置、および、覚醒誘導システム
US10237390B2 (en) Intelligent notification device and intelligent notification method
CN110235525B (zh) 用于照明系统的推荐引擎
CN106412313A (zh) 自动调节屏幕显示参数的方法、系统及智能终端
US20210090562A1 (en) Speech recognition control method and apparatus, electronic device and readable storage medium
TWI521384B (zh) 具有眼球追蹤功能之電子裝置及其控制方法
US20230239208A1 (en) Methods and systems for customizing devices in an iot environment using self-adaptive mechanism
JP7231535B2 (ja) スマートデバイスの制御方法、スマートデバイスの制御装置、電子機器及び記憶媒体
CN111028908A (zh) 睡眠状态监测方法、装置、设备和计算机可读存储介质
CN106020416B (zh) 屏幕显示方法、装置及智能设备
CN112269322A (zh) 智能设备的唤醒方法、装置、电子设备及介质
WO2017166645A1 (zh) 一种健康提示方法及装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18921706

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18921706

Country of ref document: EP

Kind code of ref document: A1