CN112583673A - Control method and device for awakening equipment - Google Patents
Control method and device for awakening equipment Download PDFInfo
- Publication number
- CN112583673A CN112583673A CN202011412965.3A CN202011412965A CN112583673A CN 112583673 A CN112583673 A CN 112583673A CN 202011412965 A CN202011412965 A CN 202011412965A CN 112583673 A CN112583673 A CN 112583673A
- Authority
- CN
- China
- Prior art keywords
- user
- wake
- state
- awakening
- execute
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L12/00—Data switching networks
- H04L12/28—Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
- H04L12/2803—Home automation networks
- H04L12/2816—Controlling appliance services of a home automation network by calling their functionalities
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/33—Querying
- G06F16/3331—Query processing
- G06F16/3332—Query translation
- G06F16/3334—Selection or weighting of terms from queries, including natural language queries
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/33—Querying
- G06F16/3331—Query processing
- G06F16/334—Query execution
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/279—Recognition of textual entities
- G06F40/289—Phrasal analysis, e.g. finite state techniques or chunking
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L25/00—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
- G10L25/48—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
- G10L25/51—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
- G10L25/63—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination for estimating an emotional state
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16Y—INFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
- G16Y10/00—Economic sectors
- G16Y10/80—Homes; Buildings
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16Y—INFORMATION AND COMMUNICATION TECHNOLOGY SPECIALLY ADAPTED FOR THE INTERNET OF THINGS [IoT]
- G16Y40/00—IoT characterised by the purpose of the information processing
- G16Y40/30—Control
Abstract
The invention provides a control method and a device for awakening equipment, wherein the method comprises the following steps: acquiring voice data of a user; determining an emotional state of the user according to the voice data; the awakening device is controlled to execute the awakening strategy according to the emotion state of the user, and the problem that the awakening mode is rough and not humanized due to the fact that the real-time emotion of the user is not considered in the conventional awakening method is solved.
Description
Technical Field
The application belongs to the field of smart home, and relates to a control method and device for awakening equipment.
Background
Along with the development of the internet of things and hardware products, many intelligent devices have awakening functions, such as intelligent sound boxes or household appliances with sound playing functions, and users can set awakening time.
It should be noted that the existing awakening method does not consider the real-time emotion of the user, which causes the awakening mode to be rough and not humanized.
Disclosure of Invention
The invention provides a control method and a control device for awakening equipment, and aims to solve the problem that the awakening mode is rough and not humanized because the real-time emotion of a user is not considered in the conventional awakening method.
According to a first aspect of the present invention, there is provided a control method of waking up a device, the method comprising: acquiring voice data of a user; determining an emotional state of the user according to the voice data; and controlling the awakening equipment to execute the awakening strategy according to the emotional state of the user.
Further, controlling the wake-up device to execute the wake-up policy according to the emotional state of the user includes: controlling the wake-up device to execute a first wake-up strategy in case the emotional state of the user is a negative state; in case the mood of the user is in a non-negative state, controlling the wake-up device to perform a second wake-up strategy.
Further, after controlling the wake-up device to execute the wake-up policy according to the emotional state of the user, the method further comprises: determining whether the user is in a waking state according to the acquired physiological data of the user; and controlling the wake-up device to execute a third wake-up strategy in case the user is not in the awake state.
Further, before acquiring the voice data of the user, the method further comprises: acquiring physiological data of a user; determining the sleep state of the user according to the physiological data of the user; and controlling the awakening equipment to execute the awakening instruction according to the sleep state.
Further, determining the emotional state of the user from the speech data includes: converting the voice data into text data; performing emotion analysis on the text data to obtain emotion vocabularies; and determining the emotional state of the user according to the matching result of the emotional vocabulary and the emotional vocabulary feature library.
According to a second aspect of the present invention, there is provided a control apparatus for waking up a device, the apparatus comprising: the first acquisition unit is used for acquiring voice data of a user; a first determination unit for determining an emotional state of the user based on the voice data; and the first control unit is used for controlling the awakening equipment to execute the awakening strategy according to the emotional state of the user.
Further, the control unit includes: the first control module is used for controlling the awakening equipment to execute a first awakening strategy under the condition that the emotional state of the user is a negative state; and the second control module is used for controlling the awakening equipment to execute a second awakening strategy under the condition that the emotion of the user is in a non-negative state.
Further, the apparatus further comprises: the second determining unit is used for determining whether the user is in a waking state according to the acquired physiological data of the user; and the second control unit is used for controlling the awakening device to execute the third awakening strategy under the condition that the user is not in the awakening state.
Further, the apparatus further comprises: a second acquisition unit for acquiring physiological data of a user; the third determining unit is used for determining the sleep state of the user according to the physiological data of the user; and the third control unit is used for controlling the awakening equipment to execute the awakening instruction according to the sleep state.
Further, the first determination unit includes: the conversion module is used for converting the voice data into text data; the analysis module is used for carrying out emotion analysis on the text data to obtain emotion vocabularies; and the determining module is used for determining the emotional state of the user according to the matching result of the emotional vocabulary and the emotional vocabulary feature library.
The invention provides a control method and a device for awakening equipment, wherein the method comprises the following steps: acquiring voice data of a user; determining an emotional state of the user according to the voice data; the awakening device is controlled to execute the awakening strategy according to the emotion state of the user, and the problem that the awakening mode is rough and not humanized due to the fact that the real-time emotion of the user is not considered in the conventional awakening method is solved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a flowchart of a control method for waking up a device according to a first embodiment of the present invention;
fig. 2 is a schematic diagram of an alternative wake-up apparatus control method according to a first embodiment of the present invention; and
fig. 3 is a schematic diagram of a control device of a wake-up apparatus according to a second embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail below. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the examples given herein without making any creative effort, shall fall within the protection scope of the present application.
Example one
As shown in fig. 1, the present scheme provides a control method for waking up a device, where the method includes:
in step S11, voice data of the user is acquired.
Step S13, the emotional state of the user is determined from the voice data.
And step S15, controlling the wake-up device to execute a wake-up strategy according to the emotional state of the user.
Specifically, the executing main body of the method in this scheme may be a wake-up device or a server, the wake-up device is an intelligent home device with a voice function, such as an intelligent sound box, and the intelligent sound box is taken as an example below, and the intelligent sound box may obtain voice data of the user, then determine an emotional state of the user through voice recognition, and control the intelligent sound box to execute wake-up strategies corresponding to different emotional states according to the emotional state of the user.
Therefore, through the method, the emotional state of the user is considered when the user is awakened, and compared with the prior art that only the awakening time is considered and then the user is awakened strongly, the awakening method is milder, so that the problem that the awakening mode is rough and not humanized due to the fact that the real-time emotion of the user is not considered in the conventional awakening method is solved. The technical effect of slowing down the passive emotion of getting up of the user when the user is awakened is achieved.
Optionally, the step S15 of controlling the wake up device to execute the wake up policy according to the emotional state of the user includes:
in step S151, in case that the emotional state of the user is a negative state, the wake-up device is controlled to execute a first wake-up policy.
And S152, controlling the awakening device to execute a second awakening strategy when the emotion of the user is in a non-negative state.
Specifically, the scheme may determine whether the emotional state of the user is a passive state or a non-passive state, and when the emotional state is the passive state, the scheme controls the wake-up device to execute a first wake-up policy corresponding to the passive state, where the first wake-up policy may be a multiple wake-up policy, such as a snooze policy, and for example, after wake-up, wake-up is performed every preset time duration (e.g., three minutes) until the user gets up. When the client is in a non-passive state, the scheme controls the awakening device to execute a strategy of waking up in a shallow sleep mode, for example, a ring preset by the user is played, if the ring is not preset, a relaxed music is played, and the volume is awakened by adopting a strategy of gradually increasing the volume preset by the user.
Optionally, after controlling the wake up device to execute the wake up policy according to the emotional state of the user at step S15, the method further includes:
in step S17, it is determined whether the user is awake according to the acquired physiological data of the user.
In step S19, the wake-up device is controlled to execute a third wake-up strategy in case the user is not in an awake state.
Specifically, in the present solution, after the wake-up policy is executed on the user, the present solution obtains physiological data (data such as body movement, blink frequency, heartbeat frequency, and breathing frequency of the user) of the user to determine whether the user is already in a wake state, and if the user is not in the wake state, the present solution controls the wake-up device to execute a third wake-up policy, where the third wake-up policy may be a wake-up manner of strong wake-up, such as a wake-up ring tone exceeding a preset decibel.
It should be noted that the physiological data of the user may be detected by a sleep detector (such as a smart band), and the wake-up device may establish a communication relationship with the sleep detector to obtain the physiological data of the user.
Optionally, before acquiring the voice data of the user in step S11, the method further includes:
in step S07, physiological data of the user is acquired.
In step S8, the sleep state of the user is determined based on the physiological data of the user.
And step S09, controlling the wake-up device to execute the wake-up command according to the sleep state.
Specifically, before emotion analysis is carried out on voice data of a user, at a time point close to a wakeup time point preset by the user, whether the user is in a shallow sleep state two minutes before the preset time can be judged through physiological data (data such as body movement, blink frequency, heartbeat frequency and respiratory frequency of the user) of the user, if the user is in the shallow sleep state, the preset ringtone of the user is played, if the ringtone is not preset, the relaxed music is played, and the volume is gradually increased by adopting the preset volume of the user to wake the user. And if the user is detected to be in deep sleep two minutes before the preset time, adopting a gradual volume-up strategy to realize first awakening. The method of step S11 described above is performed after the first wake-up.
Optionally, the step S13 of determining the emotional state of the user according to the voice data includes:
step S131 converts the voice data into text data.
And step S132, performing emotion analysis on the text data to obtain emotion vocabularies.
And step S133, determining the emotional state of the user according to the matching result of the emotional vocabulary and the emotional vocabulary feature library.
Specifically, in the scheme, the voice of the user can be detected in real time, then the voice of the user is detected to be converted into the text, the text is analyzed, the extracted text is subjected to emotion analysis to obtain emotion words, if the emotion words comprise 'good words and bad words', the words are matched with negative words in an emotion word feature library, the user is judged to be in a negative state, and if the words are sent by the user and express positive or neutral words of non-negative emotions, for example, neutral words such as 'I should get up', and the words are matched with the non-negative words in the emotion word feature library, the user is judged to be in the non-negative state.
It should be noted that, after the voice data is converted into the text data, the scheme may perform preprocessing on the text data, such as deleting some redundant corpora, and then perform emotion analysis, and the emotion vocabulary feature library may divide the emotion of the user into two types, negative and non-negative, in the database according to a predetermined standard of emotion words, where each type corresponds to a corresponding emotion word in the database. The emotional vocabulary feature library can be learned and adjusted according to habit data of user interaction.
A specific embodiment of the present solution is described below with reference to fig. 2:
the method comprises the steps of firstly detecting the sleep state of a user through a sleep detection instrument, judging whether the user is in a shallow sleep state or not according to the sleep state, if so, controlling an awakening device to play a gradually-intensified relaxed music sound to awaken the user, if the user is not in the shallow sleep state and is in a deep sleep state, realizing first awakening by adopting a strategy of gradually-intensified volume, then carrying out voice recognition on the user, converting voice data into text data, then preprocessing, carrying out emotion analysis on the preprocessed text, judging whether the user is in a negative emotion or not, and executing a snooze strategy in the negative state, wherein the snooze strategy is executed, for example, awakening every preset time (for example, three minutes) after awakening until the user gets up. When the client is in a non-passive state, the scheme controls the awakening device to execute a strategy of waking up in a shallow sleep mode, for example, a ring preset by the user is played, if the ring is not preset, a relaxed music is played, and the volume is awakened by adopting a strategy of gradually increasing the volume preset by the user.
Example two
As shown in fig. 3, the present solution provides a control apparatus for waking up a device, where the apparatus includes: a first acquisition unit 30 for acquiring voice data of a user; a first determination unit 32 for determining an emotional state of the user from the voice data; a first control unit 34 for controlling the wake-up device to execute a wake-up strategy according to the emotional state of the user.
Therefore, through the method, the emotional state of the user is considered when the user is awakened, and compared with the prior art that only the awakening time is considered and then the user is awakened strongly, the awakening method is milder, so that the problem that the awakening mode is rough and not humanized due to the fact that the real-time emotion of the user is not considered in the conventional awakening method is solved. The technical effect of slowing down the passive emotion of getting up of the user when the user is awakened is achieved.
Optionally, the control unit includes: the first control module is used for controlling the awakening equipment to execute a first awakening strategy under the condition that the emotional state of the user is a negative state; and the second control module is used for controlling the awakening equipment to execute a second awakening strategy under the condition that the emotion of the user is in a non-negative state.
Optionally, the apparatus further comprises: the second determining unit is used for determining whether the user is in a waking state according to the acquired physiological data of the user; and the second control unit is used for controlling the awakening device to execute the third awakening strategy under the condition that the user is not in the awakening state.
Optionally, the apparatus further comprises: a second acquisition unit for acquiring physiological data of a user; the third determining unit is used for determining the sleep state of the user according to the physiological data of the user; and the third control unit is used for controlling the awakening equipment to execute the awakening instruction according to the sleep state.
Optionally, the first determining unit includes: the conversion module is used for converting the voice data into text data; the analysis module is used for carrying out emotion analysis on the text data to obtain emotion vocabularies; and the determining module is used for determining the emotional state of the user according to the matching result of the emotional vocabulary and the emotional vocabulary feature library.
It is understood that the same or similar parts in the above embodiments may be mutually referred to, and the same or similar parts in other embodiments may be referred to for the content which is not described in detail in some embodiments.
It should be noted that, in the description of the present application, the terms "first", "second", etc. are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. In addition, in the description of the present application, the meaning of "plurality" means at least two unless otherwise specified.
It will be understood that when an element is referred to as being "secured to" or "disposed on" another element, it can be directly on the other element or intervening elements may also be present; when an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present, and further, as used herein, connected may include wirelessly connected; the term "and/or" is used to include any and all combinations of one or more of the associated listed items.
Any process or method descriptions in flow charts or otherwise described herein may be understood as: represents modules, segments or portions of code which include one or more executable instructions for implementing specific logical functions or steps of a process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present application.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.
Claims (10)
1. A method of controlling a wake-up device, the method comprising:
acquiring voice data of a user;
determining an emotional state of the user according to the voice data;
and controlling the awakening equipment to execute an awakening strategy according to the emotional state of the user.
2. The method of claim 1, wherein controlling the wake-up device to implement a wake-up policy according to the emotional state of the user comprises:
controlling the wake-up device to execute a first wake-up policy if the emotional state of the user is a negative state;
controlling the wake-up device to execute a second wake-up strategy in case the user's mood is in a non-negative state.
3. Method according to claim 1 or 2, wherein after controlling the wake-up device to execute a wake-up policy according to the emotional state of the user, the method further comprises:
determining whether the user is in a waking state according to the acquired physiological data of the user;
controlling the wake-up device to execute a third wake-up strategy if the user is not in an awake state.
4. The method of claim 1, wherein prior to obtaining voice data for a user, the method further comprises:
acquiring physiological data of a user;
determining a sleep state of the user according to the physiological data of the user;
and controlling the awakening equipment to execute an awakening instruction according to the sleep state.
5. The method of claim 1 or 4, wherein said determining an emotional state of the user from the speech data comprises:
converting the voice data into text data;
performing emotion analysis on the text data to obtain emotion vocabularies;
and determining the emotional state of the user according to the matching result of the emotional vocabulary and the emotional vocabulary feature library.
6. A control apparatus for waking up a device, the apparatus comprising:
the first acquisition unit is used for acquiring voice data of a user;
a first determining unit for determining an emotional state of the user according to the voice data;
and the first control unit is used for controlling the awakening equipment to execute an awakening strategy according to the emotional state of the user.
7. The apparatus of claim 6, wherein the control unit comprises:
the first control module is used for controlling the awakening equipment to execute a first awakening strategy under the condition that the emotional state of the user is a negative state;
and the second control module is used for controlling the awakening equipment to execute a second awakening strategy under the condition that the emotion of the user is in a non-negative state.
8. The apparatus of claim 6 or 7, further comprising:
the second determining unit is used for determining whether the user is in a waking state according to the acquired physiological data of the user;
a second control unit, configured to control the wake-up device to execute a third wake-up policy when the user is not in a wake state.
9. The apparatus of claim 6, further comprising:
a second acquisition unit for acquiring physiological data of a user;
a third determining unit, configured to determine a sleep state of the user according to the physiological data of the user;
and the third control unit is used for controlling the awakening equipment to execute the awakening instruction according to the sleep state.
10. The apparatus according to claim 6 or 9, wherein the first determining unit comprises:
the conversion module is used for converting the voice data into text data;
the analysis module is used for carrying out emotion analysis on the text data to obtain emotion vocabularies;
and the determining module is used for determining the emotional state of the user according to the matching result of the emotional vocabulary and the emotional vocabulary feature library.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011412965.3A CN112583673B (en) | 2020-12-04 | 2020-12-04 | Control method and device for awakening equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011412965.3A CN112583673B (en) | 2020-12-04 | 2020-12-04 | Control method and device for awakening equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112583673A true CN112583673A (en) | 2021-03-30 |
CN112583673B CN112583673B (en) | 2021-10-22 |
Family
ID=75127465
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011412965.3A Active CN112583673B (en) | 2020-12-04 | 2020-12-04 | Control method and device for awakening equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112583673B (en) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104267923A (en) * | 2014-09-16 | 2015-01-07 | 惠州Tcl移动通信有限公司 | Music awakening method and mobile device |
CN104932924A (en) * | 2015-06-30 | 2015-09-23 | 上海海漾软件技术有限公司 | Interactive reminding method based on alarm clock application, device and terminal equipment |
CN105244023A (en) * | 2015-11-09 | 2016-01-13 | 上海语知义信息技术有限公司 | System and method for reminding teacher emotion in classroom teaching |
CN108175921A (en) * | 2017-12-28 | 2018-06-19 | 深圳市赛亿科技开发有限公司 | The method and apparatus that human body wakes up |
CN108227932A (en) * | 2018-01-26 | 2018-06-29 | 上海智臻智能网络科技股份有限公司 | Interaction is intended to determine method and device, computer equipment and storage medium |
CN108712557A (en) * | 2018-03-27 | 2018-10-26 | 浙江大学 | A kind of method that emotional culture music wakes up |
CN108882454A (en) * | 2018-07-20 | 2018-11-23 | 佛山科学技术学院 | A kind of intelligent sound identification interaction means of illumination and system based on emotion judgment |
CN110558933A (en) * | 2019-07-26 | 2019-12-13 | 深圳市元征科技股份有限公司 | information prompting method and wearable device |
CN111282126A (en) * | 2020-01-19 | 2020-06-16 | 珠海格力电器股份有限公司 | Sleep wake-up method and device |
-
2020
- 2020-12-04 CN CN202011412965.3A patent/CN112583673B/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104267923A (en) * | 2014-09-16 | 2015-01-07 | 惠州Tcl移动通信有限公司 | Music awakening method and mobile device |
CN104932924A (en) * | 2015-06-30 | 2015-09-23 | 上海海漾软件技术有限公司 | Interactive reminding method based on alarm clock application, device and terminal equipment |
CN105244023A (en) * | 2015-11-09 | 2016-01-13 | 上海语知义信息技术有限公司 | System and method for reminding teacher emotion in classroom teaching |
CN108175921A (en) * | 2017-12-28 | 2018-06-19 | 深圳市赛亿科技开发有限公司 | The method and apparatus that human body wakes up |
CN108227932A (en) * | 2018-01-26 | 2018-06-29 | 上海智臻智能网络科技股份有限公司 | Interaction is intended to determine method and device, computer equipment and storage medium |
CN108712557A (en) * | 2018-03-27 | 2018-10-26 | 浙江大学 | A kind of method that emotional culture music wakes up |
CN108882454A (en) * | 2018-07-20 | 2018-11-23 | 佛山科学技术学院 | A kind of intelligent sound identification interaction means of illumination and system based on emotion judgment |
CN110558933A (en) * | 2019-07-26 | 2019-12-13 | 深圳市元征科技股份有限公司 | information prompting method and wearable device |
CN111282126A (en) * | 2020-01-19 | 2020-06-16 | 珠海格力电器股份有限公司 | Sleep wake-up method and device |
Also Published As
Publication number | Publication date |
---|---|
CN112583673B (en) | 2021-10-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11017765B2 (en) | Intelligent assistant with intent-based information resolution | |
US20230352022A1 (en) | Voice activated device for use with a voice-based digital assistant | |
US11100384B2 (en) | Intelligent device user interactions | |
WO2019142427A1 (en) | Information processing device, information processing system, information processing method, and program | |
US10083397B2 (en) | Personalized intelligent wake-up system and method based on multimodal deep neural network | |
CN109036393A (en) | Wake-up word training method, device and the household appliance of household appliance | |
Qian et al. | Automatic detection, segmentation and classification of snore related signals from overnight audio recording | |
CN109036395A (en) | Personalized speaker control method, system, intelligent sound box and storage medium | |
CN109166571A (en) | Wake-up word training method, device and the household appliance of household appliance | |
CN110491373A (en) | Model training method, device, storage medium and electronic equipment | |
CN112583673B (en) | Control method and device for awakening equipment | |
US11133020B2 (en) | Assistive technology | |
CN108170452A (en) | The growing method of robot | |
WO2019221894A1 (en) | Intelligent device user interactions | |
Noh et al. | Smart Home with Biometric System Recognition | |
CN113883669A (en) | Sleep-assisting control method and device for air conditioner, electronic equipment and storage medium | |
JP2023535341A (en) | Computer-implemented method for providing data for automatic baby cry determination | |
Nagy et al. | An anytime voice controlled ambient assisted living system for motion disabled persons | |
KR20170087225A (en) | Apparatus, method, and recording medium for providing animal sound analysis information | |
Sharma et al. | Acoustic analysis of infant cry signal towards automatic detection of the cause of crying | |
CN109858334A (en) | A kind of recognition methods of baby's demand and device, computer readable storage medium | |
Ayu et al. | Audio detection (Audition): Android based sound detection application for hearing-impaired using AdaBoostM1 classifier with REPTree weaklearner | |
CN115104548B (en) | Pet behavior adjustment and human-pet interaction method and device based on multimedia information technology | |
WO2021082196A1 (en) | Monitoring method using household appliance, device, and computer storage medium | |
WO2019242312A1 (en) | Wakeup word training method and device of household appliance, and household appliance |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |