CN103729193A - Method and device for man-machine interaction - Google Patents

Method and device for man-machine interaction Download PDF

Info

Publication number
CN103729193A
CN103729193A CN201410012205.1A CN201410012205A CN103729193A CN 103729193 A CN103729193 A CN 103729193A CN 201410012205 A CN201410012205 A CN 201410012205A CN 103729193 A CN103729193 A CN 103729193A
Authority
CN
China
Prior art keywords
language
user
password
vocal print
authentication
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410012205.1A
Other languages
Chinese (zh)
Inventor
王艳龙
雷雄国
俞凯
李力
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AI Speech Ltd
Suzhou Speech Information Technology Co Ltd
Original Assignee
Suzhou Speech Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Speech Information Technology Co Ltd filed Critical Suzhou Speech Information Technology Co Ltd
Priority to CN201410012205.1A priority Critical patent/CN103729193A/en
Publication of CN103729193A publication Critical patent/CN103729193A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention belongs to the technical field of computers and discloses a method and device for man-machine interaction. The method comprises the following steps that when a terminal is in a sleep mode, the terminal monitors a user language; when the monitored user language comprises a waking command word, the terminal skips to a working mode; when an interactive request is received, man-machine interaction is conducted through a method of a natural language. According to the method and device for the man-machine interaction, the waking command word is set, when the user language comprises the waking command word, the terminal automatically enters the working mode, the natural language is only needed for the man-machine interaction, and it is unnecessary to manually operate a touch screen, so that flexibility is high.

Description

A kind of man-machine interaction method and device
Technical field
The present invention relates to field of computer technology, particularly a kind of man-machine interaction method and device.
Background technology
Along with popularizing of intelligent terminal, increasing smart machine and the appearance of intelligent Wearable equipment, and be widely used, for example: intelligent glasses, intelligent watch, intelligent bracelet, intelligent ring, intelligent necklace, intelligent body restatement etc.On these equipment, man-machine interaction is experience and the function of very core.
In prior art, on smart machine and intelligent Wearable equipment, dispose touch-screen, the vision of user based on traditional, by triggering the button on touch-screen, realizes man-machine interaction.
Realizing in process of the present invention, inventor finds that prior art at least exists following problem:
Carrying out in the process of man-machine interaction, because the size of smart machine and intelligent Wearable equipment is little, the size of the touch-screen of equipment configuration is also very little, and user can not know and sees screen clearly, causes precisely controlling interactive process, has some limitations.
Summary of the invention
In order to solve the problem of prior art, the embodiment of the present invention provides a kind of man-machine interaction method and device.Described technical scheme is as follows:
On the one hand, provide a kind of man-machine interaction method, described method comprises:
When terminal is during in park mode, terminal monitoring user language;
When listened to user language comprises wake command word, jump to mode of operation;
When receiving mutual request, in the mode of natural language, carry out man-machine interaction.
Alternatively, described method also comprises:
When not receiving mutual request in Preset Time, terminal jumps to park mode.
Alternatively, when listened to user language comprises wake command word, jump to mode of operation and comprise:
When listened to user language comprises wake command word, carry out authentication;
When authentication is passed through, jump to mode of operation.
Alternatively, when listened to user language comprises wake command word, carry out authentication and comprise:
When monitored user language comprises wake command word, obtain the vocal print password of described user language;
In at least one vocal print password that judgement is stored, whether comprise the vocal print password of described user language;
When at least one stored vocal print password comprises the vocal print password of described user language, confirmation authentication is passed through;
When at least one stored vocal print password does not comprise the vocal print password of described user language, confirmation authentication is not passed through.
Alternatively, described vocal print password comprises fixing cryptogram or revocable cryptogram.
Alternatively, when listened to user language comprises wake command word, carry out authentication and comprise:
When monitored user language comprises wake command word, prompting user input language password;
When the preset password stored with terminal when the language password of user input is identical, confirmation authentication is passed through;
When the preset password stored with terminal when the language password of user input is not identical, confirmation authentication is not passed through.
Alternatively, described wake command word comprises one or more language.
Alternatively, when receiving mutual request, in the mode of natural language, carry out man-machine interaction and comprise:
When receiving mutual request, obtain the natural language of user's input;
Utilize language understanding and Intelligent dialogue, what described user was inputted naturally speakes to oneself and processes, and according to the environmental information of described terminal, obtains natural language to be exported.
On the other hand, provide a kind of human-computer interaction device, described device comprises:
Monitor module, for when terminal is during in park mode, terminal monitoring user language;
Pattern redirect module, for working as listened to user language while comprising wake command word, jumps to mode of operation;
Interactive module, for when receiving mutual request, carries out man-machine interaction in the mode of natural language.
Alternatively, described pattern redirect module is not also for when receiving mutual request in Preset Time, and terminal jumps to park mode.
Alternatively, described pattern redirect module comprises:
Authentication unit, for working as listened to user language while comprising wake command word, carries out authentication;
Described pattern redirect module also, for when authentication is passed through, jumps to mode of operation.
Alternatively, described authentication unit comprises:
Vocal print password obtains subelement, for working as monitored user language while comprising wake command word, obtains the vocal print password of described user language;
Judgment sub-unit, for judging whether at least one stored vocal print password comprises the vocal print password of described user language;
Described authentication unit is also for working as at least one stored vocal print password while comprising the vocal print password of described user language, confirms that authentication passes through; When at least one stored vocal print password does not comprise the vocal print password of described user language, confirmation authentication is not passed through.
Alternatively, described vocal print password comprises fixing cryptogram or revocable cryptogram.
Alternatively, described authentication unit comprises:
Prompting subelement, for working as monitored user language while comprising wake command word, prompting user input language password;
When described authentication unit is also identical for the preset password stored with terminal when the language password of user input, confirmation authentication is passed through; When the preset password stored with terminal when the language password of user input is not identical, confirmation authentication is not passed through.
Alternatively, described wake command word comprises one or more language.
Alternatively, described interactive module comprises:
Natural language acquiring unit, for when receiving mutual request, obtains the natural language of user's input;
Natural language output unit, for utilizing language understanding and Intelligent dialogue, what described user was inputted naturally speakes to oneself and processes, and according to the environmental information of described terminal, obtains natural language to be exported.
The beneficial effect that the technical scheme that the embodiment of the present invention provides is brought is:
By wake command word is set, mutual when user language machine, without manual operation touch-screen, dirigibility is strong.In while comprising this wake command word, terminal enters mode of operation automatically, only needs natural language can carry out man-machine interaction, without manual operation touch-screen, dirigibility is strong.
Accompanying drawing explanation
In order to be illustrated more clearly in the technical scheme in the embodiment of the present invention, below the accompanying drawing of required use during embodiment is described is briefly described, apparently, accompanying drawing in the following describes is only some embodiments of the present invention, for those of ordinary skills, do not paying under the prerequisite of creative work, can also obtain according to these accompanying drawings other accompanying drawing.
Fig. 1 is the man-machine interaction method process flow diagram that the embodiment of the present invention provides;
Fig. 2 is the man-machine interaction method process flow diagram that the embodiment of the present invention provides;
Fig. 3 is the man-machine interaction method process flow diagram that the embodiment of the present invention provides;
Fig. 4 is the man-machine interaction method process flow diagram that the embodiment of the present invention provides;
Fig. 5 is the man-machine interaction method process flow diagram that the embodiment of the present invention provides;
Fig. 6 is the man-machine interaction method process flow diagram that the embodiment of the present invention provides;
Fig. 7 is the human-computer interaction device structural representation that the embodiment of the present invention provides.
Embodiment
For making the object, technical solutions and advantages of the present invention clearer, below in conjunction with accompanying drawing, embodiment of the present invention is described further in detail.
Fig. 1 is the man-machine interaction method process flow diagram that the embodiment of the present invention provides.Referring to Fig. 1, this embodiment comprises:
101, when terminal is during in park mode, terminal monitoring user language;
102, when listened to user language comprises wake command word, jump to mode of operation;
103,, when receiving mutual request, in the mode of natural language, carry out man-machine interaction.
The method that the embodiment of the present invention provides, by wake command word is set, when comprising this wake command word in user language, terminal enters mode of operation automatically, only needs natural language can carry out man-machine interaction, and without manual operation touch-screen, dirigibility is strong.
Alternatively, the method also comprises: when not receiving mutual request in Preset Time, terminal jumps to park mode.
Alternatively, when listened to user language comprises wake command word, jump to mode of operation and comprise:
When listened to user language comprises wake command word, carry out authentication;
When authentication is passed through, jump to mode of operation.
Alternatively, when listened to user language comprises wake command word, carry out authentication and comprise:
When monitored user language comprises wake command word, obtain the vocal print password of this user language;
In at least one vocal print password that judgement is stored, whether comprise the vocal print password of this user language;
When at least one stored vocal print password comprises the vocal print password of this user language, confirmation authentication is passed through;
When at least one stored vocal print password does not comprise the vocal print password of this user language, confirmation authentication is not passed through.
Alternatively, this vocal print password comprises fixing cryptogram or revocable cryptogram.
Alternatively, when listened to user language comprises wake command word, carry out authentication and comprise:
When monitored user language comprises wake command word, prompting user input language password;
When the preset password stored with terminal when the language password of user input is identical, confirmation authentication is passed through;
When the preset password stored with terminal when the language password of user input is not identical, confirmation authentication is not passed through.
Alternatively, this wake command word comprises one or more language.
Alternatively, when receiving mutual request, in the mode of natural language, carry out man-machine interaction and comprise:
When receiving mutual request, obtain the natural language of user's input;
Utilize language understanding and Intelligent dialogue, what this user was inputted naturally speakes to oneself and processes, and according to the environmental information of this terminal, obtains natural language to be exported.
Fig. 2 is the man-machine interaction method process flow diagram that the embodiment of the present invention provides.Referring to Fig. 2, this embodiment comprises:
201, when terminal is during in park mode, terminal monitoring user language;
In embodiments of the present invention, this terminal can be fixed terminal or mobile terminal, fixed terminal can be PC(Personal Computer, PC) or display device, mobile terminal can be intelligent Wearable equipment, smart mobile phone, panel computer, MP3(Moving Picture Experts Group Audio Layer III, dynamic image expert compression standard audio frequency aspect 3), PDA(Personal Digital Assistant, personal digital assistant) etc.
In embodiments of the present invention, this terminal can dispose intelligent microphone.This intelligence microphone, in moment listening state, can pass through VAD(Voice Activity Detection, Voice activity detection) method is some sections by phonetic segmentation and processes, and constantly detects in user language, whether to comprise default wake command word.Wherein, this VAD method can detect beginning and the end of voice.
In embodiments of the present invention, the continual collection voice of terminal, each section of voice all carry out voice and wake Check processing up.Voice wake to detect in voice whether include wake command word up.
202,, when listened to user language comprises wake command word, carry out authentication;
Wherein, this wake command word comprises one or more language.This wake command word is for when Supervision is heard this order word, and equipment enters duty from dormant state.This wake command word is equivalent to name of equipment.This wake command word can be set by technician, also can in use be set by user, and the embodiment of the present invention is not done concrete restriction.For example, this wake command word can be " little assistant ".
In embodiments of the present invention, only take this terminal, be provided with release authority and describe as example, in another embodiment of the present invention, this terminal can not be provided with release authority yet, and now, authentication process can be skipped.
In embodiments of the present invention, the process of authentication comprises following either type:
Mode one: when monitored user language comprises wake command word, obtain the vocal print password of this user language; In at least one vocal print password that judgement is stored, whether comprise the vocal print password of this user language; When at least one stored vocal print password comprises the vocal print password of this user language, confirmation authentication is passed through, and when not comprising the vocal print password of this user language at least one stored vocal print password, confirmation authentication is not passed through, and continues to monitor.
Wherein, this vocal print password comprises fixing cryptogram or revocable cryptogram.When this vocal print password is that revocable cryptogram is, this vocal print password can be that user provides by server is random while registering, and can be also that user in use arranges, and the embodiment of the present invention is not done concrete restriction.
In embodiments of the present invention, vocal print checking is divided into method and the incoherent method of text of text-dependent.Wherein, text-dependent is that requirement user sets a fixing password, and Qualify Phase is used this fixing cryptogram.The uncorrelated use fixed password that do not require of text, allows user to use any voice to verify.
In embodiments of the present invention, safe coefficient can arrange threshold value, thereby realizes security in various degree.Degree of safety is low, is more easily verified.Degree of safety is high, has more and may be rejected.
In embodiments of the present invention, two kinds of methods all need training in advance vocal print cryptogram-modle, and model adopts GMM(Gaussian Mixture Model, gauss hybrid models) model, model file is stored in local data base.
In embodiments of the present invention, terminal gathers user speech, extracts voice PLP feature, uses the GMM model of training in advance to carry out matching degree calculating, if similarity threshold is greater than the Secure Threshold of setting, thinks and is proved to be successful.
For example, user: little assistant;
Equipment: (being verified) owner, welcome, can I do something for you?
In the concrete enforcement of this mode, using user's voice wake command data as vocal print, the data of checking, carry out vocal print password authentification.
Mode two: when monitored user language comprises wake command word, prompting user input language password; When the preset password stored with terminal when the language password of user input is identical, confirmation authentication is passed through; When the preset password stored with terminal when the language password of user input is not identical, confirm that authentication do not pass through, continue monitoring.
For example, user: little assistant;
Equipment: you are good for owner, please input the speech cipher that you arrange;
User: open sesame;
Equipment: owner, welcome, can I do something for you?
In the concrete enforcement of this mode, need the extra user's of collection speech cipher voice data, carry out vocal print password authentification.
The realization that voice wake up is based on audio recognition method.Voice wake up all voice are divided into wake command word and non-wake command word two classes.
In wake command, be set the stage, generate the GMM model of wake command word.
At voice, wake detection-phase up, by user speech, for identification, recognition result is for waking word or the non-word that wakes up up.
203, when authentication is passed through, jump to mode of operation;
204,, when receiving mutual request, in the mode of natural language, carry out man-machine interaction;
In embodiments of the present invention, terminal completes after speech recognition, carry out text semantic parsing, determine user's intention, and according to user view, complete information inquiry or operation, provide afterwards the text that feeds back to user, Intelligent dialogue module is used the data of mobile phone this locality, or the environmental information collecting, and by phonetic synthesis, convert the text that feeds back to user to sound and play to user.User can hear the answer of smart machine, and can next carry out how wheel voice dialogues are mutual, as shown in Figure 3.
Example 1:
User: little assistant, does is the temperature and humidity here how many?
Equipment: measure and show 30 degrees Celsius here, humidity 30%.(data that terminal utilization collects)
User a: alarm clock after 30 minutes is set.(terminal is utilized the data of local storage)
Equipment: setting completed.
This embodiment scene is applicable to simple equipment control, the functions such as simple information inquiry.
Example 2:
Wearing equipment, in the situation that having network, carries out intelligent sound by cloud computing resource and data mutual.
Dress equipment connection to internet, possess more powerful computing power and data-handling capacity, can have phonetic recognization rate and abundanter dialogue interaction capabilities more accurately, as shown in Figure 4.
User: little assistant, can tomorrow snow in Beijing?
Equipment: Beijing will be fine tomorrow, 20 degrees Celsius.(accesses network data)
User: help the seat in my predetermined 5 Sichuan cuisine shops, 6 of tomorrow evenings and friend are poly-.
Equipment: the Sichuan cuisine shop of having found you often to go, subscribed position.(accesses network data).
This embodiment is applicable to have the business of sophisticated functions, can allow the function of smart machine more enrich powerful.
Example 3:
The embodiment of the present invention can also be used other smart machines of individual to work in coordination with and calculate and dialogue, such as utilizing individual mobile phone to calculate with mutual, as shown in Figure 5.
In conjunction with example 1 and example 2, smart machine can be accessed local data and high in the clouds data, can access other smart machines of individual simultaneously, allows multiple smart machine collaborative works.
Possible working method 1: device A completes some parameter detecting, equipment B completes the detection of other parameters.All parameters are all transferred to equipment C mutual use that engage in the dialogue.
Based on this embodiment, individual's smart machine no longer isolates, but can collaborative work, and service is provided mutually.
Between multiple equipment, the technology that realizes of system comprises: use low-power consumption bluetooth BLE interconnected, use the LAN (Local Area Network) WIFI network interconnection.
Example 4:
User: little assistant, just moved, I will have a rest.
Device A (bracelet) is collected pulse data, collects temperature data.
Equipment B (parlor air detection equipment) is collected environment temperature, ambient humidity data.
Equipment C(air-conditioning equipment) adjust temperature to cold wind pattern
Equipment: environment equipment becomes to be applicable to the pattern of having a rest!
205, when not receiving mutual request in Preset Time, terminal jumps to park mode.
Wherein, this Preset Time can be set by technician, also can in use be set by user, and the embodiment of the present invention is not done concrete restriction.
The process of man-machine interaction as shown in Figure 6.Terminal monitoring user language, when listening to wake command word, carries out biometric authentication, when being verified, receives user language inquiry, terminal processes request.Obstructed out-of-date when terminal checking, continue to monitor.When terminal is not worked for a long time, terminal enters park mode automatically.
The method that the embodiment of the present invention provides, by wake command word is set, when comprising this wake command word in user language, terminal enters mode of operation automatically, only needs natural language can carry out man-machine interaction, and without manual operation touch-screen, dirigibility is strong.
Fig. 7 is the human-computer interaction device structural representation that the embodiment of the present invention provides.Referring to Fig. 7, this device comprises: monitor module 701, pattern redirect module 702 and interactive module 703.
Monitor module 701 for when terminal is during in park mode, terminal monitoring user language; Monitor module 701 and be connected with pattern redirect module 702, when pattern redirect module 702 comprises wake command word for working as listened to user language, jump to mode of operation; Pattern redirect module 702 is connected with interactive module 703, and interactive module 703, for when receiving mutual request, is carried out man-machine interaction in the mode of natural language.
Alternatively, this pattern redirect module 702 is not also for when receiving mutual request in Preset Time, and terminal jumps to park mode.
Alternatively, this pattern redirect module 702 comprises: authentication unit, for working as listened to user language while comprising wake command word, carries out authentication; This pattern redirect module also, for when authentication is passed through, jumps to mode of operation.
Alternatively, this authentication unit comprises: vocal print password obtains subelement, for working as monitored user language while comprising wake command word, obtains the vocal print password of this user language; Judgment sub-unit, for judging whether at least one stored vocal print password comprises the vocal print password of this user language; This authentication unit is also for working as at least one stored vocal print password while comprising the vocal print password of this user language, confirms that authentication passes through; When at least one stored vocal print password does not comprise the vocal print password of this user language, confirmation authentication is not passed through.
Alternatively, this vocal print password comprises fixing cryptogram or revocable cryptogram.
Alternatively, this authentication unit comprises: prompting subelement, and for working as monitored user language while comprising wake command word, prompting user input language password; When this authentication unit is also identical for the preset password stored with terminal when the language password of user input, confirmation authentication is passed through; When the preset password stored with terminal when the language password of user input is not identical, confirmation authentication is not passed through.
Alternatively, this wake command word comprises one or more language.
Alternatively, this interactive module 703 comprises: natural language acquiring unit, for when receiving mutual request, obtains the natural language of user's input; Natural language output unit, for utilizing language understanding and Intelligent dialogue, what this user was inputted naturally speakes to oneself and processes, and according to the environmental information of this terminal, obtains natural language to be exported.
The device that the embodiment of the present invention provides, supports eyesight to have the user of obstacle to use, and can liberate both hands, facilitates user to use in specific occasion.In addition, this device is provided with control of authority, only has the release of terminal user's ability, and only needs language to get final product release, has avoided numerous and diverse unlocking operation.
The device that the embodiment of the present invention provides, by wake command word is set, when comprising this wake command word in user language, terminal enters mode of operation automatically, only needs natural language can carry out man-machine interaction, and without manual operation touch-screen, dirigibility is strong.
It should be noted that: the human-computer interaction device that above-described embodiment provides is when man-machine interaction, only with the division of above-mentioned each functional module, be illustrated, in practical application, can above-mentioned functions be distributed and by different functional modules, completed as required, the inner structure of the equipment of being about to is divided into different functional modules, to complete all or part of function described above.In addition, the human-computer interaction device that above-described embodiment provides and man-machine interaction method embodiment belong to same design, and its specific implementation process refers to embodiment of the method, repeats no more here.
One of ordinary skill in the art will appreciate that all or part of step that realizes above-described embodiment can complete by hardware, also can carry out the hardware that instruction is relevant by program completes, described program can be stored in a kind of computer-readable recording medium, the above-mentioned storage medium of mentioning can be ROM (read-only memory), disk or CD etc.
The foregoing is only preferred embodiment of the present invention, in order to limit the present invention, within the spirit and principles in the present invention not all, any modification of doing, be equal to replacement, improvement etc., within all should being included in protection scope of the present invention.

Claims (10)

1. a man-machine interaction method, is characterized in that, described method comprises:
When terminal is during in park mode, terminal monitoring user language;
When listened to user language comprises wake command word, jump to mode of operation;
When receiving mutual request, in the mode of natural language, carry out man-machine interaction.
2. method according to claim 1, is characterized in that, when listened to user language comprises wake command word, jumps to mode of operation and comprises:
When listened to user language comprises wake command word, carry out authentication;
When authentication is passed through, jump to mode of operation.
3. method according to claim 1, is characterized in that, when listened to user language comprises wake command word, carries out authentication and comprises:
When monitored user language comprises wake command word, obtain the vocal print password of described user language;
In at least one vocal print password that judgement is stored, whether comprise the vocal print password of described user language;
When at least one stored vocal print password comprises the vocal print password of described user language, confirmation authentication is passed through;
When at least one stored vocal print password does not comprise the vocal print password of described user language, confirmation authentication is not passed through.
Described vocal print password comprises fixing cryptogram or revocable cryptogram.
4. method according to claim 1, is characterized in that, when listened to user language comprises wake command word, carries out authentication and comprises:
When monitored user language comprises wake command word, prompting user input language password;
When the preset password stored with terminal when the language password of user input is identical, confirmation authentication is passed through;
When the preset password stored with terminal when the language password of user input is not identical, confirmation authentication is not passed through.
5. method according to claim 1, is characterized in that, when receiving mutual request, carries out man-machine interaction comprise in the mode of natural language:
When receiving mutual request, obtain the natural language of user's input;
Utilize language understanding and Intelligent dialogue, what described user was inputted naturally speakes to oneself and processes, and according to the environmental information of described terminal, obtains natural language to be exported.
6. a human-computer interaction device, is characterized in that, described device comprises:
Monitor module, for when terminal is during in park mode, terminal monitoring user language;
Pattern redirect module, for working as listened to user language while comprising wake command word, jumps to mode of operation;
Interactive module, for when receiving mutual request, carries out man-machine interaction in the mode of natural language.
7. device according to claim 6, is characterized in that, described pattern redirect module comprises:
Authentication unit, for working as listened to user language while comprising wake command word, carries out authentication;
Described pattern redirect module also, for when authentication is passed through, jumps to mode of operation.
8. device according to claim 6, is characterized in that, described authentication unit comprises:
Vocal print password obtains subelement, for working as monitored user language while comprising wake command word, obtains the vocal print password of described user language;
Judgment sub-unit, for judging whether at least one stored vocal print password comprises the vocal print password of described user language;
Described authentication unit is also for working as at least one stored vocal print password while comprising the vocal print password of described user language, confirms that authentication passes through; When at least one stored vocal print password does not comprise the vocal print password of described user language, confirmation authentication is not passed through.
Described vocal print password comprises fixing cryptogram or revocable cryptogram.
9. device according to claim 6, is characterized in that, described authentication unit comprises:
Prompting subelement, for working as monitored user language while comprising wake command word, prompting user input language password;
When described authentication unit is also identical for the preset password stored with terminal when the language password of user input, confirmation authentication is passed through; When the preset password stored with terminal when the language password of user input is not identical, confirmation authentication is not passed through.
10. device according to claim 6, is characterized in that, described interactive module comprises:
Natural language acquiring unit, for when receiving mutual request, obtains the natural language of user's input;
Natural language output unit, for utilizing language understanding and Intelligent dialogue, what described user was inputted naturally speakes to oneself and processes, and according to the environmental information of described terminal, obtains natural language to be exported.
CN201410012205.1A 2014-01-11 2014-01-11 Method and device for man-machine interaction Pending CN103729193A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410012205.1A CN103729193A (en) 2014-01-11 2014-01-11 Method and device for man-machine interaction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410012205.1A CN103729193A (en) 2014-01-11 2014-01-11 Method and device for man-machine interaction

Publications (1)

Publication Number Publication Date
CN103729193A true CN103729193A (en) 2014-04-16

Family

ID=50453279

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410012205.1A Pending CN103729193A (en) 2014-01-11 2014-01-11 Method and device for man-machine interaction

Country Status (1)

Country Link
CN (1) CN103729193A (en)

Cited By (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104616653A (en) * 2015-01-23 2015-05-13 北京云知声信息技术有限公司 Word match awakening method, work match awakening device, voice awakening method and voice awakening device
CN105244025A (en) * 2015-10-29 2016-01-13 惠州Tcl移动通信有限公司 Voice identification method and system based on intelligent wearable device
CN105575395A (en) * 2014-10-14 2016-05-11 中兴通讯股份有限公司 Voice wake-up method and apparatus, terminal, and processing method thereof
CN105739977A (en) * 2016-01-26 2016-07-06 北京云知声信息技术有限公司 Wakeup method and apparatus for voice interaction device
CN105845135A (en) * 2015-01-12 2016-08-10 芋头科技(杭州)有限公司 Sound recognition system and method for robot system
CN106653018A (en) * 2016-11-04 2017-05-10 深圳市元征科技股份有限公司 Data display method and device for intelligent glasses
CN106649739A (en) * 2016-12-23 2017-05-10 深圳市空谷幽兰人工智能科技有限公司 Multi-round interactive information inheritance recognition method, apparatus and interactive system
CN106782569A (en) * 2016-12-06 2017-05-31 深圳增强现实技术有限公司 A kind of augmented reality method and device based on voiceprint registration
CN107016070A (en) * 2017-03-22 2017-08-04 北京光年无限科技有限公司 A kind of interactive method and device for intelligent robot
CN107053186A (en) * 2015-12-14 2017-08-18 卡西欧计算机株式会社 Interface, robot, dialogue method and storage medium
CN107329952A (en) * 2017-06-15 2017-11-07 重庆柚瓣科技有限公司 A kind of suitable aging semantic understanding method
CN108074310A (en) * 2017-12-21 2018-05-25 广东汇泰龙科技有限公司 Voice interactive method and intelligent lock administration system based on sound identification module
CN108182944A (en) * 2018-01-18 2018-06-19 吴波 Control the method, apparatus and intelligent terminal of intelligent terminal
CN108198554A (en) * 2018-01-29 2018-06-22 深圳市共进电子股份有限公司 The control method of domestic robot work system based on interactive voice
CN108417216A (en) * 2018-03-15 2018-08-17 深圳市声扬科技有限公司 Speech verification method, apparatus, computer equipment and storage medium
CN108670128A (en) * 2018-05-21 2018-10-19 深圳市沃特沃德股份有限公司 The method and sweeping robot of voice control sweeping robot
CN108990226A (en) * 2018-05-08 2018-12-11 珠海格力电器股份有限公司 A kind of interior light control method, apparatus and system
CN109493871A (en) * 2017-09-11 2019-03-19 上海博泰悦臻网络技术服务有限公司 The multi-screen voice interactive method and device of onboard system, storage medium and vehicle device
CN109584878A (en) * 2019-01-14 2019-04-05 广东小天才科技有限公司 A kind of voice awakening method and system
CN109599111A (en) * 2019-01-02 2019-04-09 百度在线网络技术(北京)有限公司 Voice interactive method, device and storage medium
CN109616123A (en) * 2018-11-21 2019-04-12 安徽云融信息技术有限公司 Based on the visually impaired people of big data with browser voice interactive method and device
CN109859757A (en) * 2019-03-19 2019-06-07 百度在线网络技术(北京)有限公司 A kind of speech ciphering equipment control method, device and terminal
CN111128187A (en) * 2019-12-30 2020-05-08 天津大学 Method for recording diet based on smart band
CN112118574A (en) * 2020-08-10 2020-12-22 西安交通大学 Safe communication method and system based on machine chat
CN112562694A (en) * 2020-12-02 2021-03-26 珠海格力电器股份有限公司 Method and device for awakening industrial touch screen through voice
US11164584B2 (en) 2017-10-24 2021-11-02 Beijing Didi Infinity Technology And Development Co., Ltd. System and method for uninterrupted application awakening and speech recognition

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105575395A (en) * 2014-10-14 2016-05-11 中兴通讯股份有限公司 Voice wake-up method and apparatus, terminal, and processing method thereof
CN105845135A (en) * 2015-01-12 2016-08-10 芋头科技(杭州)有限公司 Sound recognition system and method for robot system
CN104616653A (en) * 2015-01-23 2015-05-13 北京云知声信息技术有限公司 Word match awakening method, work match awakening device, voice awakening method and voice awakening device
CN105244025A (en) * 2015-10-29 2016-01-13 惠州Tcl移动通信有限公司 Voice identification method and system based on intelligent wearable device
CN107053186A (en) * 2015-12-14 2017-08-18 卡西欧计算机株式会社 Interface, robot, dialogue method and storage medium
CN105739977A (en) * 2016-01-26 2016-07-06 北京云知声信息技术有限公司 Wakeup method and apparatus for voice interaction device
CN106653018A (en) * 2016-11-04 2017-05-10 深圳市元征科技股份有限公司 Data display method and device for intelligent glasses
CN106782569A (en) * 2016-12-06 2017-05-31 深圳增强现实技术有限公司 A kind of augmented reality method and device based on voiceprint registration
CN106649739A (en) * 2016-12-23 2017-05-10 深圳市空谷幽兰人工智能科技有限公司 Multi-round interactive information inheritance recognition method, apparatus and interactive system
CN106649739B (en) * 2016-12-23 2020-09-11 广东惠禾科技发展有限公司 Multi-round interactive information inheritance identification method and device and interactive system
CN107016070A (en) * 2017-03-22 2017-08-04 北京光年无限科技有限公司 A kind of interactive method and device for intelligent robot
CN107016070B (en) * 2017-03-22 2020-06-02 北京光年无限科技有限公司 Man-machine conversation method and device for intelligent robot
CN107329952A (en) * 2017-06-15 2017-11-07 重庆柚瓣科技有限公司 A kind of suitable aging semantic understanding method
CN109493871A (en) * 2017-09-11 2019-03-19 上海博泰悦臻网络技术服务有限公司 The multi-screen voice interactive method and device of onboard system, storage medium and vehicle device
US11164584B2 (en) 2017-10-24 2021-11-02 Beijing Didi Infinity Technology And Development Co., Ltd. System and method for uninterrupted application awakening and speech recognition
CN108074310A (en) * 2017-12-21 2018-05-25 广东汇泰龙科技有限公司 Voice interactive method and intelligent lock administration system based on sound identification module
CN108074310B (en) * 2017-12-21 2021-06-11 广东汇泰龙科技股份有限公司 Voice interaction method based on voice recognition module and intelligent lock management system
CN108182944A (en) * 2018-01-18 2018-06-19 吴波 Control the method, apparatus and intelligent terminal of intelligent terminal
CN108198554A (en) * 2018-01-29 2018-06-22 深圳市共进电子股份有限公司 The control method of domestic robot work system based on interactive voice
CN108417216A (en) * 2018-03-15 2018-08-17 深圳市声扬科技有限公司 Speech verification method, apparatus, computer equipment and storage medium
CN108990226A (en) * 2018-05-08 2018-12-11 珠海格力电器股份有限公司 A kind of interior light control method, apparatus and system
CN108670128A (en) * 2018-05-21 2018-10-19 深圳市沃特沃德股份有限公司 The method and sweeping robot of voice control sweeping robot
CN109616123A (en) * 2018-11-21 2019-04-12 安徽云融信息技术有限公司 Based on the visually impaired people of big data with browser voice interactive method and device
CN109599111A (en) * 2019-01-02 2019-04-09 百度在线网络技术(北京)有限公司 Voice interactive method, device and storage medium
CN109584878A (en) * 2019-01-14 2019-04-05 广东小天才科技有限公司 A kind of voice awakening method and system
CN109859757A (en) * 2019-03-19 2019-06-07 百度在线网络技术(北京)有限公司 A kind of speech ciphering equipment control method, device and terminal
CN111128187A (en) * 2019-12-30 2020-05-08 天津大学 Method for recording diet based on smart band
CN112118574A (en) * 2020-08-10 2020-12-22 西安交通大学 Safe communication method and system based on machine chat
CN112118574B (en) * 2020-08-10 2022-02-22 西安交通大学 Safe communication method and system based on machine chat
CN112562694A (en) * 2020-12-02 2021-03-26 珠海格力电器股份有限公司 Method and device for awakening industrial touch screen through voice

Similar Documents

Publication Publication Date Title
CN103729193A (en) Method and device for man-machine interaction
CN106653021B (en) Voice wake-up control method and device and terminal
JP6797916B2 (en) Digital assistant processing of stacked data structures
CN108121490A (en) For handling electronic device, method and the server of multi-mode input
CN105976814B (en) Control method and device of head-mounted equipment
CN109243431A (en) A kind of processing method, control method, recognition methods and its device and electronic equipment
CN108363706A (en) The method and apparatus of human-computer dialogue interaction, the device interacted for human-computer dialogue
CN112313741A (en) Selective registration with an automated assistant
CN105320726A (en) Reducing the need for manual start/end-pointing and trigger phrases
CN107886957A (en) The voice awakening method and device of a kind of combination Application on Voiceprint Recognition
CN109192210B (en) Voice recognition method, wake-up word detection method and device
CN105575395A (en) Voice wake-up method and apparatus, terminal, and processing method thereof
CN111083678B (en) Playing control method and system of Bluetooth sound box and intelligent device
CN205787647U (en) A kind of intelligent sound alarm clock
CN110020009A (en) Online answering method, apparatus and system
CN105244042B (en) A kind of speech emotional interactive device and method based on finite-state automata
CN104795068A (en) Robot awakening control method and robot awakening control system
CN107632706A (en) The application data processing method and system of multi-modal visual human
CN107293300A (en) Audio recognition method and device, computer installation and readable storage medium storing program for executing
CN104538034A (en) Voice recognition method and system
WO2018006374A1 (en) Function recommending method, system, and robot based on automatic wake-up
CN107357787A (en) Semantic interaction method, apparatus and electronic equipment
CN110265011A (en) The exchange method and its electronic equipment of a kind of electronic equipment
CN107085717A (en) A kind of family's monitoring method, service end and computer-readable recording medium
CN109272991A (en) Method, apparatus, equipment and the computer readable storage medium of interactive voice

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20140416