CN109243435A - Phonetic order executes method and system - Google Patents
Phonetic order executes method and system Download PDFInfo
- Publication number
- CN109243435A CN109243435A CN201810892711.2A CN201810892711A CN109243435A CN 109243435 A CN109243435 A CN 109243435A CN 201810892711 A CN201810892711 A CN 201810892711A CN 109243435 A CN109243435 A CN 109243435A
- Authority
- CN
- China
- Prior art keywords
- instruction
- information
- word
- task
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/08—Speech classification or search
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS OR SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/22—Procedures used during a speech recognition process, e.g. man-machine dialogue
- G10L2015/223—Execution procedure of a spoken command
Abstract
This application discloses a kind of phonetic orders to execute method.This method includes that S1. receives the first phonetic order, and extracts key word information from first phonetic order;First phonetic order includes task;S2. according to the key word information and preset rules, judge the type of task described in first phonetic order;S3. it if it is determined that the task is reminding task, is then generated according to the temporal information in the key word information and reminds instruction;S4. it when to time in the temporal information, is sent and is executed instruction according to prompting instruction, the reminding task is executed according to described execute instruction.The technical problem that present application addresses intelligent robots in the related technology to have a single function, scene is poor for applicability.
Description
Technical field
This application involves Study of Intelligent Robot Control fields, execute method and system in particular to a kind of phonetic order.
Background technique
In intelligent robot technology, speech-sound intelligent robot because its is easy to operate, user experience is good have it is good
Application prospect.When operating speech-sound intelligent robot, user, which says, can in short indicate that robot completes various tasks.
Some speech-sound intelligent robots in the related technology can usually complete a kind of corresponding task according to preset speech identifying function,
It is applied in specific scene, but is difficult to carry out different types of task.The alarm clock robot of other in the related technology is only
Be able to achieve the function of traditional alarm clock jingle bell, can not achieve the diversified interaction between user, in practical applications practicability compared with
Difference.
For the problems in the relevant technologies, currently no effective solution has been proposed.
Summary of the invention
The main purpose of the application is to provide a kind of phonetic order execution method and system, to solve above-mentioned the relevant technologies
The problems in.
To achieve the goals above, according to the one aspect of the application, a kind of phonetic order execution method, the party are provided
Method includes:
S1. the first phonetic order is received, and extracts key word information from first phonetic order;First language
Sound instruction includes task;S2. according to the key word information and preset rules, judge task described in first phonetic order
Type;S3. if it is determined that the task is reminding task, then prompting is generated according to the temporal information in the key word information and referred to
It enables;S4. it when to time in the temporal information, is sent and is executed instruction according to prompting instruction, referred to according to the execution
It enables and executes the reminding task.
Further, method as the aforementioned, it is described according to after determining the task for reminding task in the S3
Temporal information in key word information generates before reminding instruction, comprising: S31. judge in the key word information whether include
Temporal information;S32. if it is determined that including temporal information in the key word information, then according to the time in the key word information
Information, which generates, reminds instruction;S33. if it is determined that not including temporal information in the key word information, then hint instructions are sent, it is described
Hint instructions are used to that user to be prompted to issue to include the second phonetic order of temporal information, and return to S1.
Further, method as the aforementioned, the S4 include: whether S41. judges in the prompting instruction to include that position is believed
Breath;If including S42. location information in prompting instruction, the execution is sent when to time in the temporal information and is referred to
It enables, wherein described execute instruction executes instruction including Motor execution instruction and prompting;S43. it is instructed and is transported according to the Motor execution
Move the position into the location information;S44. it is executed instruction according to the prompting and executes the reminding task;If S45. described
Reminding in instruction does not include location information, then to the time in the temporal information when send described in execute instruction, wherein described
It executes instruction and is executed instruction including reminding;S46. it is executed instruction according to the prompting and executes the reminding task.
Further, method as the aforementioned, if it includes location information that the S42, which includes: in the S421. prompting instruction,
Then the location information and location information each in map office are compared, whether judgement wherein has and the location information
The location information matched;S422. if it is determined that have with the matched location information of the location information, then send described in execute instruction,
Described in execute instruction and executed instruction including Motor execution instruction and prompting, and carry out S43;S423. if it is determined that not with it is described
The matched location information of location information, then carry out S45.
Further, method as the aforementioned, the S2, comprising: S21. chooses first pass from the key word information
Keyword is as word to be matched;S22. the word to be matched is obtained, the word to be matched and each word in dictionary are compared,
Judge whether the word to be matched is identical as the word in the dictionary;If the S23. matching in the word to be matched and the dictionary
Word is identical, then the type of dictionary where judging the matching word, and the word to be matched is labeled as corresponding types;The dictionary
Type includes figure kind, time class, location category and behavior class;If the S24. matching in the word to be matched and the dictionary
Word is not identical, then judges whether there are also next keywords in the key word information;S25. if so, then choosing next keyword
As word to be matched and return to S22;If S26. not having, the class of the keyword of tagged type in the key word information is judged
Whether type includes the figure kind and the behavior class;If S27. the type of the keyword of the tagged type includes the people
Species and the behavior class then determine that the type of task described in first phonetic order is the reminding task.
To achieve the goals above, according to the another aspect of the application, a kind of phonetic order execution system is provided, this is
System includes:
Voice receiving unit, task processing unit and task executing units;The voice receiving unit, for receiving first
Phonetic order, and first phonetic order is sent to the task processing unit;Wherein, first phonetic order includes
Task;The task processing unit, including extraction unit, the first judging unit and the first command unit;The extraction unit is used
In extracting key word information from first phonetic order;First judging unit is used for according to the key word information
And preset rules, judge the type of task described in first phonetic order;First command unit is used for described the
When one judging unit determines the task for reminding task, is generated to remind according to the temporal information in the key word information and referred to
It enables, and prompting instruction is sent to the task executing units;The task executing units, for believing when to the time
It when time in breath, is sent and is executed instruction according to prompting instruction, the reminding task is executed according to described execute instruction.
Further, system as the aforementioned, first judging unit, be also used to judge in the key word information whether
Include temporal information;First command unit is also used to determine in the key word information not when first judging unit
When comprising temporal information, hint instructions are generated and sent, the hint instructions are used to that user to be prompted to issue comprising temporal information
Second phonetic order.
Further, system as the aforementioned, the task executing units further include second judgment unit, the second instruction sheet
Member, moving cell and reminding unit;The second judgment unit is also used to judge in the prompting instruction whether include that position is believed
Breath;Second command unit is also used to arrive institute when second judgment unit determines in the prompting instruction to include location information
It is executed instruction described in being sent when stating the time in temporal information, wherein described execute instruction is held including Motor execution instruction and prompting
Row instruction;The moving cell, for moving to the position in the location information according to Motor execution instruction;It is described to mention
Awake unit executes the reminding task for executing instruction according to the prompting.
Further, system as the aforementioned, second command unit, including third judging unit and map office;It is described
Third judging unit is used for when the second judgment unit determines in the prompting instruction to include location information, by institute's rheme
Whether confidence breath is compared with each location information in the map office, judge wherein to have matched with the location information
Point information;Second command unit, for being determined with and the matched place of the location information when the third judging unit
It when information, is executed instruction described in transmission, wherein described execute instruction executes instruction including Motor execution instruction and prompting.
Further, system as the aforementioned, first judging unit, including acquiring unit, comparison unit, the first classification
Unit, Traversal Unit and the second taxon;The acquiring unit, for obtaining word to be matched, the word to be matched include from
A keyword is chosen in the key word information;The comparison unit, for by the word to be matched with it is each in dictionary
Word compares, and judges whether the word to be matched is identical as the word in the dictionary;First taxon, for working as institute
It states comparison unit to judge when the word to be matched is identical as the matching word in the dictionary, judges dictionary where the matching word
Type, and by the word to be matched be labeled as corresponding types;The dictionary type include figure kind, time class, location category, with
And behavior class;The Traversal Unit, for judging the matching word in the word to be matched and the dictionary when the comparison unit
When not identical, judge whether there are also next keywords in the key word information;Second taxon is used for the traversal
There is no next keyword in unit judges key word information, then judges the keyword of tagged type in the key word information
Whether type includes the figure kind and the behavior class;If the type of the keyword of the tagged type includes the personage
Class and the behavior class then determine that the type of task described in first phonetic order is the reminding task.
In the embodiment of the present application, in such a way that speech recognition technology is in conjunction with program control technology, by extracting simultaneously
The keyword of matching identification voice, has achieved the purpose that execute user speech instruction classification and technical effect, and then solves
Application is solved since related voice identification technology can recognize the skill of the single caused scene of sentence technical problem poor for applicability
Art problem.
Detailed description of the invention
The attached drawing constituted part of this application is used to provide further understanding of the present application, so that the application's is other
Feature, objects and advantages become more apparent upon.The illustrative examples attached drawing and its explanation of the application is for explaining the application, not
Constitute the improper restriction to the application.In the accompanying drawings:
Fig. 1 is that a kind of phonetic order that the application one embodiment provides executes method flow schematic diagram;
Fig. 2 is that a kind of reminding task that the application one embodiment provides executes method flow schematic diagram;
Fig. 3 is a kind of task type judgment method flow diagram that the application one embodiment provides;
Fig. 4 is that a kind of phonetic order that the application one embodiment provides executes system structure diagram.
Specific embodiment
In order to make those skilled in the art more fully understand application scheme, below in conjunction in the embodiment of the present application
Attached drawing, the technical scheme in the embodiment of the application is clearly and completely described, it is clear that described embodiment is only
The embodiment of the application a part, instead of all the embodiments.Based on the embodiment in the application, ordinary skill people
Member's every other embodiment obtained without making creative work, all should belong to the model of the application protection
It encloses.
It should be noted that the description and claims of this application and term " first " in above-mentioned attached drawing, "
Two " etc. be to be used to distinguish similar objects, without being used to describe a particular order or precedence order.It should be understood that using in this way
Data be interchangeable under appropriate circumstances, so as to embodiments herein described herein.In addition, term " includes " and " tool
Have " and their any deformation, it is intended that cover it is non-exclusive include, for example, containing a series of steps or units
Process, method, system, product or equipment those of are not necessarily limited to be clearly listed step or unit, but may include without clear
Other step or units listing to Chu or intrinsic for these process, methods, product or equipment.
It should be noted that in the absence of conflict, the features in the embodiments and the embodiments of the present application can phase
Mutually combination.The application is described in detail below with reference to the accompanying drawings and in conjunction with the embodiments.
The embodiment of the invention provides a kind of phonetic orders to execute method, and as shown in Figure 1 to Figure 3, this method includes as follows
The step of:
S1. the first phonetic order is received, and extracts key word information from first phonetic order;First language
Sound instruction includes task;Specifically, it for example, receiving the instruction " reminding Wang always to have a meeting in 7 points of rooms of going to office " that user says, extracts
Key word information is " 7 points " " go to office room " " Wang is total " " meeting ";
S2. according to the key word information and preset rules, judge the type of task described in first phonetic order;
Further, method as the aforementioned, the S2, comprising:
S21. first keyword is chosen from the key word information as word to be matched;Specifically, for example, selection " 7
Point ";
S22. the word to be matched is obtained, the word to be matched and each word in dictionary are compared, described in judgement
Whether word to be matched is identical as the word in the dictionary;Specifically, for example, being included in 1: 00 in dictionary assigns to 24 points of 00 minute models
The word of all expression times in enclosing, such as 7 points, 2 pm partly, 18 points 40 minutes;
If S23. the word to be matched is identical as the matching word in the dictionary, dictionary where judging the matching word
Type, and the word to be matched is labeled as corresponding types;The dictionary type include figure kind, time class, location category and
Behavior class;Specifically, for example, " " successful match, and it is labeled as " time class " at 7 points;
If S24. the word to be matched and the matching word in the dictionary be not identical, judge be in the key word information
It is no that there are also next keywords;
S25. if so, then choosing next keyword as word to be matched and returning to S22;Specifically, it " is gone for example, choosing
Office ", successful match, and it is labeled as " location category ";It chooses " Wang is total ", successful match, and is labeled as " figure kind ";It chooses
" meeting ", successful match, and it is labeled as " behavior class ";
If S26. not having, judge whether the type of the keyword of tagged type in the key word information includes described
Figure kind and the behavior class;
If S27. the type of the keyword of the tagged type includes the figure kind and the behavior class, institute is determined
The type for stating task described in the first phonetic order is the reminding task.Specifically, for example, above-mentioned key word information includes " 7
Point " " go to office room " " Wang is total " " meeting ", including figure kind " Wang is total " and behavior class " meeting ", therefore, it is determined that the first language of user
Sound instruction " reminding Wang always to have a meeting in 7 points of rooms of going to office " is reminding task.
S3. it if it is determined that the task is reminding task, is then generated and is reminded according to the temporal information in the key word information
Instruction;
Further, method as the aforementioned, it is described according to after determining the task for reminding task in the S3
Temporal information in key word information generates before reminding instruction, comprising:
S31. whether judge in the key word information comprising temporal information;Specifically, for example, above-mentioned key word information packet
Include " " at 7 points;
S32. if it is determined that including temporal information in the key word information, then believed according to the time in the key word information
Breath, which generates, reminds instruction;Specifically, for example, 7 point moments execute " reminding Wang always to have a meeting in the room of going to office " on the day of according to " 7 points " generations
Prompting instruction;
S33. if it is determined that not including temporal information in the key word information, then hint instructions, the hint instructions are sent
Include the second phonetic order of temporal information for prompting user to issue, and returns to S1.Specifically, for example, if user is instructing
In do not say the specific time, such as " Wang is reminded always to have a meeting ", robot will broadcast voice prompting: " please indicate the tool of task
The body time ";
S4. it when to time in the temporal information, is sent and is executed instruction according to prompting instruction, held according to described
Reminding task described in row instruction execution;
Further, method as the aforementioned, the S4 include:
S41. judge in the prompting instruction whether to include location information;
If including S42. location information in prompting instruction, sent when to time in the temporal information described in hold
Row instruction, wherein described execute instruction executes instruction including Motor execution instruction and prompting;
Optionally, step S41 can also be carried out again when to time in the temporal information, i.e. S41 ' to the time
When time in information, judge in the prompting instruction whether to include location information;S42 ' is if include position in the prompting instruction
Confidence breath, then execute instruction described in transmission, wherein described execute instruction executes instruction including Motor execution instruction and prompting;
Further, method as the aforementioned, the S42 include:
If including S421. location information in the prompting instruction, by each place in the location information and map office
Information compares, and whether judgement wherein has and the matched location information of the location information;Specifically, for example, reminding instruction
Location information " office " is contained in " reminding Wang always to have a meeting in the room of going to office ", then is believed the place in " office " and map office
Breath compares, and the common site in place applied by robot is contained in map office.
S422. if it is determined that have with the matched location information of the location information, then send described in execute instruction, wherein described
It executes instruction and is executed instruction including Motor execution instruction and prompting, and carry out S43;
S423. if it is determined that then carrying out S45 not with the matched location information of the location information.
S43. the position in the location information is moved to according to Motor execution instruction;
S44. it is executed instruction according to the prompting and executes the reminding task;Specifically, for example, if being wrapped in reminding task
Containing the location information marked in map office, then to reminder time 7 when, robot is automatically moved into the position, is executing prompting
Task such as issues voice prompting " Wang is total, woulds you please prepare meeting ";
If not including S45. location information in prompting instruction, sent when to time in the temporal information described in
It executes instruction, wherein described execute instruction executes instruction including reminding;
S46. it is executed instruction according to the prompting and executes the reminding task.Specifically, for example, if in reminding task not
Comprising the location information marked in map office, then to reminder time 7 when, robot can execute in situ reminding task (machine
People at this time should be in king head office), issue voice prompting " Wang is total, woulds you please prepare meeting ".
Optionally, for example, key word information is if robot identifies that user says instruction " removing No.1 exhibition room with me "
" going with me " " No.1 exhibition room ", the first judging unit judge that the instruction for the task of leading the way, is then matched to " No.1 exhibition in map office
Behind the Room ", sends route in movement instruction control Robot map office and be moved to " No.1 exhibition room ".
It can be seen from the above description that the present invention realizes following technical effect: by combining speech recognition technology
With process control machine people's technology, it can recognize that a plurality of types of phonetic orders, it can also be according to different phonetic orders in spy
Fixed time, specific place execute various types of tasks, realize diversified interaction between intelligent robot and user,
Convenient for being widely applied in actual scene.
It should be noted that step shown in the flowchart of the accompanying drawings can be in such as a group of computer-executable instructions
It is executed in computer system, although also, logical order is shown in flow charts, and it in some cases, can be with not
The sequence being same as herein executes shown or described step.
According to embodiments of the present invention, a kind of phonetic order for executing method for implementing above-mentioned phonetic order is additionally provided to hold
Row system, as shown in figure 4, the system includes: voice receiving unit, task processing unit and task executing units;
The voice receiving unit for receiving the first phonetic order, and first phonetic order is sent to described
Task processing unit;Wherein, first phonetic order includes task;Specifically, for example, voice receiving unit to receive user
The instruction " reminding Wang always to have a meeting in 7 points of rooms of going to office " said;
The task processing unit, including extraction unit, the first judging unit and the first command unit;The extraction unit
For extracting key word information from first phonetic order;First judging unit is used to be believed according to the keyword
Breath and preset rules, judge the type of task described in first phonetic order;First command unit is used for described
When first judging unit determines the task for reminding task, is generated to remind according to the temporal information in the key word information and referred to
It enables, and prompting instruction is sent to the task executing units;Specifically, for example, the keyword that extraction unit extracts
Information is " 7 points " " go to office room " " Wang is total " " meeting ";
Further, system as the aforementioned, first judging unit, be also used to judge in the key word information whether
Include temporal information;
Further, system as the aforementioned, first judging unit, including acquiring unit, comparison unit, first point
Class unit, Traversal Unit and the second taxon;
The acquiring unit, for obtaining word to be matched, the word to be matched includes choosing from the key word information
One keyword;Specifically, for example, acquiring unit obtains the first keyword " " at 7 points;
The comparison unit, for the word to be matched and each word in dictionary to be compared, judgement it is described to
It is whether identical as the word in the dictionary with word;Specifically, for example, being included in 1: 00 in dictionary assigned at 24 points within the scope of 00 minute
All expression times word, such as 7 points, 2 pm half, 18 points 40 minutes;
First taxon, for when comparison unit judgement is as in the word to be matched and the dictionary
When identical with word, the type of dictionary where judging the matching word, and the word to be matched is labeled as corresponding types;Institute's predicate
Library type includes figure kind, time class, location category and behavior class;Specifically, it for example, " 7 points " successful match, and is labeled as
" time class ";
The Traversal Unit, for judging the matching word in the word to be matched and the dictionary when the comparison unit not
When identical, judge whether there are also next keywords in the key word information;
Second taxon judges do not have next keyword in key word information for the Traversal Unit, then sentences
Whether the type of the keyword of tagged type includes the figure kind and the behavior class in the key word information of breaking;If institute
The type for stating the keyword of tagged type includes the figure kind and the behavior class, then determines in first phonetic order
The type of the task is the reminding task.Specifically, for example, traversal " 7 points " " go to office room " " Wang is total " " meeting ", chooses
" go to office room ", successful match, and it is labeled as " location category ";It chooses " Wang is total ", successful match, and is labeled as " figure kind ";Choosing
It takes " meeting ", successful match, and is labeled as " behavior class ";Therefore, key word information includes that figure kind " Wang is total " and behavior class " are opened
Meeting ", therefore, it is determined that the first phonetic order " reminding Wang always to have a meeting in 7 points of rooms of going to office " of user is reminding task.
First command unit is also used to when first judging unit determines not including in the key word information
Between information when, generate and send hint instructions, the hint instructions are used to that user to be prompted to issue to include the second language of temporal information
Sound instruction.Specifically, for example, if user does not say the specific time in instruction, such as " Wang is reminded always to have a meeting ", robot is just
Voice prompting can be broadcasted: " the specific time that please indicate task ";
The task executing units, for being instructed and being sent according to the prompting when to time in the temporal information
It executes instruction, the reminding task is executed according to described execute instruction;
Further, system as the aforementioned, the task executing units further include second judgment unit, the second instruction sheet
Member, moving cell and reminding unit;
Whether the second judgment unit is also used to judge in the prompting instruction include location information;
Second command unit is also used to determine that in the prompting instruction include location information in second judgment unit
When, sent when to time in the temporal information described in execute instruction, wherein described execute instruction instructs including Motor execution
It is executed instruction with prompting;
Further, system as the aforementioned, second command unit, including third judging unit and map office;
The third judging unit, for determining that in the prompting instruction include location information when the second judgment unit
When, the location information and each location information in the map office are compared, whether judgement wherein has and the position
The location information of information matches;Specifically, location information is contained for example, reminding in instruction " reminding Wang always to have a meeting in the room of going to office "
" office " and the location information in map office are then compared, are contained applied by robot in map office by " office "
The common site in place.
Second command unit, for being determined with and the matched place of the location information when the third judging unit
It when information, is executed instruction described in transmission, wherein described execute instruction executes instruction including Motor execution instruction and prompting.
The moving cell, for moving to the position in the location information according to Motor execution instruction;
The reminding unit executes the reminding task for executing instruction according to the prompting.Specifically, for example, mentioning
Comprising the location information " office " marked in map office in task of waking up, then to reminder time 7 when, robot is automatically moved into
Office is executing reminding task, such as issues voice prompting " Wang is total, woulds you please prepare meeting ";If do not included in reminding task
The location information marked in map office, then to reminder time 7 when, robot can execute in situ reminding task (robot this
When should be in king head office), issue voice prompting " Wang is total, would you please prepare meeting ".
Obviously, those skilled in the art should be understood that each module of the above invention or each step can be with general
Computing device realize that they can be concentrated on a single computing device, or be distributed in multiple computing devices and formed
Network on, optionally, they can be realized with the program code that computing device can perform, it is thus possible to which they are stored
Be performed by computing device in the storage device, perhaps they are fabricated to each integrated circuit modules or by they
In multiple modules or step be fabricated to single integrated circuit module to realize.In this way, the present invention is not limited to any specific
Hardware and software combines.
The foregoing is merely preferred embodiment of the present application, are not intended to limit this application, for the skill of this field
For art personnel, various changes and changes are possible in this application.Within the spirit and principles of this application, made any to repair
Change, equivalent replacement, improvement etc., should be included within the scope of protection of this application.
Claims (10)
1. a kind of phonetic order executes method characterized by comprising
S1. the first phonetic order is received, and extracts key word information from first phonetic order;First voice refers to
Enabling includes task;
S2. according to the key word information and preset rules, judge the type of task described in first phonetic order;
S3. it if it is determined that the task is reminding task, is then generated according to the temporal information in the key word information and reminds instruction;
S4. it when to time in the temporal information, is sent and is executed instruction according to prompting instruction, referred to according to the execution
It enables and executes the reminding task.
2. the method according to claim 1, wherein after determining the task for reminding task in the S3,
The temporal information according in the key word information generates before reminding instruction, comprising:
S31. whether judge in the key word information comprising temporal information;
S32. if it is determined that including temporal information in the key word information, then raw according to the temporal information in the key word information
It is instructed at reminding;
S33. if it is determined that not including temporal information in the key word information, then hint instructions are sent, the hint instructions are used for
Prompting user to issue includes the second phonetic order of temporal information, and returns to S1.
3. method according to claim 1 or 2, which is characterized in that the S4 includes:
S41. judge in the prompting instruction whether to include location information;
If including S42. location information in prompting instruction, the execution is sent when to time in the temporal information and is referred to
It enables, wherein described execute instruction executes instruction including Motor execution instruction and prompting;
S43. the position in the location information is moved to according to Motor execution instruction;
S44. it is executed instruction according to the prompting and executes the reminding task;
If not including S45. location information in prompting instruction, the execution is sent when to time in the temporal information
Instruction, wherein described execute instruction executes instruction including reminding;
S46. it is executed instruction according to the prompting and executes the reminding task.
4. according to the method described in claim 3, it is characterized in that, the S42 includes:
If including S421. location information in the prompting instruction, by each location information in the location information and map office
It compares, whether judgement wherein has and the matched location information of the location information;
S422. if it is determined that have with the matched location information of the location information, then send described in execute instruction, wherein the execution
Instruction includes that Motor execution instruction and prompting execute instruction, and carry out S43;
S423. if it is determined that then carrying out S45 not with the matched location information of the location information.
5. the method according to claim 1, wherein the S2, comprising:
S21. first keyword is chosen from the key word information as word to be matched;
S22. obtain the word to be matched, the word to be matched and each word in dictionary compared, judgement it is described to
It is whether identical as the word in the dictionary with word;
If S23. the word to be matched is identical as the matching word in the dictionary, the class of dictionary where judging the matching word
Type, and the word to be matched is labeled as corresponding types;The dictionary type includes figure kind, time class, location category, Yi Jihang
For class;
If S24. the word to be matched and the matching word in the dictionary be not identical, whether also to judge in the key word information
There is next keyword;
S25. if so, then choosing next keyword as word to be matched and returning to S22;
If S26. not having, judge whether the type of the keyword of tagged type in the key word information includes the personage
Class and the behavior class;
If S27. the type of the keyword of the tagged type includes the figure kind and the behavior class, described is determined
The type of task described in one phonetic order is the reminding task.
6. a kind of phonetic order executes system characterized by comprising voice receiving unit, task processing unit and task are held
Row unit;
First phonetic order for receiving the first phonetic order, and is sent to the task by the voice receiving unit
Processing unit;Wherein, first phonetic order includes task;
The task processing unit, including extraction unit, the first judging unit and the first command unit;The extraction unit is used for
Key word information is extracted from first phonetic order;First judging unit be used for according to the key word information and
Preset rules judge the type of task described in first phonetic order;First command unit is used for described first
When judging unit determines the task for reminding task, is generated according to the temporal information in the key word information and reminds instruction,
And prompting instruction is sent to the task executing units;
The task executing units are executed for being instructed to send according to the prompting when to time in the temporal information
Instruction executes the reminding task according to described execute instruction.
7. system according to claim 6, which is characterized in that first judging unit is also used to judge the key
It whether include temporal information in word information;
First command unit is also used to determine that not including the time in the key word information believes when first judging unit
When breath, hint instructions are generated and sent, the hint instructions are used to that user to be prompted to issue the second voice comprising temporal information to refer to
It enables.
8. system according to claim 6 or 7, which is characterized in that the task executing units further include the second judgement list
Member, the second command unit, moving cell and reminding unit;
Whether the second judgment unit is also used to judge in the prompting instruction include location information;
Second command unit is also used to arrive when second judgment unit determines in the prompting instruction to include location information
It is executed instruction described in being sent when time in the temporal information, wherein described execute instruction instructs and reminded including Motor execution
It executes instruction;
The moving cell, for moving to the position in the location information according to Motor execution instruction;
The reminding unit executes the reminding task for executing instruction according to the prompting.
9. system according to claim 8, which is characterized in that second command unit, including third judging unit and
Map office;
The third judging unit is used for when the second judgment unit determines in the prompting instruction to include location information,
The location information and each location information in the map office are compared, whether judgement wherein has and the location information
Matched location information;
Second command unit, for being determined with and the matched location information of the location information when the third judging unit
When, it executes instruction described in transmission, wherein described execute instruction executes instruction including Motor execution instruction and prompting.
10. system according to claim 6, which is characterized in that first judging unit, including acquiring unit, comparison
Unit, the first taxon, Traversal Unit and the second taxon;
The acquiring unit, for obtaining word to be matched, the word to be matched includes that one is chosen from the key word information
Keyword;
The comparison unit judges the word to be matched for comparing the word to be matched and each word in dictionary
It is whether identical as the word in the dictionary;
First taxon, for when comparison unit judgement is when the matching word in the word to be matched and the dictionary
When identical, the type of dictionary where judging the matching word, and the word to be matched is labeled as corresponding types;The dictionary class
Type includes figure kind, time class, location category and behavior class;
The Traversal Unit, for judging that the word to be matched and the matching word in the dictionary be not identical when the comparison unit
When, judge whether there are also next keywords in the key word information;
Second taxon judges do not have next keyword in key word information for the Traversal Unit, then judges institute
Whether the type for stating the keyword of tagged type in key word information includes the figure kind and the behavior class;If it is described
The type of the keyword of type includes the figure kind and the behavior class, then determines described in first phonetic order
The type of task is the reminding task.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810892711.2A CN109243435B (en) | 2018-08-07 | 2018-08-07 | Voice instruction execution method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810892711.2A CN109243435B (en) | 2018-08-07 | 2018-08-07 | Voice instruction execution method and system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109243435A true CN109243435A (en) | 2019-01-18 |
CN109243435B CN109243435B (en) | 2022-01-11 |
Family
ID=65070450
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810892711.2A Active CN109243435B (en) | 2018-08-07 | 2018-08-07 | Voice instruction execution method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109243435B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110163398A (en) * | 2019-05-07 | 2019-08-23 | 厦门钛尚人工智能科技有限公司 | Venue booking method and system based on speech recognition |
CN110534108A (en) * | 2019-09-25 | 2019-12-03 | 北京猎户星空科技有限公司 | A kind of voice interactive method and device |
CN111496820A (en) * | 2020-06-18 | 2020-08-07 | 北京云迹科技有限公司 | Method, device, storage medium and equipment for voice scheduling robot |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103116840A (en) * | 2013-03-07 | 2013-05-22 | 陈璟东 | Humanoid robot based intelligent reminding method and device |
CN104038630A (en) * | 2014-05-28 | 2014-09-10 | 小米科技有限责任公司 | Speech processing method and device |
CN106373572A (en) * | 2016-09-01 | 2017-02-01 | 北京百度网讯科技有限公司 | Information prompting method based on artificial intelligence and device |
US20180233132A1 (en) * | 2017-02-14 | 2018-08-16 | Microsoft Technology Licensing, Llc | Natural language interaction for smart assistant |
-
2018
- 2018-08-07 CN CN201810892711.2A patent/CN109243435B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103116840A (en) * | 2013-03-07 | 2013-05-22 | 陈璟东 | Humanoid robot based intelligent reminding method and device |
CN104038630A (en) * | 2014-05-28 | 2014-09-10 | 小米科技有限责任公司 | Speech processing method and device |
CN106373572A (en) * | 2016-09-01 | 2017-02-01 | 北京百度网讯科技有限公司 | Information prompting method based on artificial intelligence and device |
US20180233132A1 (en) * | 2017-02-14 | 2018-08-16 | Microsoft Technology Licensing, Llc | Natural language interaction for smart assistant |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110163398A (en) * | 2019-05-07 | 2019-08-23 | 厦门钛尚人工智能科技有限公司 | Venue booking method and system based on speech recognition |
CN110534108A (en) * | 2019-09-25 | 2019-12-03 | 北京猎户星空科技有限公司 | A kind of voice interactive method and device |
CN111496820A (en) * | 2020-06-18 | 2020-08-07 | 北京云迹科技有限公司 | Method, device, storage medium and equipment for voice scheduling robot |
Also Published As
Publication number | Publication date |
---|---|
CN109243435B (en) | 2022-01-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106407178B (en) | A kind of session abstraction generating method, device, server apparatus and terminal device | |
CN106716934B (en) | Chat interaction method and device and electronic equipment thereof | |
WO2016082692A1 (en) | Information prompting method and device, and instant messaging system | |
CN104932456B (en) | Intelligent scene implementation method and device, intelligent terminal and control equipment | |
CN105425970A (en) | Human-machine interaction method and device, and robot | |
CN106297782A (en) | A kind of man-machine interaction method and system | |
CN109243435A (en) | Phonetic order executes method and system | |
CN111783439A (en) | Man-machine interaction dialogue processing method and device, computer equipment and storage medium | |
CN106164869A (en) | Mixed-client/server architecture for parallel processing | |
CN112001175B (en) | Flow automation method, device, electronic equipment and storage medium | |
CN109753560A (en) | The information processing method and device of intelligent Answer System | |
CN106202288B (en) | A kind of optimization method and system of man-machine interactive system knowledge base | |
AU2017276360B2 (en) | A system for the automated semantic analysis processing of query strings | |
CN112417128B (en) | Method and device for recommending dialect, computer equipment and storage medium | |
CN107239450B (en) | Method for processing natural language based on interactive context | |
CN108132768A (en) | The processing method of phonetic entry, terminal and network server | |
CN109448727A (en) | Voice interactive method and device | |
JP7436077B2 (en) | Skill voice wake-up method and device | |
CN112115244A (en) | Dialogue interaction method and device, storage medium and electronic equipment | |
CN114694644A (en) | Voice intention recognition method and device and electronic equipment | |
CN105847558A (en) | Calendar event mode switching method based on mobile terminal and device thereof | |
CN105550218A (en) | Note management method and terminal | |
CN105677823B (en) | A kind of method and device that question and answer arrange | |
CN109299241A (en) | The knowledge library generating method and device of chat robots | |
CN109062891A (en) | Media processing method, device, terminal and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CP01 | Change in the name or title of a patent holder |
Address after: Room 702, 7th floor, NO.67, Beisihuan West Road, Haidian District, Beijing 100089 Patentee after: Beijing Yunji Technology Co.,Ltd. Address before: Room 702, 7th floor, NO.67, Beisihuan West Road, Haidian District, Beijing 100089 Patentee before: BEIJING YUNJI TECHNOLOGY Co.,Ltd. |
|
CP01 | Change in the name or title of a patent holder |