CN110480656A - One kind is accompanied and attended to robot, accompany and attend to robot control method and device - Google Patents
One kind is accompanied and attended to robot, accompany and attend to robot control method and device Download PDFInfo
- Publication number
- CN110480656A CN110480656A CN201910848250.3A CN201910848250A CN110480656A CN 110480656 A CN110480656 A CN 110480656A CN 201910848250 A CN201910848250 A CN 201910848250A CN 110480656 A CN110480656 A CN 110480656A
- Authority
- CN
- China
- Prior art keywords
- instruction
- robot
- user
- module
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/008—Manipulators for service tasks
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/008—Manipulators for service tasks
- B25J11/0085—Cleaning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
- B25J9/161—Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- Fuzzy Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Manipulator (AREA)
Abstract
It accompanies and attends to robot, accompany and attend to robot control method and device, including physical characteristics collecting module, processor this application provides one kind;The physical characteristics collecting module, for acquiring the biological information of user, and the biological information is transmitted to the processor every preset duration;The processor for receiving the biological information of the physical characteristics collecting module transmission, and is based on the biological information, predicts the mood of the user;And the mood predicted is input in preparatory trained first branch prediction model, the determining control instruction to match with the biological information, and robot of accompanying and attending to described in control makes operation corresponding with the control instruction.By this robot of accompanying and attending to, the precision of user demand branch prediction can be improved.
Description
Technical field
This application involves automatic control technology field, accompany and attend to robot, robot controlling party of accompanying and attending to more particularly, to one kind
Method and device.
Background technique
With the aggravation of Chinese population aging process and increasing for only child, many children can not accompany constantly
Old parent at one's side, this results in " Empty nest elderly " proportion in the elderly to increase day by day.The mental health of the elderly
Problem has become important social concern.According to statistics, there is the symptom of feeling of lonely, especially body in 70% or more the elderly
Weak, disability, solitary, advanced age, the death of one's spouse the elderly.
In order to solve the problems, such as that the elderly's emotion is accompanied and attended to and life care, in recent years it has been proposed that being replaced using robot
People carries out the elderly and accompanies and attends to.Existing robot of accompanying and attending to can only generally be inputted when nursing to the elderly according to the elderly
Instruction, execute corresponding operation, however required service may be not identical under different moods because of the elderly, it is this
Do not consider that the mode that user emotion variation provides service only according to instruction is not able to satisfy the demand that the elderly's emotion is accompanied and attended to.
Summary of the invention
In view of this, the one kind that is designed to provide of the application is accompanied and attended to robot, accompany and attend to robot control method and device,
To provide the service of reflection user's real feelings demand, and improve the precision of user demand branch prediction.
It accompanies and attends to robot in a first aspect, the embodiment of the present application provides one kind, the robot of accompanying and attending to includes biological characteristic
Acquisition module, processor;
The physical characteristics collecting module, for the biological information every preset duration acquisition user, and will be described
Biological information is transmitted to the processor;
The processor for receiving the biological information of the physical characteristics collecting module transmission, and is based on
The biological information predicts the mood of the user;And the mood predicted is input to trained first finger in advance
It enables in prediction model, the determining control instruction to match with the biological information, and robot of accompanying and attending to described in control makes
Operation corresponding with the control instruction.
At least one of in a kind of possible design, the physical characteristics collecting module comprises the following modules:
Sound acquisition module, image capture module;
Include the case where sound acquisition module for the physical characteristics collecting module, the biological information includes sound
Message breath;
Include the case where that image capture module, the biological information include face for the physical characteristics collecting module
Portion's image information.
In a kind of possible design, the processor is being based on the biological information, is predicting the mood of the user
When, it is specifically used for:
The phonetic feature in the acoustic information is extracted, the phonetic feature includes short-time energy, short-time zero-crossing rate, fundamental tone
Frequency, formant feature, word speed, mel cepstrum coefficients;
The phonetic feature is input to speech recognition submodel, the phonetic feature is obtained and belongs to any one default feelings
First score of thread;
The acoustic information is input to semantics recognition submodel, extracts the semantic key words in the acoustic information;
Based on the semantic key words, determine that the acoustic information belongs to the second score of any one default mood;
The facial image information is input in face recognition submodel, determines that the facial image information belongs to arbitrarily
A kind of third score of default mood;
First score, second score, the third score are weighted summation according to default weight, are based on
Score after summation determines the mood of the user.
In a kind of possible design, the robot of accompanying and attending to further include: receiving module and memory module;
The receiving module is transmitted to the processor for receiving the instruction of user, and by described instruction;
The processor, is also used to: when receiving the described instruction of the receiving module transmission, it is special to control the biology
Sign acquisition module acquires the biological information of the user;The behavior letter of the user is determined based on the biological information
Breath, and by described instruction, the time of reception described instruction and the behavioural information, it is transmitted to the memory module;The use
The behavioural information at family is used to indicate the active state of the user;
The memory module, for storing described instruction, the time for receiving described instruction and the behavioural information.
In a kind of possible design, the processor is also used to train the first branch prediction mould by the following method
Type:
Obtain at least one history emotional prediction of the user as a result, and each history emotional prediction result it is associated
Instruction;
The history emotional prediction result is input in branch prediction model to be trained, it is pre- to obtain the history mood
Survey the corresponding prediction instruction of result;
Based on prediction instruction and instruction associated with the history emotional prediction result, to described to be trained
Branch prediction model carries out epicycle training;
By more wheels training to described instruction prediction model, the first branch prediction model is obtained.
In a kind of possible design, the processor is also used to after receiving the biological information:
Based on the biological information, the behavioural information of the user is determined;
By the behavioural information of the user and the time of the reception biological information, it is input to and trains in advance
The second branch prediction model in, the determining control instruction to match with the biological information, and machine of accompanying and attending to described in controlling
Device people makes operation corresponding with the control instruction.
In a kind of possible design, the processor trains the second branch prediction model in accordance with the following methods:
The instruction of memory module storage is obtained, the time of described instruction is received and described specifies corresponding behavior
Information;
By the time for receiving described instruction and described corresponding behavioural information is specified to be input to prediction to be trained
In model, output obtains prediction instruction;
It is instructed based on the instruction obtained from the memory module and the prediction, epicycle is carried out to the prediction model
Training;
By more wheels training to the prediction model, the second branch prediction model is obtained.
In a kind of possible design, the robot of accompanying and attending to further include: voice synthetic module;
The voice synthetic module is issued for extracting the audio frequency characteristics of template voice, and receiving the processor
Voice play instruction when, based on the audio frequency characteristics issue voice.
In a kind of possible design, the robot of accompanying and attending to further include: alarm module;
The alarm module, for carrying out the behavioural information with abnormal behaviour information in the memory module is stored in
Comparison sends warning message to the equipment bound in advance when comparing successfully.
In a kind of possible design, the robot of accompanying and attending to, further includes: apart from detection module, mobile module;
It is described apart from detection module, for detecting accompany and attend to the distance between the robot and the user, and will described in
Distance is sent to the mobile module;
The mobile module, for court, robot of when detecting that the distance is greater than pre-determined distance, accompanying and attending to described in control
It is moved to the position where the user.
In a kind of possible design, the robot of accompanying and attending to further include: clean module;
The cleaning module, for controlling and the machine of accompanying and attending to when receiving the cleaning instruction that the processor is sent
The sweeping robot of device people connection cleans.
In a kind of possible design, infrared positioning is both provided on accompany and attend to robot and the sweeping machine human agent
Device;
The sweeping robot complete clean after, based on the infrared positioning apparatus again with the robot of accompanying and attending to
Establish connection.
Second aspect, the embodiment of the present application also provide one kind and accompany and attend to robot control method, comprising:
Receive the biological information of user;
Based on the biological information, the mood of the user is predicted;
The mood predicted is input in preparatory trained first branch prediction model, the determining and biological characteristic
The control instruction that information matches;
Robot of accompanying and attending to described in control makes operation corresponding with the control instruction.
The third aspect is accompanied and attended to robot controller the embodiment of the present application also provides one kind, comprising:
Receiving module, for receiving the biological information of user;
Prediction module predicts the mood of the user for being based on the biological information;
Determining module is determined for the mood predicted to be input in preparatory trained first branch prediction model
The control instruction to match with the biological information;
Control module makes operation corresponding with the control instruction for controlling the robot of accompanying and attending to.
Fourth aspect, the embodiment of the present application also provide a kind of electronic equipment, comprising: processor, memory and bus, it is described
Memory is stored with the executable machine readable instructions of the processor, when electronic equipment operation, the processor with it is described
By bus communication between memory, the machine readable instructions are executed when being executed by the processor in above-mentioned second aspect
Step.
5th aspect, the embodiment of the present application also provide a kind of computer readable storage medium, the computer-readable storage medium
Computer program is stored in matter, which executes the step in above-mentioned second aspect when being run by processor.
Robot provided by the embodiments of the present application of accompanying and attending to, accompany and attend to robot control method and device, are accompanied and attended to by being deployed in
The biological information of physical characteristics collecting module acquisition user in robot, is then based on the processing being deployed in robot
Device carries out emotional prediction according to biological information, and the mood at prediction is input to trained first instruction in advance in advance
It surveys in model, the determining control instruction to match with biological information, this control instruction determines that method combines user and works as
Preceding mood, therefore the instruction determined is more accurate, when control robot makes operation corresponding with control instruction,
The operation made also more meets user demand.
To enable the above objects, features, and advantages of the application to be clearer and more comprehensible, preferred embodiment is cited below particularly, and cooperate
Appended attached drawing, is described in detail below.
Detailed description of the invention
Technical solution in ord to more clearly illustrate embodiments of the present application, below will be to needed in the embodiment attached
Figure is briefly described, it should be understood that the following drawings illustrates only some embodiments of the application, therefore is not construed as pair
The restriction of range for those of ordinary skill in the art without creative efforts, can also be according to this
A little attached drawings obtain other relevant attached drawings.
Fig. 1 shows a kind of configuration diagram for robot of accompanying and attending to provided by the embodiment of the present application;
Fig. 2 shows processors provided by the embodiment of the present application in the biology for receiving physical characteristics collecting module transmission
Processing flow schematic diagram after characteristic information;
Fig. 3 shows a kind of flow diagram of the method for the mood for predicting user provided by the embodiment of the present application;
Fig. 4 shows the flow diagram of the first branch prediction model training method provided by the embodiment of the present application;
Fig. 5 shows the configuration diagram of the robot of accompanying and attending to of another possibility provided by the embodiment of the present application;
Fig. 6 shows the flow diagram of the training method of the second branch prediction model provided by the embodiment of the present application;
Fig. 7 shows a kind of flow diagram for robot control method of accompanying and attending to provided by the embodiment of the present application;
Fig. 8 shows a kind of configuration diagram for robot controller of accompanying and attending to provided by the embodiment of the present application;
Fig. 9 shows the structural schematic diagram of electronic equipment 900 provided by the embodiment of the present application.
Specific embodiment
To keep the purposes, technical schemes and advantages of the embodiment of the present application clearer, below in conjunction with the embodiment of the present application
Middle attached drawing, the technical scheme in the embodiment of the application is clearly and completely described, it is clear that described embodiment is only
It is some embodiments of the present application, instead of all the embodiments.The application being usually described and illustrated herein in the accompanying drawings is real
The component for applying example can be arranged and be designed with a variety of different configurations.Therefore, below to the application's provided in the accompanying drawings
The detailed description of embodiment is not intended to limit claimed scope of the present application, but is merely representative of the selected reality of the application
Apply example.Based on embodiments herein, those skilled in the art institute obtained without making creative work
There are other embodiments, shall fall in the protection scope of this application.
For convenient for understanding the present embodiment, the robot of accompanying and attending to of one kind disclosed in the embodiment of the present application is carried out first
It is discussed in detail.
It is shown in Figure 1, it is a kind of configuration diagram for robot of accompanying and attending to provided by the embodiment of the present application, including biology
Collection apparatus module and processor.
Specifically, physical characteristics collecting module is used to acquire the biological information of user every preset duration, then will
The biological information of acquisition is transmitted to processor.
Wherein, physical characteristics collecting module may include at least one of sound acquisition module, image capture module.
When physical characteristics collecting module includes sound acquisition module, biological information includes acoustic information, works as biology
When collection apparatus module includes image capture module, biological information includes facial image information.
Processor can execute after the biological information for receiving physical characteristics collecting module transmission such as Fig. 2 institute
The treatment process stated, including the following steps:
Step 201 is based on biological information, predicts the mood of user.
In a kind of possible embodiment, processor is being based on biological information, can be with when predicting the mood of user
Pass through method as shown in Figure 3, comprising:
Phonetic feature in step 301, extraction acoustic information.
Wherein, the phonetic feature in acoustic information include short-time energy, short-time zero-crossing rate, fundamental frequency, formant feature,
Word speed, mel cepstrum coefficients, the specific method for extracting the phonetic feature in acoustic information will not describe herein in expansion.
Phonetic feature is input to speech recognition submodel by step 302, is obtained phonetic feature and is belonged to any one default feelings
First score of thread.
In a kind of possible embodiment, default mood may include: it is happy, sad, irritated, angry etc., by voice
After feature is input in speech recognition submodel, the available phonetic feature belongs to the first of any one default mood
Point.
Illustratively, after one section of phonetic feature is input in speech recognition submodel, it is likely to be obtained voice spy
It is 80 that sign, which belongs to the first happy score, and belonging to the first sad score is 30, and belonging to the first irritated score is 20, belongs to anger
First score of anger is 10.
Acoustic information is input to semantics recognition submodel by step 303, extracts the semantic key words in acoustic information.
It is semantic after acoustic information is input to semantics recognition submodel in a kind of possible embodiment of the application
The acoustic information first can be converted to corresponding text information by identification submodel, then utilize the N in semantics recognition submodel
Meta-model, to text information carry out cutting, obtain set of words corresponding to text information, then by the set of words with
Pre-stored keyword compares in database, will compare successful word and is determined as semantic key words.
Step 304 is based on semantic key words, determines that acoustic information belongs to the second score of any one default mood.
Facial image information is input in face recognition submodel by step 305, determines that facial image information belongs to arbitrarily
A kind of third score of default mood.
Specifically, face-image can also be extracted before facial image information is input in face recognition submodel
Characteristics of image of information, such as geometrical characteristic, deformation characteristics, the motion feature at each position of face etc., then by characteristics of image
It is input in face recognition submodel, determines that facial image information belongs to the third score of any one default mood.
Wherein, it should be noted that the execution of step 301- step 302, step 303- step 304 and step 305 is regardless of elder generation
Sequence afterwards.
First score, the second score, third score are weighted summation according to default weight by step 306, based on summation
Score afterwards determines the mood of user.
Wherein, based on the score after summation, when determining the mood of user, the mood of highest scoring can be determined as using
The mood at family.
In a kind of possible application scenarios, robot of accompanying and attending to may only collect the acoustic information of user or only adopt
The facial image information of user is collected, then when based on the mood of biological information prediction user, will can not collect
Information corresponding to score be determined as 0.
Illustratively, if only having collected the acoustic information of user, third score is determined as 0, is determining user's feelings
When the score of thread, the first score and the second score are weighted summation according to default weight.
In one example of the application, face recognition submodel, semantics recognition submodel and speech recognition submodel be can be
One of support vector machines, convolutional neural networks, k nearest neighbor disaggregated model.
The mood predicted is input in preparatory trained first branch prediction model, determining and biology by step 202
The control instruction that characteristic information matches.
Step 203, control robot make operation corresponding with control instruction.
In a kind of possible embodiment, processor, can be according to such as Fig. 4 institute in the first branch prediction model of training
The method shown is trained, including the following steps:
Step 401 obtains at least one history emotional prediction of user as a result, and each history emotional prediction result phase
Associated instruction.
In a kind of possible design, robot of accompanying and attending to further includes memory module, and history mood is stored in memory module
Prediction result and each associated instruction of history emotional prediction result, processor, can in training instruction prediction model
To obtain history emotional prediction from memory module as a result, and each associated instruction of history emotional prediction result.
The history emotional prediction that memory module is stored is as a result, and each associated finger of history emotional prediction result
Order can be, and physical characteristics collecting module is then communicated to processor every the biological information of preset duration acquisition user,
Processor is predicted based on mood of the biological information to user, is receiving physical characteristics collecting module transmission again
Before biological information, if receiving the instruction of user's input, the feelings that the instruction that user inputs is determined as and is predicted
The associated instruction of thread, the instruction storage for then inputting the mood predicted and user to memory module.
History emotional prediction result is input in branch prediction model to be trained by step 402, and it is pre- to obtain history mood
Survey the corresponding prediction instruction of result.
Step 403 is based on prediction instruction and instruction associated with history emotional prediction result, treats trained instruction
Prediction model carries out epicycle training.
In specific implementation, this can be determined based on prediction instruction and instruction associated with history emotional prediction result
Intersection entropy loss during wheel training is then based on the mould for intersecting and instructing prediction model in entropy loss adjustment epicycle training process
Shape parameter.
Step 404 is trained by more wheels to branch prediction model, obtains the first branch prediction model.
In above process, the sample data in the training process of the first branch prediction model derives from the finger of user's input
It enables and during instruction input each time, the user emotion predicted, therefore, first trained by this method refers to
It enables prediction model meet the behavioural characteristic of active user, personalized service can be provided for user.
In a kind of possible application scenarios, if user includes more people, processor can be known according to biological information
Not multiple users, and the emotional prediction result of different user and the associated instruction of each emotional prediction result are stored
Into in different memory modules.Processor can be deposited out in module according to difference and be stored in the first branch prediction model of training
Emotional prediction result and each associated instruction of emotional prediction result, training obtain the first different branch predictions
Model.When processor receives biological information, can based on biological information carry out user's identification, then determine with
The corresponding first branch prediction model of the user, and control instruction is predicted based on the first branch prediction model.
In scheme provided in this embodiment, physical characteristics collecting module acquires the biology of a user every preset duration
Characteristic information, in the embodiment of another possibility, physical characteristics collecting module can also be received in robot of accompanying and attending to
After the instruction of user's input, the biological information of user is acquired.
It is shown in Figure 5, it is the framework signal of the robot of accompanying and attending to of another possibility provided by the embodiment of the present application
Figure, which can also include receiving module and memory module;
Receiving module is transmitted to processor for receiving the instruction of user, and by received instruction;
Processor can be also used for when receiving the instruction of receiving module transmission, and control physical characteristics collecting module is adopted
Collect the biological information of user, and determine the behavioural information of user based on biological information, received instruction, reception are referred to
The time of order and the behavioural information determined are all transmitted to memory module;
Memory module, the received instruction transmitted for storage processor, receive instruction time and determine
Behavioural information.
Wherein, the behavioural information of user is used to indicate the active state of user, if behavioural information may include sitting quietly, sleeping
Feel, get up, walking, tumble etc..
In a kind of possible embodiment, processor is also based on biology after receiving biological information
Characteristic information determines the behavioural information of user, then by the behavioural information of user and receive biological information time,
It is input in preparatory trained second branch prediction model, the determining control instruction to match with biological information, and controls
It makes robot of accompanying and attending to and makes operation corresponding with control instruction.
In a kind of possible application scenarios, if user includes more people, processor can be known according to biological information
Not multiple users, and different storages is sent by the time of the behavioural information of different user and reception biological information
In module.Second branch prediction model can also be biological according to the behavioural information for the user that different memory modules are stored, reception
The time of characteristic information and dependent instruction train the second different branch prediction models, when receiving biological information, root
Corresponding second branch prediction model is determined according to biological information, and then control instruction is predicted.
In a kind of possible design, robot of accompanying and attending to can also include alarm module, and alarm module can believe behavior
It ceases and is compared with the abnormal behaviour information of storage in a storage module, when comparing successfully, sent to the equipment bound in advance
Warning message.
Wherein, the training method of the second branch prediction model is referred to method shown in fig. 6, including following step
It is rapid:
Step 601, the instruction for obtaining memory module storage receive the time instructed and specify corresponding behavioural information.
Step 602 will receive the time instructed and corresponding behavioural information specified to be input to prediction model to be trained
In, output obtains prediction instruction.
Step 603 is instructed based on the instruction and prediction obtained from memory module, carries out epicycle training to prediction model.
Wherein, based on instruction prediction instruction and obtained from memory module, epicycle training is carried out to prediction model
When, it can be based on instruction prediction instruction and obtained from memory module, determine the cross entropy in epicycle training process, so
Model parameter afterwards based on the prediction model in cross entropy adjustment epicycle training process.
Step 604 is trained by more wheels to prediction model, obtains the second branch prediction model.
In a kind of possible design, robot of accompanying and attending to can also include voice synthetic module, and voice synthetic module can be with
For extracting the audio frequency characteristics of template voice, and when the voice for receiving processor sending plays instruction, the mould based on extraction
The audio frequency characteristics of plate voice issue voice.
Wherein, template voice can be through what is manually inputted, be also possible to connect by the external of robot of accompanying and attending to
Mouth is imported from external equipment.
In a kind of possible design, robot of accompanying and attending to can also include apart from detection module, mobile module;Distance detection
Module can be used for detecting accompany and attend to the distance between robot and user, and send mobile module for distance, and mobile module is used
In when the distance for detecting that detection module is sent is greater than pre-determined distance, control accompany and attend to the position where robot towards user into
Row movement.
Wherein, mobile module detection accompany and attend to robot between user at a distance from when, can first pass through to be mounted on and accompany and attend to
Video camera shooting in robot includes the image of user, is then based on image segmentation and determines the position of user in the picture,
Again based on the infrared facility accompanied and attended in robot is mounted on, is detected and accompanied and attended between robot and user by way of infrared distance measurement
Distance.
In a kind of possible design, robot of accompanying and attending to can also include cleaning module, clean module and receive for working as
When the cleaning instruction that processor is sent, the sweeping robot for the robot connection that controls and accompany and attend to is cleaned.
Wherein, sweeping robot can be connect by way of embedded with robot of accompanying and attending to.It accompanies and attends to robot and scavenging machine
It can be both provided with infrared positioning apparatus in the main body of device people, after sweeping robot is completed to clean, filled based on infrared positioning
It sets and establishes connection with robot of accompanying and attending to again.
In a kind of possible design, robot of accompanying and attending to can also be attached with external equipment, and pass through external device
Heart rate, the blood pressure etc. of user are detected, when detecting the appearance exception such as heart rate, blood pressure, by alarm module to preparatory
The equipment of binding sends warning message.
Based on identical design, accompany and attend to robot control method present invention also provides one kind, it is shown in Figure 7, for this
Apply for a kind of flow diagram for robot control method of accompanying and attending to provided by embodiment, including the following steps:
Step 701, the biological information for receiving user.
Step 702 is based on biological information, predicts the mood of user.
The mood predicted is input in preparatory trained first branch prediction model, determining and biology by step 703
The control instruction that characteristic information matches.
Step 704, control robot of accompanying and attending to make operation corresponding with control instruction.
It accompanies and attends to robot controller present invention also provides one kind, it is shown in Figure 8, it is provided by the embodiment of the present application
A kind of robot controller of accompanying and attending to configuration diagram, including receiving module 801, prediction module 802, determining module 803,
Control module 804, specific:
Receiving module 801, for receiving the biological information of user;
Prediction module 802 predicts the mood of the user for being based on the biological information;
Determining module 803, for the mood predicted to be input in preparatory trained first branch prediction model, really
The fixed control instruction to match with the biological information;
Control module 804 makes operation corresponding with the control instruction for controlling the robot of accompanying and attending to.
Robot provided by the embodiments of the present application of accompanying and attending to, accompany and attend to robot control method and device, are accompanied and attended to by being deployed in
The biological information of physical characteristics collecting module acquisition user in robot, is then based on the processing being deployed in robot
Device carries out emotional prediction according to biological information, and the mood at prediction is input to trained first instruction in advance in advance
It surveys in model, the determining control instruction to match with biological information, this control instruction determines that method combines user and works as
Preceding mood, therefore the instruction determined is more accurate, when control robot makes operation corresponding with control instruction,
The operation made also more meets user demand.
Based on same technical concept, the embodiment of the present application also provides a kind of electronic equipment.It is this Shen referring to shown in Fig. 9
Please the structural schematic diagram of electronic equipment 900 that provides of embodiment, including processor 901, memory 902 and bus 903.Wherein,
Memory 902 is executed instruction for storing, including memory 9021 and external memory 9022;Here memory 9021 is also referred to as memory
Reservoir, for temporarily storing the operational data in processor 901, and the data exchanged with external memories 9022 such as hard disks,
Processor 901 carries out data exchange by memory 9021 and external memory 9022, when electronic equipment 900 is run, processor
It is communicated between 901 and memory 902 by bus 903, so that processor 901 is being executed to give an order:
Receive the biological information of user;
Based on the biological information, the mood of the user is predicted;
The mood predicted is input in preparatory trained first branch prediction model, the determining and biological characteristic
The control instruction that information matches;
Robot of accompanying and attending to described in control makes operation corresponding with the control instruction.
The embodiment of the present application also provides a kind of computer readable storage medium, is stored on the computer readable storage medium
Computer program, the computer program execute any of the above-described robot as described in the examples of accompanying and attending to and control when being run by processor
The step of method.
Specifically, which can be general storage medium, such as mobile disk, hard disk, on the storage medium
Computer program when being run, be able to carry out it is above-mentioned accompany and attend to robot control method the step of, needed to improve and improve user
Seek the precision of branch prediction.
Carry out accompanying and attending to provided by the embodiment of the present application the computer program product of robot control method, including stores
The computer readable storage medium of the executable non-volatile program code of processor, the instruction that said program code includes are available
In executing previous methods method as described in the examples, specific implementation can be found in embodiment of the method, and details are not described herein.
It is apparent to those skilled in the art that for convenience and simplicity of description, the system of foregoing description,
The specific work process of device and unit, can refer to corresponding processes in the foregoing method embodiment, and details are not described herein.
In several embodiments provided herein, it should be understood that disclosed systems, devices and methods, it can be with
It realizes by another way.The apparatus embodiments described above are merely exemplary, for example, the division of the unit,
Only a kind of logical function partition, there may be another division manner in actual implementation, in another example, multiple units or components can
To combine or be desirably integrated into another system, or some features can be ignored or not executed.Another point, it is shown or beg for
The mutual coupling, direct-coupling or communication connection of opinion can be through some communication interfaces, device or unit it is indirect
Coupling or communication connection can be electrical property, mechanical or other forms.
The unit as illustrated by the separation member may or may not be physically separated, aobvious as unit
The component shown may or may not be physical unit, it can and it is in one place, or may be distributed over multiple
In network unit.It can select some or all of unit therein according to the actual needs to realize the mesh of this embodiment scheme
's.
It, can also be in addition, each functional unit in each embodiment of the application can integrate in one processing unit
It is that each unit physically exists alone, can also be integrated in one unit with two or more units.
It, can be with if the function is realized in the form of SFU software functional unit and when sold or used as an independent product
It is stored in the executable non-volatile computer-readable storage medium of a processor.Based on this understanding, the application
Technical solution substantially the part of the part that contributes to existing technology or the technical solution can be with software in other words
The form of product embodies, which is stored in a storage medium, including some instructions use so that
One computer equipment (can be personal computer, server or the network equipment etc.) executes each embodiment institute of the application
State all or part of the steps of method.And storage medium above-mentioned includes: USB flash disk, mobile hard disk, read-only memory (Read-Only
Memory, ROM), random access memory (Random Access Memory, RAM), magnetic or disk etc. is various to deposit
Store up the medium of program code.
Finally, it should be noted that embodiment described above, the only specific embodiment of the application, to illustrate the application
Technical solution, rather than its limitations, the protection scope of the application is not limited thereto, although with reference to the foregoing embodiments to this Shen
It please be described in detail, those skilled in the art should understand that: anyone skilled in the art
Within the technical scope of the present application, it can still modify to technical solution documented by previous embodiment or can be light
It is readily conceivable that variation or equivalent replacement of some of the technical features;And these modifications, variation or replacement, do not make
The essence of corresponding technical solution is detached from the spirit and scope of the embodiment of the present application technical solution, should all cover the protection in the application
Within the scope of.Therefore, the protection scope of the application shall be subject to the protection scope of the claim.
Claims (10)
- The robot 1. one kind is accompanied and attended to, which is characterized in that the robot of accompanying and attending to includes physical characteristics collecting module, processor;The physical characteristics collecting module, for every preset duration, acquiring the biological information of user, and by the biology Characteristic information is transmitted to the processor;The processor, for receiving the biological information of the physical characteristics collecting module transmission, and based on described Biological information predicts the mood of the user;And the mood predicted is input to trained first instruction in advance in advance It surveys in model, the determining control instruction to match with the biological information, and robot of accompanying and attending to described in control makes and institute State the corresponding operation of control instruction.
- 2. robot according to claim 1 of accompanying and attending to, the physical characteristics collecting module comprise the following modules at least It is a kind of:Sound acquisition module, image capture module;Include the case where sound acquisition module for the physical characteristics collecting module, the biological information includes sound letter Breath;Include the case where that image capture module, the biological information include face figure for the physical characteristics collecting module As information.
- 3. robot according to claim 2 of accompanying and attending to, the processor are being based on the biological information, are predicting institute When stating the mood of user, it is specifically used for:The phonetic feature in the acoustic information is extracted, the phonetic feature includes short-time energy, short-time zero-crossing rate, fundamental tone frequency Rate, formant feature, word speed, mel cepstrum coefficients;The phonetic feature is input to speech recognition submodel, the phonetic feature is obtained and belongs to any one default mood First score;The acoustic information is input to semantics recognition submodel, extracts the semantic key words in the acoustic information;Based on the semantic key words, determine that the acoustic information belongs to the second score of any one default mood;The facial image information is input in face recognition submodel, determines that the facial image information belongs to any one The third score of default mood;First score, second score, the third score are weighted summation according to default weight, based on summation Score afterwards determines the mood of the user.
- 4. robot according to claim 1 of accompanying and attending to, which is characterized in that the robot of accompanying and attending to further include: receiving module, And memory module;The receiving module is transmitted to the processor for receiving the instruction of user, and by described instruction;The processor, is also used to: when receiving the described instruction of the receiving module transmission, controlling the biological characteristic and adopts Collection module acquires the biological information of the user;The behavioural information of the user is determined based on the biological information, And by described instruction, the time of reception described instruction and the behavioural information, it is transmitted to the memory module;The user Behavioural information be used to indicate the active state of the user;The memory module, for storing described instruction, the time for receiving described instruction and the behavioural information.
- 5. robot according to claim 1 of accompanying and attending to, which is characterized in that the processor is also used to by the following method Training the first branch prediction model:At least one history emotional prediction of the user is obtained as a result, and each associated finger of history emotional prediction result It enables;The history emotional prediction result is input in branch prediction model to be trained, obtains the history emotional prediction knot The corresponding prediction instruction of fruit;Based on prediction instruction and instruction associated with the history emotional prediction result, to the instruction to be trained Prediction model carries out epicycle training;By more wheels training to described instruction prediction model, the first branch prediction model is obtained.
- 6. robot according to claim 4 of accompanying and attending to, which is characterized in that the processor, receiving, the biology is special After reference breath, it is also used to:Based on the biological information, the behavioural information of the user is determined;By the behavioural information of the user and time of the biological information is received, is input to trained in advance the In two branch prediction models, the determining control instruction to match with the biological information, and robot of accompanying and attending to described in control Make operation corresponding with the control instruction.
- 7. robot according to claim 6 of accompanying and attending to, which is characterized in that the processor trains institute in accordance with the following methods State the second branch prediction model:The instruction of memory module storage is obtained, the time of described instruction is received and described specifies corresponding behavioural information;By the time for receiving described instruction and described corresponding behavioural information is specified to be input to prediction model to be trained In, output obtains prediction instruction;It is instructed based on the instruction obtained from the memory module and the prediction, epicycle training is carried out to the prediction model;By more wheels training to the prediction model, the second branch prediction model is obtained.
- 8. robot according to claim 6 of accompanying and attending to, which is characterized in that the robot of accompanying and attending to further include: alarm module;The alarm module, for by the behavioural information and being stored in abnormal behaviour information in the memory module and carrying out pair Than sending warning message to the equipment bound in advance when comparing successfully.
- The robot control method 9. one kind is accompanied and attended to characterized by comprisingReceive the biological information of user;Based on the biological information, the mood of the user is predicted;The mood predicted is input in preparatory trained first branch prediction model, the determining and biological information The control instruction to match;Robot of accompanying and attending to described in control makes operation corresponding with the control instruction.
- The robot controller 10. one kind is accompanied and attended to characterized by comprisingReceiving module, for receiving the biological information of user;Prediction module predicts the mood of the user for being based on the biological information;Determining module, for the mood predicted to be input in preparatory trained first branch prediction model, determining and institute State the control instruction that biological information matches;Control module makes operation corresponding with the control instruction for controlling the robot of accompanying and attending to.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910848250.3A CN110480656B (en) | 2019-09-09 | 2019-09-09 | Accompanying robot, accompanying robot control method and accompanying robot control device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910848250.3A CN110480656B (en) | 2019-09-09 | 2019-09-09 | Accompanying robot, accompanying robot control method and accompanying robot control device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110480656A true CN110480656A (en) | 2019-11-22 |
CN110480656B CN110480656B (en) | 2021-09-28 |
Family
ID=68557031
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910848250.3A Active CN110480656B (en) | 2019-09-09 | 2019-09-09 | Accompanying robot, accompanying robot control method and accompanying robot control device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110480656B (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111312221A (en) * | 2020-01-20 | 2020-06-19 | 宁波舜韵电子有限公司 | Intelligent range hood based on voice control |
CN112060080A (en) * | 2020-07-31 | 2020-12-11 | 深圳市优必选科技股份有限公司 | Robot control method and device, terminal equipment and storage medium |
CN113246156A (en) * | 2021-07-13 | 2021-08-13 | 武汉理工大学 | Child accompanying robot based on intelligent emotion recognition and control method |
CN113273930A (en) * | 2021-06-04 | 2021-08-20 | 李侃 | Floor sweeping robot integrating intelligent rescue function and control method thereof |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100499770B1 (en) * | 2004-12-30 | 2005-07-07 | 주식회사 아이오. 테크 | Network based robot control system |
CN101604204A (en) * | 2009-07-09 | 2009-12-16 | 北京科技大学 | Distributed cognitive technology for intelligent emotional robot |
CN105739688A (en) * | 2016-01-21 | 2016-07-06 | 北京光年无限科技有限公司 | Man-machine interaction method and device based on emotion system, and man-machine interaction system |
CN106182032A (en) * | 2016-08-24 | 2016-12-07 | 陈中流 | One is accompanied and attended to robot |
CN107103269A (en) * | 2016-02-23 | 2017-08-29 | 芋头科技(杭州)有限公司 | One kind expression feedback method and intelligent robot |
CN108877840A (en) * | 2018-06-29 | 2018-11-23 | 重庆柚瓣家科技有限公司 | Emotion identification method and system based on nonlinear characteristic |
CN109571494A (en) * | 2018-11-23 | 2019-04-05 | 北京工业大学 | Emotion identification method, apparatus and pet robot |
CN109767791A (en) * | 2019-03-21 | 2019-05-17 | 中国—东盟信息港股份有限公司 | A kind of voice mood identification and application system conversed for call center |
-
2019
- 2019-09-09 CN CN201910848250.3A patent/CN110480656B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100499770B1 (en) * | 2004-12-30 | 2005-07-07 | 주식회사 아이오. 테크 | Network based robot control system |
CN101604204A (en) * | 2009-07-09 | 2009-12-16 | 北京科技大学 | Distributed cognitive technology for intelligent emotional robot |
CN105739688A (en) * | 2016-01-21 | 2016-07-06 | 北京光年无限科技有限公司 | Man-machine interaction method and device based on emotion system, and man-machine interaction system |
CN107103269A (en) * | 2016-02-23 | 2017-08-29 | 芋头科技(杭州)有限公司 | One kind expression feedback method and intelligent robot |
CN106182032A (en) * | 2016-08-24 | 2016-12-07 | 陈中流 | One is accompanied and attended to robot |
CN108877840A (en) * | 2018-06-29 | 2018-11-23 | 重庆柚瓣家科技有限公司 | Emotion identification method and system based on nonlinear characteristic |
CN109571494A (en) * | 2018-11-23 | 2019-04-05 | 北京工业大学 | Emotion identification method, apparatus and pet robot |
CN109767791A (en) * | 2019-03-21 | 2019-05-17 | 中国—东盟信息港股份有限公司 | A kind of voice mood identification and application system conversed for call center |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111312221A (en) * | 2020-01-20 | 2020-06-19 | 宁波舜韵电子有限公司 | Intelligent range hood based on voice control |
CN111312221B (en) * | 2020-01-20 | 2022-07-22 | 宁波舜韵电子有限公司 | Intelligent range hood based on voice control |
CN112060080A (en) * | 2020-07-31 | 2020-12-11 | 深圳市优必选科技股份有限公司 | Robot control method and device, terminal equipment and storage medium |
CN113273930A (en) * | 2021-06-04 | 2021-08-20 | 李侃 | Floor sweeping robot integrating intelligent rescue function and control method thereof |
CN113246156A (en) * | 2021-07-13 | 2021-08-13 | 武汉理工大学 | Child accompanying robot based on intelligent emotion recognition and control method |
Also Published As
Publication number | Publication date |
---|---|
CN110480656B (en) | 2021-09-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110480656A (en) | One kind is accompanied and attended to robot, accompany and attend to robot control method and device | |
JP7199451B2 (en) | Emotional interaction system, device and method based on emotional computing user interface | |
WO2020182153A1 (en) | Method for performing speech recognition based on self-adaptive language, and related apparatus | |
KR102317958B1 (en) | Image processing apparatus and method | |
WO2019204186A1 (en) | Integrated understanding of user characteristics by multimodal processing | |
CN102149319B (en) | Alzheimer's cognitive enabler | |
US10789961B2 (en) | Apparatus and method for predicting/recognizing occurrence of personal concerned context | |
CN110534099A (en) | Voice wakes up processing method, device, storage medium and electronic equipment | |
CN109789550A (en) | Control based on the social robot that the previous role in novel or performance describes | |
CN104036776A (en) | Speech emotion identification method applied to mobile terminal | |
CN113633983A (en) | Method, device, electronic equipment and medium for controlling expression of virtual character | |
CN110442867A (en) | Image processing method, device, terminal and computer storage medium | |
CN113593595A (en) | Voice noise reduction method and device based on artificial intelligence and electronic equipment | |
CN114078472A (en) | Training method and device for keyword calculation model with low false awakening rate | |
Serbaya | [Retracted] Analyzing the Role of Emotional Intelligence on the Performance of Small and Medium Enterprises (SMEs) Using AI‐Based Convolutional Neural Networks (CNNs) | |
CN109074809A (en) | Information processing equipment, information processing method and program | |
US11759387B2 (en) | Voice-based control of sexual stimulation devices | |
US20230372190A1 (en) | Adaptive speech and biofeedback control of sexual stimulation devices | |
CN109427332A (en) | The electronic equipment and its operating method of operation are executed using voice command | |
KR20190125668A (en) | Apparatus and method for analyzing emotional status of pet | |
CN116935277A (en) | Multi-mode emotion recognition method and device | |
US20220331196A1 (en) | Biofeedback-based control of sexual stimulation devices | |
Gupta et al. | REDE-Detecting human emotions using CNN and RASA | |
CN111971670B (en) | Generating a response in a dialog | |
Park et al. | Music-aided affective interaction between human and service robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |