CN109451356A - A kind of intelligent mobile robot, automatic order method, device and chip - Google Patents
A kind of intelligent mobile robot, automatic order method, device and chip Download PDFInfo
- Publication number
- CN109451356A CN109451356A CN201811561784.XA CN201811561784A CN109451356A CN 109451356 A CN109451356 A CN 109451356A CN 201811561784 A CN201811561784 A CN 201811561784A CN 109451356 A CN109451356 A CN 109451356A
- Authority
- CN
- China
- Prior art keywords
- program
- pet
- label
- emotion
- type
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 22
- 230000002996 emotional effect Effects 0.000 claims abstract description 95
- 230000036651 mood Effects 0.000 claims abstract description 68
- 230000008921 facial expression Effects 0.000 claims abstract description 11
- 230000008451 emotion Effects 0.000 claims description 76
- 238000010801 machine learning Methods 0.000 claims description 26
- 238000012549 training Methods 0.000 claims description 24
- 230000005540 biological transmission Effects 0.000 claims description 3
- 238000005070 sampling Methods 0.000 claims description 3
- 238000004140 cleaning Methods 0.000 claims 1
- 230000000694 effects Effects 0.000 description 16
- 241001417527 Pempheridae Species 0.000 description 3
- 238000013473 artificial intelligence Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000010365 information processing Effects 0.000 description 2
- NJPPVKZQTLUDBO-UHFFFAOYSA-N novaluron Chemical compound C1=C(Cl)C(OC(F)(F)C(OC(F)(F)F)F)=CC=C1NC(=O)NC(=O)C1=C(F)C=CC=C1F NJPPVKZQTLUDBO-UHFFFAOYSA-N 0.000 description 2
- 206010019233 Headaches Diseases 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 206010027940 Mood altered Diseases 0.000 description 1
- 241000209140 Triticum Species 0.000 description 1
- 235000021307 Triticum Nutrition 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 231100000869 headache Toxicity 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000002156 mixing Methods 0.000 description 1
- 230000008450 motivation Effects 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 230000005236 sound signal Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/4508—Management of client data or end-user data
- H04N21/4532—Management of client data or end-user data involving end-user characteristics, e.g. viewer profile, preferences
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01K—ANIMAL HUSBANDRY; CARE OF BIRDS, FISHES, INSECTS; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
- A01K15/00—Devices for taming animals, e.g. nose-rings or hobbles; Devices for overturning animals in general; Training or exercising equipment; Covering boxes
- A01K15/02—Training or exercising equipment, e.g. mazes or labyrinths for animals ; Electric shock devices ; Toys specially adapted for animals
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/466—Learning process for intelligent management, e.g. learning user preferences for recommending movies
- H04N21/4662—Learning process for intelligent management, e.g. learning user preferences for recommending movies characterized by learning algorithms
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/47202—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for requesting content on demand, e.g. video on demand
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
Abstract
The present invention discloses a kind of intelligent mobile robot, automatic order method, device and chip, the intelligent mobile robot includes: body, pet mood acquisition device 102 and screen 101, pet mood acquisition device 102 includes the camera 1022 and microphone array 1021 rotated freely, the protrusion of camera 1022 is arranged on the later half capping of the intelligent mobile robot base housing, for capturing the facial expression image of pet, simultaneously, microphone array 1021 is distributed at the body side mounting hole of the intelligent mobile robot, for capturing the cry of pet;Screen 101 is located on the first half capping of the intelligent mobile robot base housing, and exists with the signal input part of pet mood acquisition device 102 and be electrically connected, facial expression image and acoustic information for the pet obtained according to pet mood acquisition device 102 play on screen 101 and play the role of the video pacified to pet current emotional.
Description
Technical field
The invention belongs to robotic technology field more particularly to a kind of intelligent mobile robots, automatic order method, dress
It sets and chip.
Background technique
The nature of dog are that curiosity is strong, and scope of activities is big, to scoot about, hunt daily, are played with similar, can be good for
Kang Chengchang.Raising most of the resident of pet at home at present is all to stay in cell building, and raising is toy dog.Cell building
For pet, scope of activities is too small, and owner can not be in and look after for 24 hours, is also impossible to take out of daily even if being in
It goes to walk a dog, dog dog pass can only be stayed at home when owner stays out.Dog is also the animal being in a bad mood, and IQ is equivalent to three years old small
Child makes the act of madness when dog fidgety and can not exchange with similar doggie when that feels bored at home can usually be disgruntled
It is dynamic, it is most common to sting thing, it runs chaotically and daps, serious " house to be torn open ", many familiesies " deeply hurting ".On the other hand modern life
Movable joint is played fastly, and free time is few, does not almost have the time to go to pacify the mood of doggie, management of a large amount of pet-breeders to pet
Rather have a headache.Therefore, it is necessary to which the mood of pet is targetedly pacified instead of the mankind by robot.
Summary of the invention
In order to overcome the above technical defects, the present invention proposes following technical scheme: a kind of intelligent mobile robot, the intelligence
Mobile robot is a kind of clean robot for having mobile base, which includes: that body, pet mood are adopted
Acquisition means 102 and screen 101;Pet mood acquisition device 102 includes the camera 1022 and microphone array rotated freely
1021, the protrusion of camera 1022 is arranged on the later half capping of the intelligent mobile robot base housing, the camera lens of camera
Support 360 degree rotation, for capturing the facial expression image of pet, meanwhile, microphone array 1021 is distributed in the intelligent sliding motivation
At the body side mounting hole of device people, for capturing the cry of pet;Screen 101 is located at the intelligent mobile robot pedestal shell
On the first half capping of body, and exist with pet mood acquisition device 102 and be electrically connected, for according to pet mood acquisition device
The facial expression image and acoustic information of 102 pets obtained play on screen 101 and to play the role of pacifying to pet current emotional
Video.Compared with prior art, the intelligent mobile robot passes through camera 1022 and wheat on the basis of the body of sweeper
Gram wind array 1021 captures more accurate pet emotional information, and realizes to play using the body capping of large area and pacify pet
The video of mood, to improve the effect for pacifying pet.
A kind of automatic order method based on pet mood, the automatic order method are applied to the intelligent mobile machine
People, comprising the following steps: control pet mood acquisition device 102 obtains the emotional information of pet, and according to the emotional information
Determine the type of emotion of pet;The corresponding section of type of emotion of pet is determined using pre-generated program preferences decision model
Mesh plays label, then determines that program plays the corresponding broadcasting control instruction of label;When screen 101 receives broadcasting control instruction
When, it plays out program and plays the corresponding video program of label, to achieve the purpose that pacify pet current emotional states;Wherein, it saves
Mesh preference decision model is that the correlativity of the emotional change state of the sample program category and pet that are played according to history carries out
Trained model;The emotional information includes: the facial expression image and acoustic information of pet.Compared with prior art, training is utilized
Program preferences decision model determining emotional information is converted into corresponding broadcasting control instruction, allow pet targeted
Video program audiovisual sense of touch effect under pacified, stablize pet mood, reduce pet owner time cost and expense at
This.
Further, the generation method of the program preferences decision model includes: that creation program plays label;It obtains and saves
Mesh plays the Sample video program of the corresponding specified type of label, then the pet viewing of body front end is played on screen 101,
The acquisition in real time of pet mood acquisition device 102 is called to count the emotional feedback state of pet simultaneously;Emotional feedback based on pet
State plays the corresponding Sample video program played of label to program using default machine learning method and is trained, generates institute
State program preferences decision model;The emotional feedback state is the emotional information.It is generated and is saved using default machine learning method
Mesh preference decision model is trained to carry out emotional information, is conducive to the mood placating adaptable according to different emotional information distribution
The program of effect promotes the Experience Degree of pet.
Further, the type of emotion includes happiness, anger, sorrow and/or pleasure;When the type of emotion is happiness, the section
The program corresponding with the type of emotion is played label and is determined as the first video attribute by mesh preference decision model, and right
The Sample video program storage answered is included in pet program preferences library, to update statistical result;When the type of emotion is anger, institute
It states program preferences decision model and the program broadcasting label corresponding with the type of emotion is determined as the second video attribute, and
Corresponding Sample video program storage is included in pet program preferences library, to update statistical result;When the type of emotion is sorrow
When, the program corresponding with the type of emotion is played label and is determined as third video category by the program preferences decision model
Property, and corresponding Sample video program storage is included in pet program preferences library, to update statistical result;When the type of emotion
When being happy, the program corresponding with the type of emotion is played label and is determined as the 4th view by the program preferences decision model
Frequency attribute, and corresponding Sample video program storage is included in pet program preferences library, to update statistical result.Be conducive to enhance
The robustness of training pattern improves the program preferences decision model and determines effect under conventional scenario.
Further, first video attribute, second video attribute, the third video attribute and the described 4th
There is also the video attributes that other are not defined each other for video attribute, if presently described emotional feedback state and upper one
When the variation of a emotional feedback state is little, the view that label switching is not defined to other can be played by presently described program
Frequency attribute, then execute the Sample video program that the corresponding broadcasting of label is played to program using default machine learning method and carry out
Trained step.Be conducive to provide more training materials for the program preferences decision model, help to improve the program
The accuracy that preference decision model determines.
A kind of automatic program-requesting device based on pet mood, the automatic program-requesting device are built in the intelligent mobile machine
People, comprising: mood obtains module, the emotional information of pet is obtained for controlling pet mood acquisition device 102, and according to described
Emotional information determines the type of emotion of pet;Program determining module, for obtaining the mood class that transmits of module receiving mood
After type, the corresponding program of type of emotion for determining pet according to pre-generated program preferences decision model plays label, then
Determine that program plays the corresponding broadcasting control instruction of label;Program pacifies module, for determining mould when screen 101 receives program
It when the broadcasting control instruction of block transmission, plays out program and plays the corresponding video program of label, pacify pet to reach and work as cause
The purpose of not-ready status;Wherein, program preferences decision model is that the mood of the sample program category and pet that are played according to history becomes
The model that the correlativity of change state is trained;The emotional information includes: the facial expression image and acoustic information of pet.With it is existing
There is technology to compare, the automatic program-requesting device is pacified using the emotional information of default machine learning method training pet with corresponding
The association of program, and according to trained model to the judgement of the emotional information captured in real time as a result, mapping out corresponding
Control instruction is played, allows pet to be pacified under the audiovisual sense of touch effect of targeted video program, stablizes pet mood,
Reduce the time cost and expense cost of pet owner.
Further, the program determining module includes: creation submodule, plays label for creating program;Video is adopted
Appearance module is played for obtaining the Sample video program of specified type corresponding with program broadcasting label, then on screen 101
To the pet viewing of body front end, while pet mood acquisition device 102 being called to acquire the emotional feedback shape for counting pet in real time
State;Training submodule, for playing label pair to program using default machine learning method according to the emotional feedback state of pet
The Sample video program that should be played is trained, and generates the program preferences decision model;The emotional feedback state is described
Emotional information.Be conducive to promote the body of pet according to the program of the adaptable mood placating effect of different emotional information distribution
Degree of testing.
Further, the type of emotion includes happiness, anger, sorrow and/or pleasure;When the type of emotion is happiness, the section
The program corresponding with the type of emotion is played label and is determined as the first video attribute by mesh preference decision model, and right
The Sample video program storage answered is included in pet program preferences library, to update statistical result;When the type of emotion is anger, institute
It states program preferences decision model and the program broadcasting label corresponding with the type of emotion is determined as the second video attribute, and
Corresponding Sample video program storage is included in pet program preferences library, to update statistical result;When the type of emotion is sorrow
When, the program corresponding with the type of emotion is played label and is determined as third video category by the program preferences decision model
Property, and corresponding Sample video program storage is included in pet program preferences library, to update statistical result;When the type of emotion
When being happy, the program corresponding with the type of emotion is played label and is determined as the 4th view by the program preferences decision model
Frequency attribute, and corresponding Sample video program storage is included in pet program preferences library, to update statistical result.Be conducive to enhance
The robustness of training pattern improves the program preferences decision model and determines effect under conventional scenario.
Further, the trained submodule is also used to anti-in presently described emotional feedback state and a upper mood
In the case that the variation of feedback state is little, the video category that label switching is not defined to other can be played by presently described program
Property, and corresponding Sample video program storage is included in pet program preferences library, to update statistical result, then execute the utilization
Default machine learning method plays the step of corresponding Sample video program played of label is trained to program.Be conducive to as institute
It states program preferences decision model and more training materials is provided, help to improve the accurate of the program preferences decision model judgement
Property.
A kind of chip, the chip are used to store the corresponding program code of the automatic order method, and built-in described automatic
In on-demand device, machine learning method is preset to program broadcasting label pair for executing described utilize according to the emotional state of pet
The Sample video program that should be played is trained, then is controlled screen 101 and played out the corresponding video program of program broadcasting label, with
Achieve the purpose that pacify pet current emotional states.Compared with prior art, this chip improves the automatic program-requesting device
Training determines speed, and getting rid of reduces training speed because of the control operation of remote server, so that the automatic program-requesting device pair
The real-time of emotional information processing improves.
Detailed description of the invention
Fig. 1 is a kind of structural schematic diagram of intelligent mobile robot provided in an embodiment of the present invention.
Fig. 2 is a kind of flow chart of automatic order method based on pet mood provided in an embodiment of the present invention.
Fig. 3 is the flow chart of the generation method of program preferences decision model provided in an embodiment of the present invention.
Fig. 4 is a kind of block schematic illustration of automatic program-requesting device based on pet mood provided in an embodiment of the present invention.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention is retouched in detail
It states.In the absence of conflict, the feature in following embodiment and embodiment can be combined with each other.
The embodiment of the present invention proposes a kind of intelligent mobile robot, which is that one kind has mobile base
Clean robot, as shown in Figure 1, the intelligent mobile robot is a kind of intelligent movable sweeper, the intelligence under the present embodiment
Mobile robot includes: body, pet mood acquisition device 102 and screen 101;Pet mood acquisition device 102 includes freely
The camera 1022 and microphone array 1021 of rotation, the protrusion of camera 1022 are arranged in the intelligent mobile robot pedestal shell
On the later half capping of body, the camera lens of camera 1022 supports 360 degree rotation, for capturing the expression figure of pet in surrounding enviroment
Picture, meanwhile, microphone array 1021 is distributed at the body side mounting hole of the intelligent mobile robot, for capturing pet
Cry;Screen 101 is located on the first half capping of the intelligent mobile robot base housing, the signal input part of screen 101
Exist with the signal output end of pet mood acquisition device 102 and be electrically connected, for being obtained according to pet mood acquisition device 102
Pet facial expression image and acoustic information, on screen 101 play play the role of the video pacified to pet current emotional.With
The prior art is compared, and the intelligent mobile robot passes through camera 1022 and microphone array on the basis of the body of sweeper
Column 1021 capture more accurate pet emotional information, and realize to play using the body capping of large area and pacify pet mood
Video, to improve the effect for pacifying pet.
The embodiment of the present invention is based on the intelligent mobile robot, provides a kind of automatic program request side based on pet mood
Method, the automatic order method are applied to the intelligent mobile robot, as shown in Figure 2, comprising the following steps:
Step S11, control pet mood acquisition device 102 obtains the emotional information of pet, and is determined according to the emotional information
The type of emotion of pet, subsequently into step S12;Wherein, the emotional information includes the facial expression image and acoustic information of pet,
The data that control camera 1022 and microphone array 1021 acquire in real time carry out data fusion according to certain strategy, to obtain
The emotional information of pet.For example, microphone array 1021 is also adopted simultaneously when the pet mood that camera 1022 is shot is to smile
With the cry of the pickup pet within the scope of 360 degree of full angles, if the volume of pet is greater than the first preset value, and the sound of pet
Frequency is greater than the second preset value, determines that the emotional information of pet for joy, then combines pet mood image to determine the mood of pet
Type is happiness, which can make up the noise jamming in microphone array for the advantage of vision collecting, thus in noise
Enhance particular sound signal in environment, the multi-streaming feature blending algorithm system supported is easier to realize " to be believed according to the mood
Cease the type of emotion for determining pet ".
Step S12, the corresponding program of type of emotion of pet is determined using pre-generated program preferences decision model
Label is played, then determines that program plays the corresponding broadcasting control instruction of label, subsequently into step S13.Be conducive to enhancing training
The robustness of model improves the program preferences decision model and determines effect under conventional scenario.The type of emotion with it is described
The corresponding relationship that program plays label pre-establishes, and the program preferences decision model is its corresponding relationship training managing
, i.e., the mould that the correlativity of the emotional change state of a kind of sample program category played according to history and pet is trained
Type.
Specifically, the type of emotion includes happiness, anger, sorrow and/or pleasure;When the type of emotion is happiness, the program
The program corresponding with the type of emotion is played label and is determined as the first video attribute by preference decision model, and corresponding
Sample video program storage be included in pet program preferences library, to update statistical result;It is described when the type of emotion is anger
The program corresponding with the type of emotion is played label and is determined as the second video attribute, and handle by program preferences decision model
Corresponding Sample video program storage is included in pet program preferences library, to update statistical result;When the type of emotion is sorrow,
The program corresponding with the type of emotion is played label and is determined as third video attribute by the program preferences decision model,
And corresponding Sample video program storage is included in pet program preferences library, to update statistical result;When the type of emotion is
When happy, the program corresponding with the type of emotion is played label and is determined as the 4th video by the program preferences decision model
Attribute, and corresponding Sample video program storage is included in pet program preferences library, to update statistical result.The institute of aforementioned judgement
It states the corresponding program of type of emotion and plays label, the view of elementary video attribute is set in the program preferences decision model
Frequency is gathered, and preliminary training material is provided for the program preferences decision model, meets the basic mood of pet and pacifies demand.
Step S13, when screen 101, which receives, plays control instruction, from pet program preferences library described in calling
Program plays the corresponding video program of label, then plays back in screen 101, pacifies pet current emotional states to reach
Purpose.The program plays the type that label can be defined as video program, and being also possible to one has multiple video program texts
The title of the video file folder of part.Based on the broadcasting control instruction, the section can be called from pet program preferences library
Mesh plays the corresponding video program of label, in this way can be by the program preferences under the emotional change to any state of guarantee pet
Decision model identified, all can emotional information based on capture and the program play the associated trained relationship of label, shielding
Curtain 101 plays video file corresponding with program broadcasting label.Compared with prior art, sentenced using trained program preferences
Determining emotional information is converted to corresponding broadcasting control instruction by cover half type, allows pet in the view of targeted video program
It is pacified under listening sense of touch to act on, stablizes pet mood, reduce the time cost and expense cost of pet owner.
As one embodiment, as shown in figure 3, the generation method of the program preferences decision model include: step S121,
It creates program and plays label, subsequently into step S122;Program plays label and respectively corresponds four kinds of institutes such as happiness, anger, sorrow and/or pleasure
State type of emotion.When the type of emotion is happiness, it is tragedy that program, which plays label,;When the type of emotion is sad, program
Broadcasting label is comedy;When the type of emotion is anger, it is tranquil musical play of expressing one's emotion that program, which plays label,;When the mood
When type is sad, it is comedy that program, which plays label,;To have the function that pacify pet mood.
Step S122, the Sample video program of acquisition specified type corresponding with program broadcasting label, then on screen 101
The pet viewing of body front end is played to, while the mood for calling the acquisition in real time of pet mood acquisition device 102 to count pet is anti-
Feedback state is pacified with judging whether the Sample video program played plays the mood of pet, what degree soothing effect reaches.
Subsequently into step S123.
Step S123, the emotional feedback state based on pet plays label pair to program using default machine learning method
The Sample video program that should be played is trained, and training generates the program preferences decision model;By emotional feedback state and work as
The Sample video program of preceding broadcasting is trained its feature by default machine learning method, Ke Yixun as training variable
The program preferences decision model is practised, the program preferences decision model is judged according to the emotional state information of pet
The program corresponding with the current emotional information of pet plays label.Wherein, the emotional feedback state is the mood letter
Breath, presetting machine learning method main study subject is artificial intelligence, can study the mankind were simulated or realized to computer how
Habit behavior reorganizes the existing structure of knowledge and is allowed to constantly improve the performance of itself to obtain new knowledge or skills.Machine
Learning algorithm may include computer how to realize artificial intelligence or in empirical learning how a kind of automatic improved algorithm.This
Apply for that the machine learning algorithm of the setting in embodiment is used for training video markup model, can be neural network model.It needs
Illustrate, the embodiment of the present application is not construed as limiting the type of machine learning algorithm, is generated and is saved using default machine learning method
Mesh preference decision model is trained to carry out emotional information, is conducive to the mood placating adaptable according to different emotional information distribution
The program of effect promotes the Experience Degree of pet.
As one embodiment, first video attribute, second video attribute, the third video attribute and institute
Stating the 4th video attribute, there is also the video attributes that other are not defined each other, if presently described emotional feedback state
When little with the variation of the upper one emotional feedback state, label switching can be played by presently described program and not determined to other
The video attribute of justice, then execute described utilize and preset machine learning method to the corresponding Sample video section played of program broadcasting label
The step of mesh is trained, and corresponding Sample video program storage is included in pet program preferences library, to update statistical result,
Be conducive to provide more training materials for the program preferences decision model, help to improve the program preferences decision model
The accuracy of judgement.Based on the default machine learning method, updated statistical result will determine mould for the program preferences
Type provides new training and determines the factor, improves the validity of model training and comprehensive.
As another embodiment, Fig. 4 provides a kind of structural block diagram of automatic program-requesting device based on pet mood, should
Automatic program-requesting device is built in aforementioned intelligent mobile robot.As shown in figure 4, the automatic program-requesting device includes: that mood obtains
Modulus block obtains the emotional information of pet for controlling pet mood acquisition device 102, and is doted on according to emotional information determination
The type of emotion of object;Program determining module, for after receiving mood and obtaining the type of emotion that transmits of module, according to pre- Mr.
At program preferences decision model determine the corresponding program of type of emotion of pet and play label, then determine that program plays label
Corresponding broadcasting control instruction;Program pacifies module, for receiving the broadcasting control of program determining module transmission when screen 101
When instruction, plays out program and play the corresponding video program of label, to achieve the purpose that pacify pet current emotional states;Its
In, program preferences decision model is the correlativity of the emotional change state of the sample program category and pet that are played according to history
The model being trained;The emotional information includes: the facial expression image and acoustic information of pet.Compared with prior art, described
Automatic program-requesting device trains the emotional information of pet and the corresponding associated pass for pacifying program using default machine learning method
System, and according to trained model to the judgement of the emotional information captured in real time as a result, map out corresponding broadcasting control instruction, allow
Pet is pacified under the audiovisual sense of touch effect of targeted video program, is stablized pet mood, is reduced pet owner
Time cost and expense cost.
Optionally, the program determining module includes: creation submodule, plays label for creating program;Video sampling
Submodule, the Sample video program of the specified type for obtaining the creation submodule creation, then played on screen 101
The pet of body front end watches, while pet mood acquisition device 102 being called to acquire the emotional feedback state for counting pet in real time;
Training submodule, the emotional feedback state for being acquired in real time according to video sampling submodule utilize default machine learning method
The corresponding Sample video program played of label is played to the program to be trained, and generates the program preferences decision model;Institute
Stating emotional feedback state is the emotional information.The program determining module is conducive to mutually be fitted according to different emotional information distribution
The program for the mood placating effect answered, promotes the Experience Degree of pet.
Optionally, the type of emotion includes happiness, anger, sorrow and/or pleasure;When the type of emotion is happiness, the program
The program corresponding with the type of emotion is played label and is determined as the first video attribute by preference decision model, and corresponding
Sample video program storage be included in pet program preferences library, to update statistical result;It is described when the type of emotion is anger
The program corresponding with the type of emotion is played label and is determined as the second video attribute, and handle by program preferences decision model
Corresponding Sample video program storage is included in pet program preferences library, to update statistical result;When the type of emotion is sorrow,
The program corresponding with the type of emotion is played label and is determined as third video attribute by the program preferences decision model,
And corresponding Sample video program storage is included in pet program preferences library, to update statistical result;When the type of emotion is
When happy, the program corresponding with the type of emotion is played label and is determined as the 4th video by the program preferences decision model
Attribute, and corresponding Sample video program storage is included in pet program preferences library, to update statistical result, perfect the pet
The program category in program preferences library enhances the robustness of training pattern, improves the program preferences decision model in conventional scenario
Lower judgement effect.
Optionally, the trained submodule is also used in presently described emotional feedback state and a upper emotional feedback
In the case that the variation of state is little, the video attribute that label switching is not defined to other can be played by presently described program,
And corresponding Sample video program storage is included in pet program preferences library, to update statistical result, then execute described using pre-
If machine learning method plays the step of corresponding Sample video program played of label is trained to program.First video
There is also undetermined each other for attribute, second video attribute, the third video attribute and the 4th video attribute
The video attribute of justice can if the variation of presently described emotional feedback state and the upper one emotional feedback state is little
Label switching is played to video attribute to be defined by presently described program, then executes described utilize and presets machine learning method pair
Program plays the step of corresponding Sample video program played of label is trained, and corresponding Sample video program is stored and is counted
Enter pet program preferences library, to update statistical result, because video attribute to be defined can be between happiness, anger, sorrow and/or pleasure
Between other emotional states.The trained submodule is conducive to provide more training elements for the program preferences decision model
Material helps to improve the accuracy that the program preferences decision model determines.
A kind of chip, the chip are used to store the corresponding program code of aforementioned automatic order method, and built-in described automatic
In on-demand device, machine learning method is preset to program broadcasting label pair for executing described utilize according to the emotional state of pet
The Sample video program that should be played is trained, then is controlled screen 101 and played out the video section that program plays label Corresponding matching
Mesh, to achieve the purpose that pacify pet current emotional states.Compared with prior art, this chip improves the automatic program request dress
The training set determines speed, and getting rid of reduces training speed because of the control operation of remote server, so that the automatic program request dress
It sets and the real-time of emotional information processing is improved.
Finally it should be noted that: the above embodiments are merely illustrative of the technical scheme of the present invention and are not intended to be limiting thereof;To the greatest extent
The present invention is described in detail with reference to preferred embodiments for pipe, it should be understood by those ordinary skilled in the art that: still
It can modify to a specific embodiment of the invention or some technical features can be equivalently replaced;Without departing from this hair
The spirit of bright technical solution should all cover within the scope of the technical scheme claimed by the invention.
Claims (10)
1. a kind of intelligent mobile robot, which is characterized in that the intelligent mobile robot is a kind of cleaning for having mobile base
Robot, the intelligent mobile robot include: body, pet mood acquisition device (102) and screen (101);
Pet mood acquisition device (102) includes the camera (1022) and microphone array (1021) rotated freely, camera
(1022) protrusion is arranged on the later half capping of the intelligent mobile robot base housing, for capturing the expression figure of pet
Picture, meanwhile, microphone array (1021) is distributed at the body side mounting hole of the intelligent mobile robot, is doted on for capturing
The cry of object;Screen (101) is located on the first half capping of the intelligent mobile robot base housing, and acquires with pet mood
The signal input part of device (102), which exists, to be electrically connected, the table of the pet for being obtained according to pet mood acquisition device (102)
Feelings image and acoustic information play on screen (101) and play the role of the video pacified to pet current emotional.
2. a kind of automatic order method based on pet mood, which is characterized in that the automatic order method is applied to claim 1
The intelligent mobile robot, comprising the following steps:
It controls pet mood acquisition device (102) and obtains the emotional information of pet, and pet is determined according to the emotional information
Type of emotion;
The corresponding program of type of emotion for determining pet using pre-generated program preferences decision model plays label, then really
Determine program and plays the corresponding broadcasting control instruction of label;
When screen (101), which receives, plays control instruction, plays out program and play the corresponding video program of label, to reach peace
Comfort the purpose of pet current emotional states;
Wherein, program preferences decision model is the sample program category played according to history and the phase of the emotional change state of pet
The model that pass relationship is trained;The emotional information includes: the facial expression image and acoustic information of pet.
3. automatic order method according to claim 2, which is characterized in that the generation method of the program preferences decision model
Include:
It creates program and plays label;
The Sample video program of specified type corresponding with program broadcasting label is obtained, then plays to body on screen (101)
The pet of front end watches, while pet mood acquisition device (102) being called to acquire the emotional feedback state for counting pet in real time;
Emotional feedback state based on pet plays the corresponding sample played of label to program using default machine learning method and regards
Frequency program is trained, and generates the program preferences decision model;
The emotional feedback state is the emotional information.
4. automatic order method according to claim 3, which is characterized in that the type of emotion include happiness, anger, sorrow and/or
It is happy;
When the type of emotion is happiness, the program preferences decision model broadcasts the program corresponding with the type of emotion
It puts label and is determined as the first video attribute, and corresponding Sample video program storage is included in pet program preferences library, to update
Statistical result;
When the type of emotion is anger, the program preferences decision model broadcasts the program corresponding with the type of emotion
It puts label and is determined as the second video attribute, and corresponding Sample video program storage is included in pet program preferences library, to update
Statistical result;
When the type of emotion is sorrow, the program preferences decision model broadcasts the program corresponding with the type of emotion
It puts label and is determined as third video attribute, and corresponding Sample video program storage is included in pet program preferences library, to update
Statistical result;
When the type of emotion is happy, the program preferences decision model broadcasts the program corresponding with the type of emotion
It puts label and is determined as the 4th video attribute, and corresponding Sample video program storage is included in pet program preferences library, to update
Statistical result.
5. automatic order method according to claim 4, which is characterized in that first video attribute, second video
There is also the video categories that other are not defined each other for attribute, the third video attribute and the 4th video attribute
Property, it, can be by presently described if the variation of presently described emotional feedback state and the upper one emotional feedback state is little
Program plays the video attribute that label switching is not defined to other, then executes described utilize and preset machine learning method to program
Play the step of corresponding Sample video program played of label is trained.
6. a kind of automatic program-requesting device based on pet mood, which is characterized in that the automatic program-requesting device is built in claim 1
The intelligent mobile robot, comprising:
Mood obtains module, the emotional information of pet is obtained for controlling pet mood acquisition device (102), and according to the feelings
Thread information determines the type of emotion of pet;
Program determining module, for after receiving mood and obtaining the type of emotion that transmits of module, according to pre-generated program
The corresponding program of type of emotion that preference decision model determines pet plays label, then determines that program plays that label is corresponding to be broadcast
Put control instruction;
Program pacifies module, when for receiving the broadcasting control instruction of program determining module transmission when screen (101), plays out
Program plays the corresponding video program of label, to achieve the purpose that pacify pet current emotional states;
Wherein, program preferences decision model is the sample program category played according to history and the phase of the emotional change state of pet
The model that pass relationship is trained;The emotional information includes: the facial expression image and acoustic information of pet.
7. automatic program-requesting device according to claim 6, which is characterized in that the program determining module includes:
Submodule is created, plays label for creating program;
Video sampling submodule for obtaining the Sample video program for playing the corresponding specified type of label with program, then is shielding
The pet viewing of body front end is played on curtain (101), while being called pet mood acquisition device (102) acquisition in real time to count and being doted on
The emotional feedback state of object;
Training submodule, for playing label pair to program using default machine learning method according to the emotional feedback state of pet
The Sample video program that should be played is trained, and generates the program preferences decision model;
Wherein, the emotional feedback state is the emotional information.
8. automatic program-requesting device according to claim 7, which is characterized in that the type of emotion include happiness, anger, sorrow and/or
It is happy;
When the type of emotion is happiness, the program preferences decision model broadcasts the program corresponding with the type of emotion
It puts label and is determined as the first video attribute, and corresponding Sample video program storage is included in pet program preferences library, to update
Statistical result;
When the type of emotion is anger, the program preferences decision model broadcasts the program corresponding with the type of emotion
It puts label and is determined as the second video attribute, and corresponding Sample video program storage is included in pet program preferences library, to update
Statistical result;
When the type of emotion is sorrow, the program preferences decision model broadcasts the program corresponding with the type of emotion
It puts label and is determined as third video attribute, and corresponding Sample video program storage is included in pet program preferences library, to update
Statistical result;
When the type of emotion is happy, the program preferences decision model broadcasts the program corresponding with the type of emotion
It puts label and is determined as the 4th video attribute, and corresponding Sample video program storage is included in pet program preferences library, to update
Statistical result.
9. automatic program-requesting device according to claim 8, which is characterized in that the trained submodule is also used to presently described
In the case that the variation of emotional feedback state and the upper one emotional feedback state is little, it can be played and be marked by presently described program
Label are switched to other video attributes not being defined, then execute described utilize and preset machine learning method to program broadcasting label pair
The step of Sample video program that should be played is trained.
10. a kind of chip, which is characterized in that the chip is for storing any one of claim 2 to 5 automatic order method pair
The program code answered, and in any one of built-in claim 6 to 9 automatic program-requesting device, for the mood shape according to pet
State executes the Sample video program for playing the corresponding broadcasting of label to program using default machine learning method and is trained, then
Control screen (101) plays out program and plays the corresponding video program of label, to reach the mesh for pacifying pet current emotional states
's.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811561784.XA CN109451356A (en) | 2018-12-20 | 2018-12-20 | A kind of intelligent mobile robot, automatic order method, device and chip |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811561784.XA CN109451356A (en) | 2018-12-20 | 2018-12-20 | A kind of intelligent mobile robot, automatic order method, device and chip |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109451356A true CN109451356A (en) | 2019-03-08 |
Family
ID=65558550
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811561784.XA Pending CN109451356A (en) | 2018-12-20 | 2018-12-20 | A kind of intelligent mobile robot, automatic order method, device and chip |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109451356A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111461337A (en) * | 2020-03-05 | 2020-07-28 | 深圳追一科技有限公司 | Data processing method and device, terminal equipment and storage medium |
CN112134949A (en) * | 2020-09-22 | 2020-12-25 | 珠海格力电器股份有限公司 | Pet hosting method, device and system |
CN112488219A (en) * | 2020-12-07 | 2021-03-12 | 江苏科技大学 | Mood consolation method and system based on GRU and mobile terminal |
CN112704024A (en) * | 2020-12-30 | 2021-04-27 | 安徽四创电子股份有限公司 | Pet dog emergency placating system based on Internet of things |
TWI746127B (en) * | 2020-08-26 | 2021-11-11 | 宏碁股份有限公司 | Pet care system and the method thereof |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130104812A1 (en) * | 2010-04-29 | 2013-05-02 | Ron Levi | System and method for treating pets |
GB201416680D0 (en) * | 2014-09-22 | 2014-11-05 | Plummer Michael I | A Calming System For An Animal |
CN104260096A (en) * | 2014-03-31 | 2015-01-07 | 苏州市华天雄信息科技有限公司 | Novel remote pet comforting and monitoring service robot |
CN104760056A (en) * | 2014-11-27 | 2015-07-08 | 深圳市银星智能科技股份有限公司 | Camera module for autonomously moving device and cleaning robot |
KR20160010160A (en) * | 2014-07-18 | 2016-01-27 | 배형진 | Automatic feeding apparatus for a pet |
CN105580750A (en) * | 2014-11-17 | 2016-05-18 | 波宝创新科技有限公司 | Interactive pet caring system |
CN106305463A (en) * | 2015-06-24 | 2017-01-11 | 深圳市易新科技有限公司 | Intelligent feeder and intelligent feeding system |
CN106386561A (en) * | 2016-08-30 | 2017-02-15 | 深圳市沃特沃德股份有限公司 | Method, device and system for appeasing pets |
CN206196621U (en) * | 2016-11-23 | 2017-05-31 | 成都信息工程大学 | A kind of long-range breeding apparatus of pet |
CN108064738A (en) * | 2016-11-14 | 2018-05-25 | 北京加益科技有限公司 | pet interactive device, terminal and system |
CN108200396A (en) * | 2018-01-05 | 2018-06-22 | 湖南固尔邦幕墙装饰股份有限公司 | Intelligent door system and intelligent door control method |
CN108847254A (en) * | 2018-05-31 | 2018-11-20 | 广州粤创富科技有限公司 | A kind of method, apparatus and pet wearable device identifying pet mood |
-
2018
- 2018-12-20 CN CN201811561784.XA patent/CN109451356A/en active Pending
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130104812A1 (en) * | 2010-04-29 | 2013-05-02 | Ron Levi | System and method for treating pets |
CN104260096A (en) * | 2014-03-31 | 2015-01-07 | 苏州市华天雄信息科技有限公司 | Novel remote pet comforting and monitoring service robot |
KR20160010160A (en) * | 2014-07-18 | 2016-01-27 | 배형진 | Automatic feeding apparatus for a pet |
GB201416680D0 (en) * | 2014-09-22 | 2014-11-05 | Plummer Michael I | A Calming System For An Animal |
CN105580750A (en) * | 2014-11-17 | 2016-05-18 | 波宝创新科技有限公司 | Interactive pet caring system |
CN104760056A (en) * | 2014-11-27 | 2015-07-08 | 深圳市银星智能科技股份有限公司 | Camera module for autonomously moving device and cleaning robot |
CN106305463A (en) * | 2015-06-24 | 2017-01-11 | 深圳市易新科技有限公司 | Intelligent feeder and intelligent feeding system |
CN106386561A (en) * | 2016-08-30 | 2017-02-15 | 深圳市沃特沃德股份有限公司 | Method, device and system for appeasing pets |
CN108064738A (en) * | 2016-11-14 | 2018-05-25 | 北京加益科技有限公司 | pet interactive device, terminal and system |
CN206196621U (en) * | 2016-11-23 | 2017-05-31 | 成都信息工程大学 | A kind of long-range breeding apparatus of pet |
CN108200396A (en) * | 2018-01-05 | 2018-06-22 | 湖南固尔邦幕墙装饰股份有限公司 | Intelligent door system and intelligent door control method |
CN108847254A (en) * | 2018-05-31 | 2018-11-20 | 广州粤创富科技有限公司 | A kind of method, apparatus and pet wearable device identifying pet mood |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111461337A (en) * | 2020-03-05 | 2020-07-28 | 深圳追一科技有限公司 | Data processing method and device, terminal equipment and storage medium |
CN111461337B (en) * | 2020-03-05 | 2023-08-18 | 深圳追一科技有限公司 | Data processing method, device, terminal equipment and storage medium |
TWI746127B (en) * | 2020-08-26 | 2021-11-11 | 宏碁股份有限公司 | Pet care system and the method thereof |
CN112134949A (en) * | 2020-09-22 | 2020-12-25 | 珠海格力电器股份有限公司 | Pet hosting method, device and system |
CN112488219A (en) * | 2020-12-07 | 2021-03-12 | 江苏科技大学 | Mood consolation method and system based on GRU and mobile terminal |
CN112704024A (en) * | 2020-12-30 | 2021-04-27 | 安徽四创电子股份有限公司 | Pet dog emergency placating system based on Internet of things |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109451356A (en) | A kind of intelligent mobile robot, automatic order method, device and chip | |
JP6902683B2 (en) | Virtual robot interaction methods, devices, storage media and electronic devices | |
US11370125B2 (en) | Social robot with environmental control feature | |
CN104049721B (en) | Information processing method and electronic equipment | |
CN111835986B (en) | Video editing processing method and device and electronic equipment | |
JP5172049B2 (en) | Robot apparatus, robot control method, and robot control program | |
KR20190126906A (en) | Data processing method and device for care robot | |
CN112051743A (en) | Device control method, conflict processing method, corresponding devices and electronic device | |
JP2020531999A (en) | Continuous selection of scenarios based on identification tags that describe the user's contextual environment for the user's artificial intelligence model to run by the autonomous personal companion | |
CN107480766B (en) | Method and system for content generation for multi-modal virtual robots | |
KR102463112B1 (en) | Simulation Sandbox System | |
JP2012155616A (en) | Content provision system, content provision method, and content provision program | |
CN106507207A (en) | Interactive method and device in live application | |
CN113760100B (en) | Man-machine interaction equipment with virtual image generation, display and control functions | |
CN107566227A (en) | Control method, device, smart machine and the storage medium of home appliance | |
CN104506586A (en) | Intelligent earphone system capable of regulating volume by gesture and regulation method | |
WO2024027661A1 (en) | Digital human driving method and apparatus, device and storage medium | |
CN209233974U (en) | A kind of intelligent mobile robot and chip | |
CN108600604A (en) | Image pickup method, dual-screen mobile terminal and storage medium | |
CN110576440B (en) | Child accompanying robot and accompanying control method thereof | |
CN115101048B (en) | Science popularization information interaction method, device, system, interaction equipment and storage medium | |
CN108650529A (en) | A kind of intelligence early education analyzer and its application method | |
CN111168689A (en) | Robot system capable of automatically generating children growth video and working method thereof | |
CN114047901B (en) | Man-machine interaction method and intelligent device | |
CN111832691B (en) | Role-substituted upgradeable multi-object intelligent accompanying robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information |
Address after: 519000 2706, No. 3000, Huandao East Road, Hengqin new area, Zhuhai, Guangdong Applicant after: Zhuhai Yiwei Semiconductor Co.,Ltd. Address before: Room 105-514, No.6 Baohua Road, Hengqin New District, Zhuhai City, Guangdong Province Applicant before: AMICRO SEMICONDUCTOR Co.,Ltd. |
|
CB02 | Change of applicant information | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190308 |
|
RJ01 | Rejection of invention patent application after publication |