CN108491790A - A kind of determination method, apparatus, storage medium and the robot of object - Google Patents
A kind of determination method, apparatus, storage medium and the robot of object Download PDFInfo
- Publication number
- CN108491790A CN108491790A CN201810231355.XA CN201810231355A CN108491790A CN 108491790 A CN108491790 A CN 108491790A CN 201810231355 A CN201810231355 A CN 201810231355A CN 108491790 A CN108491790 A CN 108491790A
- Authority
- CN
- China
- Prior art keywords
- intention
- user
- article
- scope
- action message
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J11/00—Manipulators not otherwise provided for
- B25J11/008—Manipulators for service tasks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/35—Categorising the entire scene, e.g. birthday party or wedding scene
Abstract
The embodiment of the invention discloses determination method, apparatus, storage medium and the robots of a kind of object.This method includes:Obtain the intention action message of user;According to the intention action message, the intention range of user is determined;The images of items within the scope of the intention is obtained, therefrom determines the intention article of user.By using above-mentioned technical proposal, it may be determined that the article that user needs in real time plays user and preferably looks after effect.
Description
Technical field
The present embodiments relate to the determination method, apparatus of intelligent robot technology field more particularly to a kind of object,
Storage medium and robot.
Background technology
Currently, with economic rapid development, household consumption is horizontal to be gradually increased, and the family commonly taken in is come
It says, also to look after old man while work and child has become certain burden.It is well known that either looking after children also
It is to take care of old man, is required for paying great time cost, and for young man, due to working principle, it is impossible to daily
All accompany old man and children at one's side.Therefore, when young man's work is more busy, how to take into account and looked after old man and children
Through becoming social problem.
After the adjoint robot of early stage is born, can be linked up with old man and children, realize to children's knowledge at
The help of long side and solve the problems, such as that old man is lonely, but when certain articles are actually needed in old man or children, companion
Random device people is difficult to provide help.
Invention content
The embodiment of the present invention provides a kind of determination method, apparatus, storage medium and the robot of object, it may be determined that uses
The article that family needs in real time plays user and preferably looks after effect.
In a first aspect, an embodiment of the present invention provides a kind of determination method of object, this method includes:
Obtain the intention action message of user;
According to the intention action message, the intention range of user is determined;
The images of items within the scope of the intention is obtained, therefrom determines the intention article of user.
Further, the images of items within the scope of the intention is being obtained, after the intention article for therefrom determining user, institute
The method of stating further includes:
By action component, the intention article is moved in the accessible range of user.
Further, the intention action message for obtaining user, including:
Obtain the image or consecutive image of user;
The limbs direction or moving direction of the user are determined from described image either consecutive image;
The limbs are directed toward or moving direction is determined as the intention action message of user.
Further, according to the intention action message, the intention range of user is determined, including:
According to the intention action message, the intention direction of user is determined;
According to auxiliary judgement information, intention direction scattering angle is determined, wherein the auxiliary judgement information includes:User's
One or more of voice messaging, the direction of visual lines of user, limb motion track and motion track;
The range that the intention direction and intention direction scattering angle are formed is as the intention range of user.
Further, the images of items within the scope of the intention is obtained, therefrom determines the intention article of user, including:
The image within the scope of the intention is obtained, image recognition technology is based on, determines the article figure within the scope of the intention
Picture;
It analyzes in the images of items within the scope of the intention, probability of each article as intention article;
The higher article of the probability is determined as to the intention article of user.
Further, it analyzes in the images of items within the scope of the intention, probability of each article as intention article, wraps
It includes:
From the images of items within the scope of the intention, each article and the incidence relation of user are analyzed;Wherein, association is closed
System includes that demand frequency association is associated with required time.
Second aspect, the embodiment of the present invention additionally provide a kind of determining device of object, which includes:
Intention action message acquisition module, the intention action message for obtaining user;
Intention range determination module, for according to the intention action message, determining the intention range of user;
Intention article determining module therefrom determines the intention of user for obtaining the images of items within the scope of the intention
Article.
Further, described device further includes:
Intention article mobile module, for by action component, the intention article to be moved to the accessible model of user
In enclosing.
Further, the intention action message acquisition module, including:
Image acquisition unit, image or consecutive image for obtaining user;
Direction-determining unit, for determined from described image either consecutive image the user limbs be directed toward or
Moving direction;
Intention action message determination unit, for being directed toward the limbs or moving direction is determined as the intention of user and moves
Make information.
Further, the intention range determination module, including:
Intention direction-determining unit, for according to the intention action message, determining the intention direction of user;
Intention direction scattering angle determination unit, for according to auxiliary judgement information, determining intention direction scattering angle,
Described in auxiliary judgement information include:The voice messaging of user, the direction of visual lines of user, limb motion track and motion track
One or more of;
Intention range determination unit, the range for forming the intention direction and intention direction scattering angle are made
For the intention range of user.
Further, the intention article determining module, including:
Images of items acquiring unit is based on image recognition technology, determines institute for obtaining the image within the scope of the intention
State the images of items within the scope of intention;
Probability analysis unit, for analyzing in the images of items within the scope of the intention, each article is as intention article
Probability;
Intention article determination unit, the intention article for the higher article of the probability to be determined as to user.
Further, the probability analysis unit, is specifically used for:
From the images of items within the scope of the intention, each article and the incidence relation of user are analyzed;Wherein, association is closed
System includes that demand frequency association is associated with required time.
The third aspect, the embodiment of the present application provide a kind of computer readable storage medium, are stored thereon with computer journey
Sequence realizes the determination method of the object as described in the embodiment of the present application when the program is executed by processor.
Fourth aspect, the embodiment of the present application provide a kind of robot, and the robot includes:
At least one processor;And the memory being connect at least one processor communication;Wherein,
The memory is stored with the instruction that can be executed by least one processor, and described instruction is by described at least one
A processor executes, so that at least one processor is able to carry out the determination side of the object as described in claim 1-6
Method.
The technical solution that the embodiment of the present application is provided, by the intention action message for obtaining user;According to the intention
Action message determines the intention range of user;The images of items within the scope of the intention is obtained, therefrom determines the intention object of user
Product.By using technical solution provided herein, it may be determined that the article that user needs in real time plays preferably user
Look after effect.
Description of the drawings
Fig. 1 is the flow chart of the determination method for the object that the embodiment of the present invention one provides;
Fig. 2 is the flow chart of the determination method of object provided by Embodiment 2 of the present invention;
Fig. 3 is the flow chart of the determination method for the object that the embodiment of the present invention three provides;
Fig. 4 is the structural schematic diagram of the determining device for the object that the embodiment of the present invention four provides;
Fig. 5 is a kind of structural schematic diagram of robot provided by the embodiments of the present application.
Specific implementation mode
The present invention is described in further detail with reference to the accompanying drawings and examples.It is understood that this place is retouched
The specific embodiment stated is used only for explaining the present invention rather than limitation of the invention.It also should be noted that in order to just
Only the parts related to the present invention are shown in description, attached drawing rather than entire infrastructure.
It should be mentioned that some exemplary embodiments are described as before exemplary embodiment is discussed in greater detail
The processing described as flow chart or method.Although each step is described as the processing of sequence, many of which by flow chart
Step can be implemented concurrently, concomitantly or simultaneously.In addition, the sequence of each step can be rearranged.When its operation
The processing can be terminated when completion, it is also possible to the additional step being not included in attached drawing.The processing can be with
Corresponding to method, function, regulation, subroutine, subprogram etc..
Embodiment one
Fig. 1 is the flow chart of the determination method for the object that the embodiment of the present invention one provides, the applicable machine of the present embodiment
People accompanies user situation, this method that can be executed by the determining device for the object that the embodiment of the present invention is provided, the device
It can be realized, and can be integrated in robot by the mode of software and/or hardware.
As shown in Figure 1, the determination method of the object includes:
S110, the intention action message for obtaining user.
Wherein, intention action message can be that user has intention to take the information of some articles, can also be that user is logical
Cross the information that language expresses the article of needs.If user is a elderly, the drug being placed on above desk is needed, then is existed
When user reaches for drug to desk direction, so that it may which, to be determined as the intention action message of user, user needs for another example
Closing television machine, or turn down volume, then language can be determined as the intention action message of user if user says.
In the present embodiment, optionally, the intention action message of user is obtained, including:Obtain image or the company of user
Continuous image;The limbs direction or moving direction of the user are determined from described image either consecutive image;By the limbs
Direction or moving direction are determined as the intention action message of user.
In the present embodiment, due to having executed corresponding action by speech recognition more routinely, it is advantageous to,
It is using the limb action of user or moving direction as the intention action message of user.Wherein, moving direction can be basis
Bed old man the latter can just climb or can walk children for, they often by it is slowly moving come they want
Object beside, to take to article.This modes such as moves body or creeps, walks in user, it may be determined that for user's
Intention action message.
Wherein it is possible to image collecting device is configured according to may be performed as the terminal of main body with robot etc., it is such as wide
The acquisition of some information for user may be implemented in angle camera and audio collecting device, such as microphone.It is worth saying
Bright, for the old man or children that are nursed, they can do many actions daily, how judge that user's is dynamic
Work is in the embodiment of the present application, can to utilize the means of machine learning to enough take the sportex of some articles or routine
Learning model building is carried out to the action of user, when the action of user meets special characteristic, then can be determined as user and need enough to take one
A little articles while making a stretch of the arm to a fixed direction, then can determine that the action is to enough take if user side is moved
Some articles, in such a case, it is possible to carry out the subsequent step of the embodiment of the present application.
S120, according to the intention action message, determine the intention range of user.
Wherein, intention range can be a direction, can also be the range of a placement article, such as before sick bed, quilt
Carer stretches out one's hand to the direction of a desk, then can determine this range on the intention ranging from desktop of user, as baby from
One end of bed is climbed to the other end, then the other end of the intention of baby ranging from bed.
Images of items within the scope of S130, the acquisition intention, therefrom determines the intention article of user.
After determining intention range, feel that the images of items within the scope of intention is determined by image recognition within the scope of intention
Each article, the intention article of user is determined further according to the demand of user.Wherein, the demand of user can be to rely on
The frequency of the generations action relationships of each article within the scope of user and intention is determined, such as often 5-6 hour takes medicine one
It is secondary, then 5-6 hour after last time takes medicine, then can by within the scope of intention medicine bottle or medicine box be determined as the intention of user
Article then drinks water or so half an hour away from last time if user drinks a water per half an hour, can be by the water within the scope of intention
Cup is determined as intention article.Wherein within the scope of intention, preliminary screening can be therefrom carried out after each article of determination, sieved
It selects some users and not may require that the article enough taken substantially, such as the calligraphy and painting on wall, the pendulum on desktop decorations, and by this article
It may be set as relatively low as the probability of the intention article of user, be conducive to the intention object that terminal accurately confirms user in this way
Product.
The technical solution that the embodiment of the present application is provided, by the intention action message for obtaining user;According to the intention
Action message determines the intention range of user;The images of items within the scope of the intention is obtained, therefrom determines the intention object of user
Product.By using technical solution provided herein, it may be determined that the article that user needs in real time plays preferably user
Look after effect.
Based on the above technical solution, optionally, the images of items within the scope of the intention is being obtained, therefrom determined
After the intention article of user, the method further includes:By action component, the intention article is moved to touching for user
And in range.Wherein, action component can make the actions such as its crawler belt, the arm that move for what the terminals such as robot configured
Identified intention article can be moved in the range that user can touch by component by the movement of control action component,
Bedside can be such as placed on, or can be sent to user in face of and wait for taking for user.The advantages of this arrangement are as follows can be with
After determining the intention article of user, it is moved to the accessible range of user, in this way it is possible to prevente effectively from user is due to row
It moves inconvenient but is badly in need of taking the danger caused by the intention article, such as fall from bedside, or touch other articles to cause
Scratch etc., it improves to by the nurse effect of carer.
Embodiment two
Fig. 2 is the flow chart of the determination method of object provided by Embodiment 2 of the present invention.The present embodiment is in above-mentioned implementation
On the basis of example, according to the intention action message, the intention range of user is determined, including:According to the intention action message,
Determine the intention direction of user;According to auxiliary judgement information, intention direction scattering angle is determined, wherein the auxiliary judgement information
Including:One or more of the voice messaging of user, the direction of visual lines of user, limb motion track and motion track;
The range that the intention direction and intention direction scattering angle are formed is as the intention range of user.
As shown in Fig. 2, the determination method of the object includes:
S210, the intention action message for obtaining user.
S220, according to the intention action message, determine the intention direction of user.
Specifically, can stretch out one's hand direction, the direction that expression in the eyes is positioned or the direction that body is moved according to user
To determine the intention direction of user.
S230, according to auxiliary judgement information, intention direction scattering angle is determined, wherein the auxiliary judgement information includes:
One or more of the voice messaging of user, the direction of visual lines of user, limb motion track and motion track.
Wherein auxiliary judgement information can be the judgment basis for determining user intent direction scattering angle, such as be looked after
Be a children, in crawling process, be towards a direction before, but but changing always later, this may be with
Its locomitivity is related, but can determine and children come back every time look at is a fixed direction, such as an object for appreciation of bedside
It is even, then it can in this way be conducive to determine user according to the intention direction scattering angle for realizing the determining user of direction auxiliary of children
Intention range be that can effectively avoid the determination deviation of intention range from causing in its range where do want to the article wanted
It can not determine the intention article of user.It is similar, also may be used according to the voice messaging of user, limb motion track and motion track
To help that the intention range of user is determined more accurately.
S240, the range for forming the intention direction and intention direction scattering angle are as the intention model of user
It encloses.
Wherein, scattering angle only determines the range boundary of intention direction both sides, front and back range boundary on intention direction
It can be determined with the presence or absence of the whole boundary of wall, desktop and some object according on intention direction.
Images of items within the scope of S250, the acquisition intention, therefrom determines the intention article of user.
The present embodiment on the basis of the above embodiments, provides a kind of side that user intent range is determined more accurately
Method can effectively improve the accuracy of intention range determination by this method, effectively avoid because of intention range really
The problem of determining deviation, causing terminal that can not accurately obtain the article of user's actual needs, improves the usage experience of user.
Embodiment three
Fig. 3 is the flow chart of the determination method for the object that the embodiment of the present invention three provides.The present embodiment is in above-mentioned implementation
On the basis of example, the images of items within the scope of the intention is obtained, therefrom determines the intention article of user, including:Described in acquisition
Image within the scope of intention is based on image recognition technology, determines the images of items within the scope of the intention;Analyze the intention model
It encloses in interior images of items, probability of each article as intention article;The higher article of the probability is determined as user's
Intention article.
As shown in figure 3, the determination method of the object includes:
S310, the intention action message for obtaining user.
S320, according to the intention action message, determine the intention range of user.
S330, the image obtained within the scope of the intention determine the object within the scope of the intention based on image recognition technology
Product image.
Wherein it is possible to after obtaining one or multiple images, according to image recognition technology, determine each in intention range
A article.
In images of items within the scope of S340, the analysis intention, probability of each article as intention article.
S350, the intention article that the higher article of the probability is determined as to user.
Probability of each article as intention article is analyzed, the number that can be daily used by a user according to each article
It is ranked up, access times are higher, then it is higher to send out the probability of intention action object as user for it.It can also basis
Current environmental information determines, if current time is 18:00 or so, ambient room light is dark, then can be similar to lamp
The probability of the intention article for being determined as user of photoswitch and other items improves.
In the present embodiment, optionally, it analyzes in the images of items within the scope of the intention, each article is as intention object
The probability of product, including:From the images of items within the scope of the intention, each article and the incidence relation of user are analyzed;Wherein,
Incidence relation includes that demand frequency association is associated with required time.Wherein, demand frequency association such as can be to drinking water, taking medicine
And dining etc., can form certain frequency rule daily, and required time association such as can be turn on light at night, the morning
The operations such as television set are opened, then lamp switch and TV remote controller etc. are exactly that there are required time incidence relations with user.This
The benefit of sample setting is the user of demand can be according to different dimensions to analyze to(for) intention article, finally accurately determines to use
The intention article at family.
S350, the intention article that the higher article of the probability is determined as to user.
It obtains after probability height sorts, user being followed successively by and taken the high article of probability, if user does not receive,
Or wave to refuse, then the article can be replaced and article that probability of taking is slightly lower, and then to confirm successively.
The demand of article if user does not take can terminate the behaviour for helping user to take article according to the voice messaging of user
Make.
The present embodiment on the basis of the above embodiments, provides a kind of specific method of determining user intent article, leads to
It crosses and uses this method, the intention article of user can be determined more accurately within the scope of intention, improve the usage experience of user.
Example IV
Fig. 4 is the structural schematic diagram of the determining device for the object that the embodiment of the present invention four provides.As shown in figure 4, described
The determining device of object, including:
Intention action message acquisition module 410, the intention action message for obtaining user;
Intention range determination module 420, for according to the intention action message, determining the intention range of user;
Intention article determining module 430 therefrom determines the meaning of user for obtaining the images of items within the scope of the intention
To article.
On the basis of the various embodiments described above, optionally, described device further includes:
Intention article mobile module, for by action component, the intention article to be moved to the accessible model of user
In enclosing.
The said goods can perform the method that any embodiment of the present invention is provided, and have the corresponding function module of execution method
And advantageous effect.
Embodiment five
The embodiment of the present application also provides a kind of storage medium including computer executable instructions, and the computer is executable
When being executed by computer processor for executing a kind of determination method of object, this method includes for instruction:
Obtain the intention action message of user;
According to the intention action message, the intention range of user is determined;
The images of items within the scope of the intention is obtained, therefrom determines the intention article of user.
Storage medium --- any various types of memory devices or storage device.Term " storage medium " is intended to wrap
It includes:Install medium, such as CD-ROM, floppy disk or magnetic tape equipment;Computer system memory or random access memory, such as
DRAM, DDR RAM, SRAM, EDO RAM, blue Bath (Rambus) RAM etc.;Nonvolatile memory, such as flash memory, magnetic medium
(such as hard disk or optical storage);The memory component etc. of register or other similar types.Storage medium can further include other
Memory of type or combinations thereof.In addition, storage medium can be located at program in the computer system being wherein performed, or
It can be located in different second computer systems, second computer system is connected to computer by network (such as internet)
System.Second computer system can provide program instruction to computer for executing.Term " storage medium " may include can
To reside in different location two or more storage mediums of (such as in different computer systems by network connection).
Storage medium can store the program instruction (such as being implemented as computer program) that can be executed by one or more processors.
Certainly, a kind of storage medium including computer executable instructions that the embodiment of the present application is provided, computer
The object that executable instruction is not limited to the described above determines operation, can also be performed what the application any embodiment was provided
Relevant operation in the determination method of object.
Embodiment six
The embodiment of the present application provides a kind of robot, and object provided by the embodiments of the present application can be integrated in the robot
Determining device.Fig. 5 is a kind of structural schematic diagram of robot provided by the embodiments of the present application.As shown in figure 5, the robot can
To include:At least one processor;And the memory being connect at least one processor communication;Wherein, the storage
Device is stored with the instruction that can be executed by least one processor, and described instruction is executed by least one processor, with
Make the determination method for the object that at least one processor is able to carry out as described in any embodiment of the present invention.
Robot provided by the embodiments of the present application, it may be determined that the article that user needs in real time plays preferably user
Look after effect.
It is arbitrarily real that determining device, storage medium and the robot of the object provided in above-described embodiment can perform the application
The determination method for applying the object that example is provided has and executes the corresponding function module of this method and advantageous effect.Not above-mentioned
The technical detail of detailed description in embodiment, reference can be made to the determination method for the object that the application any embodiment is provided.
Note that above are only presently preferred embodiments of the present invention and institute's application technology principle.It will be appreciated by those skilled in the art that
The present invention is not limited to specific embodiments described here, can carry out for a person skilled in the art it is various it is apparent variation,
It readjusts and substitutes without departing from protection scope of the present invention.Therefore, although being carried out to the present invention by above example
It is described in further detail, but the present invention is not limited only to above example, without departing from the inventive concept, also
May include other more equivalent embodiments, and the scope of the present invention is determined by scope of the appended claims.
Claims (10)
1. a kind of determination method of object, which is characterized in that including:
Obtain the intention action message of user;
According to the intention action message, the intention range of user is determined;
The images of items within the scope of the intention is obtained, therefrom determines the intention article of user.
2. according to the method described in claim 1, it is characterized in that, obtaining the images of items within the scope of the intention, therefrom
After the intention article for determining user, the method further includes:
By action component, the intention article is moved in the accessible range of user.
3. according to the method described in claim 1, it is characterized in that, it is described obtain user intention action message, including:
Obtain the image or consecutive image of user;
The limbs direction or moving direction of the user are determined from described image either consecutive image;
The limbs are directed toward or moving direction is determined as the intention action message of user.
4. according to the method described in claim 1, it is characterized in that, according to the intention action message, the intention of user is determined
Range, including:
According to the intention action message, the intention direction of user is determined;
According to auxiliary judgement information, intention direction scattering angle is determined, wherein the auxiliary judgement information includes:The voice of user
One or more of information, the direction of visual lines of user, limb motion track and motion track;
The range that the intention direction and intention direction scattering angle are formed is as the intention range of user.
5. according to the method described in claim 1, it is characterized in that, obtaining the images of items within the scope of the intention, therefrom really
Determine the intention article of user, including:
The image within the scope of the intention is obtained, image recognition technology is based on, determines the images of items within the scope of the intention;
It analyzes in the images of items within the scope of the intention, probability of each article as intention article;
The higher article of the probability is determined as to the intention article of user.
6. according to the method described in claim 5, it is characterized in that, analyze in the images of items within the scope of the intention, each
Probability of the article as intention article, including:
From the images of items within the scope of the intention, each article and the incidence relation of user are analyzed;Wherein, incidence relation packet
Demand frequency association is included to be associated with required time.
7. a kind of determining device of object, which is characterized in that including:
Intention action message acquisition module, the intention action message for obtaining user;
Intention range determination module, for according to the intention action message, determining the intention range of user;
Intention article determining module therefrom determines the intention article of user for obtaining the images of items within the scope of the intention.
8. device according to claim 7, which is characterized in that further include:
Intention article mobile module, for by action component, the intention article to be moved in the accessible range of user.
9. a kind of computer readable storage medium, is stored thereon with computer program, which is characterized in that the program is held by processor
The determination method of the object as described in any in claim 1-6 is realized when row.
10. a kind of robot, which is characterized in that including:
At least one processor;And the memory being connect at least one processor communication;Wherein,
The memory is stored with the instruction that can be executed by least one processor, and described instruction is by least one place
It manages device to execute, so that at least one processor is able to carry out the determination method of the object as described in claim 1-6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810231355.XA CN108491790A (en) | 2018-03-20 | 2018-03-20 | A kind of determination method, apparatus, storage medium and the robot of object |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810231355.XA CN108491790A (en) | 2018-03-20 | 2018-03-20 | A kind of determination method, apparatus, storage medium and the robot of object |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108491790A true CN108491790A (en) | 2018-09-04 |
Family
ID=63318898
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810231355.XA Pending CN108491790A (en) | 2018-03-20 | 2018-03-20 | A kind of determination method, apparatus, storage medium and the robot of object |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108491790A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110744544A (en) * | 2019-10-31 | 2020-02-04 | 昆山市工研院智能制造技术有限公司 | Service robot vision grabbing method and service robot |
CN111515946A (en) * | 2018-10-31 | 2020-08-11 | 杭州程天科技发展有限公司 | Control method and device for human body auxiliary robot |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101233540A (en) * | 2005-08-04 | 2008-07-30 | 皇家飞利浦电子股份有限公司 | Apparatus for monitoring a person having an interest to an object, and method thereof |
US20150206188A1 (en) * | 2014-01-17 | 2015-07-23 | Panasonic Intellectual Property Corporation Of America | Item presentation method, and information display method |
CN204819531U (en) * | 2015-06-05 | 2015-12-02 | 王一舒 | Novel multi -functional intelligence robot of accompanying and attending to |
CN105291113A (en) * | 2015-11-27 | 2016-02-03 | 深圳市神州云海智能科技有限公司 | Robot system for home care |
CN106873773A (en) * | 2017-01-09 | 2017-06-20 | 北京奇虎科技有限公司 | Robot interactive control method, server and robot |
CN206536480U (en) * | 2017-02-17 | 2017-10-03 | 昆山库尔卡人工智能科技有限公司 | A kind of child intelligently nurses robot |
CN107471217A (en) * | 2017-08-23 | 2017-12-15 | 北京石油化工学院 | One kind is helped the elderly robot control method |
CN207044184U (en) * | 2017-08-23 | 2018-02-27 | 北京石油化工学院 | One kind is helped the elderly robot |
CN107756373A (en) * | 2017-10-13 | 2018-03-06 | 中国科学院深圳先进技术研究院 | Small scale robot and its take thing control method, small scale robot extraction system |
-
2018
- 2018-03-20 CN CN201810231355.XA patent/CN108491790A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101233540A (en) * | 2005-08-04 | 2008-07-30 | 皇家飞利浦电子股份有限公司 | Apparatus for monitoring a person having an interest to an object, and method thereof |
US20150206188A1 (en) * | 2014-01-17 | 2015-07-23 | Panasonic Intellectual Property Corporation Of America | Item presentation method, and information display method |
CN204819531U (en) * | 2015-06-05 | 2015-12-02 | 王一舒 | Novel multi -functional intelligence robot of accompanying and attending to |
CN105291113A (en) * | 2015-11-27 | 2016-02-03 | 深圳市神州云海智能科技有限公司 | Robot system for home care |
CN106873773A (en) * | 2017-01-09 | 2017-06-20 | 北京奇虎科技有限公司 | Robot interactive control method, server and robot |
CN206536480U (en) * | 2017-02-17 | 2017-10-03 | 昆山库尔卡人工智能科技有限公司 | A kind of child intelligently nurses robot |
CN107471217A (en) * | 2017-08-23 | 2017-12-15 | 北京石油化工学院 | One kind is helped the elderly robot control method |
CN207044184U (en) * | 2017-08-23 | 2018-02-27 | 北京石油化工学院 | One kind is helped the elderly robot |
CN107756373A (en) * | 2017-10-13 | 2018-03-06 | 中国科学院深圳先进技术研究院 | Small scale robot and its take thing control method, small scale robot extraction system |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111515946A (en) * | 2018-10-31 | 2020-08-11 | 杭州程天科技发展有限公司 | Control method and device for human body auxiliary robot |
CN111515946B (en) * | 2018-10-31 | 2021-07-20 | 杭州程天科技发展有限公司 | Control method and device for human body auxiliary robot |
CN110744544A (en) * | 2019-10-31 | 2020-02-04 | 昆山市工研院智能制造技术有限公司 | Service robot vision grabbing method and service robot |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6803351B2 (en) | Managing agent assignments in man-machine dialogs | |
CN107030691B (en) | Data processing method and device for nursing robot | |
Bretan et al. | Emotionally expressive dynamic physical behaviors in robots | |
Cassell et al. | More than just a pretty face: conversational protocols and the affordances of embodiment | |
Cassell | Embodied conversational interface agents | |
Jokinen et al. | Multimodal open-domain conversations with the Nao robot | |
Chao et al. | Timed Petri nets for fluent turn-taking over multimodal interaction resources in human-robot collaboration | |
US20120290111A1 (en) | Robot | |
WO2018006374A1 (en) | Function recommending method, system, and robot based on automatic wake-up | |
KR20200024675A (en) | Apparatus and method for recognizing behavior of human | |
CN108491790A (en) | A kind of determination method, apparatus, storage medium and the robot of object | |
KR20230044165A (en) | Electronic apparatus and control method thereof | |
Draghici et al. | Development of a human service robot application using pepper robot as a museum guide | |
CN108388399B (en) | Virtual idol state management method and system | |
CN110154048A (en) | Control method, control device and the robot of robot | |
Salem et al. | 23 Social Signal Processing in Social Robotics | |
Lee et al. | Robotic companions for smart space interactions | |
Yan | Paired speech and gesture generation in embodied conversational agents | |
CN111673764A (en) | Intelligent voice interaction robot with ultraviolet disinfection function | |
Qu et al. | Context-based word acquisition for situated dialogue in a virtual world | |
Manandhar et al. | Multivariate output-associative RVM for multi-dimensional affect predictions | |
Herath et al. | Thinking head: Towards human centred robotics | |
CN113011311A (en) | Monitoring method, system, device and storage medium | |
Hassemer et al. | Cognitive linguistics and gesture | |
Wrede et al. | Towards an integrated robotic system for interactive learning in a social context |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20180904 |