CN108780361A - Human-computer interaction method and device, robot and computer readable storage medium - Google Patents

Human-computer interaction method and device, robot and computer readable storage medium Download PDF

Info

Publication number
CN108780361A
CN108780361A CN201880001295.0A CN201880001295A CN108780361A CN 108780361 A CN108780361 A CN 108780361A CN 201880001295 A CN201880001295 A CN 201880001295A CN 108780361 A CN108780361 A CN 108780361A
Authority
CN
China
Prior art keywords
robot
interactive object
information
target interactive
man
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880001295.0A
Other languages
Chinese (zh)
Inventor
张含波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cloudminds Robotics Co Ltd
Original Assignee
Cloudminds Shenzhen Robotics Systems Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cloudminds Shenzhen Robotics Systems Co Ltd filed Critical Cloudminds Shenzhen Robotics Systems Co Ltd
Publication of CN108780361A publication Critical patent/CN108780361A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/0005Manipulators having means for high-level communication with users, e.g. speech generator, face recognition means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/80Recognising image objects characterised by unique random patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Automation & Control Theory (AREA)
  • Fuzzy Systems (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Manipulator (AREA)

Abstract

The application relates to the technical field of robots and discloses a human-computer interaction method, a human-computer interaction device, a robot and a computer readable storage medium. In the present application, the human-computer interaction method is applied to a robot, and includes: extracting biometric information of the identified at least one object; wherein the biological characteristic information comprises physiological characteristic information and/or behavior characteristic information; determining a target interaction object needing interaction from at least one object according to the biological characteristic information; the robot is controlled to respond in a manner that matches the target interaction object. The man-machine interaction method can enable the robot to only respond to the object needing interaction, effectively avoids error response operation, and greatly improves user experience.

Description

Man-machine interaction method, device, robot and computer readable storage medium
Technical field
This application involves robotic technology field, more particularly to a kind of man-machine interaction method, device, robot and computer Readable storage medium storing program for executing.
Background technology
Human-computer interaction, human-computer interaction (Human-Computer Interaction or Human-Machine Interaction, abbreviation HCI or HMI), it is the knowledge of the interactive relation between a research system and user.System can be Various machines can also be the system and software of computerization.To be placed on bank bussiness hall, megastore, airport For the interaction humanoid robot of equal public places, which can be made a response by computer system, provide clothes to the user Business, such as actively initiate the enquirement greeted, answer user, guiding user's transacting business.
But inventor has found that at least there are the following problems in the prior art:Since the flow of the people of public place is larger, and And various sound interferences are had, such as broadcast of broadcasting, music, and existing robot can not mask these and do at all Disturb factor, can constantly make a response instead, not only seriously occupy the process resource of robot in this way, can also make robot without Method provides effective service to really need the user of help, seriously affects the usage experience of user.
Invention content
The application section Example technical problem to be solved be to provide a kind of man-machine interaction method, device, Robot and computer readable storage medium, to solve the above technical problems.
One embodiment of the application provides a kind of man-machine interaction method, which is applied to robot, Including:Extract the biological information of at least one object recognized;Wherein, biological information includes physiological characteristic information And/or behavior characteristic information;According to biological information, the target interaction for needing to interact is determined from least one object Object;Control robot makes and the matched response of target interactive object.
One embodiment of the application provides a kind of human-computer interaction device, which is applied to robot, Including:Extraction module, determining module and control module;Extraction module, the biology for extracting at least one object recognized Characteristic information;Wherein, biological information includes physiological characteristic information and/or behavior characteristic information;Determining module is used for basis Biological information determines the target interactive object for needing to interact from least one object;Control module, for controlling Robot makes and the matched response of target interactive object.
One embodiment of the application provides a kind of robot, which includes at least one processor;And with The memory of at least one processor communication connection;Wherein, memory is stored with the instruction that can be executed by least one processor, Instruction is executed by least one processor, so that at least one processor is able to carry out involved in the application any means embodiment Man-machine interaction method.
One embodiment of the application provides a kind of computer readable storage medium, which deposits Computer instruction is contained, computer instruction is for making computer execute the human-computer interaction involved in the application any means embodiment Method.
In terms of existing technologies, robot is recognized when recognizing object by extraction the embodiment of the present application The biological information of object, and the object for really needing and interacting is determined according to biological information, it is real determining It needs just to make the response with the object matching after the object interacted.By this man-machine interaction mode, machine can be made People, to effectively prevent accidentally responding operation, greatly improves user's body only for needing the object interacted to make a response It tests.
Description of the drawings
One or more embodiments are illustrated by the picture in corresponding attached drawing, these exemplary theorys The bright restriction not constituted to embodiment, the element with same reference numbers label is expressed as similar element in attached drawing, removes Non- to have special statement, composition does not limit the figure in attached drawing.
Fig. 1 is the flow chart of man-machine interaction method in the application first embodiment;
Fig. 2 is the schematic diagram that robot determines target interactive object in the application first embodiment;
Fig. 3 is the flow chart of man-machine interaction method in the application second embodiment;
Fig. 4 is the schematic diagram that robot determines target interactive object in the application second embodiment;
Fig. 5 is the flow chart of man-machine interaction method in the application 3rd embodiment;
Fig. 6 is the block diagram of human-computer interaction device in the application fourth embodiment;
Fig. 7 is the block diagram of robot in the 5th embodiment of the application.
Specific implementation mode
It is with reference to the accompanying drawings and embodiments, right in order to make the object, technical solution and advantage of the application be more clearly understood The application section Example is further elaborated.It should be appreciated that specific embodiment described herein is only used to solve The application is released, is not used to limit the application.
The first embodiment of the application is related to a kind of man-machine interaction method, which is applied to robot, tool Body flow is as shown in Figure 1.
It should be noted that robot described in the present embodiment is to automatically control being commonly called as machine, including all simulations The machinery (such as robot dog, Doraemon etc.) of human behavior or thought and simulation other biological.
In a step 101, the biological information of at least one object recognized is extracted.
Specifically, in the present embodiment, the operation of the biological information of at least one object recognized is extracted, is had Body can be when using robot present position to have detected that at least one object is close in the preset range (such as 5 meters) in the center of circle Triggering, it allows the robot to perceive the object within the scope of itself 360 degree of present position by this detection mode.
It is noted that in the present embodiment, robot determines the operation for recognizing object, can be specifically to pass through machine The proximity sensor installed on device people is realized, for example is placed in public places by robot, and after being started, close to sensing Whether device can perceive to there is within the scope of 5 meters of the artificial center of circle of machine object close, if perceiving the mobile letter of object After breath or existence information, the information perceived is just converted into electric signal, the life of robot is controlled by the processor of robot Object collection apparatus device goes to extract the biological information of at least one object recognized.
For the ease of understanding the specific implementation of extraction biological information, several specific extraction sides are enumerated below Formula, it is specific as follows:
Mode one:It controls robot and carries out Image Acquisition, and extract the life of at least one object from the image collected Object feature obtains the biological information of at least one object.
Mode two:It controls robot and carries out voice collecting, and extract the life of at least one object from collected voice Object feature obtains the biological information of at least one object.
Mode three:It controls robot and carries out Image Acquisition and voice collecting, and extract at least one from the image collected The biological characteristic of a object obtains the biological information of at least one object, while being extracted at least from collected voice The biological characteristic of one object obtains the biological information of at least one object.
In addition, when employing mode three extracts biological information, it can be further to the object that is obtained from image Biological information and the object organisms characteristic information obtained from voice carry out analyzing processing, so that it is determined that belonging to same right The biological information of elephant so that in the follow-up operation for determining target interactive object, image can be come from according to the same object The biological information of middle object and in voice object biological information carry out comprehensive analysis, to promote determining mesh Mark the accuracy of interactive object.
It should be noted that in the present embodiment, the biological information of extraction specifically include physiological characteristic information and/or Behavior characteristic information.
Wherein, physiological characteristic information can be specifically the facial information, eye information, voiceprint of the object recognized Any one in relevant informations such as (referring specifically to analyze whose information sound comes from) or arbitrary combination, behavioural characteristic letter Breath can be specifically the displacement information of the object recognized, the voice content information in what is said or talked about language (refers specifically to identify Go out the information of content) etc. any one or arbitrary combination in relevant informations.
Such as when extracting the biological characteristic of at least one object from the image collected, can usually it extract The behavior characteristic informations such as physiological characteristic informations and displacement information such as facial information and/or eye information of object.
Also such as, when extracting the biological characteristic of at least one object from collected voice, can usually it extract The behavior characteristic informations such as the physiological characteristic informations such as the voiceprint to object and voice content information.
In addition, above-mentioned control robot carries out Image Acquisition, can be specifically the image collector for controlling robot itself It sets, can also be to be obtained from the external image collecting device being connect with robot communication if camera carries out Image Acquisition, such as quotient Monitoring device or two ways the cooperation acquisition installed in.
Similarly, above-mentioned control robot carries out voice collecting, can also be the voice acquisition device using robot itself With/communicate with connection external voice harvester obtain.
In addition, it is noted that after determination recognizes object, control robot carries out Image Acquisition and/or voice Before acquisition, it can control robot according to the directional information for perceiving object and turn to towards where the object recognized Then direction carries out the acquisition operations of image and/or voice, to ensure in the image collected and voice in control robot There is the object recognized so that the biological information of the object of subsequent extracted is more complete, and then ensures finally determining mesh It is more accurate to mark interactive object.
In addition, the image collected described in the present embodiment is not limited to the image informations such as photo, it can also be and regard Image information in frequency, is not limited herein.
It should be noted that these are only for example, in practical applications, those skilled in the art can be according to it The technological means of grasp is rationally arranged, as long as can be at least one right from what is recognized according to the biological information extracted As middle determining target interactive object.
In a step 102, according to biological information, determine that the target for needing to interact is handed over from least one object Mutual object.
In the present embodiment, according to biological information, the target for needing to interact is determined from least one object The operation of interactive object can specifically be accomplished in the following manner:
First, according to biological information, determine that at least one object is to wait for interactive object.For convenience of description, this reality Example is applied to wait for that interactive object behaviour is specifically described.
Specifically, because in practical applications, the object close to robot differs that establish a capital be pair for needing to interact As close to robot may be such as toy or other-end equipment, and inhuman.It therefore, can be by life that will extract Object characteristic information is compared with the sample information of the people to prestore, and non-human object is excluded, to ensure the standard of subsequent operation True property.
In addition, when having multiple people in determining the object recognized, it can also further pass through everyone biology of analysis Feature, such as direction of displacement (whether being directed towards robot motion), eye information (whether being look at robot etc.) etc. determine it Whether asking for help, is being determined as the people that these really ask for help to wait for interactive object.
Then, it is waited in interactive object from determining, selection one is satisfactory to wait for interactive object as target interaction pair As, i.e., robot final choice progress human-computer interaction object.
Specifically, if waiting for, the number of interactive object is equal to 1, this is directly waited for that interactive object is determined as target interaction pair As if waiting for, the number of interactive object is more than 1, and condition is arranged according to preset priority, waits for that interactive object is arranged for each Priority, then determine highest priority waits for that interactive object is target interactive object.
In order to make it easy to understand, being specifically described below in conjunction with Fig. 2.
As shown in Fig. 2, occur 3 objects, respectively A, B, C in the range of robot can recognize, and according to After biological information is judged, this 3 objects meet interactive condition, i.e., are all to wait for interactive object.In this case, really The mode of interactive object of setting the goal can be determined by the height of priority, such as according to the location information for waiting for interactive object For it, priority is set.
Specifically, as shown in Fig. 2, the location information for waiting for interactive object A got is (x0, y0), waits for interactive object B Location information be (x1, y1), wait for that the location information of interactive object C is (x2, y2), according to distance calculation formulaIt can calculate and wait for that interactive object A, B, C are respectively d0, d1, d2 apart from the distance of robot.If d2<d0 <D1, then condition being arranged according to preset priority, (closer apart from robot, priority is higher, remoter apart from robot, preferentially Grade is lower), to wait for that priority is arranged in interactive object A, B, C, the priority of setting is respectively:Waiting for interactive object C, (priority is most It is high), wait for interactive object B (priority is minimum), wait for interactive object A (priority bit in wait for interactive object C and wait for interactive object B it Between), it is assured that waits for that interactive object C is target interactive object at this time.
In addition, it is noted that in practical applications, it is understood that there may be multiple to wait for position of the interactive object apart from robot Which identical situation can be moved to by robot and need dynamic angle of walking around when interactive object in this case Minimum principle makes priority judgement.
It should be noted that these are only for example, not to the technical solution of the application and scope of protection structure At restriction, in practical applications, those skilled in the art can be rationally arranged, not be limited herein according to actual needs.
In step 103, the location information of target interactive object is obtained.
At step 104, according to location information, control robot is moved towards target interactive object.
Specifically, after determining target interactive object, can according to the location information of the target interactive object got, It controls robot to move towards target interactive object so that robot can actively interact operation, promote user experience.
Compared with prior art, the man-machine interaction method provided in the present embodiment, can make robot only for need into The object of row interaction makes a response, and to effectively prevent accidentally responding operation, greatly improves user experience.
The second embodiment of the application is related to a kind of man-machine interaction method.The present embodiment is done on the basis of first embodiment It is further improved, specific improvements are:During response matched with target interactive object is made by control robot, The identity information of target interactive object can be also obtained, and after robot is moved to the region where target interactive object, according to Identity information is made to carry out specifically below in conjunction with Fig. 3 and Fig. 4 for convenience of description with the matched response of target interactive object It is bright.
Specifically, in the present embodiment, including step 301 is to step 305, wherein step 301 and step 302, step 304 respectively in first embodiment step 101 and step 102, step 104 it is roughly the same, details are not described herein again, below it is main Introduce difference, not the technical detail of detailed description in the present embodiment, reference can be made to first embodiment provided it is man-machine Exchange method, details are not described herein again.
In step 303, the location information and identity information of target interactive object are obtained.
By taking the behaviour of target interactive object as an example, the identity information of the target interactive object obtained in the present embodiment may include Whether name gender, the age, is any one or arbitrary combination in the relevant informations such as VIP client.
It, specifically can be by face recognition technology by target interactive object it should be noted that above-mentioned various identity informations Information and robot residing for occasion (such as bank bussiness hall) record the user for handling business face database in store Human face data matched, after successful match, the Association Identity of the user for handling business of record can be directly acquired Information.If not successful match, gender and substantially the range of age are first determined according to face recognition technology, then by mutual The identity information for further improving target interactive object is searched in networking.
In addition, it is noted that in practical applications, when determining target interactive object, can be combined with waiting handing over The identity information of mutual object is determined, and the priority of interactive object is such as waited for according to the VIP parameter settings carried in identity information, And consider the factors such as distance, target interactive object is determined, in order to make it easy to understand, being specifically described below in conjunction with Fig. 4.
Specifically, there is A, B, C tri- to wait for interactive object in the range of robot can identify, and each wait for interaction pair Location information and identity information such as Fig. 4 mark of elephant, wherein wait for interactive object A, B, C respectively apart from the distance of robot be d0, D1, d2, and d2<d0<d1.
In this case, it determines that the mode of target interactive object can pay the utmost attention to distance factor, selects to wait for interaction pair As C is target interactive object;VIP factors can also be paid the utmost attention to, select to wait for interactive object A for target interactive object;It can be with It is to pay the utmost attention to age factor, older is waited for that interactive object is preferentially determined as target interactive object.
It needs, these are only distance explanation, the technical solution of the application and scope of protection are not constituted It limits, in practical applications, those skilled in the art can be rationally arranged, not be limited herein according to actual needs.
In step 305, it behind the region where being moved to target interactive object, is made according to identity information and being handed over target The response of mutual object matching.
Such as target interactive object be Fig. 4 in C, behind the region where being moved to target interactive object C (such as away from Position of one meter from target interactive object), robot can actively carry out service query or business guiding, such as " Mr. Zhang Yi, you It is good, it may I ask you need that business handled?".
Further, in order to promote user experience, after being inquired to target interactive object C, target interaction pair is waited for It, can also be to waiting for interactive object A and wait for that interactive object B makes that " current guest is more, woulds you please resistance to during making answer as C The heart waits for!" voice prompt.
It should be noted that these are only for example, not to the technical solution of the application and scope of protection structure At restriction, in practical applications, those skilled in the art can be rationally arranged, not be limited herein according to actual needs.
Compared with prior art, the man-machine interaction method provided in the present embodiment, in the position for obtaining target interactive object When information, by further obtaining the identity information of target interactive object, so as in robot according to target interactive object Location information be moved to the region where target interactive object after, can be made according to identity information and target interactive object The response matched, further the user experience is improved.
The 3rd embodiment of the application is related to a kind of man-machine interaction method.The present embodiment is implemented in first embodiment or second It is further improved on the basis of example, specific improvements are:It is made in control robot matched with target interactive object After response, when redefining the target interactive object for needing to interact, need first it is judged whether or not new object connects Nearly robot, detailed process are as shown in Figure 5.
Specifically, in the present embodiment, including step 501 is to step 508, wherein step 501 to step 504 is distinguished Roughly the same to step 104 with the step 101 in first embodiment, details are not described herein again, mainly introduces difference below, not The technical detail of detailed description in the present embodiment, reference can be made to the human-computer interaction that the first embodiment or the second embodiment is provided Method, details are not described herein again.
In step 505, new object is judged whether there is close to robot.If it is determined that there is new object close to machine People enters step 506;Otherwise, it is directly entered step 507, it is remaining from last interactive process to wait in interactive object Again it chooses one and waits for that interactive object makes target interactive object.
Specifically, in the present embodiment, the mode for judging whether there is new object close to robot may be used such as Described in one embodiment, if new to be detected in the preset range (such as 5 meters) in the center of circle being presently in position using robot Object is close, it is determined that has new object close to robot, specific to judge operation, details are not described herein again.
In addition, it is necessary to explanation, in the present embodiment, the new object close to robot can be one, it can also More than 1, it is not limited herein.
In step 506, the biological information of new object is extracted.
In step 507, the target interactive object for needing to interact is redefined.
Specifically, the target object that the needs redefined in the present embodiment interact is specially from new object It is chosen in the object in addition to the target interactive object of last interactive operation.
In order to make it easy to understand, being specifically described below:
In practical applications, public place especially larger in flow of the people, the same time there may be it is multiple needs with (i.e. according to the biological information of the object recognized, it is needs that determining has more than 1 object to the object that robot interacts What is interacted waits for interactive object), however, when carrying out human-computer interaction, synchronization, robot can only wait for interaction pair to one As making a response and (selected target interactive object being needed to interact), after completing primary interaction, can just wait handing over other Mutual object interacts.But after completing primary interaction, predetermined it may wait for interactive object also in addition to having around robot Made a response waiting for robot, it is also possible to have it is new need object to be interacted to occur, therefore in this case, again Determine the operation for the target interactive object for needing to interact, it is necessary to wait for interactive object and upper primary man-machine friendship what is newly confirmed It is remaining during mutually to wait for that choosing one in interactive object again waits for that interactive object makes target interactive object.
In addition, it is necessary to explanation, by redefining the target interactive object for needing to interact in this present embodiment Mode, it is roughly the same with the method for determination in first embodiment, it is required to, according to biological information, determine the object recognized To wait for interactive object, the target interactive object that interacts finally then is needed from waiting for choosing in interactive object, concrete implementation Details are not described herein again for details.
It, in the present embodiment still can be according to respectively waiting for that interactive object is preferential in addition, the selection about target interactive object The height of grade is selected, naturally it is also possible to be determined new target interactive object according to other selection modes, is not limited herein.
In step 508, the matched response of target interactive object that control robot makes and redefines.
Specifically, the matched response of target interactive object that control robot makes and redefines, response process can Think:Moved towards target interactive object, and after being moved to the region where target interactive object, actively carry out service consultation or Business guides, and specific response mode can be configured, herein according to the relevant information of the target interactive object redefined It is not limited.
It should be noted that these are only for example, not to the technical solution of the application and scope of protection structure At restriction, in practical applications, those skilled in the art can be rationally arranged, not be limited herein according to actual needs.
Compared with prior art, the man-machine interaction method provided in the present embodiment, after completing a man-machine interactive operation, By monitoring whether new object close to robot, and determining that extraction is emerging when having new object close to robot Whether the biological information of object and determining emerging object are to wait for interactive object, if emerging object is to wait interacting Object then waits for choosing one again in interactive object waiting for of newly confirming is remaining in interactive object and upper primary interactive process It is a to wait for that interactive object makes target interactive object, then carry out human-computer interaction;If emerging object is not to wait for interactive object, then It is remaining directly in upper primary interactive process to wait for that choosing one in interactive object again waits for that interactive object makes target friendship Then mutual object carries out human-computer interaction.
By foregoing description it is not difficult to find that the man-machine interaction method provided in the present embodiment, can make robot work Dynamic update perceives the state of object in the process, so as to accurately make the response for meeting current scene, reduces and misses Operation, further the user experience is improved.
The fourth embodiment of the application is related to a kind of human-computer interaction device, which is applied to robot, tool Body structure is as shown in Figure 6.
As shown in fig. 6, human-computer interaction device includes extraction module 601, determining module 602 and control module 603.
Wherein, extraction module 601, the biological information for extracting at least one object recognized.
Determining module 602, for according to biological information, the mesh for needing to interact to be determined from least one object Mark interactive object.
Control module 603 is made and the matched response of target interactive object for controlling robot.
Specifically, in the present embodiment, the biological characteristic at least one object recognized that extraction module 601 extracts Information can be specifically the combination of any one or both in physiological characteristic information, behavior characteristic information.
In addition, it is noted that the physiological characteristic information that extraction module 601 extracts in the present embodiment can be specifically Any one or the arbitrary combination such as facial information, eye information, voiceprint of object.The behavior that extraction module 601 extracts Characteristic information can be specifically the displacement information of object, voice content information etc. any one or combinations thereof.
Determining module 602 is determined from least one object and is handed over according to above-mentioned various biological informations When mutual target interactive object, can be specifically:First, determine that the object recognized is according to above-mentioned various biological informations Interactive object (needing the object interacted) is waited for, such as the expression in the eyes note according to the eye information analysis of the object object recognized Optionally and the displacement information of the object determines whether it is currently asking for help, so that it is determined that whether it is to wait handing over Mutual object;Then, it is determining after interactive object, is waiting for choosing a satisfactory object in interactive object as mesh from these Mark interactive object (finally needing the object interacted).
In addition, in the present embodiment, control module 603 control robot make with the matched response of target interactive object, Can be specifically that control robot is moved towards target interactive object.
Further, after robot is moved to the region where target interactive object, robot can be controlled according to this The identity information of object makes matched response, such as the guiding of the carry out service query of active, business, can be specifically: " you are good, may I ask what business you will handle?"
It should be noted that these are only for example, not to the technical solution of the application and scope of protection structure At restriction, in practical applications, those skilled in the art can be rationally arranged, not be limited herein according to actual needs.
In addition, the not technical detail of detailed description in the present embodiment, reference can be made to the application any embodiment is provided Man-machine interaction method, details are not described herein again.
By foregoing description it is not difficult to find that the human-computer interaction device provided in the present embodiment, is extracted using extraction module The biological information of at least one object recognized, determining module is according to biological information, from least one object It determines the target interactive object for needing to interact, is then made and target interactive object using control module control robot The response matched directly is cooperated so that the robot for being equipped with the human-computer interaction device can be only by above-mentioned each module For needing the object interacted to make a response, to effectively prevent accidentally responding operation, user experience is greatly improved.
The apparatus embodiments described above are merely exemplary, does not constitute and limits to the protection domain of the application, In practical applications, it is next real can to select according to the actual needs some or all of module therein by those skilled in the art The purpose of existing this embodiment scheme, is not limited herein.
The 5th embodiment of the application is related to a kind of robot, and concrete structure is as shown in Figure 7.
The robot can be the intelligence machine positioned at such as bank bussiness hall, megastore, airport equipment public place Equipment.One or more processors 701 and memory 702 are specifically included inside it, in Fig. 7 by taking a processor 701 as an example.
In the present embodiment, involved in above-described embodiment to human-computer interaction device in each function module be deployed in place It manages on device 701, processor 701 can be connected with memory 702 by bus or other modes, to be connected by bus in Fig. 7 For.
Memory 702 is used as a kind of computer readable storage medium, can be used for storing software program, computer can perform journey Sequence and module, the corresponding program instruction/module of man-machine interaction method as involved in the application any means embodiment.Processing Device 701 is stored in software program, instruction and module in memory 702 by operation, to the various work(of execute server It can apply and data processing, that is, realize the man-machine interaction method involved in the application any means embodiment.
Memory 702 may include storing program area and storage data field, wherein storing program area can store operation system System, the required application program of at least one function;Storage data field can establish historical data base, for storing priority setting Condition etc..In addition, memory 702 may include high-speed random access memory, can also include readable and writable memory (Random Access Memory, RAM) etc..In some embodiments, it includes remotely located relative to processor 701 that memory 702 is optional Memory, these remote memories can pass through network connection to terminal device.The example of above-mentioned network includes but not limited to Internet, intranet, LAN, mobile radio communication and combinations thereof.
In practical applications, the instruction of the execution of at least one processor 701 can be stored in memory 702, instructed by extremely A few processor 701 executes, so that at least one processor 701 is able to carry out the people that the application any means embodiment is related to Machine exchange method controls the positioning operation in each function module finishing man-machine interaction method in human-computer interaction device, does not exist The technical detail of detailed description in the present embodiment, reference can be made to the man-machine interaction method that the application any embodiment is provided.
The sixth embodiment of the application is related to a kind of computer readable storage medium, is deposited in the computer readable storage medium Computer instruction is contained, which enables a computer to execute the man-machine friendship involved in the application any means embodiment Mutual method.
It will be understood by those skilled in the art that the various embodiments described above are to realize the specific embodiment of the application, and In practical applications, can to it, various changes can be made in the form and details, without departing from spirit and scope.

Claims (10)

1. a kind of man-machine interaction method, is applied to robot, the man-machine interaction method includes:
Extract the biological information of at least one object recognized;Wherein, the biological information includes physiological characteristic Information and/or behavior characteristic information;
According to the biological information, the target interactive object for needing to interact is determined from least one object;
The robot is controlled to make and the matched response of target interactive object.
2. man-machine interaction method as described in claim 1, wherein the biology for extracting at least one object recognized is special Reference ceases, and specifically includes:
In the preset range using the robot present position as the center of circle, detect at least one object close to the machine People extracts the biological information of at least one object.
3. man-machine interaction method as claimed in claim 1 or 2, wherein the life for extracting at least one object recognized Object characteristic information, specifically includes:
It controls the robot and carries out Image Acquisition, and extract the biology spy of at least one object from the image collected Sign, obtains the physiological characteristic information and/or behavior characteristic information of at least one object;Wherein, the physiological characteristic information Including facial information and/or eye information, the behavior characteristic information includes displacement information;
And/or control the robot and carry out voice collecting, and at least one object is extracted from collected voice Biological characteristic obtains the physiological characteristic information and/or behavior characteristic information of at least one object;Wherein, the physiology is special Reference breath includes voiceprint, and the behavior characteristic information includes voice content information.
4. the man-machine interaction method as described in claims 1 to 3 any one, wherein it is described according to the biological information, The target interactive object for needing to interact is determined from least one object, is specifically included:
According to the biological information, determine that at least one object is to wait for interactive object;
If the number for waiting for interactive object is equal to 1, wait for that interactive object is the target interactive object described in determination;
If the number for waiting for interactive object is more than 1, condition is arranged according to preset priority, to wait for interaction pair described in each As priority is arranged, determine highest priority waits for that interactive object is the target interactive object.
5. the man-machine interaction method as described in Claims 1-4 any one, wherein the control robot make with The matched response of target interactive object, specifically includes:
Obtain the location information of the target interactive object;
According to the positional information, the robot is controlled to move towards the target interactive object.
6. man-machine interaction method as claimed in claim 5, wherein the control robot makes to be interacted with the target The response of object matching, specifically includes:
Obtain the identity information of the target interactive object;
After the robot is moved to the region where the target interactive object, according to the identity information make with it is described The matched response of target interactive object.
7. man-machine interaction method as claimed in claim 6, wherein interacted with the target pair controlling the robot and making After matched response, the man-machine interaction method further includes:
Determination has new object close to the robot;
The biological information of the new object is extracted, and except described from the new object and at least one object In object except target interactive object, the target interactive object for needing to interact is redefined;
Control the matched response of target interactive object that the robot makes and redefines.
8. a kind of human-computer interaction device, is applied to robot, the human-computer interaction device includes:Extraction module, determining module and Control module;
The extraction module, the biological information for extracting at least one object recognized;Wherein, the biological characteristic Information includes physiological characteristic information and/or behavior characteristic information;
The determining module is used for according to the biological information, determining from least one object to be handed over Mutual target interactive object;
The control module is made and the matched response of target interactive object for controlling the robot.
9. a kind of robot, including:
At least one processor;And
The memory being connect at least one processor communication;Wherein,
The memory is stored with the instruction that can be executed by least one processor, and described instruction is by least one place It manages device to execute, so that at least one processor is able to carry out the human-computer interaction side described in claim 1 to 7 any one Method.
10. a kind of computer readable storage medium is stored with computer instruction, the computer instruction is for making the computer Perform claim requires the man-machine interaction method described in 1 to 7 any one.
CN201880001295.0A 2018-02-05 2018-02-05 Human-computer interaction method and device, robot and computer readable storage medium Pending CN108780361A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/075263 WO2019148491A1 (en) 2018-02-05 2018-02-05 Human-computer interaction method and device, robot, and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN108780361A true CN108780361A (en) 2018-11-09

Family

ID=64029123

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880001295.0A Pending CN108780361A (en) 2018-02-05 2018-02-05 Human-computer interaction method and device, robot and computer readable storage medium

Country Status (2)

Country Link
CN (1) CN108780361A (en)
WO (1) WO2019148491A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109062482A (en) * 2018-07-26 2018-12-21 百度在线网络技术(北京)有限公司 Man-machine interaction control method, device, service equipment and storage medium
CN110085225A (en) * 2019-04-24 2019-08-02 北京百度网讯科技有限公司 Voice interactive method, device, intelligent robot and computer readable storage medium
CN110228073A (en) * 2019-06-26 2019-09-13 郑州中业科技股份有限公司 Active response formula intelligent robot
CN110465947A (en) * 2019-08-20 2019-11-19 苏州博众机器人有限公司 Multi-modal fusion man-machine interaction method, device, storage medium, terminal and system
CN110689889A (en) * 2019-10-11 2020-01-14 深圳追一科技有限公司 Man-machine interaction method and device, electronic equipment and storage medium
CN112764950A (en) * 2021-01-27 2021-05-07 上海淇玥信息技术有限公司 Event interaction method and device based on combined behaviors and electronic equipment
CN113486765A (en) * 2021-06-30 2021-10-08 上海商汤临港智能科技有限公司 Gesture interaction method and device, electronic equipment and storage medium
CN113724454A (en) * 2021-08-25 2021-11-30 上海擎朗智能科技有限公司 Interaction method of mobile equipment, device and storage medium
CN115476366A (en) * 2021-06-15 2022-12-16 北京小米移动软件有限公司 Control method, device, control equipment and storage medium for foot type robot
CN117251048A (en) * 2022-12-06 2023-12-19 北京小米移动软件有限公司 Control method and device of terminal equipment, terminal equipment and storage medium

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110716634A (en) * 2019-08-28 2020-01-21 北京市商汤科技开发有限公司 Interaction method, device, equipment and display equipment
CN114633267A (en) * 2022-03-17 2022-06-17 上海擎朗智能科技有限公司 Interactive content determination method, mobile equipment, device and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011143523A2 (en) * 2010-05-13 2011-11-17 Alexander Poltorak Electronic personal interactive device
CN104936091A (en) * 2015-05-14 2015-09-23 科大讯飞股份有限公司 Intelligent interaction method and system based on circle microphone array
CN105701447A (en) * 2015-12-30 2016-06-22 上海智臻智能网络科技股份有限公司 Guest-greeting robot
CN105843118A (en) * 2016-03-25 2016-08-10 北京光年无限科技有限公司 Robot interacting method and robot system
CN106113038A (en) * 2016-07-08 2016-11-16 纳恩博(北京)科技有限公司 Mode switching method based on robot and device
CN106203050A (en) * 2016-07-22 2016-12-07 北京百度网讯科技有限公司 The exchange method of intelligent robot and device
CN106873773A (en) * 2017-01-09 2017-06-20 北京奇虎科技有限公司 Robot interactive control method, server and robot
CN107450729A (en) * 2017-08-10 2017-12-08 上海木爷机器人技术有限公司 Robot interactive method and device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011143523A2 (en) * 2010-05-13 2011-11-17 Alexander Poltorak Electronic personal interactive device
CN104936091A (en) * 2015-05-14 2015-09-23 科大讯飞股份有限公司 Intelligent interaction method and system based on circle microphone array
CN105701447A (en) * 2015-12-30 2016-06-22 上海智臻智能网络科技股份有限公司 Guest-greeting robot
CN105843118A (en) * 2016-03-25 2016-08-10 北京光年无限科技有限公司 Robot interacting method and robot system
CN106113038A (en) * 2016-07-08 2016-11-16 纳恩博(北京)科技有限公司 Mode switching method based on robot and device
CN106203050A (en) * 2016-07-22 2016-12-07 北京百度网讯科技有限公司 The exchange method of intelligent robot and device
CN106873773A (en) * 2017-01-09 2017-06-20 北京奇虎科技有限公司 Robot interactive control method, server and robot
CN107450729A (en) * 2017-08-10 2017-12-08 上海木爷机器人技术有限公司 Robot interactive method and device

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109062482A (en) * 2018-07-26 2018-12-21 百度在线网络技术(北京)有限公司 Man-machine interaction control method, device, service equipment and storage medium
CN110085225B (en) * 2019-04-24 2024-01-02 北京百度网讯科技有限公司 Voice interaction method and device, intelligent robot and computer readable storage medium
CN110085225A (en) * 2019-04-24 2019-08-02 北京百度网讯科技有限公司 Voice interactive method, device, intelligent robot and computer readable storage medium
CN110228073A (en) * 2019-06-26 2019-09-13 郑州中业科技股份有限公司 Active response formula intelligent robot
CN110465947A (en) * 2019-08-20 2019-11-19 苏州博众机器人有限公司 Multi-modal fusion man-machine interaction method, device, storage medium, terminal and system
CN110465947B (en) * 2019-08-20 2021-07-02 苏州博众机器人有限公司 Multi-mode fusion man-machine interaction method, device, storage medium, terminal and system
CN110689889A (en) * 2019-10-11 2020-01-14 深圳追一科技有限公司 Man-machine interaction method and device, electronic equipment and storage medium
CN110689889B (en) * 2019-10-11 2021-08-17 深圳追一科技有限公司 Man-machine interaction method and device, electronic equipment and storage medium
CN112764950A (en) * 2021-01-27 2021-05-07 上海淇玥信息技术有限公司 Event interaction method and device based on combined behaviors and electronic equipment
CN112764950B (en) * 2021-01-27 2023-05-26 上海淇玥信息技术有限公司 Event interaction method and device based on combined behaviors and electronic equipment
CN115476366B (en) * 2021-06-15 2024-01-09 北京小米移动软件有限公司 Control method, device, control equipment and storage medium for foot robot
CN115476366A (en) * 2021-06-15 2022-12-16 北京小米移动软件有限公司 Control method, device, control equipment and storage medium for foot type robot
CN113486765A (en) * 2021-06-30 2021-10-08 上海商汤临港智能科技有限公司 Gesture interaction method and device, electronic equipment and storage medium
CN113486765B (en) * 2021-06-30 2023-06-16 上海商汤临港智能科技有限公司 Gesture interaction method and device, electronic equipment and storage medium
CN113724454A (en) * 2021-08-25 2021-11-30 上海擎朗智能科技有限公司 Interaction method of mobile equipment, device and storage medium
CN117251048A (en) * 2022-12-06 2023-12-19 北京小米移动软件有限公司 Control method and device of terminal equipment, terminal equipment and storage medium

Also Published As

Publication number Publication date
WO2019148491A1 (en) 2019-08-08

Similar Documents

Publication Publication Date Title
CN108780361A (en) Human-computer interaction method and device, robot and computer readable storage medium
EP4044146A1 (en) Method and apparatus for detecting parking space and direction and angle thereof, device and medium
CN106378780A (en) Robot system and method and server for controlling robot
CN110069994B (en) Face attribute recognition system and method based on face multiple regions
CN108235697B (en) Robot dynamic learning method and system, robot and cloud server
CN108182412A (en) For the method and device of detection image type
CN107545241A (en) Neural network model is trained and biopsy method, device and storage medium
CN112232293A (en) Image processing model training method, image processing method and related equipment
CN109034069A (en) Method and apparatus for generating information
CN109141453A (en) A kind of route guiding method and system
CN114093052A (en) Intelligent inspection method and system suitable for machine room management
CN107609463A (en) Biopsy method, device, equipment and storage medium
CN108090486A (en) Image processing method and device in a kind of game of billiards
WO2023132555A1 (en) Augmented reality-based construction site management method and server
CN109948511A (en) Gesture identification method and device
CN111881740B (en) Face recognition method, device, electronic equipment and medium
CN115640074A (en) Service data processing method and device and intelligent counter terminal
CN113936340A (en) AI model training method and device based on training data acquisition
CN109086725A (en) Hand tracking and machine readable storage medium
CN113696849B (en) Gesture-based vehicle control method, device and storage medium
CN114299546A (en) Method and device for identifying pet identity, storage medium and electronic equipment
CN112802252B (en) Intelligent building safety management method, system and storage medium based on Internet of things
CN107301696A (en) Dynamic access control system and dynamic access control method
CN111872928A (en) Obstacle attribute distinguishing method and system and intelligent robot
CN113076533A (en) Service processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20210207

Address after: 200245 2nd floor, building 2, no.1508, Kunyang Road, Minhang District, Shanghai

Applicant after: Dalu Robot Co.,Ltd.

Address before: 518000 Room 201, building A, No. 1, Qian Wan Road, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen, Guangdong (Shenzhen Qianhai business secretary Co., Ltd.)

Applicant before: Shenzhen Qianhaida Yunyun Intelligent Technology Co.,Ltd.

TA01 Transfer of patent application right
CB02 Change of applicant information

Address after: 201111 Building 8, No. 207, Zhongqing Road, Minhang District, Shanghai

Applicant after: Dayu robot Co.,Ltd.

Address before: 200245 2nd floor, building 2, no.1508, Kunyang Road, Minhang District, Shanghai

Applicant before: Dalu Robot Co.,Ltd.

CB02 Change of applicant information