US11325255B2 - Method for controlling robot and robot device - Google Patents

Method for controlling robot and robot device Download PDF

Info

Publication number
US11325255B2
US11325255B2 US16/668,647 US201916668647A US11325255B2 US 11325255 B2 US11325255 B2 US 11325255B2 US 201916668647 A US201916668647 A US 201916668647A US 11325255 B2 US11325255 B2 US 11325255B2
Authority
US
United States
Prior art keywords
robot
user
target
gaze
acquiring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US16/668,647
Other languages
English (en)
Other versions
US20200061822A1 (en
Inventor
Lei Luo
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cloudminds Robotics Co Ltd
Original Assignee
Cloudminds Robotics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cloudminds Robotics Co Ltd filed Critical Cloudminds Robotics Co Ltd
Assigned to CLOUDMINDS (SHENZHEN) ROBOTICS SYSTEMS CO., LTD. reassignment CLOUDMINDS (SHENZHEN) ROBOTICS SYSTEMS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LUO, LEI
Publication of US20200061822A1 publication Critical patent/US20200061822A1/en
Assigned to CLOUDMINDS ROBOTICS CO., LTD. reassignment CLOUDMINDS ROBOTICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CLOUDMINDS (SHENZHEN) ROBOTICS SYSTEMS CO., LTD.
Application granted granted Critical
Publication of US11325255B2 publication Critical patent/US11325255B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1602Programme controls characterised by the control system, structure, architecture
    • B25J9/161Hardware, e.g. neural networks, fuzzy logic, interfaces, processor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/003Controls for manipulators by means of an audio-responsive input
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/088Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
    • B25J13/089Determining the position of the robot with reference to its environment
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/16Sound input; Sound output
    • G06F3/167Audio in a user interface, e.g. using voice commands for navigating, audio feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/08Speech classification or search
    • G10L15/18Speech classification or search using natural language modelling
    • G10L15/1815Semantic context, e.g. disambiguation of the recognition hypotheses based on word meaning
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J11/00Manipulators not otherwise provided for
    • B25J11/008Manipulators for service tasks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/0003Home robots, i.e. small robots for domestic use
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40002Camera, robot follows direction movement of operator head, helmet, headstick
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40411Robot assists human in non-industrial environment like home or office
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/08Speech classification or search
    • G10L15/18Speech classification or search using natural language modelling
    • G10L15/1822Parsing for meaning understanding
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/22Procedures used during a speech recognition process, e.g. man-machine dialogue
    • G10L2015/223Execution procedure of a spoken command

Definitions

  • the application of robots is still relatively in the primary stage, and the robots are mainly for making dialogs and chatting with users. Even if some robots are capable of performing other tasks, the experience is not good.
  • FIG. 2 a , FIG. 2 b and FIG. 2 c are top views of a scenario of a method for controlling a robot according to an embodiment of the present application;
  • FIG. 7 is a schematic structural diagram of a robot device according to an embodiment of the present application.
  • a search range may be significantly narrowed, and the robot may simply acquire the target.
  • an included angle between a facing direction (a facing forward direction) and the optical axis of the robot is determined by a specific face recognition and analysis algorithm (other algorithms may also be used as long as the data is obtained). In this way, an absolute direction of the user's facing direction in the current coordinate system may be calculated.
  • the method for controlling a robot includes the following steps:
  • Step 101 A reference coordinate system is established.
  • An indoor environment may be positioned by the above positioning method for the robot. It is assumed that a point in a real space is a coordinate origin (0, 0). Based on indoor positioning, the robot may acquire the position thereof at any time, and calculates coordinates thereof.
  • Step 102 A user's gaze direction indicative of a target article is acquired, a sight line angle of the robot is acquired, a position of the robot is acquired, a linear distance between the robot and the user is acquired, and a gaze plane in a user's gaze direction relative to the reference coordinate system is calculated in real time based on the sight line angle of the robot, the position of the robot and the liner distance between the robot and the user.
  • the gaze plane is a plane perpendicular to the ground, and a straight line of the user's gaze direction is obtained by viewing the gaze plane from the top. That is, the straight line is straight line CB as illustrated in FIG. 2 a and FIG. 2 b .
  • FIG. 2 a is different from FIG. 2 b in that in the reference coordinate system, the robot in FIG. 2 a is located on the right side of the user, and the robot in FIG. 2 b is on the left side of the user.
  • the sight line angle of the robot is the optical axis of the sight line of the robot.
  • the calculating a gaze plane in a user's gaze direction relative to the reference coordinate system in real time based on the sight line angle of the robot, the position of the robot and the liner distance between the robot and the user specifically includes:
  • Coordinates (m, n) of the robot in the reference coordinate system are calculated based on the position of the robot, and an included angle ⁇ between the optical axis of the sight line of the robot and the X axis of the reference coordinate system is calculated; an included angle ⁇ between a straight line of the line between the position of the user and the position of the robot, and the optical axis of the sight line of the robot is calculated based on the sight line angle of the robot, the position of the robot and the position of the user; an included angle between the straight line of the line between the position of the user and the position of the robot, and the user's gaze direction is calculated; and the gaze plane of the user's gaze direction relative to the reference coordinate system is calculated in real time based on the included angle ⁇ between the optical axis of the sight line of the robot and the X axis of the reference coordinate system, the included angle ⁇ between the straight line of the line between the position of the user and the position of the robot and the user's gaze direction, and the linear distance between the robot
  • the optical axis of the sight line of at least one robot is controlled to deflect from the position of the user to the target indicated by the user, and an optical axis of the sight line upon the deflection is acquired; an included angle ⁇ ′ between the optical axis of the sight line upon the deflection and the X axis of the reference coordinate system is calculated based on the coordinates of the robot in the reference coordinate system; the coordinates of the robot in the reference coordinate system are calculated based on the position of the robot; a focal distance of the camera upon the deflection is calculated based on the coordinates of the robot in the reference coordinate system, the included angle ⁇ ′ between the optical axis of the sight line upon the deflection and the X axis of the reference coordinate system, and the gaze plane; and based on the focal distance between the optical axis of the sight line upon the deflection and the camera upon the deflection, the focus of the camera is controlled to fall in a portion of the plane between the position of the user and the target indicated by the user
  • the embodiment of the present application achieves the following beneficial effects:
  • the method for controlling a robot according to the embodiment of the present application includes: establishing a reference coordinate system; capturing a user's gaze direction of an indicated target, acquiring a sight line angle of the robot, acquiring a position of the robot, acquiring a linear distance between the robot and the user, and calculating in real time a gaze plane in a user's gaze direction relative to the reference coordinate system based on the sight line angle of the robot, the position of the robot and the liner distance between the robot and the user; and smoothly scanning the gaze plane by at least one robot to search for the indicated target in the user's gaze direction.
  • Step 105 A real-time position and a real-time sight line angle of the robot is acquired by the robot during movement of the robot, an intersection line between the real-time sight line angle of the robot and the gaze plane, and a focal distance are determined, and a proximity region of the intersection line is image-scanned based on the focal distance until the indicated target gazed by the user in the gaze plane is identified.
  • Solved coordinates (x, y) are coordinates of intersection H between the real-time optical axis of the sight line of the camera of the robot and the straight line of the user's gaze direction.
  • the calculation is carried out in real time as the robot moves or the camera rotates, for example, 30 times per second (an actual count of times is dependent on a movement or rotation speed of the robot, and the higher the calculation frequency, the more accurate the calculation; however, the calculation load also increases, and the specific calculation frequency is not defined herein).
  • the process of determining the intersection line between the real-time sight line angle of the robot and the gaze plane, and the focal distance, and the process of identifying the indicated target gazed by the user in the gaze plane and moving towards the indicated target to acquire the indicated target are separately controlled.
  • the robot is controlled to move towards the indicated target, such that the indicated target is acquired.
  • the position of the robot is acquired, the coordinates of the robot are calculated, and the straight line between the robot and the user is acquired.
  • a walking route of the robot is planned, and the indicated target is taken to the user.
  • the process of determining the intersection line between the real-time sight line angle of the robot and the gaze plane, and the focal distance, and the process of identifying the indicated target gazed by the user in the gaze plane and moving towards the indicated target to acquire the indicated target may also be controlled in a centralized fashion.
  • the robot image-scans the proximity region of the intersection line based on the focal distance until the indicated target gazed by the user in the gaze plane is identified, and then the robot is controlled to move towards the position where the focal distance (a focal distance for focusing the indicated target) is 0. In this way, the robot reaches the indicated target.
  • the embodiment of the present application achieves the following beneficial effects:
  • the real-time position and the real-time sight line angle of the robot are acquired, and the intersection line of the real-time sight line angle of the robot and the gaze plane, and the focal distance are acquired.
  • the focus of the camera of the robot is controlled to fall in the gaze plane, such that a clear image that may include the indicated target is captured, wherein the image is specifically an image captured for the proximity region of the intersection line.
  • the robot may image-scan the proximity region of the intersection line based on the focal distance, until the indicated target gazed by the user in the gaze plane is identified.
  • the desired parameters may all be acquired in real time, the indicated target may be quickly and accurately identified, and the user experience is better.
  • an angle ⁇ between the user and the optical axis of the sight line of the robot is acquired.
  • an included angle between the direction which the user faces and the optical axis may not be determined by face recognition and analysis.
  • the robot needs to firstly move to the front or the side of the user as long as the robot is capable of seeing the face of the user (it is also probable that the user needs to re-issue an instruction, and the robot needs to re-determine the included angle between the direction which the user faces and the optical axis of the sight line).
  • the included angle between the direction to which the face of the user is opposite and the optical axis may be also be determined, from the back of the user, by image identification based on other algorithms, or other robots around the user are capable of acquiring the linear equation of the straight line of the line between the position of the user and the position of the robot, the included angle and or the linear equation may be likewise synchronized to the robot.
  • the method for controlling a robot further includes the following step:
  • Step 104 The robot includes a networking module configured to network with the robots under the same coordinate systems, wherein the networking module is configured to share data in the gaze plane.
  • the embodiment of the present application achieves the following beneficial effects:
  • the robot is networked to the robots under the same coordinate system, and the robots share the data in the gaze plane; based on states of current processing tasks of a plurality of robots and considering the position of the robots, facing angle and the like, a robot control server may control, by calculation or analysis, one of the robots to search for the indicated target in the user's gaze direction, such that the plurality of robots collaboratively operate.
  • the same coordinate system is the reference coordinate system.
  • a target directory for storing target characteristic information may be set on the robot control server or a memory of the robot.
  • the target characteristic information may be various attributes of the target, for example, name, position, belonging, shape, size, color, price, preference degree of the user, purchase way, purchase reason and the like.
  • the robot may more smartly have a dialog with the user. For example, two persons are talking about a ceramic vase for decoration in a house, the owner may speak to a guest “How about this? I really like it,” while gazing the vase, and the guest may answer “this vase is really beautiful, pretty good, and where did you buy it”? . . . . In a conventional way, the robot may hear the dialogs between the user and the guest, but may not definitely know what article they are talking about. As such, the collected voice information is not useful.
  • the robot knows, by vision, the article that the user is talking about, and associates the voice information with the visual article to know that the user likes this vase very much, where the user purchase this vase, and how much this vase is and the like information. In this way, the robot knows more about the user, and thus may provide more ideal and smarter services for the user.
  • dialogs between the robot and the user, or discussions between the robot and multiple persons are embodied.
  • the method for controlling a robot further includes the following step:
  • Step 108 After the robot judges that the user is chatting, voice data and video data of the user's chatting are collected, subject information of the voice data and the video data of the user's chatting is identified, matching is performed between the updated target characteristic information and the subject information, and voice and video communications with the user are completed based on a matching result.
  • the robot judges that the user is chatting with the robot. For example, when the robot captures that the sight line of the user falls on the head of the robot or the user gazes the robot and calls the name or the like of the robot, it is determined that the user is chatting with the robot.
  • the robot firstly acquires voice data and video data of user's chatting, identifies subject information of the voice data and the video data of the user's chatting, calls the target characteristic information in the updated target directory, and matches the updated target characteristic information with the subject information to output content communicated between the robot and the user, wherein voices and actions may be output.
  • the robot is smarter, and may provide more ideal and smarter services for the user, and thus the user experience may be enhanced.
  • An embodiment of the present application further provides a robot apparatus 400 .
  • the robot apparatus 400 includes a coordinate system establishing module 401 , a capturing and calculating module 402 and a scanning module 403 .
  • the coordinate system establishing module 401 is configured to establish a reference coordinate system.
  • the scanning module 403 is configured to smoothly scan the gaze plane by the robot to search for the indicated target in the user's gaze direction.
  • the robot smoothly scans the gaze plane to search the indicated target, such that the search range is small, the search failure rate is low, and the user experience is better.
  • the robot fails to acquire the line-of-sight line of the user.
  • the robot may capture the user's gaze direction, and calculate in real time the gaze plane of the gaze direction relative to the reference coordinate system. By scanning the gaze plane, the robot may acquire what target the user gazes, and the success rate of searching for the target is improved, and the user experience is enhanced.
  • a determining and identifying module 405 configured to acquire a real-time position and a real-time sight line angle of the robot by the robot during movement of the robot, determine an intersection line between the real-time sight line angle of the robot and the gaze plane, and a focal distance, and image-scan a proximity region of the intersection line based on the focal distance until the indicated target gazed by the user in the gaze plane is identified.
  • the sight line angle of the robot is an optical axis of the sight line of the robot
  • the capturing and calculating module 402 is further configured to:
  • the gaze plane is a plane perpendicular to the ground, and a straight line of the user's gaze direction is obtained by viewing the gaze plane from the top.
  • d is a linear distance between the robot and the user.
  • the real-time sight line angle of the robot is a real-time optical axis of the sight line of the robot
  • the capturing and calculating module 405 is further configured to:
  • A′H is the focal distance.
  • the linear equation and a quadric equation of the straight line of A′H and the straight line of the user's gaze direction are solved in a combination fashion, and solved coordinates (x, y) are coordinates of intersection H between the real-time optical axis of the sight line of the camera of the robot and the straight line of the user's gaze direction.
  • A′H is calculated based on the coordinates of point H and the coordinates of point A′, which is the focal distance.
  • A′H ⁇ square root over (( x ⁇ p ) 2 +( y ⁇ q ) 2 ) ⁇
  • the embodiment of the present application achieves the following beneficial effects:
  • the robot is networked to the robots under the same coordinate system, and the robots share the data in the gaze plane; based on states of current processing tasks of a plurality of robots and considering the position of the robots, facing angle and the like, a robot control server may control, by calculation or analysis, one of the robots to search for the indicated target in the user's gaze direction, such that the plurality of robots collaboratively operate. In this way, the operating efficiency is improved, the user requirements are quickly satisfied, and the user experience is enhanced.
  • the same coordinate system is the reference coordinate system.
  • the robot may perform further operations.
  • the robot apparatus 400 further includes:
  • an extracting and storing module 406 configured to extract target characteristic information of the indicated target, and store the target characteristic information to a target directory.
  • the target characteristic information of the indicated target may be specifically extracted by the graph search method, or extracted by video characteristics, or extracted by voice data of the user.
  • the robot apparatus 400 further includes:
  • the robot apparatus 400 may better help the robot to collect information when the robot has dialogs with two or more people.
  • the identifying and matching module 407 is further configured to acquire and identify the key information of the voice data and the video data of the user, and match the target characteristic information with the key information to find the associated key information and store and update the same.
  • the robot knows, by audition or vision, the article that the user is talking about, and associates the voice information with the article to find the associated key information. That is, the robot knows more about the user, such that the robot is smarter.
  • the robot apparatus 400 further includes:
  • a matching and communicating module 408 configured to: after the robot judges that the user is chatting, collect voice data and video data of the user's chatting, identify subject information of the voice data and the video data of the user's chatting, match the updated target characteristic information with the subject information, and carry out voice and video communications with the user based on a matching result.
  • the robot When the robot has a voice or video communication with the user, the robot firstly acquires voice data and video data of user's chatting, identifies subject information of the voice data and the video data of the user's chatting, calls the target characteristic information in the updated target directory, and matches the updated target characteristic information with the subject information to output content communicated between the robot and the user, wherein voices and actions may be output.
  • the robot is smarter, and thus the user experience may be enhanced.
  • FIG. 7 is a schematic structural diagram of hardware of a robot device according to an embodiment of the present application.
  • the robot device may any robot device 800 suitable for performing the method for controlling a robot.
  • the robot device 800 may be further provided with one or a plurality of power assemblies configured to drive the robot to move along a specific trajectory.
  • the device includes at least one processor 810 and a memory 820 , and FIG. 7 uses one processor 810 as an example.
  • the at least one processor 810 and the memory 820 may be connected via a bus or in another fashion, and FIG. 7 uses the bus as an example.
  • the memory 820 may be configured to store non-volatile software programs, non-volatile computer-executable programs and modules, for example, the program instructions/modules (for example, the coordinate system establishing module 401 , the capturing and calculating module 402 and the scanning module 403 as illustrated in FIG. 5 , the coordinate system establishing module 401 , the capturing and calculating module 402 , the scanning module 403 , the determining and identifying module 405 , the extracting and storing module 406 , the identifying and matching module 407 , and the matching and communicating module 408 as illustrated in FIG. 6 ) corresponding to the methods for controlling a robot in the embodiments of the present application.
  • the non-volatile software programs, instructions and modules stored in the memory 810 when being executed, cause the processor 820 to perform various function applications and data processing of a server, that is, performing the method for controlling a robot in the above method embodiments.
  • the one or more modules are stored in the memory 820 , and when being executed by the at least one processor 810 , perform the method for controlling a robot in any of the above method embodiments.
  • the computer software program may be stored in a computer readable storage medium, wherein the computer software program, when being executed, may perform the steps and processes according to the above method embodiments.
  • the storage medium may be any medium capable of storing program codes, such as read-only memory (ROM), a random access memory (RAM), a magnetic disk, or a compact disc-read only memory (CD-ROM).

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Acoustics & Sound (AREA)
  • Computational Linguistics (AREA)
  • Artificial Intelligence (AREA)
  • Automation & Control Theory (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Ophthalmology & Optometry (AREA)
  • Manipulator (AREA)
  • User Interface Of Digital Computer (AREA)
US16/668,647 2017-04-21 2019-10-30 Method for controlling robot and robot device Active 2038-07-21 US11325255B2 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/081484 WO2018191970A1 (fr) 2017-04-21 2017-04-21 Procédé de commande de robot, appareil robot et dispositif robot

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/081484 Continuation WO2018191970A1 (fr) 2017-04-21 2017-04-21 Procédé de commande de robot, appareil robot et dispositif robot

Publications (2)

Publication Number Publication Date
US20200061822A1 US20200061822A1 (en) 2020-02-27
US11325255B2 true US11325255B2 (en) 2022-05-10

Family

ID=59953867

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/668,647 Active 2038-07-21 US11325255B2 (en) 2017-04-21 2019-10-30 Method for controlling robot and robot device

Country Status (4)

Country Link
US (1) US11325255B2 (fr)
JP (1) JP6893607B2 (fr)
CN (1) CN107223082B (fr)
WO (1) WO2018191970A1 (fr)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6882147B2 (ja) * 2017-11-28 2021-06-02 シュナイダーエレクトリックホールディングス株式会社 操作案内システム
CN109199240B (zh) * 2018-07-24 2023-10-20 深圳市云洁科技有限公司 一种基于手势控制的扫地机器人控制方法及系统
WO2020152778A1 (fr) * 2019-01-22 2020-07-30 本田技研工業株式会社 Corps mobile d'accompagnement
CN109934867B (zh) * 2019-03-11 2021-11-09 达闼机器人有限公司 一种图像讲解的方法、终端和计算机可读存储介质
CN110990594B (zh) * 2019-11-29 2023-07-04 华中科技大学 一种基于自然语言交互的机器人空间认知方法及系统
CN111652103B (zh) * 2020-05-27 2023-09-19 北京百度网讯科技有限公司 室内定位方法、装置、设备以及存储介质
CN111803213B (zh) * 2020-07-07 2022-02-01 武汉联影智融医疗科技有限公司 一种协作式机器人引导定位方法及装置
KR20220021581A (ko) * 2020-08-14 2022-02-22 삼성전자주식회사 로봇 및 이의 제어 방법
CN112507531B (zh) * 2020-11-24 2024-05-07 北京电子工程总体研究所 一种平面空间二对一场景下防守区域扩大方法
CN114566171A (zh) * 2020-11-27 2022-05-31 华为技术有限公司 一种语音唤醒方法及电子设备
CN113359996A (zh) * 2021-08-09 2021-09-07 季华实验室 生活辅助机器人控制系统、方法、装置及电子设备

Citations (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5481622A (en) * 1994-03-01 1996-01-02 Rensselaer Polytechnic Institute Eye tracking apparatus and method employing grayscale threshold values
US5912721A (en) * 1996-03-13 1999-06-15 Kabushiki Kaisha Toshiba Gaze detection apparatus and its method as well as information display apparatus
US6118888A (en) * 1997-02-28 2000-09-12 Kabushiki Kaisha Toshiba Multi-modal interface apparatus and method
US6351573B1 (en) * 1994-01-28 2002-02-26 Schneider Medical Technologies, Inc. Imaging device and method
JP2005028468A (ja) 2003-07-08 2005-02-03 National Institute Of Advanced Industrial & Technology ロボットの視覚座標系位置姿勢同定方法、座標変換方法および装置
US20050175218A1 (en) * 2003-11-14 2005-08-11 Roel Vertegaal Method and apparatus for calibration-free eye tracking using multiple glints or surface reflections
JP2006003263A (ja) 2004-06-18 2006-01-05 Hitachi Ltd 視覚情報処理装置および適用システム
US7043056B2 (en) * 2000-07-24 2006-05-09 Seeing Machines Pty Ltd Facial image processing system
WO2008007781A1 (fr) 2006-07-14 2008-01-17 Panasonic Corporation Dispositif de détection de la direction d'axe visuel et procédé de détection de la direction de ligne visuelle
JP2008254122A (ja) 2007-04-05 2008-10-23 Honda Motor Co Ltd ロボット
JP2009223172A (ja) 2008-03-18 2009-10-01 Advanced Telecommunication Research Institute International 物品推定システム
CN101576384A (zh) 2009-06-18 2009-11-11 北京航空航天大学 一种基于视觉信息校正的室内移动机器人实时导航方法
JP2010112979A (ja) 2008-11-04 2010-05-20 Advanced Telecommunication Research Institute International インタラクティブ看板システム
US8041456B1 (en) * 2008-10-22 2011-10-18 Anybots, Inc. Self-balancing robot including an ultracapacitor power source
CN102323817A (zh) 2011-06-07 2012-01-18 上海大学 一种服务机器人控制平台系统及其多模式智能交互与智能行为的实现方法
US20120038739A1 (en) * 2009-03-06 2012-02-16 Gregory Francis Welch Methods, systems, and computer readable media for shader-lamps based physical avatars of real and virtual people
CN102830793A (zh) 2011-06-16 2012-12-19 北京三星通信技术研究有限公司 视线跟踪方法和设备
CN102915039A (zh) 2012-11-09 2013-02-06 河海大学常州校区 一种仿动物空间认知的多机器人联合目标搜寻方法
US20130083976A1 (en) 2011-10-03 2013-04-04 Qualcomm Incorporated Image-based head position tracking method and system
CN103170980A (zh) 2013-03-11 2013-06-26 常州铭赛机器人科技有限公司 一种家用服务机器人的定位系统及定位方法
CN103264393A (zh) 2013-05-22 2013-08-28 常州铭赛机器人科技有限公司 家用服务机器人的使用方法
CN103761519A (zh) 2013-12-20 2014-04-30 哈尔滨工业大学深圳研究生院 一种基于自适应校准的非接触式视线追踪方法
US8885882B1 (en) * 2011-07-14 2014-11-11 The Research Foundation For The State University Of New York Real time eye tracking for human computer interaction
CN104685541A (zh) 2012-09-17 2015-06-03 感官运动仪器创新传感器有限公司 用于确定三维对象上注视点的方法和装置
JP2015163415A (ja) 2014-02-28 2015-09-10 三井不動産株式会社 ロボット制御システム、ロボット制御サーバ及びロボット制御プログラム
CN104951808A (zh) 2015-07-10 2015-09-30 电子科技大学 一种用于机器人交互对象检测的3d视线方向估计方法
JP2015197329A (ja) 2014-03-31 2015-11-09 三菱重工業株式会社 データ伝送システム、データ伝送装置、データ伝送方法、及びデータ伝送プログラム
US9237844B2 (en) * 2010-03-22 2016-01-19 Koninklijke Philips N.V. System and method for tracking the point of gaze of an observer
JP5891553B2 (ja) 2011-03-01 2016-03-23 株式会社国際電気通信基礎技術研究所 ルートパースペクティブモデル構築方法およびロボット
US20160114488A1 (en) 2014-10-24 2016-04-28 Fellow Robots, Inc. Customer service robot and related systems and methods
JP2016166952A (ja) 2015-03-09 2016-09-15 株式会社国際電気通信基礎技術研究所 コミュニケーションシステム、確認行動決定装置、確認行動決定プログラムおよび確認行動決定方法
CN106294678A (zh) 2016-08-05 2017-01-04 北京光年无限科技有限公司 一种智能机器人的话题发起装置及方法
US20170102768A1 (en) 2014-03-26 2017-04-13 Microsoft Technology Licensing, Llc Eye gaze tracking using binocular fixation constraints
US10157313B1 (en) * 2014-09-19 2018-12-18 Colorado School Of Mines 3D gaze control of robot for navigation and object manipulation
US11199898B2 (en) * 2018-06-27 2021-12-14 SentiAR, Inc. Gaze based interface for augmented reality environment

Patent Citations (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6351573B1 (en) * 1994-01-28 2002-02-26 Schneider Medical Technologies, Inc. Imaging device and method
US5481622A (en) * 1994-03-01 1996-01-02 Rensselaer Polytechnic Institute Eye tracking apparatus and method employing grayscale threshold values
US5912721A (en) * 1996-03-13 1999-06-15 Kabushiki Kaisha Toshiba Gaze detection apparatus and its method as well as information display apparatus
US6118888A (en) * 1997-02-28 2000-09-12 Kabushiki Kaisha Toshiba Multi-modal interface apparatus and method
US7043056B2 (en) * 2000-07-24 2006-05-09 Seeing Machines Pty Ltd Facial image processing system
JP2005028468A (ja) 2003-07-08 2005-02-03 National Institute Of Advanced Industrial & Technology ロボットの視覚座標系位置姿勢同定方法、座標変換方法および装置
US7809160B2 (en) * 2003-11-14 2010-10-05 Queen's University At Kingston Method and apparatus for calibration-free eye tracking using multiple glints or surface reflections
US20050175218A1 (en) * 2003-11-14 2005-08-11 Roel Vertegaal Method and apparatus for calibration-free eye tracking using multiple glints or surface reflections
JP2006003263A (ja) 2004-06-18 2006-01-05 Hitachi Ltd 視覚情報処理装置および適用システム
WO2008007781A1 (fr) 2006-07-14 2008-01-17 Panasonic Corporation Dispositif de détection de la direction d'axe visuel et procédé de détection de la direction de ligne visuelle
JP2008254122A (ja) 2007-04-05 2008-10-23 Honda Motor Co Ltd ロボット
JP2009223172A (ja) 2008-03-18 2009-10-01 Advanced Telecommunication Research Institute International 物品推定システム
US8041456B1 (en) * 2008-10-22 2011-10-18 Anybots, Inc. Self-balancing robot including an ultracapacitor power source
JP2010112979A (ja) 2008-11-04 2010-05-20 Advanced Telecommunication Research Institute International インタラクティブ看板システム
US20120038739A1 (en) * 2009-03-06 2012-02-16 Gregory Francis Welch Methods, systems, and computer readable media for shader-lamps based physical avatars of real and virtual people
CN101576384A (zh) 2009-06-18 2009-11-11 北京航空航天大学 一种基于视觉信息校正的室内移动机器人实时导航方法
US9237844B2 (en) * 2010-03-22 2016-01-19 Koninklijke Philips N.V. System and method for tracking the point of gaze of an observer
JP5891553B2 (ja) 2011-03-01 2016-03-23 株式会社国際電気通信基礎技術研究所 ルートパースペクティブモデル構築方法およびロボット
CN102323817A (zh) 2011-06-07 2012-01-18 上海大学 一种服务机器人控制平台系统及其多模式智能交互与智能行为的实现方法
CN102830793A (zh) 2011-06-16 2012-12-19 北京三星通信技术研究有限公司 视线跟踪方法和设备
US8885882B1 (en) * 2011-07-14 2014-11-11 The Research Foundation For The State University Of New York Real time eye tracking for human computer interaction
US20130083976A1 (en) 2011-10-03 2013-04-04 Qualcomm Incorporated Image-based head position tracking method and system
CN104685541A (zh) 2012-09-17 2015-06-03 感官运动仪器创新传感器有限公司 用于确定三维对象上注视点的方法和装置
CN102915039A (zh) 2012-11-09 2013-02-06 河海大学常州校区 一种仿动物空间认知的多机器人联合目标搜寻方法
CN103170980A (zh) 2013-03-11 2013-06-26 常州铭赛机器人科技有限公司 一种家用服务机器人的定位系统及定位方法
CN103264393A (zh) 2013-05-22 2013-08-28 常州铭赛机器人科技有限公司 家用服务机器人的使用方法
CN103761519A (zh) 2013-12-20 2014-04-30 哈尔滨工业大学深圳研究生院 一种基于自适应校准的非接触式视线追踪方法
JP2015163415A (ja) 2014-02-28 2015-09-10 三井不動産株式会社 ロボット制御システム、ロボット制御サーバ及びロボット制御プログラム
US20170102768A1 (en) 2014-03-26 2017-04-13 Microsoft Technology Licensing, Llc Eye gaze tracking using binocular fixation constraints
JP2015197329A (ja) 2014-03-31 2015-11-09 三菱重工業株式会社 データ伝送システム、データ伝送装置、データ伝送方法、及びデータ伝送プログラム
US10157313B1 (en) * 2014-09-19 2018-12-18 Colorado School Of Mines 3D gaze control of robot for navigation and object manipulation
US20160114488A1 (en) 2014-10-24 2016-04-28 Fellow Robots, Inc. Customer service robot and related systems and methods
JP2016166952A (ja) 2015-03-09 2016-09-15 株式会社国際電気通信基礎技術研究所 コミュニケーションシステム、確認行動決定装置、確認行動決定プログラムおよび確認行動決定方法
CN104951808A (zh) 2015-07-10 2015-09-30 电子科技大学 一种用于机器人交互对象检测的3d视线方向估计方法
CN106294678A (zh) 2016-08-05 2017-01-04 北京光年无限科技有限公司 一种智能机器人的话题发起装置及方法
US11199898B2 (en) * 2018-06-27 2021-12-14 SentiAR, Inc. Gaze based interface for augmented reality environment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
1st Office Action dated Oct. 27, 2020 by the JP Office; Appln.No.2019-554521.
International Search Report dated Jan. 25, 2018; PCT/CN2017/081484.

Also Published As

Publication number Publication date
JP6893607B2 (ja) 2021-06-23
JP2020520308A (ja) 2020-07-09
WO2018191970A1 (fr) 2018-10-25
CN107223082A (zh) 2017-09-29
US20200061822A1 (en) 2020-02-27
CN107223082B (zh) 2020-05-12

Similar Documents

Publication Publication Date Title
US11325255B2 (en) Method for controlling robot and robot device
US10791409B2 (en) Improving a user experience localizing binaural sound to an AR or VR image
JP4976903B2 (ja) ロボット
JP4460528B2 (ja) 識別対象識別装置およびそれを備えたロボット
US10466777B2 (en) Private real-time communication between meeting attendees during a meeting using one or more augmented reality headsets
US8761933B2 (en) Finding a called party
US20130342652A1 (en) Tracking and following people with a mobile robotic device
CN110032982B (zh) 机器人引路方法、装置、机器人和存储介质
US20090241039A1 (en) System and method for avatar viewing
CN108681399A (zh) 一种设备控制方法、装置、控制设备及存储介质
CN109819400B (zh) 用户位置的查找方法、装置、设备及介质
EP3287745B1 (fr) Procédé et dispositif d'interaction d'informations
JP2004078316A (ja) 姿勢認識装置及び自律ロボット
US20160014180A1 (en) Method and apparatus for processing multi-terminal conference communication
US20150234541A1 (en) Projection method and electronic device
CN106231189A (zh) 拍照处理方法和装置
US20180077356A1 (en) System and method for remotely assisted camera orientation
JP7279646B2 (ja) 情報処理装置、情報処理方法及びプログラム
JP7187768B2 (ja) カメラ装置、カメラ装置制御システム、及びプログラム
JP2018041231A (ja) 接客支援プログラム、接客支援方法、接客支援システムおよび情報処理装置
CN112507829B (zh) 一种多人视频手语翻译方法及系统
Tee et al. Gesture-based attention direction for a telepresence robot: Design and experimental study
EP3994613A1 (fr) Appareil de traitement d'informations, procédé de traitement d'informations et programme
US11875080B2 (en) Object sharing method and apparatus
CN110730378A (zh) 一种信息处理方法及系统

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

AS Assignment

Owner name: CLOUDMINDS (SHENZHEN) ROBOTICS SYSTEMS CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:LUO, LEI;REEL/FRAME:050876/0651

Effective date: 20191015

STPP Information on status: patent application and granting procedure in general

Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

AS Assignment

Owner name: CLOUDMINDS ROBOTICS CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CLOUDMINDS (SHENZHEN) ROBOTICS SYSTEMS CO., LTD.;REEL/FRAME:055620/0516

Effective date: 20210302

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE