CN107315414A - A kind of method, device and the robot of control machine people walking - Google Patents

A kind of method, device and the robot of control machine people walking Download PDF

Info

Publication number
CN107315414A
CN107315414A CN201710576609.7A CN201710576609A CN107315414A CN 107315414 A CN107315414 A CN 107315414A CN 201710576609 A CN201710576609 A CN 201710576609A CN 107315414 A CN107315414 A CN 107315414A
Authority
CN
China
Prior art keywords
target
robot
information
image
follow
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710576609.7A
Other languages
Chinese (zh)
Other versions
CN107315414B (en
Inventor
齐欧
陈召强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Smart Technology (beijing) Co Ltd
Lingdong Technology Beijing Co Ltd
Original Assignee
Smart Technology (beijing) Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Smart Technology (beijing) Co Ltd filed Critical Smart Technology (beijing) Co Ltd
Priority to CN201710576609.7A priority Critical patent/CN107315414B/en
Publication of CN107315414A publication Critical patent/CN107315414A/en
Application granted granted Critical
Publication of CN107315414B publication Critical patent/CN107315414B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides method, device and the robot of a kind of control machine people walking, wherein, the robot bag image acquisition device, this method includes:Receive image acquisition device collection follow target image after, according to the width for following true environment that target image and predetermined image acquisition device gather, it is determined that following the first range information of target opposed robots' horizontal-shift;Receive the robot for following target to send and follow the second distance information between target;According to the first range information and second distance information, determine robot and follow the first orientation information between target;According to second distance information and first orientation information, generation travelling control instruction;Instructed according to travelling control, control machine people, which follows, described follows target to walk.In the present invention, the range information and azimuth information determined are more accurate so that robot, which can be followed accurately, above-mentioned follows target.

Description

A kind of method, device and the robot of control machine people walking
Technical field
The present invention relates to robotic technology field, a kind of walked in particular to control machine people method, device and Robot.
Background technology
With the continuous development of scientific technology, robot is widely used in daily life, medical treatment, agricultural, work In the fields such as industry production., it is necessary to be tracked using robot to target in some application fields.
In order to realize tracking of the robot to target, robot needs to determine the distance between itself and target, side in real time The information such as position.In the prior art, robot can typically set image acquisition device when being tracked to target in robot, By the image of image acquisition device photographic subjects, estimated according to the size of target on image with this robot itself and target it Between distance, and pass through the orientation between the distance calculation device people itself of estimation and target.
But, the information error such as the distance between the robot itself determined using the above method and target, orientation compared with Greatly, robot is caused can not accurately to follow target.
The content of the invention
In view of this, the purpose of the embodiment of the present invention is to provide a kind of method, device and the machine of control machine people walking Device people, to solve by method of the prior art, the robot determined itself is with following the distance between target, orientation to believe Cease error larger, cause the problem of robot accurately can not follow target.
In a first aspect, the embodiments of the invention provide a kind of method of control machine people walking, wherein, in the robot Image acquisition device is provided with, methods described includes:
Receive described image collector collection follow target image after, according to it is described follow target image and in advance The width of the true environment of the described image collector collection of determination, it is determined that described follow the relatively described robot level of target inclined The first range information moved;
The robot and the second distance information followed between target that target is sent are followed described in receiving;
According to first range information and the second distance information, determine the robot with it is described follow target it Between first orientation information;
According to the second distance information and the first orientation information, generation travelling control instruction;
Instructed according to the travelling control, control the robot to follow and described follow target to walk.
With reference in a first aspect, the embodiments of the invention provide the possible implementation of the first of above-mentioned first aspect, its In, the width of target image and the true environment of predetermined described image collector collection is followed described in the basis, really The first range information of the relatively described robot horizontal-shift of target is followed described in fixed, including:
It is determined that it is described follow in target image follow the relatively described level for following target image center of target inclined Distance is moved, and the width of target image is followed described in determination;
According to the horizontal-shift distance, the width of the width and the true environment for following target image, calculate First range information.
With reference in a first aspect, the embodiments of the invention provide the possible implementation of second of above-mentioned first aspect, its In, receive described image collector collection follow target image before, methods described also includes:
Receive the ambient image in the described image collector coverage of described image collector collection;
According to the ambient image, it is determined that described follow target.
With reference to second of possible implementation of first aspect, the embodiments of the invention provide the of above-mentioned first aspect Three kinds of possible implementations, wherein, it is described according to the ambient image, follow target it is determined that described, including:
Detect the number of the target in the ambient image;
When only existing a target in detecting the ambient image, the target is defined as described to follow target;
When there is at least two targets in detecting the ambient image, according to each target in the ambient image and The distance between center of the ambient image follows target described in determining.
With reference to the third possible implementation of first aspect, the embodiments of the invention provide the of above-mentioned first aspect Four kinds of possible implementations, wherein, can not determined according to the ambient image described in when following target,
The location information for following target to send described in receiving, the location information follows target and the machine including described The 3rd range information and second orientation information between people;
According to being determined the 3rd range information followed between target and the robot and second orientation information Follow target.
With reference in a first aspect, the embodiments of the invention provide the possible implementation of the 5th of above-mentioned first aspect kind, its In, methods described also includes:
The status data of the robot is obtained, the status data includes the acceleration information of the robot, angle speed Degrees of data and magnetic induction information;
According to the status data, judge whether the state of the robot is abnormal;
When judging the abnormal state of the robot, target is followed to send warning information to described, and control the machine Device people stops walking.
With reference in a first aspect, the embodiments of the invention provide the possible implementation of the 6th of above-mentioned first aspect kind, its In, detection of obstacles part is provided with the robot;
Methods described also includes:
During the robot ambulation, the detection of obstacles part of the detection of obstacles part collection is obtained The obstacle information in front;
According to the obstacle information, judge whether that the robot can be controlled to hide the barrier;
If it is, according to the obstacle information, controlling the robot to hide the barrier;
Otherwise, control the robot to stop walking, and follow target to send cue to described.
With reference to the 6th kind of possible implementation of first aspect, the embodiments of the invention provide the of above-mentioned first aspect Seven kinds of possible implementations, wherein, the detection of obstacles part at least includes the first detection of obstacles sensor and second Detection of obstacles sensor, the first detection of obstacles sensor is arranged on the left side of the robot front end, described second Detection of obstacles sensor is arranged on the right side of the robot front end;
It is described to control the robot to hide the barrier according to the obstacle information, including:Including:
When receiving the first obstacle information that the first detection of obstacles sensor is sent, the robot is controlled Turn right first angle;
When receiving the second obstacle information that the second detection of obstacles sensor is sent, the robot is controlled Turn left second angle;
When the first obstacle information and second barrier that receive the first detection of obstacles sensor transmission During the second obstacle information that detection sensor is sent, the robot is controlled to rotate the third angle to the direction for following target Degree.
With reference in a first aspect, the embodiments of the invention provide the possible implementation of the 8th of above-mentioned first aspect kind, its In, it is described to follow target to be user;
Methods described also includes:
During the robot ride, the images of gestures of the user of described image collector collection is obtained;
The gesture information of the user is determined according to the images of gestures of the user;
According to the gesture information of the user, the walking states of the robot are controlled.
With reference in a first aspect, the embodiments of the invention provide the possible implementation of the 9th of above-mentioned first aspect kind, its In, described to follow target to be user, the user carries the terminal device matched with the robot;
Methods described also includes:
During the robot ride, the control instruction that the user is sent by the terminal device is received;
According to the control instruction, the walking states of the robot are controlled.
Second aspect, the embodiments of the invention provide a kind of device of control machine people walking, wherein, in the robot Image acquisition device is provided with, described device includes:
First determining module, for receive described image collector collection follow target image after, according to described The width of target image and the true environment of predetermined described image collector collection is followed, it is determined that described follow target figure As the first range information of relatively described robot horizontal-shift;
Receiving module, for receive the robot for following target to send and it is described follow between target second Range information;
Second determining module, for according to first range information and the second distance information, determining the machine People and the first orientation information followed between target;
Generation module, for according to the second distance information and the first orientation information, generation travelling control instruction;
Control module, for being instructed according to the travelling control, controls the robot to follow and described follows target to walk.
With reference to second aspect, the embodiments of the invention provide the possible implementation of the first of above-mentioned second aspect, its In, first determining module includes:
First determining unit, for determining that described follow in target image follows target is relatively described to follow target image The horizontal-shift distance of center, and the width of target image is followed described in determination;
Computing unit, for according to the horizontal-shift distance, the width and the true ring for following target image The width in border, calculates first range information.
With reference to second aspect, the embodiments of the invention provide the possible implementation of the third of above-mentioned second aspect, its In, the receiving module is additionally operable to, and receives the environment in the described image collector coverage of described image collector collection Image;
Described device also includes:
3rd determining module, for according to the ambient image, it is determined that described follow target.
With reference to second of possible implementation of second aspect, the embodiments of the invention provide the of above-mentioned second aspect Three kinds of possible implementations, wherein, the 3rd determining module includes:
Detection unit, the number for detecting the target in the ambient image;
Second determining unit, for when only existing a target in detecting the ambient image, the target to be determined Target is followed to be described;It is additionally operable to, when there is at least two targets in detecting the ambient image, according to the environment map As in target is followed described in the determination of the distance between center of each target and the ambient image.
With reference to the third possible implementation of second aspect, the embodiments of the invention provide the of above-mentioned second aspect Four kinds of possible implementations, wherein, described device also includes the 4th determining module;
The receiving module, is additionally operable to the location information for following target to send described in receiving, and the location information includes institute State and follow the 3rd range information and second orientation information between target and the robot;
4th determining module, for followed according to the 3rd range information between target and the robot and Second orientation information follows target described in determining.
With reference to second aspect, the embodiments of the invention provide the possible implementation of the 5th of above-mentioned second aspect kind, its In, described device also includes:
First acquisition module, the status data for obtaining the robot, the status data includes the robot Acceleration information, angular velocity data and magnetic induction data;
First judge module, for according to the status data, judging whether the state of the robot is abnormal;
The control module is additionally operable to, when judging the abnormal state of the robot, follows target to send police to described Notify breath, and control the robot to stop walking.
With reference to second aspect, the embodiments of the invention provide the possible implementation of the 6th of above-mentioned second aspect kind, its In, detection of obstacles part is provided with the robot;
Described device also includes:
Second acquisition module, for during the robot ambulation, obtaining the detection of obstacles part collection Obstacle information in front of the detection of obstacles part;
Second judge module, for according to the obstacle information, judging whether that the robot can be controlled to hide institute State barrier;
The control module is additionally operable to, when robot can be with avoiding obstacles, according to the obstacle information, controls institute State robot and hide the barrier;Be additionally operable to, the robot can not avoiding obstacles when, control the robot to stop Walking, and follow target to send cue to described.
With reference to the 6th kind of possible implementation of second aspect, the embodiments of the invention provide the of above-mentioned second aspect Seven kinds of possible implementations, wherein, the detection of obstacles part at least includes the first detection of obstacles sensor and second Detection of obstacles sensor, the first detection of obstacles sensor is arranged on the left side of the robot front end, described second Detection of obstacles sensor is arranged on the right side of the robot front end;
The control module, including:
First control unit, for when the first obstacle information for receiving the first detection of obstacles sensor transmission When, control the robot to turn right first angle;
Second control unit, for when the second obstacle information for receiving the second detection of obstacles sensor transmission When, control the robot to turn left second angle;
3rd control unit, for when the first obstacle information for receiving the first detection of obstacles sensor transmission During the second obstacle information sent with the second detection of obstacles sensor, the robot is controlled to follow target to described Direction rotate third angle.
With reference to second aspect, the embodiments of the invention provide the possible implementation of the 8th of above-mentioned second aspect kind, its In, it is described to follow target to be user;
Described device also includes:
3rd acquisition module, for during the robot ride, obtaining the described of described image collector collection The images of gestures of user;
5th determining module, the gesture information for determining the user according to the images of gestures of the user;
The control module is additionally operable to, according to the gesture information of the user, is controlled the robot to perform and is used with described The corresponding action of the gesture information at family.
With reference to second aspect, the embodiments of the invention provide the possible implementation of the 9th of above-mentioned second aspect kind, its In, described to follow target to be user, the user carries the terminal device matched with the robot;
The receiving module is additionally operable to, during the robot ride, is received the user and is set by the terminal The control instruction that preparation is sent;
The control module is additionally operable to, according to the control instruction, controls the robot to perform and the control instruction Corresponding action.
The third aspect, the embodiments of the invention provide a kind of robot, wherein, including robot body, be arranged on it is described The dress of control machine people walking described in image acquisition device, transceiver and above-mentioned any one of second aspect on robot body Put.
With reference to second aspect, the embodiments of the invention provide the possible implementation of the first of above-mentioned second aspect, its In, in addition to detection of obstacles part and attitude transducer.
In method, device and the robot that control machine people provided in an embodiment of the present invention walks, pass through IMAQ Device collection follows the image of target, and follow robot that the positioner in target sends and follow between target away from From information, robot is determined jointly and the distance between target information and first orientation information is followed, the distance letter determined Breath and first orientation information are more accurate so that robot, which can be followed accurately, above-mentioned follows target.
To enable the above objects, features and advantages of the present invention to become apparent, preferred embodiment cited below particularly, and coordinate Appended accompanying drawing, is described in detail below.
Brief description of the drawings
In order to illustrate the technical solution of the embodiments of the present invention more clearly, below will be attached to what is used required in embodiment Figure is briefly described, it will be appreciated that the following drawings illustrate only certain embodiments of the present invention, therefore is not construed as pair The restriction of scope, for those of ordinary skill in the art, on the premise of not paying creative work, can also be according to this A little accompanying drawings obtain other related accompanying drawings.
Fig. 1 shows the flow chart of the method for the control machine people walking that one embodiment of the invention is provided;
Fig. 2 is shown in the method for the control machine people walking that one embodiment of the invention is provided, it is determined that following target Flow chart;
Fig. 3 shown in the method for the control machine people walking that one embodiment of the invention provided, the row of control machine people Sail the flow chart of state;
Fig. 4 shows the flow chart of the method for the control machine people walking that further embodiment of this invention is provided;
Fig. 5 shows the flow chart of the device for the control machine people walking that another embodiment of the present invention is provided;
Fig. 6 shows the structural representation for the robot that another embodiment of the present invention is provided.
Embodiment
To make the purpose, technical scheme and advantage of the embodiment of the present invention clearer, below in conjunction with the embodiment of the present invention Middle accompanying drawing, the technical scheme in the embodiment of the present invention is clearly and completely described, it is clear that described embodiment is only It is a part of embodiment of the invention, rather than whole embodiments.The present invention being generally described and illustrated herein in the accompanying drawings is real Applying the component of example can be arranged and be designed with a variety of configurations.Therefore, it is of the invention to what is provided in the accompanying drawings below The detailed description of embodiment is not intended to limit the scope of claimed invention, but is merely representative of the selected reality of the present invention Apply example.Based on embodiments of the invention, the institute that those skilled in the art are obtained on the premise of creative work is not made There is other embodiment, belong to the scope of protection of the invention.
One embodiment of the invention provides a kind of method that control machine people walks, with reference to shown in Fig. 1, and this method includes step Rapid S110-S150, it is specific as follows.
S110, receive image acquisition device collection follow target image after, follow target image and pre- according to above-mentioned Width between the true environment of the described image collector collection first determined, it is determined that described follow the relatively described robot of target First range information of horizontal-shift.
It is wherein, above-mentioned that to follow target can be people, can also be the arbitrary targets such as other robots, vehicle.
The executive agent of the method for control machine people walking provided in an embodiment of the present invention is the control being arranged in robot Device processed.
In embodiments of the present invention, image acquisition device is provided with robot, further, for the ease of IMAQ Device collection follows target image, and the image acquisition device is arranged on the front end of robot.So, follow what target was walked in robot During, then it can gather the image for following target.
Further, in above-mentioned steps S110, it is determined that the first range information of target opposed robots' horizontal-shift is followed, Realized by following process:
It is determined that the horizontal-shift distance for following target to follow target image center relatively in target image is followed, with And determine to follow the width of target image;According to above-mentioned horizontal-shift distance, follow the width and true environment of target image Width, calculates above-mentioned first range information.
Wherein, above-mentioned horizontal-shift distance can be horizontal-shift pixel, it is above-mentioned follow target image width can be Follow the image pixel width of target image.
Specifically, by horizontal-shift distance, following the width of target image and the width of true environment, pass through following public affairs Formula calculates above-mentioned first range information:
Wherein, in above-mentioned formula, S2Represent be the first range information, what x was represented be follow in target image with Follow the horizontal-shift distance of target image center, w relatively with targetVeryWhat is represented is the width of true environment, wFigureRepresent Be the width for following image.
In above-mentioned steps S110, when receive follow target image after, extract above-mentioned follow mesh is followed in target image Target feature, and determine to follow in target image and follow the position of target, and determine the center for following target image Position, calculates and follows target relative to the horizontal offset of the above-mentioned center for following target image.
Further, when computationally stating the first range information, if on target image is followed, following target and image Center will be followed not at same level height, then target move to downwards with the sustained height of center, can be with Calculating follows target and follows certain point of the center of target image at sustained height apart from the center of the image Distance, the range information is defined as the first range information.
Further, the width of the true environment of above-mentioned image acquisition device collection can be calculated as follows:
Before method provided in an embodiment of the present invention is performed, determined by the positioner carried on target is followed Robot and the vertical range before target is followed, the distance is designated as L, according to the image acquisition device being arranged in robot Horizontal view angle θ, the width for the true application scenarios that image acquisition device can be shot under these conditions is calculated by equation below:
wVery=2*L*tan θ
Specifically, in embodiments of the present invention, above-mentioned image acquisition device can in real time gather and follow target image, can also be every Interval preset time collection once follows target image, wherein, the preset time can be between 0.1 second, the random time such as 0.2 second Every.
S120, receives the robot for following target to send and follows the second distance information between target.
Above-mentioned second distance information refers to robot and follows the second distance value between target.
In embodiments of the present invention, terminal device is carried on target is followed, positioning dress is provided with the terminal device Put, the positioner can be wireless carrier communication (Ultra Wideband, UWB) positioner.
In embodiments of the present invention, transceiver and communication component are set in robot;
Above-mentioned transceiver is used to receive the very first time stamp data that the positioner being placed in terminal device is sent, and is receiving To after very first time stamp data, the second time stamp data is sent to positioner, is led to specifically, transceiver and positioner pass through Believe part connection, above-mentioned positioner stabs data, the second time stamp data and the time stamp data of setting according to the very first time Transmission speed, determines and follows the distance between target and robot.
And the distance is sent to transceiver, the device of control machine people walking is transferred to by transceiver.
Also, wireless communication module is additionally provided with robot, controller is by wireless communication module and follows target On terminal device set up communication connection, receiving terminal apparatus send second distance information.
Be arranged at follow terminal device in target can in real time or regular positioning follow between target and robot second Range information, and the second distance information is sent to robot.
Above-mentioned terminal device can be bracelet, mobile phone etc..
In addition, the size and robot that target can be followed on image according to following and distance between target is followed Corresponding relation, rough estimation robot and follows the second distance information between target.
S130, according to the first range information and second distance information, determines robot and follows the first party between target Position information.
Specifically, the first orientation information refers to robot and follows the azimuth between target, i.e. current robot With following the line between target to deviate the angle immediately ahead of robot.
In a preferred embodiment, by equation below calculating robot and the orientation between target can be followed Angle:
Wherein, in above-mentioned formula, α refers to robot and follows the azimuth between target, s1Refer to robot With following the second distance information between target, s2Refer to the first range information.
S140, according to second distance information and first orientation information, generation travelling control instruction.
Robot in the process of walking, remains and followed the distance between target and first orientation Information invariability, when Follow target to front travel when, follow target often to travel a step, robot is with following the distance before target to become Change, in order to keep robot with following the distance between target constant, robot is also required to identical distance of walking forward, specifically , robot can be previously stored with controller and follow the fixed range information between target, also, in real time or periodically Gather robot and follow the second distance information between target, by the difference between second distance information and fixed range information It is defined as the distance that robot needs to walk, at this moment, controller can generate the travelling control instruction that robot walks to front, And the distance that robot walks to front is carried in the instruction, the distance is with following the distance phase that target is travelled forwards Together.
When following target not walked to front, at this moment, following target, often row makes a move, and robot is with following target The distance between and first orientation information change, in order to keep robot and follow the distance between target information and the One azimuth information keep it is constant, at this moment, robot need with follow target walking direction with apart from consistent, specifically, in machine Robot is previously stored with the controller of device people and the distance between target information and first orientation information is followed, also, it is real When or taken at regular intervals robot and follow the second distance information and first orientation information between target, according to the second of collection away from From information and first orientation information, and the range information and first orientation information prestored, determine that robot needs walking Distance and bearing, controller can generate control machine people traveling travelling control instruction, the travelling control instruction in, carry There is robot to need the direction for the distance and walking walked.
S150, is instructed according to above-mentioned travelling control, and control machine people, which follows, above-mentioned follows target to walk.
Specifically, in embodiments of the present invention, receive image acquisition device collection follow image before, in addition it is also necessary to it is determined that Go out to follow target, with reference to shown in Fig. 2, it is determined that following the specific implementation of target includes step S210-S220, it is specific as follows:
S210, receives the ambient image in the image acquisition device coverage of image acquisition device collection;
S220, according to above-mentioned ambient image, it is determined that following target.
Specifically, in embodiments of the present invention, the init state of robot is off-mode, and controller can real-time judge Whether signal input is received, and signal input can be inputted by external equipment (bracelet that such as user wears), Can be the key-press input by being arranged in robot, when controller has determined signal input, then control machine people Armed state is entered from off-mode, if never signal is inputted, robot remains off-mode.
After robot enters armed state, controller control image acquisition device starts to gather image acquisition device coverage Interior ambient image, and according to the ambient image, it is determined that following target, detailed process is as follows:
Detect the number of the target in above-mentioned ambient image;A target is only existed in above-mentioned ambient image is detected When, the target is defined as following target;When there is at least two targets in detecting above-mentioned environment head portrait, according to above-mentioned ring The determination of the distance between center of each target and above-mentioned ambient image is above-mentioned in the image of border follows target.
Above-mentioned image acquisition region coverage refers to the pickup area of image acquisition device.
Further, in embodiments of the present invention, clarification of objective first in extraction environment image, and according to the ring of extraction The feature of border image, judges there are several targets in ambient image, should if only existing a target in the ambient image Target is defined as following target;When there is at least two targets in ambient image, then each target range environment is calculated respectively The distance of the center of image, a minimum target of the distance is defined as following target.
Such as, when controller, which is detected, there is 3 targets in above-mentioned ambient image, A targets, B mesh are designated as respectively Mark and C targets, O is designated as by the center of above-mentioned ambient image, then respectively between calculating A targets, B targets and C targets and O points Distance, and it is relatively more above-mentioned calculate is three distances, the minimum target of distance between O points is defined as following target.
But, if it is equal with the distance of the center of ambient image respectively to there are multiple targets in above-mentioned ambient image Situation when, then can not according in ambient image the distance between center of each target and ambient image determine follow mesh Mark.
Therefore, for the above-mentioned situation that can not determine to follow target, in method provided in an embodiment of the present invention, also wrap Include:
Receive follow target send location information, the location information include follow between target and robot the 3rd away from From information and second orientation information;According to the 3rd range information and the second orientation information followed between target and robot, really Surely target is followed.
Determine specifically, following and UWB being provided with the terminal device for carrying and matching with robot in target, terminal device Position device, its 3rd range information and second orientation information between robot, root can be determined by the positioner It can aid in determining according to the information and follow target.
Such as, determine two according to the distance between center of each target and ambient image in ambient image Target is equal with the distance between the center of ambient image, and a target is located at the left side of center, a target Right side positioned at center, at this moment, believes according to the 3rd range information and second orientation followed between target and center Breath, it is to be located at the target on the left of center or the target on the right side of center that can go out to follow target with auxiliary judgment.
Above-mentioned target can be people, robot or other objects etc..
Further, in embodiments of the present invention, in order to prevent robot operationally by embrace walk or go up a slope at present It is sliding, controller can also in real time or periodic detection robot state, specifically include:
The status data of robot is obtained, the status data includes acceleration information, angular velocity data and the magnetic of robot Sensed data;According to above-mentioned status data, judge whether the state of robot is abnormal;When judging the abnormal state of robot, Warning information, and control machine people stopping walking being sent to target is followed.
Specifically, above-mentioned robotary then refers to that robot is embraced or downslide etc. on upward trend extremely, when It was found that when robotary is abnormal, by the wireless communication module that is provided with robot to the terminal device for following target to carry Warning information is sent, the current state for following target to learn robot can be allowed, in addition, when robotary is abnormal, can also Control machine people stops walking, specifically, can be by the motor locking of control machine people, to realize that control machine people stops row Walk.
Further, in embodiments of the present invention, detection of obstacles part is additionally provided with robot, the detection of obstacles Part can be arranged on the front end of robot, for during robot ambulation, gathering the obstacle information in front of robot; In addition, the detection of obstacles part can also be arranged on behind robot, for robot when retreating, after detection robot Obstacle information on the route of retreat, therefore, method provided in an embodiment of the present invention, in addition to:
During robot ambulation, the barrier in front of the detection of obstacles part of detection of obstacles part collection is obtained Information;According to above-mentioned obstacle information, judge whether can control machine people hide above-mentioned barrier;If it is, according to above-mentioned Obstacle information, control machine people's avoiding barrier, otherwise, control machine people stop walking, and are pointed out to following target to send Signal.
The front end for being arranged on robot regardless of above-mentioned detection of obstacles part is also provided in the rear end of robot, barrier What detection part was detected is the obstacle information in front of shield elements.
If controller judges that avoidance can not be carried out according to above-mentioned obstacle information, at this moment, controller can control machine people Stop walking, and to following target to send cue, to point out to follow target robot current due to can not avoiding barrier And stop walking.
Further, controller control machine people stop walk when, can by the motor locking of control machine people come Realize.
It is arranged on by above-mentioned detection of obstacles part in the embodiment of the front end of robot, above-mentioned detection of obstacles portion Part at least includes the first detection of obstacles sensor and the second detection of obstacles sensor, and the first detection of obstacles sensor is set Left side in robot front end, the second detection of obstacles sensor is arranged on the right side of robot front end;
Above-mentioned first detection of obstacles sensor is used to detect the obstacle information in front of at its set location, above-mentioned second The obstacle information that detection of obstacles sensor is used in front of at detector set location.
Above-mentioned obstacle information can include following information:The distance of obstacle distance robot, height of barrier etc. Information.
Specifically, above-mentioned according to obstacle information, control machine people's avoiding barrier, including:
When receiving the first obstacle information of the first detection of obstacles sensor transmission, control machine people turns right First angle;
When receiving the second obstacle information of the second detection of obstacles sensor transmission, control machine people turns left Second angle;
When the first obstacle information and the second detection of obstacles sensing that receive the transmission of the first detection of obstacles sensor Device send the second obstacle information when, control machine people to follow target direction rotate third angle.
If specifically, the above-mentioned left front for following target in robot, when the first detection of obstacles sensor and second When detection of obstacles sensor is detected simultaneously by obstacle information, control machine people turns left third angle, until wherein one Untill when individual detection of obstacles sensor can not detect obstacle information;If above-mentioned follow target before the right side of robot Just, when the first detection of obstacles sensor and the second detection of obstacles sensor are detected simultaneously by obstacle information, control machine Device people turns right third angle, untill when one of detection of obstacles sensor can not detect obstacle information.
When above-mentioned detection of obstacles part is arranged on behind robot, placed obstacles thing using above-mentioned identical mode Detection part, and avoidance is carried out using above-mentioned identical mode.
In a kind of embodiment, above-mentioned first detection of obstacles sensor can be infrared sensor or ultrasonic wave Sensor;Above-mentioned second detection of obstacles sensor can be infrared sensor or ultrasonic sensor, specifically, above-mentioned first Detection of obstacles sensor and the second detection of obstacles sensor can be the sensor or different type of same type Sensor.
In addition, above-mentioned first detection of obstacles sensor and the second detection of obstacles sensor can also be that image is adopted Storage, by gathering the image in front of robot, to determine to whether there is barrier in front of robot.
In embodiments of the present invention, if the first detection of obstacles sensor detects obstacle information, illustrate in machine Barrier is there are at the left forward side position of device people, at this moment, the first angle that can be turned right by control machine people is hided Obstacle avoidance thing;Similarly, if the second detection of obstacles sensor detects obstacle information, illustrate on the front right side of robot Barrier is there are at side position, at this moment, the second angle that can be turned left by control machine people is come avoiding barrier.
Wherein, above-mentioned first angle and second angle can be identical angle, or different angular dimensions, this Inventive embodiments do not limit the specific size of above-mentioned first angle and second angle.
In addition to this it is possible to the first detection of obstacles sensor be arranged on to the right side of robot front end, by the second barrier Analyte detection sensor is hindered to be arranged on the left side of robot front end.
In one embodiment, when it is above-mentioned follow target be user when, method provided in an embodiment of the present invention also includes The walking states of control machine people, herein the walking states of robot refer to robot pause walking, start to walk or enter Enter armed state etc..
With reference to shown in Fig. 3, in method provided in an embodiment of the present invention, the walking states of control machine people, particular by What step S310-S330 was realized, specifically include:
S310, during robot ambulation, obtains the images of gestures of the user of image acquisition device collection;
S320, the gesture information of user is determined according to the images of gestures of user;
S330, according to the gesture information of user, control machine people performs action corresponding with the gesture information of user.
Specifically, in embodiments of the present invention, can be identified by the images of gestures to user, determine user's Gesture information.
Above-mentioned gesture information can be the gesture-type of user, such as, clench fist, can also be that the gesture of user is relative In position of user's body etc..
When position of the gesture that the gesture information of above-mentioned user is user with respect to user's body, above-mentioned steps S330 tool Body implementation process is as follows:
Position according to the gesture of user with respect to user's body, and it is corresponding between position set in advance and control instruction Relation, determines position corresponding control instruction of the gesture with respect to user's body of above-mentioned user;According to the control instruction, control machine Device people performs action corresponding with above-mentioned control instruction.
Specifically, being previously stored with the corresponding pass between different positions and control instruction in the controller of robot System, such as, when the gesture of user is located at the chest locations of user, corresponding control instruction is that robot suspends walking.
In another embodiment, if above-mentioned follow target for user, and user carries and robot is matched Terminal device, the terminal device can be the equipment such as bracelet, mobile phone, at this moment, and the walking states of control machine people are by such as What lower step was realized:
During robot ambulation, the control instruction that user is sent by terminal device is received;Referred to according to above-mentioned control Order, control machine people performs action corresponding with above-mentioned control instruction.
Further, when above-mentioned terminal device is bracelet, at least one button is provided with bracelet, for that can lead to The different buttons pressed on bracelet are crossed, or by pressing different number of times on same button, sends and controls to robot System instruction.
Such as, user is consecutively pressed button on bracelet twice, and bracelet can send pause instruction to server, work as machine People placed in a suspend state when, if user double-clicks the button on bracelet again, bracelet can be transmitted into armed state to server Instruction, so that robot enters armed state by halted state.
Further embodiment of this invention provide a kind of control machine people walk specific method flow chart, robot it is initial Change state is off-mode, with reference to shown in Fig. 4, specifically includes following steps:
S401, determines whether that signal is inputted, if it is, performing step S402;
Specifically, the signal is sent or passed through by the terminal device carried on target is followed What the button being arranged in robot was sent, the signal is used to indicate that robot enters armed state from off-mode.
S402, control machine people enters armed state from off-mode;
S403, control image acquisition device gathers the ambient image in its coverage;
S404, determines to follow target according to the image;
If not detected in above-mentioned steps S404 and following target, continue through image acquisition device and gather its covering In the range of image.
S405, follows target image, and receive the UWB data of user terminal transmission by image acquisition device collection;
S406, follows target image and UWB data, control machine people, which followed by, follows target to walk according to above-mentioned;
Above-mentioned UWB data include robot and follow the second distance information between target.
S407, robot follow it is above-mentioned follow target walk during, the detection part that breaks the barriers collection machine Obstacle information in front of people;
S408, judges whether that above-mentioned barrier can be bypassed, if it is, performing step S409, otherwise, performs step S410;
S409, control machine people bypasses above-mentioned barrier;
S410, control machine people stops walking;
S411, during robot ambulation, receive follow target send configured information (configured information can be use The control instruction that the gesture at family or user are sent by bracelet);
S412, according to above-mentioned configured information, the walking states of control machine people.
The method of control machine people provided in an embodiment of the present invention walking, target is followed by what image acquisition device was gathered Image, and follow the robot that the positioner in target is sent with following the distance between target information, determine jointly Robot is with following the distance between target information and first orientation information, and the range information determined and first orientation information are more Accurately so that robot, which can be followed accurately, above-mentioned follows target.
Based on the method identical principle walked with above-mentioned control machine people, another embodiment of the present invention additionally provides one kind The device of control machine people walking, the device is used for the device for performing control machine people walking provided in an embodiment of the present invention, Be provided with image acquisition device in the robot, with reference to shown in Fig. 5, the device include the first determining module 510, receiving module 520, Second determining module 530, generation module 540 and control module 550, wherein,
Above-mentioned first determining module 510, for receive image acquisition device collection follow target image after, according to upper The width for the true environment for following target image and predetermined image acquisition device to gather is stated, determines above-mentioned to follow target relative First range information of robot horizontal-shift;
Above-mentioned receiving module 520, for receive the above-mentioned robot for following target to send and follow between target second Range information;
Above-mentioned second determining module 530, for according to above-mentioned first range information and second distance information, determining above-mentioned machine Device people and follow the first orientation information between template;
Above-mentioned generation module 540, for according to above-mentioned second distance information and above-mentioned first orientation information, generation walking control System instruction;
Above-mentioned control module 550, for being instructed according to above-mentioned travelling control, control machine people, which follows, above-mentioned follows target line Walk.
Further, above-mentioned first determining module 510 determines to follow the first distance of target opposed robots' horizontal-shift Information, is to be realized by the first determining unit and computing unit, specifically includes:
Above-mentioned first determining unit, for determine it is above-mentioned follow in target image follow target is relative to follow target image The horizontal-shift distance of center, and determine the above-mentioned width for following target image;Above-mentioned computing unit, for according to State horizontal-shift distance, follow target head picture and the width of true environment, calculate above-mentioned first range information.
In a kind of embodiment, above-mentioned second determining module 520 is additionally operable to, and receives the figure of image acquisition device collection As the ambient image in collector coverage;
In this kind of embodiment, the device of above-mentioned control machine people walking also includes the 3rd determining module, specifically for, The ambient image received according to receiving module 520, determines above-mentioned to follow target.
In a kind of embodiment, above-mentioned 3rd determining module is according to ambient image, is to pass through it is determined that following target What detection unit and the second determining unit were realized, specifically include:
Above-mentioned detection unit, the number for detecting the target in above-mentioned ambient image;Above-mentioned second determining unit, is used for When only existing a target in detecting above-mentioned ambient image, the target is defined as following target;It is additionally operable to, is detecting When there is at least two targets in above-mentioned ambient image, according in each target in above-mentioned ambient image and above-mentioned ambient image The distance between heart position is defined as above-mentioned following target.
In some cases, only according in above-mentioned ambient image between each target and the center of ambient image away from Target is followed from that can not determine, therefore, device provided in an embodiment of the present invention also includes the 4th determining module;
Above-mentioned receiving module, be additionally operable to receive it is above-mentioned follow target send location information, above-mentioned location information include with With the 3rd range information and second orientation information between target and robot;Above-mentioned 4th determining module, for according to above-mentioned The 3rd range information and second orientation information between target and robot is followed to determine above-mentioned to follow target.
In a kind of embodiment, the device of above-mentioned control machine people walking also includes the first acquisition module and judgement Module, wherein,
Above-mentioned first acquisition module, the status data for obtaining robot, status data includes the acceleration of robot Degrees of data, angular velocity data and magnetic induction data;Above-mentioned first judge module, for according to above-mentioned status data, judging machine Whether the state of people is abnormal;
When judging the abnormal state of robot, then by control module 550, to following target to send warning information, and control Robot processed stops walking.
In a kind of embodiment, detection of obstacles part is additionally provided with robot, therefore, control machine people The device of walking also includes the second acquisition module and the second judge module;
Above-mentioned second acquisition module, for during robot ambulation, obtaining the obstacle of detection of obstacles part collection Obstacle information in front of analyte detection part;Above-mentioned second judge module, for according to obstacle information, judging whether to control Robot avoiding barrier processed;When judgement can be with control machine people's avoiding barrier, by control module 550, according to above-mentioned barrier Hinder thing information, control machine people's avoiding barrier, judge robot can not avoiding obstacles when, by control module 550, control Robot stop walking, and to follow target send cue.
Specifically, above-mentioned detection of obstacles part at least includes the first detection of obstacles sensor and the second detection of obstacles Sensor, above-mentioned first detection of obstacles sensor is arranged on the left of the front end of robot, and the second detection of obstacles sensor is set Put on the right side of the front end of robot, the above-mentioned control machine people's avoiding barrier of control module 550, be by the first control unit, What the second control unit and the 3rd control unit were realized, specifically include:
Above-mentioned first control unit, for when the first obstacle information for receiving the transmission of the first detection of obstacles sensor When, control machine people turns right first angle;Above-mentioned second control unit, the second detection of obstacles sensing is received for working as During the second obstacle information that device is sent, control machine people turns left second angle;Above-mentioned 3rd control unit, connects for working as Receive the first obstacle information and the second detection of obstacles sensor that the first detection of obstacles sensing machine sends are sent the During two obstacle informations, control machine people to follow target direction rotate third angle.
In a kind of embodiment, when it is above-mentioned to follow target be user when, the device of above-mentioned control machine people walking Also include the 3rd acquisition module and the 5th determining module;
Above-mentioned 3rd acquisition module, the hand of the user for during robot ride, obtaining image acquisition device collection Gesture image;Above-mentioned 5th determining module, the gesture information for determining user according to the images of gestures of user;Then by above-mentioned control Module 550, according to the gesture information of user, control machine people performs action corresponding with the gesture information being used for.
In another embodiment, it is above-mentioned follow target be for, and user carry and robot match Terminal device, in this kind of embodiment, above-mentioned second determining module 520 is additionally operable to, during robot ride, receive The control instruction that user is sent by terminal device;Above-mentioned control module 550 is additionally operable to, according to above-mentioned control instruction, control machine Device people performs action corresponding with above-mentioned control instruction.
In the device of control machine people walking provided in an embodiment of the present invention, target is followed by what image acquisition device was gathered Image, and follow robot and follow the distance between target information that the positioner in target sends, it is common to determine Go out robot and follow the distance between target information and first orientation information, the range information determined and first orientation information It is more accurate so that robot, which can be followed accurately, above-mentioned follows target.
Another embodiment of the present invention additionally provides a kind of robot, with reference to shown in Fig. 6, and the robot includes robot body 610th, image acquisition device 620, the device 630 and transceiver 640 of control machine people walking on robot body 610 are arranged on;
The device 630 that described image collector 620 and transceiver 640 are walked with control machine people is connected;
Above-mentioned transceiver 640 is connected with carrying the terminal device on target is followed.
Specifically, being additionally provided with detection of obstacles part and attitude transducer in above-mentioned robot.
Above-mentioned detection of obstacles part can be ultrasonic obstacle analyte detection device or infrared detection of obstacles device.
Above-mentioned attitude transducer includes acceleration transducer, angular-rate sensor and geomagnetic sensor etc..
Robot provided in an embodiment of the present invention, follows the image of target, and follow by what image acquisition device was gathered Robot and follow the distance between target information that positioner in target is sent, determine robot and follow mesh jointly The distance between mark information and first orientation information, the range information determined and first orientation information are more accurate so that machine People, which can accurately follow, above-mentioned follows target.
The device for the control machine people walking that the embodiment of the present invention is provided can be the specific hardware or peace in equipment Loaded on software or firmware in equipment etc..The technique effect of the device that the embodiment of the present invention is provided, its realization principle and generation Identical with preceding method embodiment, to briefly describe, device embodiment part does not refer to part, refers to preceding method embodiment Middle corresponding contents.It is apparent to those skilled in the art that, for convenience and simplicity of description, described above is The specific work process of system, device and unit, may be referred to the corresponding process in above method embodiment, no longer go to live in the household of one's in-laws on getting married herein State.
, can be by others side in embodiment provided by the present invention, it should be understood that disclosed apparatus and method Formula is realized.Device embodiment described above is only schematical, for example, the division of the unit, only one kind are patrolled Collect function to divide, there can be other dividing mode when actually realizing, in another example, multiple units or component can combine or can To be integrated into another system, or some features can be ignored, or not perform.It is another, it is shown or discussed each other Coupling or direct-coupling or communication connection can be the INDIRECT COUPLING or communication link of device or unit by some communication interfaces Connect, can be electrical, machinery or other forms.
The unit illustrated as separating component can be or may not be it is physically separate, it is aobvious as unit The part shown can be or may not be physical location, you can with positioned at a place, or can also be distributed to multiple On NE.Some or all of unit therein can be selected to realize the mesh of this embodiment scheme according to the actual needs 's.
In addition, each functional unit in the embodiment that the present invention is provided can be integrated in a processing unit, also may be used To be that unit is individually physically present, can also two or more units it is integrated in a unit.
If the function is realized using in the form of SFU software functional unit and is used as independent production marketing or in use, can be with It is stored in a computer read/write memory medium.Understood based on such, technical scheme is substantially in other words The part contributed to prior art or the part of the technical scheme can be embodied in the form of software product, the meter Calculation machine software product is stored in a storage medium, including some instructions are to cause a computer equipment (can be individual People's computer, server, or network equipment etc.) perform all or part of step of each of the invention embodiment methods described. And foregoing storage medium includes:USB flash disk, mobile hard disk, read-only storage (ROM, Read-Only Memory), arbitrary access are deposited Reservoir (RAM, Random Access Memory), magnetic disc or CD etc. are various can be with the medium of store program codes.
It should be noted that:Similar label and letter represents similar terms in following accompanying drawing, therefore, once a certain Xiang Yi It is defined in individual accompanying drawing, then it further need not be defined and explained in subsequent accompanying drawing, in addition, term " the One ", " second ", " the 3rd " etc. are only used for distinguishing description, and it is not intended that indicating or implying relative importance.
Finally it should be noted that:Embodiment described above, is only the embodiment of the present invention, to illustrate the present invention Technical scheme, rather than its limitations, protection scope of the present invention is not limited thereto, although with reference to the foregoing embodiments to this hair It is bright to be described in detail, it will be understood by those within the art that:Any one skilled in the art The invention discloses technical scope in, it can still modify to the technical scheme described in previous embodiment or can be light Change is readily conceivable that, or equivalent substitution is carried out to which part technical characteristic;And these modifications, change or replacement, do not make The essence of appropriate technical solution departs from the spirit and scope of technical scheme of the embodiment of the present invention.The protection in the present invention should all be covered Within the scope of.Therefore, protection scope of the present invention described should be defined by scope of the claims.

Claims (10)

1. a kind of method of control machine people walking, it is characterised in that image acquisition device, the side are provided with the robot Method includes:
Receive described image collector collection follow target image after, according to it is described follow target image and predefine The collection of described image collector true environment width, it is determined that the target relatively described robot horizontal-shift of following First range information;
The robot and the second distance information followed between target that target is sent are followed described in receiving;
According to first range information and the second distance information, determine the robot and described follow between target First orientation information;
According to the second distance information and the first orientation information, generation travelling control instruction;
Instructed according to the travelling control, control the robot to follow and described follow target to walk.
2. according to the method described in claim 1, it is characterised in that target image is followed described in the basis and predetermined The width of the true environment of described image collector collection, it is determined that described follow the of the relatively described robot horizontal-shift of target One range information, including:
It is determined that it is described follow in target image follow the relatively described horizontal offset for following target image center of target From, and the width of target image is followed described in determination;
According to the horizontal-shift distance, the width of the width and the true environment for following target image, calculate described First range information.
3. according to the method described in claim 1, it is characterised in that follow target figure in reception described image collector collection Before picture, methods described also includes:
Receive the ambient image in the described image collector coverage of described image collector collection;
According to the ambient image, it is determined that described follow target.
4. method according to claim 3, it is characterised in that described according to the ambient image, it is determined that described follow mesh Mark, including:
Detect the number of the target in the ambient image;
When only existing a target in detecting the ambient image, the target is defined as described to follow target;
In detecting the ambient image exist at least two targets when, according to each target in the ambient image with it is described The distance between center of ambient image follows target described in determining.
5. method according to claim 4, it is characterised in that according to each target in the ambient image and the ring The distance between center of border image can not determine it is described when following target,
Receive the location information for following target to send, the location information include it is described follow target and the robot it Between the 3rd range information and second orientation information;
Followed according to being determined the 3rd range information followed between target and the robot and second orientation information Target.
6. according to the method described in claim 1, it is characterised in that methods described also includes:
The status data of the robot is obtained, the status data includes acceleration information, the angular speed number of the robot According to magnetic induction data;
According to the status data, judge whether the state of the robot is abnormal;
When judging the abnormal state of the robot, target is followed to send warning information to described, and control the robot Stop walking.
7. according to the method described in claim 1, it is characterised in that detection of obstacles part is provided with the robot;
Methods described also includes:
During the robot ambulation, in front of the detection of obstacles part for obtaining the detection of obstacles part collection Obstacle information;
According to the obstacle information, judge whether that the robot can be controlled to hide the barrier;
If it is, according to the obstacle information, controlling the robot to hide the barrier;
Otherwise, control the robot to stop walking, and follow target to send cue to described.
8. method according to claim 7, it is characterised in that the detection of obstacles part at least includes the first barrier Detection sensor and the second detection of obstacles sensor, the first detection of obstacles sensor are arranged on the robot front end Left side, the second detection of obstacles sensor is arranged on the right side of the robot front end;
It is described to control the robot to hide the barrier according to the obstacle information, including:Including:
When receiving the first obstacle information that the first detection of obstacles sensor is sent, the robot is controlled to the right Rotate first angle;
When receiving the second obstacle information that the second detection of obstacles sensor is sent, the robot is controlled to the left Rotate second angle;
When the first obstacle information and second detection of obstacles that receive the first detection of obstacles sensor transmission During the second obstacle information that sensor is sent, the robot is controlled to rotate third angle to the direction for following target.
9. a kind of device of control machine people walking, it is characterised in that image acquisition device, the dress are provided with the robot Put including:
First determining module, for receive described image collector collection follow target image after, followed according to described The width of target image and the true environment of predetermined described image collector collection, it is determined that described follow target with respect to institute State the first range information of robot horizontal-shift;
Second determining module, for receive the robot for following target to send and it is described follow between target second Range information;
Second determining module, for according to first range information and the second distance information, determine the robot with The first orientation information followed between target;
Generation module, for according to the second distance information and the first orientation information, generation travelling control instruction;
Control module, for being instructed according to the travelling control, controls the robot to follow and described follows target to walk.
10. a kind of robot, it is characterised in that including robot body, the IMAQ being arranged on the robot body The device of device, transceiver and control machine people as claimed in claim 9 walking.
CN201710576609.7A 2017-07-14 2017-07-14 Method and device for controlling robot to walk and robot Active CN107315414B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710576609.7A CN107315414B (en) 2017-07-14 2017-07-14 Method and device for controlling robot to walk and robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710576609.7A CN107315414B (en) 2017-07-14 2017-07-14 Method and device for controlling robot to walk and robot

Publications (2)

Publication Number Publication Date
CN107315414A true CN107315414A (en) 2017-11-03
CN107315414B CN107315414B (en) 2021-04-27

Family

ID=60178694

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710576609.7A Active CN107315414B (en) 2017-07-14 2017-07-14 Method and device for controlling robot to walk and robot

Country Status (1)

Country Link
CN (1) CN107315414B (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107807652A (en) * 2017-12-08 2018-03-16 灵动科技(北京)有限公司 Merchandising machine people, the method for it and controller and computer-readable medium
CN109798901A (en) * 2019-03-18 2019-05-24 国网江苏省电力有限公司电力科学研究院 A kind of archives robot and its navigation positioning system and navigation locating method
CN110362091A (en) * 2019-08-05 2019-10-22 广东交通职业技术学院 A kind of robot follows kinescope method, device and robot
CN110794692A (en) * 2018-08-03 2020-02-14 珠海格力电器股份有限公司 Mobile control method and device of household appliance and household appliance
CN111352411A (en) * 2018-12-20 2020-06-30 北京新联铁集团股份有限公司 Hollow axle positioning method and device and intelligent hollow axle flaw detector
CN112130564A (en) * 2020-09-11 2020-12-25 珠海市一微半导体有限公司 Robot rotation angle acquisition method, chip and robot
CN112890680A (en) * 2019-11-19 2021-06-04 科沃斯机器人股份有限公司 Follow-up cleaning operation method, control method, device, robot and storage medium
CN113126604A (en) * 2019-12-30 2021-07-16 北京猎户星空科技有限公司 Robot obstacle avoidance method and device, electronic equipment and storage medium
CN113878577A (en) * 2021-09-28 2022-01-04 深圳市海柔创新科技有限公司 Robot control method, robot, control terminal and control system
US20220382282A1 (en) * 2021-05-25 2022-12-01 Ubtech North America Research And Development Center Corp Mobility aid robot navigating method and mobility aid robot using the same
US12032377B2 (en) * 2021-05-25 2024-07-09 Ubkang (Qingdao) Technology Co., Ltd. Mobility aid robot navigating method and mobility aid robot using the same

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11160000A (en) * 1997-12-01 1999-06-15 Mitsubishi Electric Corp Image guiding system
JP2002255037A (en) * 2001-02-28 2002-09-11 Nippon Soken Inc Mobile case
CN1940591A (en) * 2005-09-26 2007-04-04 通用汽车环球科技运作公司 System and method of target tracking using sensor fusion
CN105717927A (en) * 2016-04-13 2016-06-29 京东方科技集团股份有限公司 Bearing device and control method used for bearing device
CN105956513A (en) * 2016-04-19 2016-09-21 北京小米移动软件有限公司 Method and device for executing reaction action
CN106292657A (en) * 2016-07-22 2017-01-04 北京地平线机器人技术研发有限公司 Mobile robot and patrol path setting method thereof
CN106444763A (en) * 2016-10-20 2017-02-22 泉州市范特西智能科技有限公司 Intelligent automatic following method based on visual sensor, system and suitcase

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11160000A (en) * 1997-12-01 1999-06-15 Mitsubishi Electric Corp Image guiding system
JP2002255037A (en) * 2001-02-28 2002-09-11 Nippon Soken Inc Mobile case
CN1940591A (en) * 2005-09-26 2007-04-04 通用汽车环球科技运作公司 System and method of target tracking using sensor fusion
CN105717927A (en) * 2016-04-13 2016-06-29 京东方科技集团股份有限公司 Bearing device and control method used for bearing device
CN105956513A (en) * 2016-04-19 2016-09-21 北京小米移动软件有限公司 Method and device for executing reaction action
CN106292657A (en) * 2016-07-22 2017-01-04 北京地平线机器人技术研发有限公司 Mobile robot and patrol path setting method thereof
CN106444763A (en) * 2016-10-20 2017-02-22 泉州市范特西智能科技有限公司 Intelligent automatic following method based on visual sensor, system and suitcase

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107807652A (en) * 2017-12-08 2018-03-16 灵动科技(北京)有限公司 Merchandising machine people, the method for it and controller and computer-readable medium
CN110794692A (en) * 2018-08-03 2020-02-14 珠海格力电器股份有限公司 Mobile control method and device of household appliance and household appliance
CN110794692B (en) * 2018-08-03 2021-07-23 珠海格力电器股份有限公司 Mobile control method and device of household appliance and household appliance
CN111352411A (en) * 2018-12-20 2020-06-30 北京新联铁集团股份有限公司 Hollow axle positioning method and device and intelligent hollow axle flaw detector
CN111352411B (en) * 2018-12-20 2024-05-24 北京新联铁集团股份有限公司 Hollow axle positioning method and device and intelligent hollow axle flaw detector
CN109798901A (en) * 2019-03-18 2019-05-24 国网江苏省电力有限公司电力科学研究院 A kind of archives robot and its navigation positioning system and navigation locating method
CN109798901B (en) * 2019-03-18 2022-08-12 国网江苏省电力有限公司电力科学研究院 Robot for files and navigation positioning system and navigation positioning method thereof
CN110362091A (en) * 2019-08-05 2019-10-22 广东交通职业技术学院 A kind of robot follows kinescope method, device and robot
CN112890680B (en) * 2019-11-19 2023-12-12 科沃斯机器人股份有限公司 Follow-up cleaning operation method, control device, robot and storage medium
CN112890680A (en) * 2019-11-19 2021-06-04 科沃斯机器人股份有限公司 Follow-up cleaning operation method, control method, device, robot and storage medium
CN113126604A (en) * 2019-12-30 2021-07-16 北京猎户星空科技有限公司 Robot obstacle avoidance method and device, electronic equipment and storage medium
CN113126604B (en) * 2019-12-30 2024-04-30 北京猎户星空科技有限公司 Robot obstacle avoidance method and device, electronic equipment and storage medium
CN112130564B (en) * 2020-09-11 2022-12-13 珠海一微半导体股份有限公司 Method for acquiring rotation angle of robot
CN112130564A (en) * 2020-09-11 2020-12-25 珠海市一微半导体有限公司 Robot rotation angle acquisition method, chip and robot
US20220382282A1 (en) * 2021-05-25 2022-12-01 Ubtech North America Research And Development Center Corp Mobility aid robot navigating method and mobility aid robot using the same
US12032377B2 (en) * 2021-05-25 2024-07-09 Ubkang (Qingdao) Technology Co., Ltd. Mobility aid robot navigating method and mobility aid robot using the same
CN113878577A (en) * 2021-09-28 2022-01-04 深圳市海柔创新科技有限公司 Robot control method, robot, control terminal and control system

Also Published As

Publication number Publication date
CN107315414B (en) 2021-04-27

Similar Documents

Publication Publication Date Title
CN107315414A (en) A kind of method, device and the robot of control machine people walking
US10970859B2 (en) Monitoring method and device for mobile target, monitoring system and mobile robot
CN109947119B (en) Mobile robot autonomous following method based on multi-sensor fusion
US10551854B2 (en) Method for detecting target object, detection apparatus and robot
CN106682572B (en) Target tracking method and system and first electronic device
EP3280976B1 (en) Object position measurement with automotive camera using vehicle motion data
CN108829137A (en) A kind of barrier-avoiding method and device of robot target tracking
CN106104203B (en) A kind of distance detection method of mobile object, device and aircraft
CN111932588A (en) Tracking method of airborne unmanned aerial vehicle multi-target tracking system based on deep learning
EP1978432B1 (en) Routing apparatus for autonomous mobile unit
TWI481980B (en) Electronic apparatus and navigation method thereof
CN110926476B (en) Accompanying service method and device for intelligent robot
CN107797107A (en) Trailer detect and track based on camera
KR20110047505A (en) Apparaus and Method for Detecting Slip of a Mobile Robot
CN106575437A (en) Information-processing device, information processing method, and program
CN107168343A (en) The control method and luggage case of a kind of luggage case
CN111077890A (en) Implementation method of agricultural robot based on GPS positioning and automatic obstacle avoidance
US20170004631A1 (en) Method and system for visual pedometry
CN110942474A (en) Robot target tracking method, device and storage medium
CN112541416A (en) Cross-radar obstacle tracking method and device, electronic equipment and storage medium
JPWO2020137315A1 (en) Positioning device and mobile
CN108169743A (en) Agricultural machinery is unmanned to use farm environment cognitive method
US11080562B1 (en) Key point recognition with uncertainty measurement
Rovira-Mas et al. Obstacle detection using stereo vision to enhance safety of autonomous machines
KR20190081334A (en) Method for tracking moving trajectory based on complex positioning and apparatus thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: A method, device and robot for controlling the walking of a robot

Effective date of registration: 20220921

Granted publication date: 20210427

Pledgee: Bank of Hangzhou Limited by Share Ltd. Beijing Zhongguancun branch

Pledgor: LINGDONG TECHNOLOGY (BEIJING) Co.,Ltd.

Registration number: Y2022990000654

PE01 Entry into force of the registration of the contract for pledge of patent right