CN102411371A - Multi-sensor service-based robot following system and method - Google Patents
Multi-sensor service-based robot following system and method Download PDFInfo
- Publication number
- CN102411371A CN102411371A CN2011103705778A CN201110370577A CN102411371A CN 102411371 A CN102411371 A CN 102411371A CN 2011103705778 A CN2011103705778 A CN 2011103705778A CN 201110370577 A CN201110370577 A CN 201110370577A CN 102411371 A CN102411371 A CN 102411371A
- Authority
- CN
- China
- Prior art keywords
- robot
- tracking
- information
- people
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Landscapes
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention discloses a system and method for allowing a robot to follow a target person according to fused measuring information of multiple sensors. The system comprises a sensor detection part, a detection fusion part, a motion executing part and a host main control computer, wherein the sensor detection part is used for acquiring the positions of all similar persons in a visual field and transmitting all measuring information to the host main control computer; the detection fusion part is used for updating the position of a target person with a sampling-based joint probability filtering algorithm by using the host main control computer; and the motion executing part is used for generating a motion target according to current target person information and obstacle information and executing tracking and following. In the invention, measuring information of multiple sensors is fused, and collisionless following is performed on the target person in combination with the sampling-based joint probability filtering algorithm and an obstacle avoidance algorithm, so that self-following of the robot can be realized under the condition that interfering persons exist in a room, and the following robustness and tracking accuracy are increased simultaneously.
Description
Technical field
The invention belongs to the robot field, relate to a kind of robot system for tracking and method, definite says, is that a kind of measurement information according to Multi-sensor Fusion is realized robot autonomous system and method for following the target people.
Technical background
It is a hot issue in the man-machine interaction research that target people follows; Not only has application demand widely the robot field; For example assist robot wheelchair that disabled person and patient move, follow robot of the weight of carrying luggage after one's death owner etc., and also have using value for the reconnaissance version military robot.The process of following often occurs in the many people physical environment that comprises target people and non-target people, thereby robot need accurately follow intended target in avoiding obstacles.
Present existing robot system for tracking mainly adopts single laser sensor, and is comparatively single to target people's acquisition method, is subject in the environment homologue soma and disturbs.The method that robot follows is mainly the joint probability filtering algorithm (SJPDAF) based on sampling, and this algorithm is suitable for the multiple goal people to be followed the tracks of, but this algorithm also just adopts single laser sensor in the past.
Robot environment in the process of following is comparatively complicated, static state or dynamic barrier possibly occur.Therefore require robot in the process of following, to keep away the barrier ability.Present barrier-avoiding method mainly contains dynamic window method (DWA), potential field method and speed obstruction method (VO) etc.The complexity of these algorithms is higher, has to be absorbed in local minimum defective easily.
The patent of Chinese patent numbers 200910101604.4 discloses a kind of multi-robot tracked mobile target algorithm; The motion model of the positional information structure target of the current location information of use moving target and previous moment; And, use the covariance interpolation method that moving target position is estimated with motion prediction model construction robot motion controlling models.This invention guarantees that moving target is in the multirobot visual range all the time.But this invention can't be followed the tracks of a plurality of targets simultaneously, does not also consider the barrier problem of keeping away in the tracing process.
Summary of the invention
The objective of the invention is to follow the deficiency that the target man-hour exists in robot to prior art; Propose a kind of robot system for tracking and method, have the good barrier ability of keeping away when making service robot under complicated indoor environment, follow the target people based on multisensor.
The concrete technical scheme of technical solution problem of the present invention is following:
The present invention is a kind of robot system for tracking that is suitable for merging multisensor; Comprise the sensor part, detect and merge part, motion operating part and upper main control computer; Sensor partly is the position of gathering all similar people in the visual field; And all measurement informations are sent to upper main control computer; It is that upper main control computer adopts the joint probability filtering algorithm based on sampling that target people's position is upgraded that part is merged in detection, and the motion operating part is according to current goal people information and obstacle information generation moving target and carries out and follow the tracks of and follow.
Sensor of the present invention partly comprises multiple sensors, and multiple sensors comprises laser range finder and degree of depth camera.
The joint probability filtering algorithm based on sampling that part is the fusion multiple sensors of upper main control computer operation is merged in detection of the present invention.
Motion operating part of the present invention comprises chassis and The Cloud Terrace.
The present invention is a kind of method of application that is suitable for merging the robot system for tracking of multisensor, may further comprise the steps:
(1) laser sensor that uses robot uses degree of depth camera that all target people are carried out the head shoulder simultaneously and detects and the location all target people location, the visual field;
(2) use joint probability filtering algorithm that all target people's position is upgraded, promptly generate the positional information of each target people current time according to measurement information based on sampling;
(3) target people's information that the quilt that use to upgrade is followed combines the obstacle information in the place ahead to keep away to hinder to follow.
The described laser sensor of method of application of the present invention adopts the circular arc detection method that similar people is detected, and degree of depth camera adopts the fast face detecting method based on AdaBoost, carries out the detection of target people's head shoulder characteristic.
Method of application of the present invention has merged the measurement information of multiple sensors, is applied in the joint probability filtering algorithm (SJPDAF) based on sampling.
Method of application of the present invention is under the condition of fresh target people position more, and according to the place ahead obstacle information, the method that has adopted a kind of barrier search radius to shrink generates the impact point of motion planning fast.
Beneficial effect of the present invention shows; Through merging the measurement information of multiple sensors; In conjunction with based on the joint probability filtering algorithm of sampling with keep away the barrier algorithm and the target people is not had to bump follow; Guarantee robot can be well indoor have under the situation of disturbing the people realize independently following, improved the robustness and the tracking accuracy of following simultaneously.
Description of drawings
Fig. 1 is a robot system for tracking structural representation;
Fig. 2 is a Robotic Dynamic planning viable targets point synoptic diagram.
Embodiment
Below in conjunction with accompanying drawing and specific embodiment the technical scheme of invention is done further explain:
Fig. 1 is the structural representation of robot provided by the invention system for tracking.As shown in Figure 1, robot of the present invention system for tracking comprises the sensor part, detects and merge part and motion operating part.Adopt laser and degree of depth camera to carry out target people's collection, upper main control computer adopts the joint probability filtering algorithm (SJPDAF) based on sampling that target people's position is upgraded, and The Cloud Terrace and chassis are followed the tracks of and followed according to the target people position of upgrading.
Sensor partly is responsible for gathering all similar people's in the visual field position and chassis obstacle information, and all measurement informations are sent to upper main control computer.Sensor partly comprises multiple sensors, and multiple sensors comprises laser range finder and degree of depth camera, and laser range finder comprises the chest laser range finder and the chassis laser range finder of robot.Wherein, the chest laser range finder is a HOKUYO UTM-30LX type laser range finder, and the degree of depth camera of robot head is a MESA SR4000 camera, and the chassis laser range finder is a SICK-100 type laser range finder.
Detect and merge the positional information that part is responsible for generating according to measurement information each target people current time; Being upper main control computer generates in the current time visual field everyone positional information according to the measurement information of chest laser range finder and degree of depth camera, next robot motion's impact point constantly of the obstacle information planning of gathering according to the chassis laser range finder simultaneously.
The motion operating part is responsible for generating moving target and execution according to current goal people information and obstacle information.
Robot follower method step based on multiple sensors provided by the invention is following:
At first; Use the laser sensor of robot chest to adopt the circular arc detection method; Similar people in the visual field is detected and locatees; Use degree of depth camera to adopt the fast face detecting method based on AdaBoost simultaneously, all target people are carried out the head shoulder detect and the location, all sensor measurement informations all send to upper main control computer.
Secondly, use joint probability filtering algorithm (SJPDAF) that all target people's position is upgraded based on sampling.Be carved with n tracking target when supposing t, m
tIndividual observation.Because we can only follow a target travel,, be designated as x (t) so only pay close attention to the t state constantly of being followed target; Z (t) expression t is the composite sequence of the observation of all the sensors constantly, note
z
j(t) observation in expression t this group observation constantly.β
jExpression observation j is to the association probability of target.Suppose that the particle filter of being followed target has K particle.
The concrete step of the Multi-sensor Fusion SJPDAF algorithm of binding capacity measurement information is following:
(1) at the state of forecast period according to predictive equation renewal previous moment population;
(2) according to the association probability β of each observation of computes for tracked target
j,
Wherein θ representes the relation integration of observational characteristic j and target, and Θ representes to observe all possible correlating event combination of j and target association, x
k(t) state of expression t moment k particle, α representes normalized factor;
(3) use the measurement information of sensor current time to estimate to be followed the state of target according to following formula,
T is the weight w of k particle of target constantly
k(t) do
At last, target people's information that the quilt that use is upgraded is followed is kept away barrier in conjunction with the obstacle information in the place ahead and is followed.Thinking such as Fig. 2 of dynamic programming viable targets point show: at first information and the robot radius according to the chassis laser detection expands the barrier interval; Secondly be that the viable targets point is searched at the center to the left and right respectively with target people direction, if find feasible object of planning point P, algorithm withdraws from; If on current barrier search radius R scope, do not have suitable feasible path at last, suitably dwindle search barrier radius R and repeat top two steps up to finding a feasible impact point.The chassis radii size is much to seek feasible impact point if search radius narrows down to robot, explains that so robot the place ahead does not have feasible path, and it is static that robot keeps.
The robot material object is followed the target people under complex environment do not have the motion of bumping.Robot has the dynamic disturbance people to occur in the process of following, and also there is the static-obstacle thing on ground simultaneously.Robot can pass through a plurality of target people of SJPDAF algorithm keeps track, makes robot dynamically do not disturbed the people to influence.Rational moving target point is searched for according to the obstacle information of chassis laser range finder collection by robot, makes robot in narrow space, not have the motion of bumping, and has embodied the validity of barrier-avoiding method.
What more than enumerate only is a specific embodiment of the present invention; Obviously, the invention is not restricted to above embodiment, many distortion can also be arranged; All distortion that those of ordinary skill in the art can directly derive or associate from content disclosed by the invention all should be thought protection scope of the present invention.
Claims (9)
1. robot system for tracking that is suitable for merging multisensor; Comprise the sensor part, detect and merge part, motion operating part and upper main control computer; It is characterized in that; Described sensor partly is the position of gathering all similar people in the visual field; And all measurement informations are sent to upper main control computer, and it is that upper main control computer adopts the joint probability filtering algorithm based on sampling that target people's position is upgraded that part is merged in described detection, described motion operating part is to generate robot motion planning point and carry out tracking and follow according to current goal people information and obstacle information.
2. robot as claimed in claim 1 system for tracking is characterized in that described sensor partly comprises multiple sensors.
3. robot as claimed in claim 2 system for tracking is characterized in that, described multiple sensors comprises laser range finder and degree of depth camera.
4. robot as claimed in claim 1 system for tracking is characterized in that, the joint probability filtering algorithm based on sampling that part is the fusion multiple sensors of upper main control computer operation is merged in described detection.
5. robot as claimed in claim 1 system for tracking is characterized in that, described motion operating part comprises chassis and The Cloud Terrace.
6. a method of application that is suitable for merging the robot system for tracking of multisensor is characterized in that, may further comprise the steps:
(1) laser sensor that uses robot uses degree of depth camera that all target people are carried out the head shoulder simultaneously and detects and the location all target people location, the visual field;
(2) use joint probability filtering algorithm that all target people's position is upgraded, promptly generate the positional information of each target people current time according to measurement information based on sampling;
(3) target people's information that the quilt that use to upgrade is followed combines the obstacle information in the place ahead to keep away to hinder to follow.
7. the method for application of robot as claimed in claim 6 system for tracking; It is characterized in that; Described laser sensor adopts the circular arc detection method that similar people is detected, and degree of depth camera adopts the fast face detecting method based on AdaBoost, carries out the detection of target people's head shoulder characteristic.
8. the method for application of robot as claimed in claim 6 system for tracking is characterized in that, has merged the measurement information of multiple sensors, is applied in the joint probability filtering algorithm (SJPDAF) based on sampling.
9. the method for application of robot as claimed in claim 6 system for tracking is characterized in that, under the condition of fresh target people position more, according to the place ahead obstacle information, the method that has adopted a kind of barrier search radius to shrink generates the impact point of motion planning fast.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2011103705778A CN102411371A (en) | 2011-11-18 | 2011-11-18 | Multi-sensor service-based robot following system and method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2011103705778A CN102411371A (en) | 2011-11-18 | 2011-11-18 | Multi-sensor service-based robot following system and method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN102411371A true CN102411371A (en) | 2012-04-11 |
Family
ID=45913486
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2011103705778A Pending CN102411371A (en) | 2011-11-18 | 2011-11-18 | Multi-sensor service-based robot following system and method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN102411371A (en) |
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103017771A (en) * | 2012-12-27 | 2013-04-03 | 杭州电子科技大学 | Multi-target joint distribution and tracking method of static sensor platform |
CN103537099A (en) * | 2012-07-09 | 2014-01-29 | 深圳泰山在线科技有限公司 | Tracking toy |
CN104850120A (en) * | 2015-03-19 | 2015-08-19 | 武汉科技大学 | Wheel type mobile robot navigation method based on IHDR self-learning frame |
CN104898659A (en) * | 2015-03-11 | 2015-09-09 | 北京理工大学 | Man-robot cooperative control method based on model prediction |
CN105425795A (en) * | 2015-11-26 | 2016-03-23 | 纳恩博(北京)科技有限公司 | Method for planning optimal following path and apparatus |
CN105487558A (en) * | 2015-12-24 | 2016-04-13 | 青岛海通机器人系统有限公司 | Object following system based on mobile robot and method |
CN105563493A (en) * | 2016-02-01 | 2016-05-11 | 昆山市工业技术研究院有限责任公司 | Height and direction adaptive service robot and adaptive method |
CN105652895A (en) * | 2014-11-12 | 2016-06-08 | 沈阳新松机器人自动化股份有限公司 | Mobile robot human body tracking system and tracking method based on laser sensor |
CN105955251A (en) * | 2016-03-11 | 2016-09-21 | 北京克路德人工智能科技有限公司 | Vision following control method of robot and robot |
CN106125087A (en) * | 2016-06-15 | 2016-11-16 | 清研华宇智能机器人(天津)有限责任公司 | Dancing Robot indoor based on laser radar pedestrian tracting method |
CN106991688A (en) * | 2017-03-09 | 2017-07-28 | 广东欧珀移动通信有限公司 | Human body tracing method, human body tracking device and electronic installation |
CN107016348A (en) * | 2017-03-09 | 2017-08-04 | 广东欧珀移动通信有限公司 | With reference to the method for detecting human face of depth information, detection means and electronic installation |
WO2017177552A1 (en) * | 2016-04-13 | 2017-10-19 | 京东方科技集团股份有限公司 | Carrier device and control method used for the carrier device |
CN107272744A (en) * | 2017-05-27 | 2017-10-20 | 芜湖星途机器人科技有限公司 | The robot active system for tracking being engaged with the number of taking machine |
CN107390721A (en) * | 2017-07-26 | 2017-11-24 | 歌尔科技有限公司 | Robot retinue control method, device and robot |
WO2017219751A1 (en) * | 2016-06-21 | 2017-12-28 | 安徽酷哇机器人有限公司 | Mobile suitcase having automatic following and obstacle avoidance functions, and using method therefor |
CN107608345A (en) * | 2017-08-26 | 2018-01-19 | 深圳力子机器人有限公司 | A kind of robot and its follower method and system |
CN108062095A (en) * | 2016-11-08 | 2018-05-22 | 福特全球技术公司 | The object tracking merged in probabilistic framework using sensor |
CN108121359A (en) * | 2016-11-29 | 2018-06-05 | 沈阳新松机器人自动化股份有限公司 | A kind of shopping robot |
CN108334098A (en) * | 2018-02-28 | 2018-07-27 | 弗徕威智能机器人科技(上海)有限公司 | A kind of human body follower method based on multisensor |
CN108710295A (en) * | 2018-04-20 | 2018-10-26 | 浙江工业大学 | A kind of robot follower method based on the filtering of progressive volume information |
CN109885080A (en) * | 2013-11-27 | 2019-06-14 | 宾夕法尼亚大学理事会 | Self-control system and autonomous control method |
CN111752279A (en) * | 2020-07-09 | 2020-10-09 | 上海有个机器人有限公司 | Robot multi-sensor fusion self-checking method and system |
CN111932588A (en) * | 2020-08-07 | 2020-11-13 | 浙江大学 | Tracking method of airborne unmanned aerial vehicle multi-target tracking system based on deep learning |
CN113934219A (en) * | 2021-12-16 | 2022-01-14 | 宏景科技股份有限公司 | Robot automatic obstacle avoidance method, system, equipment and medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1389710A (en) * | 2002-07-18 | 2003-01-08 | 上海交通大学 | Multiple-sensor and multiple-object information fusing method |
CN101210817A (en) * | 2007-12-24 | 2008-07-02 | 河北工业大学 | Method for robot independently searching odor source in indoor environment |
CN101504546A (en) * | 2008-12-12 | 2009-08-12 | 北京科技大学 | Children robot posture tracking apparatus |
CN101630413A (en) * | 2009-08-14 | 2010-01-20 | 浙江大学 | Multi-robot tracked mobile target algorithm |
-
2011
- 2011-11-18 CN CN2011103705778A patent/CN102411371A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1389710A (en) * | 2002-07-18 | 2003-01-08 | 上海交通大学 | Multiple-sensor and multiple-object information fusing method |
CN101210817A (en) * | 2007-12-24 | 2008-07-02 | 河北工业大学 | Method for robot independently searching odor source in indoor environment |
CN101504546A (en) * | 2008-12-12 | 2009-08-12 | 北京科技大学 | Children robot posture tracking apparatus |
CN101630413A (en) * | 2009-08-14 | 2010-01-20 | 浙江大学 | Multi-robot tracked mobile target algorithm |
Cited By (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103537099A (en) * | 2012-07-09 | 2014-01-29 | 深圳泰山在线科技有限公司 | Tracking toy |
CN103537099B (en) * | 2012-07-09 | 2016-02-10 | 深圳泰山在线科技有限公司 | Tracing toy |
CN103017771B (en) * | 2012-12-27 | 2015-06-17 | 杭州电子科技大学 | Multi-target joint distribution and tracking method of static sensor platform |
CN103017771A (en) * | 2012-12-27 | 2013-04-03 | 杭州电子科技大学 | Multi-target joint distribution and tracking method of static sensor platform |
CN109885080B (en) * | 2013-11-27 | 2022-09-20 | 宾夕法尼亚大学理事会 | Autonomous control system and autonomous control method |
CN109885080A (en) * | 2013-11-27 | 2019-06-14 | 宾夕法尼亚大学理事会 | Self-control system and autonomous control method |
CN105652895A (en) * | 2014-11-12 | 2016-06-08 | 沈阳新松机器人自动化股份有限公司 | Mobile robot human body tracking system and tracking method based on laser sensor |
CN104898659A (en) * | 2015-03-11 | 2015-09-09 | 北京理工大学 | Man-robot cooperative control method based on model prediction |
CN104850120B (en) * | 2015-03-19 | 2017-11-10 | 武汉科技大学 | Wheeled mobile robot air navigation aid based on IHDR autonomous learning frameworks |
CN104850120A (en) * | 2015-03-19 | 2015-08-19 | 武汉科技大学 | Wheel type mobile robot navigation method based on IHDR self-learning frame |
CN105425795B (en) * | 2015-11-26 | 2020-04-14 | 纳恩博(北京)科技有限公司 | Method and device for planning optimal following path |
WO2017088720A1 (en) * | 2015-11-26 | 2017-06-01 | 纳恩博(北京)科技有限公司 | Method and device for planning optimal following path and computer storage medium |
CN105425795A (en) * | 2015-11-26 | 2016-03-23 | 纳恩博(北京)科技有限公司 | Method for planning optimal following path and apparatus |
CN105487558A (en) * | 2015-12-24 | 2016-04-13 | 青岛海通机器人系统有限公司 | Object following system based on mobile robot and method |
CN105563493A (en) * | 2016-02-01 | 2016-05-11 | 昆山市工业技术研究院有限责任公司 | Height and direction adaptive service robot and adaptive method |
CN105955251A (en) * | 2016-03-11 | 2016-09-21 | 北京克路德人工智能科技有限公司 | Vision following control method of robot and robot |
US10638820B2 (en) | 2016-04-13 | 2020-05-05 | Boe Technology Group Co., Ltd. | Carrying device and method of controlling the same |
WO2017177552A1 (en) * | 2016-04-13 | 2017-10-19 | 京东方科技集团股份有限公司 | Carrier device and control method used for the carrier device |
CN106125087A (en) * | 2016-06-15 | 2016-11-16 | 清研华宇智能机器人(天津)有限责任公司 | Dancing Robot indoor based on laser radar pedestrian tracting method |
WO2017219751A1 (en) * | 2016-06-21 | 2017-12-28 | 安徽酷哇机器人有限公司 | Mobile suitcase having automatic following and obstacle avoidance functions, and using method therefor |
CN108062095A (en) * | 2016-11-08 | 2018-05-22 | 福特全球技术公司 | The object tracking merged in probabilistic framework using sensor |
CN108121359A (en) * | 2016-11-29 | 2018-06-05 | 沈阳新松机器人自动化股份有限公司 | A kind of shopping robot |
CN107016348A (en) * | 2017-03-09 | 2017-08-04 | 广东欧珀移动通信有限公司 | With reference to the method for detecting human face of depth information, detection means and electronic installation |
CN106991688A (en) * | 2017-03-09 | 2017-07-28 | 广东欧珀移动通信有限公司 | Human body tracing method, human body tracking device and electronic installation |
CN107272744A (en) * | 2017-05-27 | 2017-10-20 | 芜湖星途机器人科技有限公司 | The robot active system for tracking being engaged with the number of taking machine |
CN107390721A (en) * | 2017-07-26 | 2017-11-24 | 歌尔科技有限公司 | Robot retinue control method, device and robot |
CN107390721B (en) * | 2017-07-26 | 2021-05-18 | 歌尔科技有限公司 | Robot following control method and device and robot |
CN107608345A (en) * | 2017-08-26 | 2018-01-19 | 深圳力子机器人有限公司 | A kind of robot and its follower method and system |
CN108334098A (en) * | 2018-02-28 | 2018-07-27 | 弗徕威智能机器人科技(上海)有限公司 | A kind of human body follower method based on multisensor |
CN108334098B (en) * | 2018-02-28 | 2018-09-25 | 弗徕威智能机器人科技(上海)有限公司 | A kind of human body follower method based on multisensor |
CN108710295A (en) * | 2018-04-20 | 2018-10-26 | 浙江工业大学 | A kind of robot follower method based on the filtering of progressive volume information |
CN108710295B (en) * | 2018-04-20 | 2021-06-18 | 浙江工业大学 | Robot following method based on progressive volume information filtering |
CN111752279A (en) * | 2020-07-09 | 2020-10-09 | 上海有个机器人有限公司 | Robot multi-sensor fusion self-checking method and system |
CN111752279B (en) * | 2020-07-09 | 2023-09-08 | 上海有个机器人有限公司 | Multi-sensor fusion self-checking method and system for robot |
CN111932588A (en) * | 2020-08-07 | 2020-11-13 | 浙江大学 | Tracking method of airborne unmanned aerial vehicle multi-target tracking system based on deep learning |
CN111932588B (en) * | 2020-08-07 | 2024-01-30 | 浙江大学 | Tracking method of airborne unmanned aerial vehicle multi-target tracking system based on deep learning |
CN113934219A (en) * | 2021-12-16 | 2022-01-14 | 宏景科技股份有限公司 | Robot automatic obstacle avoidance method, system, equipment and medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN102411371A (en) | Multi-sensor service-based robot following system and method | |
CN109947119B (en) | Mobile robot autonomous following method based on multi-sensor fusion | |
US10198008B2 (en) | Mobile robot system | |
CN103064416B (en) | Crusing robot indoor and outdoor autonomous navigation system | |
CN103926925B (en) | Improved VFH algorithm-based positioning and obstacle avoidance method and robot | |
Chung et al. | The detection and following of human legs through inductive approaches for a mobile robot with a single laser range finder | |
US20180211103A1 (en) | Method of creating map by identifying moving object, and robot implementing the method | |
CN106997688A (en) | Parking position detecting method based on multi-sensor information fusion | |
EP3388915A1 (en) | Information processing method, mobile device, and computer storage medium | |
CN103680291A (en) | Method for realizing simultaneous locating and mapping based on ceiling vision | |
US11300663B2 (en) | Method for predicting a motion of an object | |
WO2016013095A1 (en) | Autonomous moving device | |
Li et al. | Autonomous last-mile delivery vehicles in complex traffic environments | |
Lidoris et al. | The autonomous city explorer (ACE) project—mobile robot navigation in highly populated urban environments | |
CN105277593A (en) | Mobile robot based indoor smell source positioning method | |
Brookshire | Person following using histograms of oriented gradients | |
Liu et al. | The design of a fully autonomous robot system for urban search and rescue | |
KR20160048530A (en) | Method and apparatus for generating pathe of autonomous vehicle | |
CN113034579A (en) | Dynamic obstacle track prediction method of mobile robot based on laser data | |
CN108268036A (en) | A kind of novel robot intelligent barrier avoiding system | |
Xue et al. | Fuzzy controller for autonomous vehicle based on rough sets | |
CN103139812B (en) | Mobile node formation obstacle avoidance method based on wireless sensor network | |
Hao et al. | Asynchronous information fusion in intelligent driving systems for target tracking using cameras and radars | |
CN113741550B (en) | Mobile robot following method and system | |
CN101813780A (en) | GPS and multi-transducer integrated method for tracing mobile target in real time |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C12 | Rejection of a patent application after its publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20120411 |