CN105892461A - Method and system for matching and recognizing the environment where robot is and map - Google Patents

Method and system for matching and recognizing the environment where robot is and map Download PDF

Info

Publication number
CN105892461A
CN105892461A CN201610244332.3A CN201610244332A CN105892461A CN 105892461 A CN105892461 A CN 105892461A CN 201610244332 A CN201610244332 A CN 201610244332A CN 105892461 A CN105892461 A CN 105892461A
Authority
CN
China
Prior art keywords
robot
map
laser
local environment
total number
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610244332.3A
Other languages
Chinese (zh)
Other versions
CN105892461B (en
Inventor
徐清霞
张小*
张小�
章征贵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Pingyi Economic Development Zone Investment Development Co ltd
Original Assignee
Shanghai View Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai View Technologies Co Ltd filed Critical Shanghai View Technologies Co Ltd
Priority to CN201610244332.3A priority Critical patent/CN105892461B/en
Publication of CN105892461A publication Critical patent/CN105892461A/en
Application granted granted Critical
Publication of CN105892461B publication Critical patent/CN105892461B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0234Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons
    • G05D1/0236Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures

Abstract

The invention provides a method for matching and recognizing the environment where a robot is and a map. The method comprises the following steps: S10: utilizing a particle filter to position the environment where a robot is, and obtaining the positional information of the robot; S20: calculating the matching degree between the environment where the robot is and the read map; S30: determining whether the matching degree between the environment where the robot is and the map satisfies the matching demand; S40: if the matching degree between the environment where the robot is and the map does not satisfy the matching demand, the environment where the robot does not match the map; and if not, the environment where the robot matches the map. The method and system for matching and recognizing the environment where a robot is and a map can utilize the particle filter to position the environment where the robot is, can calculate the matching degree between the environment where the robot is and the map to determine whether the matching degree satisfies the matching demand, so that the method and system can identify whether the environment matches the map so as to improve the adaptability of the robot to the complicated environment and the self safety of the robot.

Description

The matching and recognition method of a kind of robot local environment and map and system
Technical field
The present invention relates to robotics, refer in particular to the match cognization of robot local environment and map Method and system.
Background technology
Along with science and technology constantly improves, the kind of mobile robot gets more and more, be widely used in industry, The industries such as agricultural, fire-fighting, service, are liked by numerous people deeply.Mobile robot autonomous complete various Before task, providing map and move the path planning that robot walks on map, mobile robot is pressed Walk according to path planning.
But, mobile robot is during walking, it is more likely that can run into emergency situations, institute of robot Place's environmental change conference affects the degree of accuracy of robot localization, if robot localization failure, robot may Can be out of control.Therefore, mobile robot is more and more higher to the requirement of environmental suitability, needs to judge in time to move Whether robot local environment mates with map, to avoid the raw unnecessary shock of mobile machine human hair.
Summary of the invention
The present invention is to solve the problem whether mobile robot local environment mates with map, it is provided that Yi Zhongji The matching and recognition method of device people's local environment and map and system.This matching and recognition method and system identification Go out whether robot local environment mates with map, thus improve robot to the adaptability of complex environment and machine The security of device people self.
In order to realize the above goal of the invention of the present invention, the present invention is achieved by the following technical solutions:
The present invention provides the matching and recognition method of a kind of robot local environment and map, comprises the steps: S10 utilizes particle filter to position robot local environment, obtains the positional information of described robot; S20 calculates the matching degree of described robot local environment and the map read;S30 judges described machine Whether people's local environment meets with the matching degree of described map mates requirement;If ring residing for the described robot of S40 Border is unsatisfactory for mating requirement with the matching degree of described map, and the most described robot local environment is with described map not Coupling;Otherwise, described robot local environment and described map match.
Further, robot local environment and the matching and recognition method of map, further comprise the steps of: S01 profit Obtain the laser data of described robot local environment with laser sensor, described laser data includes that laser connects Receive total number, Laser emission total number, laser length and laser angle.
Further, described step S30 also comprises the steps: the S31 laser according to described laser sensor Receive total number, calculate laser pick-off total number and account for the ratio of Laser emission total number;S32 judges that laser connects Receive total number and whether account for the ratio of Laser emission total number more than the legal threshold value of laser;If S33 laser pick-off is total Bar number accounts for the ratio of Laser emission total number more than the legal threshold value of laser, then judge the matching degree that calculates whether Less than preset matching threshold value;If the matching degree that S34 calculates is less than preset matching threshold value, then calculate described Matching degree, less than the duration of described preset matching threshold value, records the initial pose of described robot with current Pose;S35 judges that whether the duration calculated is more than preset time threshold;If what S36 calculated holds The continuous time more than preset time threshold, then according to initial pose and the current pose of described robot, calculates The displacement difference of described robot;Otherwise, step S31 is jumped to;Whether the displacement difference that S37 judgement calculates More than preset displacement threshold value;If the displacement difference calculated is more than preset displacement threshold value, residing for the most described robot Environment does not mates with described map;Otherwise, described robot local environment and described map match.
Further, described step S10 further comprises the steps of: S11 and utilizes particle filter to ring residing for robot Border positions, and obtains the pose of described robot;Its ranging formula is as follows:
X (t)=f (x (t-1), u (t));
Y (t)=g (x (t), z (t), m);
Wherein, x (t) is the described robot pose in t, and x (t-1) is that described robot is in the t-1 moment Pose, u (t) is the described robot control input quantity in t;Y (t) is described robot local environment Laser data, z (t) is the measurement noise analogue value, and m is cartographic information.
Further, described step S20 further comprises the steps of: S21 employing Gaussian distribution density function and calculates t Weight p of moment single particle, is described robot local environment with described by accumulative for all particle weights The matching degree of map;Single particle weight calculation formula is as follows:
p = αe - l 2 / β + c , c = 1 / 2 d ;
Wherein, α is Gaussian Profile coefficient, and d is the laser length in described laser data, and β is that Gauss divides The variance of cloth, l is between the barrier in described robot local environment and the barrier in described map Distance.
Further, robot local environment and the matching and recognition method of map, further comprise the steps of: S50 and work as When described robot local environment and described map match, described robot walks according to path planning;S60 When described robot local environment is not mated with described map, described robot reports to the police and stops according to planning Walk in path.
Further, robot local environment and the matching and recognition method of map.Further comprise the steps of: S51 to work as When described robot walks according to path planning, it is judged that whether described robot reaches home position;If S52 Described robot reaches home position, and the most described robot stops walking;Otherwise, step S01 is jumped to.
Further, the system of the matching and recognition method of robot local environment and map, including: location mould Block, is used for utilizing particle filter to position robot local environment, obtains the pose of described robot; Computing module, described computing module electrically connects with described locating module, is used for calculating residing for described robot The matching degree of environment and the map read;Judge module, described judge module is electrically connected with described computing module Connect, mate requirement for judging whether described robot local environment meets with the matching degree of described map;? Joining identification module, described match cognization module electrically connects with described judge module, if for institute of described robot Place's environment is unsatisfactory for mating requirement with the matching degree of described map, the most described robot local environment and describedly Figure does not mates;Otherwise, described robot local environment and described map match.
Further, robot local environment and the match cognization system of map, including: data acquisition module, For utilizing laser sensor to obtain the laser data of described robot local environment, described laser data includes Laser pick-off total number, Laser emission total number, laser length and laser angle.
Further, described computing module is additionally operable to the laser pick-off total number according to described laser sensor, meter Calculate laser pick-off total number and account for the ratio of Laser emission total number;Described judge module is additionally operable to judge that laser connects Receive total number and whether account for the ratio of Laser emission total number more than the legal threshold value of laser;If laser pick-off total number The ratio accounting for Laser emission total number is more than the legal threshold value of laser, and the most described judge module is additionally operable to judge to calculate Whether the matching degree gone out is less than preset matching threshold value;If the matching degree calculated is less than preset matching threshold value, The most described computing module is additionally operable to the duration calculating described matching degree less than described preset matching threshold value, Logging modle is for recording the initial pose of described robot and current pose;Described judge module is additionally operable to sentence Whether the disconnected duration calculated is more than preset time threshold;If the duration calculated is more than when presetting Between threshold value, the most described computing module is additionally operable to the initial pose according to described robot and current pose, calculate Go out the displacement difference of described robot;Described judge module is additionally operable to judge that whether the displacement difference calculated is more than pre- If displacement threshold value;If the displacement difference calculated is more than preset displacement threshold value, the most described match cognization module is used for Identify described robot local environment not mate with described map;Otherwise, described match cognization module is used for Identify described robot local environment and described map match.
The present invention at least one of has the advantages that
1. the present invention utilizes particle filter to position robot local environment, calculates institute of robot Place's environment and the matching degree of map, it is judged that whether matching degree meets coupling requirement, thus identifies environment and ground Whether figure mates, thus improves robot to the adaptability of complex environment and the security of robot self.
2, the present invention is identifying after whether robot local environment mate with map, whether controls robot Walk according to path planning, thus the environment for reply real-time change provides process strategy, such as, run into obstacle During thing, coupling meets and requires optional detouring, otherwise matching degree be unsatisfactory for requirement can path planning etc. again.
3, the present invention is applicable not only to robot when walking on path planning, in complex environment, knows Do not go out its environment not mate with map;Apply also for judging robot place environment whether with the map be given Unanimously, can prompt the user whether to select wrong map.
Accompanying drawing explanation
With detailed description of the invention, the present invention is described in further detail below in conjunction with the accompanying drawings:
Fig. 1 is the schematic diagram of a kind of robot local environment and the matching and recognition method of map;
Fig. 2 is a part of schematic diagram of robot local environment and the matching and recognition method of map;
Fig. 3 is the schematic diagram of another kind of robot local environment and the matching and recognition method of map;
Fig. 4 is the schematic diagram of another robot local environment and the matching and recognition method of map;
Fig. 5 is another part schematic diagram of robot local environment and the matching and recognition method of map;
Fig. 6 is the composition structural representation of a kind of robot local environment and the match cognization system of map;
Fig. 7 is the composition structural representation of another kind of robot local environment and the match cognization system of map;
In figure:
10, locating module, 11, data acquisition module, 20, computing module, 21, logging modle, 30, judge module, 40, match cognization module, 50, control module, 60, alarm module.
Detailed description of the invention
In order to be illustrated more clearly that the embodiment of the present invention or technical scheme of the prior art, below will be to enforcement In example or description of the prior art, the required accompanying drawing used is briefly described, it should be apparent that, following description It is exemplary with accompanying drawing for the present invention, and is understood not to limit the present invention.Following description describes The present invention is understood by numerous details to facilitate.But, in some instances, that know or routine Details is the most undeclared, the requirement succinct to meet specification.
As it is shown in figure 1, according to one embodiment of present invention, a kind of robot local environment and map Join recognition methods, comprise the steps:
S10 utilizes particle filter to position robot local environment, obtains the position of described robot Information;
S20 calculates the matching degree of described robot local environment and the map read;
S30 judges whether described robot local environment meets with the matching degree of described map and mates requirement;
If described robot local environment is unsatisfactory for mating requirement with the matching degree of described map, then perform S40 Described robot local environment is not mated with described map;
If described robot local environment meets with the matching degree of described map mates requirement, then perform S41 institute State robot local environment and described map match.
As shown in Figure 1 and Figure 2, according to another embodiment of the invention, a kind of robot local environment with The matching and recognition method of map, comprises the steps:
S01 utilizes laser sensor to obtain the laser data of described robot local environment, described laser data Including laser pick-off total number, Laser emission total number, laser length and laser angle;Preferably, institute The model stating laser sensor can be RPLIDARA1M1;
S10 utilizes particle filter to position robot local environment, obtains the position of described robot Information;
S20 calculates the matching degree of described robot local environment and the map read;
S31, according to the laser pick-off total number of described laser sensor, calculates laser pick-off total number and accounts for laser Launch the ratio of total number;
S32 judges that whether laser pick-off total number accounts for the ratio of Laser emission total number more than the legal threshold of laser Value, when the legal threshold value of described laser refers to run into barrier, laser pick-off total number accounts for Laser emission total number Ratio;
If the ratio that S33 laser pick-off total number accounts for Laser emission total number is more than the legal threshold value of laser, then sentence Whether the disconnected matching degree calculated is less than preset matching threshold value;Otherwise, step S41 is jumped to;
If the matching degree that S34 calculates is less than preset matching threshold value, then calculate described matching degree less than described The duration of preset matching threshold value, record the initial pose of described robot and current pose;Otherwise, jump Go to step S41;
S35 judges that whether the duration calculated is more than preset time threshold;
If the duration that S36 calculates is more than preset time threshold, then according to the initial bit of described robot Appearance and current pose, calculate the displacement difference of described robot;Otherwise, step S31 is jumped to;
S37 judges that whether the displacement difference calculated is more than preset displacement threshold value;
If the displacement difference calculated is more than preset displacement threshold value, then perform robot local environment described in S40 with Described map does not mates;
If the displacement difference calculated is less than preset displacement threshold value, then perform robot local environment described in S41 with Described map match.
Concrete, laser pick-off total number refers to: the laser strip number reflected, and utilizes laser sensor to visit Survey robot local environment, if local environment is more open, outside laser measurement scope, then reflect Laser strip number account for Laser emission sum ratio will be less than the legal threshold value of laser;If environmental information is enriched, Such as distribution of obstacles is complicated, and major part laser can fire back, then reflect in laser sensor Laser strip number accounts for the ratio of Laser emission sum more than the legal threshold value of laser;
Judge the matching degree of robot local environment and map, it is judged that whether this matching degree is less than preset matching threshold Value, if matching degree is less than preset matching threshold value, then it represents that this barrier is not present in map;If matching degree More than or equal to preset matching threshold value, then it represents that this barrier is present in map, robot local environment and ground Figure coupling;
Judge that matching degree is less than the duration of preset matching threshold value whether more than preset time threshold, if holding again The continuous time is more than preset time threshold, then it represents that this barrier will not disappear at short notice, can hinder machine People walks;If the duration is not more than preset time threshold, then it represents that this barrier can disappear at short notice, Robot ambulation, robot local environment is not hindered still to mate with map;
Judge whether the displacement difference of robot is more than preset displacement threshold value again, if displacement difference is more than preset displacement threshold Value, then it represents that robot still detects that after motion certain distance environmental change is big, gets rid of robot solid Fixed position is in low matching degree and certain time, the situation that environmental change is little in fact, institute of robot Place's environment does not mates with map, needs robot is carried out relevant control;If displacement difference is not more than preset displacement Threshold value, then it represents that robot local environment is still mated with map.
According to still a further embodiment, a kind of robot local environment and the match cognization side of map Method, comprises the steps:
S01 utilizes laser sensor to obtain the laser data of described robot local environment, described laser data Including laser pick-off total number, Laser emission total number, laser length and laser angle;
S11 utilizes particle filter to position robot local environment, obtains the pose of described robot; Its ranging formula is as follows:
X (t)=f (x (t-1), u (t));
Y (t)=g (x (t), z (t), m);
Wherein, x (t) is the described robot pose in t, and x (t-1) is that described robot is in the t-1 moment Pose, u (t) is the described robot control input quantity in t;Y (t) is described robot local environment Laser data, z (t) is the random observation noise simulation value between 0~1, and m is cartographic information;At this Bright middle z (t) is exactly the c in weight calculation formula;
S21 uses Gaussian distribution density function to calculate weight p of t single particle, by all particles Weight adds up the matching degree being described robot local environment with described map;Single particle weight calculation is public Formula is as follows:
p = αe - l 2 / β + c , c = 1 / 2 d ;
Wherein, α is Gaussian Profile coefficient, and d is the laser length in described laser data, and β is that Gauss divides The variance of cloth, l is between the barrier in described robot local environment and the barrier in described map Distance;Wherein, the value of l is the least, and matching degree is the highest, and environmental change is the least;
S31, according to the laser pick-off total number of described laser sensor, calculates laser pick-off total number and accounts for laser Launch the ratio of total number;
S32 judges that whether laser pick-off total number accounts for the ratio of Laser emission total number more than the legal threshold of laser Value, when the legal threshold value of described laser refers to run into barrier, laser pick-off total number accounts for Laser emission total number Ratio;
If the ratio that S33 laser pick-off total number accounts for Laser emission total number is more than the legal threshold value of laser, then sentence Whether the disconnected matching degree calculated is less than preset matching threshold value;
If the matching degree that S34 calculates is less than preset matching threshold value, then calculate described matching degree less than described The duration of preset matching threshold value, record the initial pose of described robot and current pose;
S35 judges that whether the duration calculated is more than preset time threshold;
If the duration that S36 calculates is more than preset time threshold, then according to the initial bit of described robot Appearance and current pose, calculate the displacement difference of described robot;Otherwise, step S31 is jumped to;
S37 judges that whether the displacement difference calculated is more than preset displacement threshold value;
If the displacement difference calculated is more than preset displacement threshold value, then perform robot local environment described in S40 with Described map does not mates;
If the displacement difference calculated is less than preset displacement threshold value, then perform robot local environment described in S41 with Described map match.
Concrete, the most simply describe particle filtering principle, the tool of its particle filtering Body principle can refer to the based on the tasteless particle filter of distributed edge same of Application No. 201310424318.8 Step location and map constructing method.
As it is shown on figure 3, according to still another embodiment of the invention, a kind of robot local environment and map Matching and recognition method, comprises the steps:
S01 utilizes laser sensor to obtain the laser data of described robot local environment, described laser data Including laser pick-off total number, Laser emission total number, laser length and laser angle;
S11 utilizes particle filter to position robot local environment, obtains the pose of described robot; Its ranging formula is as follows:
X (t)=f (x (t-1), u (t));
Y (t)=g (x (t), z (t), m);
Wherein, x (t) is the described robot pose in t, and x (t-1) is that described robot is in the t-1 moment Pose, u (t) is the described robot control input quantity in t;Y (t) is described robot local environment Laser data, z (t) is the measurement noise analogue value, and m is cartographic information;
S21 uses Gaussian distribution density function to calculate weight p of t single particle, by all particles Weight adds up the matching degree being described robot local environment with described map;Single particle weight calculation is public Formula is as follows:
p = αe - l 2 / β + c , c = 1 / 2 d ;
Wherein, α is Gaussian Profile coefficient, and d is the laser length in described laser data, and β is that Gauss divides The variance of cloth, l is between the barrier in described robot local environment and the barrier in described map Distance;
S31, according to the laser pick-off total number of described laser sensor, calculates laser pick-off total number and accounts for laser Launch the ratio of total number;
S32 judges that whether laser pick-off total number accounts for the ratio of Laser emission total number more than the legal threshold of laser Value, when the legal threshold value of described laser refers to run into barrier, laser pick-off total number accounts for Laser emission total number Ratio;
If the ratio that S33 laser pick-off total number accounts for Laser emission total number is more than the legal threshold value of laser, then sentence Whether the disconnected matching degree calculated is less than preset matching threshold value;
If the matching degree that S34 calculates is less than preset matching threshold value, then calculate described matching degree less than described The duration of preset matching threshold value, record the initial pose of described robot and current pose;
S35 judges that whether the duration calculated is more than preset time threshold;
If the duration that S36 calculates is more than preset time threshold, then according to the initial bit of described robot Appearance and current pose, calculate the displacement difference of described robot;Otherwise, step S31 is jumped to;
S37 judges that whether the displacement difference calculated is more than preset displacement threshold value;
If the displacement difference calculated is more than preset displacement threshold value, then perform robot local environment described in S40 with Described map does not mates;
If the displacement difference calculated is less than preset displacement threshold value, then perform robot local environment described in S41 with Described map match;
S50 is when described robot local environment and described map match, and described robot is according to path planning Walking;
Preferably, also include that step S51 is when described robot walks according to path planning, it is judged that described machine Whether device people reaches home position;
Preferably, if also including, the described robot of step S52 reaches home position, and the most described robot stops Walking;Otherwise, step S01 is jumped to;
S60 is not when described robot local environment is mated with described map, and described robot reports to the police and stops Walking.
As shown in Figure 6, according to one embodiment of present invention, a kind of robot local environment and map Join identification system, including:
Locating module 10, is used for utilizing particle filter to position robot local environment, obtains described The pose of robot;
Computing module 20, described computing module 20 electrically connects with described locating module 10, is used for calculating institute The matching degree of the map stating robot local environment and read;
Judge module 30, described judge module 30 electrically connects with described computing module 20, is used for judging described Whether robot local environment meets with the matching degree of described map mates requirement;
Match cognization module 40, described match cognization module 40 electrically connects with described judge module 30, is used for If described robot local environment is unsatisfactory for mating requirement with the matching degree of described map, institute of the most described robot Place's environment does not mates with described map;Otherwise, described robot local environment and described map match.
As it is shown in fig. 7, according to another embodiment of the invention, a kind of robot local environment and map Match cognization system, including:
Data acquisition module 11, for utilizing laser sensor to obtain the laser number of described robot local environment According to, described laser data includes laser pick-off total number, Laser emission total number, laser length and laser Angle;
Locating module 10, is used for utilizing particle filter to position robot local environment, obtains described The pose of robot;Its ranging formula is as follows:
X (t)=f (x (t-1), u (t));
Y (t)=g (x (t), z (t), m);
Wherein, x (t) is the described robot pose in t, and x (t-1) is that described robot is in the t-1 moment Pose, u (t) is the described robot control input quantity in t;Y (t) is described robot local environment Laser data, z (t) is the measurement noise analogue value, and m is cartographic information;
Computing module 20, for using Gaussian distribution density function to calculate weight p of t single particle, All particle weights are added up the matching degree being described robot local environment with described map;Single particle Weight calculation formula is as follows:
p = αe - l 2 / β + c , c = 1 / 2 d ;
Wherein, α is Gaussian Profile coefficient, and d is the laser length in described laser data, and β is that Gauss divides The variance of cloth, l is between the barrier in described robot local environment and the barrier in described map Distance;
Described computing module 20 is additionally operable to the laser pick-off total number according to described laser sensor, calculates and swashs Light-receiving total number accounts for the ratio of Laser emission total number;
Judge module 30, for judging that laser pick-off total number accounts for the ratio of Laser emission total number and whether is more than The legal threshold value of laser;
If laser pick-off total number accounts for the ratio of Laser emission total number more than the legal threshold value of laser, then described in sentence Disconnected module 30 is additionally operable to judge that whether the matching degree calculated is less than preset matching threshold value;
If the matching degree calculated is less than preset matching threshold value, the most described computing module 20 is additionally operable to calculate Described matching degree is less than the duration of described preset matching threshold value, and logging modle 21 is used for recording described machine The initial pose of device people and current pose;
Described judge module 30 is additionally operable to judge that whether the duration calculated is more than preset time threshold;
If the duration calculated is more than preset time threshold, the most described computing module 20 is additionally operable to basis The initial pose of described robot and current pose, calculate the displacement difference of described robot;
Described judge module 30 is additionally operable to judge that whether the displacement difference calculated is more than preset displacement threshold value;
If the displacement difference calculated is more than preset displacement threshold value, the most described match cognization module 40 is used for identifying Described robot local environment is not mated with described map;Otherwise, described match cognization module 40 is used for identifying Go out described robot local environment and described map match;
Control module 50, is used for when described robot local environment and described map match, described robot Walk according to path planning;
Preferably, described judge module 30 is additionally operable to, when described robot walks according to path planning, sentence Whether disconnected described robot reaches home position;
Preferably, the position if described robot reaches home, the most described control module 50 is additionally operable to control institute State robot and stop walking;
Alarm module 60.For when described robot local environment is not mated with described map, described machine People reports to the police, and described control module 50 is used for controlling robot and stops walking.
As shown in Figure 4, Figure 5, according to one embodiment of present invention, a kind of robot local environment and ground The matching and recognition method of figure, comprises the steps:
S100: reading global map data, to build figure complete for acquiescence here;
S200: read the laser data that laser sensor gets, robot is by outside laser sensor perception Boundary's environment, laser sensor is arranged on robot front end, and laser data includes laser angle and corresponding laser Length, the model of laser sensor is RPLIDAR A1M1;
S300: obtain the walking path of robot, and control robot ambulation, by global path planning and Control algolithm makes robot normally work;
S400: utilize particle filter algorithm to position;Particle filter positioning principle,
X (t)=f (x (t-1), u (t)); (1)
Y (t)=g (x (t), z (t), m); (2)
Wherein, x (t) is the described robot pose in t, and x (t-1) is that described robot is in the t-1 moment Pose, u (t) is the described robot control input quantity in t;Y (t) is described robot local environment Laser data, z (t) is the measurement noise analogue value, and m is cartographic information;
Formula 1 is state equation, and formula 2 is observational equation, and its principle is by the sample set generation of previous moment Enter in state equation, measurable go out a new sample set, also referred to as particle collection (pose of robot), New particle collection is substituted in observational equation, the observation (laser data) obtained is contrasted with cartographic information, The weight (also referred to as matching degree) of particle can be obtained;
S410: carry out particle initialization, creates equally distributed particle (sample), weight phase in map Deng, use Gaussian probability-density function to carry out particle distribution;
S420: utilize gauss of distribution function to sample, it is thus achieved that particle collection;
S420: obtain the weight of each particle according to the particle after sampling and laser observations data, calculates all Total weight (matching degree) of particle;The computing formula of weight is:
p = αe - l 2 / β + c , c = 1 / 2 d ; - - - ( 3 )
Wherein, α is Gaussian Profile coefficient, such as α=0.5;D is that the laser in described laser data greatly enhances most Degree, such as d=6m, c=1/12;β is the variance of Gaussian Profile, such as β=0.08;L is described robot Distance between barrier in local environment and the barrier in described map, the least explanation matching degree of distance The highest, environmental change is the least;
S430: weight normalized, finally carries out resampling and obtains the optimal location of robot;
Total weight of all particles can be calculated according to formula 3 and then orient the physical location of robot;Just Beginning state x (0) is an equally distributed sample set, and special instruction, particle filter algorithm realizes robot Location also includes that some other step does not repeats, such as sampling and resampling;
S500: calculating robot's environment and map match degree, the factor of influence of matching degree includes: laser data The most legal, total weight of particle, total weight duration and robot displacement, if laser data is legal (i.e. major part laser data can reflect, and otherwise laser data proceeds in the case of insecure Matching degree calculates nonsensical), particle low weight certain time, and the premise of displacement is had in robot It is lower it is believed that robot local environment is not mated with the information on map.Here set six threshold values, be The legal threshold value of laser (laser_count_thred), total weight threshold (total_weight_thred), the time Threshold value (last_time_thred), robot displacement's threshold value (x_thred, y_thred, theta_thred), Wherein the displacement of robot only needs to meet any one in displacement threshold value;
S501: calculate laser pick-off total number and account for the ratio of total transmitting, swash primarily to solve open area Light number returns less, and total weight continues the relatively low erroneous judgement caused;
The most whether S502: judge whether the ratio shared by laser pick-off total number meets requirement, close more than laser Method threshold value (laser_count_thred), if turning S503;Otherwise turn S504;
S503: obtain total weight of particle according to formula 3;
S504: exit matching degree identification;
S505: judge whether total weight meets requirement, if illustrating that total weight is less than the weight threshold set (total_weight_thred) turn S506, otherwise turn S504;
S506: obtain the pose of robot, turn S507;
S507: judge whether that meeting total weight for the first time is less than threshold value, if turning S508, otherwise turns S509;
S508: record is initial time this moment.Now the pose of robot is the initial pose of matching degree identification;
S509: record the pose of current time and robot;
S510: calculate the time difference of S508 and S509, it is judged that whether the duration reaches the time threshold set Value (last_time_thred), if turning S511, otherwise turns S501;
S511: calculate the displacement difference of S508 and S509, it is judged that whether displacement is more than the displacement threshold value set (x_thred, y_thred, theta_thred), if turning S512, otherwise turns S504;
S512: environment does not mates with map, exports result;
S600: judge whether matching degree meets requirement, requires if met, and continues row by the path of initial planning Walk, i.e. S800, otherwise turn S700;
S700: report to the police and stop walking;
S800: walk on by original route, turns S900;
S900: judge whether robot ambulation terminates, if not turning S200;
It should be noted that, above-described embodiment all can independent assortment as required.The above is only the present invention Preferred embodiment, it is noted that for those skilled in the art, without departing from On the premise of the principle of the invention, it is also possible to make some improvements and modifications, these improvements and modifications also should be regarded as Protection scope of the present invention.

Claims (10)

1. a robot local environment and the matching and recognition method of map, it is characterised in that include as follows Step:
S10 utilizes particle filter to position robot local environment, obtains the position of described robot Information;
S20 calculates the matching degree of described robot local environment and the map read;
S30 judges whether described robot local environment meets with the matching degree of described map and mates requirement;
If S40 described robot local environment is unsatisfactory for mating requirement with the matching degree of described map, then described Robot local environment is not mated with described map;Otherwise, described robot local environment and described map Coupling.
Robot the most according to claim 1 local environment and the matching and recognition method of map, it is special Levy and be, further comprise the steps of:
S01 utilizes laser sensor to obtain the laser data of described robot local environment, described laser data Including laser pick-off total number, Laser emission total number, laser length and laser angle.
Robot the most according to claim 2 local environment and the matching and recognition method of map, it is special Levying and be, described step S30 also comprises the steps:
S31, according to the laser pick-off total number of described laser sensor, calculates laser pick-off total number and accounts for laser Launch the ratio of total number;
S32 judges that whether laser pick-off total number accounts for the ratio of Laser emission total number more than the legal threshold of laser Value;
If the ratio that S33 laser pick-off total number accounts for Laser emission total number is more than the legal threshold value of laser, then sentence Whether the disconnected matching degree calculated is less than preset matching threshold value;
If the matching degree that S34 calculates is less than preset matching threshold value, then calculate described matching degree less than described The duration of preset matching threshold value, record the initial pose of described robot and current pose;
S35 judges that whether the duration calculated is more than preset time threshold;
If the duration that S36 calculates is more than preset time threshold, then according to the initial bit of described robot Appearance and current pose, calculate the displacement difference of described robot;Otherwise, step S31 is jumped to;
S37 judges that whether the displacement difference calculated is more than preset displacement threshold value;If the displacement difference calculated is more than Preset displacement threshold value, the most described robot local environment is not mated with described map;Otherwise, described machine People's local environment and described map match.
Robot the most according to claim 2 local environment and the matching and recognition method of map, it is special Levying and be, described step S10 further comprises the steps of:
S11 utilizes particle filter to position robot local environment, obtains the position of described robot Appearance;Its ranging formula is as follows:
X (t)=f (x (t-1), u (t));
Y (t)=g (x (t), z (t), m);
Wherein, x (t) is the described robot pose in t, and x (t-1) is that described robot is in the t-1 moment Pose, u (t) is the described robot control input quantity in t;Y (t) is ring residing for described robot The laser data in border, z (t) is the measurement noise analogue value, and m is cartographic information.
Robot the most according to claim 4 local environment and the matching and recognition method of map, it is special Levying and be, described step S20 further comprises the steps of:
S21 uses Gaussian distribution density function to calculate weight p of t single particle, by all particles Weight adds up the matching degree being described robot local environment with described map;Single particle weight calculation Formula is as follows:
C=1/2d;
Wherein, α is Gaussian Profile coefficient, and d is the laser length in described laser data, and β is that Gauss divides The variance of cloth, l is between the barrier in described robot local environment and the barrier in described map Distance.
6. mate knowledge according to the robot local environment described in any one in Claims 1 to 5 and map Other method, it is characterised in that further comprise the steps of:
S50 is when described robot local environment and described map match, and described robot is according to path planning Walking;
S60 is not when described robot local environment is mated with described map, and described robot reports to the police and stops Walk according to path planning.
Robot the most according to claim 6 local environment and the matching and recognition method of map, it is special Levy and be, further comprise the steps of:
S51 is when described robot walks according to path planning, it is judged that whether described robot reaches home position Put;
The position if the described robot of S52 reaches home, the most described robot stops walking;Otherwise, jump to Step S01.
8. apply at the robot local environment as described in any one in claim 1~7 and map for one kind The system of matching and recognition method, it is characterised in that including:
Locating module, is used for utilizing particle filter to position robot local environment, obtains described The pose of robot;
Computing module, described computing module electrically connects with described locating module, is used for calculating described machine The matching degree of people's local environment and the map read;
Judge module, described judge module electrically connects with described computing module, is used for judging described robot Whether local environment meets with the matching degree of described map mates requirement;
Match cognization module, described match cognization module electrically connects with described judge module, if for described Robot local environment is unsatisfactory for mating requirement with the matching degree of described map, ring residing for the most described robot Border is not mated with described map;Otherwise, described robot local environment and described map match.
Robot the most according to claim 8 local environment and the match cognization system of map, it is special Levy and be, including:
Data acquisition module, for utilizing laser sensor to obtain the laser number of described robot local environment According to, described laser data includes laser pick-off total number, Laser emission total number, laser length and swashs Angular.
Robot the most according to claim 9 local environment and the match cognization system of map, its It is characterised by:
Described computing module is additionally operable to the laser pick-off total number according to described laser sensor, calculates laser Receive total number and account for the ratio of Laser emission total number;
Described judge module is additionally operable to judge whether laser pick-off total number accounts for the ratio of Laser emission total number More than the legal threshold value of laser;
If the ratio that laser pick-off total number accounts for Laser emission total number is more than the legal threshold value of laser, then described Judge module is additionally operable to judge that whether the matching degree calculated is less than preset matching threshold value;
If the matching degree calculated is less than preset matching threshold value, the most described computing module is additionally operable to calculate institute Stating the matching degree duration less than described preset matching threshold value, logging modle is used for recording described robot Initial pose and current pose;
Described judge module is additionally operable to judge that whether the duration calculated is more than preset time threshold;
If the duration calculated is more than preset time threshold, the most described computing module is additionally operable to according to institute State the initial pose of robot and current pose, calculate the displacement difference of described robot;
Described judge module is additionally operable to judge that whether the displacement difference calculated is more than preset displacement threshold value;
If the displacement difference calculated is more than preset displacement threshold value, the most described match cognization module is used for identifying Described robot local environment is not mated with described map;Otherwise, described match cognization module is used for identifying Go out described robot local environment and described map match.
CN201610244332.3A 2016-04-13 2016-04-13 A kind of matching and recognition method and system of robot local environment and map Active CN105892461B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610244332.3A CN105892461B (en) 2016-04-13 2016-04-13 A kind of matching and recognition method and system of robot local environment and map

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610244332.3A CN105892461B (en) 2016-04-13 2016-04-13 A kind of matching and recognition method and system of robot local environment and map

Publications (2)

Publication Number Publication Date
CN105892461A true CN105892461A (en) 2016-08-24
CN105892461B CN105892461B (en) 2018-12-04

Family

ID=56705071

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610244332.3A Active CN105892461B (en) 2016-04-13 2016-04-13 A kind of matching and recognition method and system of robot local environment and map

Country Status (1)

Country Link
CN (1) CN105892461B (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106323273A (en) * 2016-08-26 2017-01-11 深圳微服机器人科技有限公司 Robot relocation method and device
CN106568432A (en) * 2016-10-20 2017-04-19 上海物景智能科技有限公司 Moving robot primary pose obtaining method and system
CN107820615A (en) * 2017-08-23 2018-03-20 深圳前海达闼云端智能科技有限公司 Send the method, apparatus and server of prompt message
CN108458706A (en) * 2017-12-25 2018-08-28 达闼科技(北京)有限公司 A kind of air navigation aid, device, cloud server and computer program product
CN108550134A (en) * 2018-03-05 2018-09-18 北京三快在线科技有限公司 It builds the determination method of figure effectiveness indicator and builds the determining device of figure effectiveness indicator
CN109035291A (en) * 2018-08-03 2018-12-18 重庆电子工程职业学院 Robot localization method and device
CN110189366A (en) * 2019-04-17 2019-08-30 北京迈格威科技有限公司 A kind of laser rough registration method, apparatus, mobile terminal and storage medium
CN111812613A (en) * 2020-08-06 2020-10-23 常州市贝叶斯智能科技有限公司 Mobile robot positioning monitoring method, device, equipment and medium
CN112082554A (en) * 2020-08-05 2020-12-15 深圳市优必选科技股份有限公司 Robot navigation method, device, terminal equipment and storage medium
CN112097772A (en) * 2020-08-20 2020-12-18 深圳市优必选科技股份有限公司 Robot and map construction method and device thereof
CN112581535A (en) * 2020-12-25 2021-03-30 达闼机器人有限公司 Robot positioning method, device, storage medium and electronic equipment
CN112700495A (en) * 2020-11-25 2021-04-23 北京旷视机器人技术有限公司 Pose determination method and device, robot, electronic device and storage medium
WO2021109166A1 (en) * 2019-12-06 2021-06-10 苏州艾吉威机器人有限公司 Three-dimensional laser positioning method and system
CN114407005A (en) * 2021-12-02 2022-04-29 国能铁路装备有限责任公司 Robot and walking control method and device thereof

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102576228A (en) * 2009-08-31 2012-07-11 Neato机器人技术公司 Method and apparatus for simultaneous localization and mapping of mobile robot environment
CN103186797A (en) * 2011-12-29 2013-07-03 财团法人工业技术研究院 Visual positioning method and device
CN103576686A (en) * 2013-11-21 2014-02-12 中国科学技术大学 Automatic guide and obstacle avoidance method for robot
CN103631264A (en) * 2013-12-04 2014-03-12 苏州大学张家港工业技术研究院 Method and device for simultaneous localization and mapping
CN103926925A (en) * 2014-04-22 2014-07-16 江苏久祥汽车电器集团有限公司 Improved VFH algorithm-based positioning and obstacle avoidance method and robot
JP2015146091A (en) * 2014-02-03 2015-08-13 トヨタ自動車株式会社 Position estimation method of mobile robot

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102576228A (en) * 2009-08-31 2012-07-11 Neato机器人技术公司 Method and apparatus for simultaneous localization and mapping of mobile robot environment
CN103186797A (en) * 2011-12-29 2013-07-03 财团法人工业技术研究院 Visual positioning method and device
CN103576686A (en) * 2013-11-21 2014-02-12 中国科学技术大学 Automatic guide and obstacle avoidance method for robot
CN103631264A (en) * 2013-12-04 2014-03-12 苏州大学张家港工业技术研究院 Method and device for simultaneous localization and mapping
JP2015146091A (en) * 2014-02-03 2015-08-13 トヨタ自動車株式会社 Position estimation method of mobile robot
CN103926925A (en) * 2014-04-22 2014-07-16 江苏久祥汽车电器集团有限公司 Improved VFH algorithm-based positioning and obstacle avoidance method and robot

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106323273A (en) * 2016-08-26 2017-01-11 深圳微服机器人科技有限公司 Robot relocation method and device
CN106323273B (en) * 2016-08-26 2019-05-21 深圳微服机器人科技有限公司 A kind of robot method for relocating and device
CN106568432B (en) * 2016-10-20 2019-07-09 上海物景智能科技有限公司 A kind of initial pose acquisition methods of mobile robot and system
CN106568432A (en) * 2016-10-20 2017-04-19 上海物景智能科技有限公司 Moving robot primary pose obtaining method and system
CN107820615A (en) * 2017-08-23 2018-03-20 深圳前海达闼云端智能科技有限公司 Send the method, apparatus and server of prompt message
CN108458706A (en) * 2017-12-25 2018-08-28 达闼科技(北京)有限公司 A kind of air navigation aid, device, cloud server and computer program product
CN108550134A (en) * 2018-03-05 2018-09-18 北京三快在线科技有限公司 It builds the determination method of figure effectiveness indicator and builds the determining device of figure effectiveness indicator
CN109035291B (en) * 2018-08-03 2020-11-20 重庆电子工程职业学院 Robot positioning method and device
CN109035291A (en) * 2018-08-03 2018-12-18 重庆电子工程职业学院 Robot localization method and device
CN110189366A (en) * 2019-04-17 2019-08-30 北京迈格威科技有限公司 A kind of laser rough registration method, apparatus, mobile terminal and storage medium
WO2021109166A1 (en) * 2019-12-06 2021-06-10 苏州艾吉威机器人有限公司 Three-dimensional laser positioning method and system
CN112082554A (en) * 2020-08-05 2020-12-15 深圳市优必选科技股份有限公司 Robot navigation method, device, terminal equipment and storage medium
CN111812613A (en) * 2020-08-06 2020-10-23 常州市贝叶斯智能科技有限公司 Mobile robot positioning monitoring method, device, equipment and medium
CN112097772A (en) * 2020-08-20 2020-12-18 深圳市优必选科技股份有限公司 Robot and map construction method and device thereof
WO2022036981A1 (en) * 2020-08-20 2022-02-24 深圳市优必选科技股份有限公司 Robot, and map construction method and device thereof
CN112097772B (en) * 2020-08-20 2022-06-28 深圳市优必选科技股份有限公司 Robot and map construction method and device thereof
CN112700495A (en) * 2020-11-25 2021-04-23 北京旷视机器人技术有限公司 Pose determination method and device, robot, electronic device and storage medium
CN112581535A (en) * 2020-12-25 2021-03-30 达闼机器人有限公司 Robot positioning method, device, storage medium and electronic equipment
WO2022134680A1 (en) * 2020-12-25 2022-06-30 达闼机器人股份有限公司 Method and device for robot positioning, storage medium, and electronic device
CN114407005A (en) * 2021-12-02 2022-04-29 国能铁路装备有限责任公司 Robot and walking control method and device thereof

Also Published As

Publication number Publication date
CN105892461B (en) 2018-12-04

Similar Documents

Publication Publication Date Title
CN105892461A (en) Method and system for matching and recognizing the environment where robot is and map
Alatise et al. A review on challenges of autonomous mobile robot and sensor fusion methods
EP2927769B1 (en) Localization within an environment using sensor fusion
Kleiner et al. Real‐time localization and elevation mapping within urban search and rescue scenarios
Das et al. Real-time vision-based control of a nonholonomic mobile robot
CN111220153B (en) Positioning method based on visual topological node and inertial navigation
Apostolopoulos et al. Integrated online localization and navigation for people with visual impairments using smart phones
CN112518739A (en) Intelligent self-navigation method for reconnaissance of tracked chassis robot
CN106643739A (en) Indoor environment personnel location method and system
CN113325837A (en) Control system and method for multi-information fusion acquisition robot
CN110293965A (en) Method of parking and control device, mobile unit and computer-readable medium
CN114683290B (en) Method and device for optimizing pose of foot robot and storage medium
Ruotsalainen et al. Improving computer vision-based perception for collaborative indoor navigation
KR102163462B1 (en) Path-finding Robot and Mapping Method Using It
CN110487281A (en) A kind of intelligent guidance system detecting barrier
Tardioli et al. A robotized dumper for debris removal in tunnels under construction
CN107820562A (en) A kind of air navigation aid, device and electronic equipment
Tur et al. A closed-form expression for the uncertainty in odometry position estimate of an autonomous vehicle
Siddiqui UWB RTLS for construction equipment localization: experimental performance analysis and fusion with video data
WO2022004333A1 (en) Information processing device, information processing system, information processing method, and program
CN110216675B (en) Control method and device of intelligent robot, intelligent robot and computer equipment
Lee et al. Fail-safe multi-modal localization framework using heterogeneous map-matching sources
Spassov et al. Map-matching for pedestrians via bayesian inference
KR101829348B1 (en) System for constructing and managing variable line information for constructing line information
KR101829342B1 (en) System for constructing and managing precise information of road using variable equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20201218

Address after: 233399, 4 floor, building 3, industrial Acceleration Center, Wuhe County Economic Development Zone, Bengbu, Anhui.

Patentee after: WUHE ZHIKAI ENVIRONMENTAL PROTECTION TECHNOLOGY Co.,Ltd.

Patentee after: Zeng Yuee

Address before: 201702 212-214, block B, No. 599, Gaojing Road, Qingpu District, Shanghai

Patentee before: SHANGHAI VIEW TECHNOLOGIES Co.,Ltd.

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20210913

Address after: 273300 No.207, East Second floor, Administration Committee of Shandong Pingyi Economic Development Zone, west section of Jinhua Road, Pingyi Economic Development Zone, Linyi City, Shandong Province

Patentee after: Pingyi Economic Development Zone Investment Development Co.,Ltd.

Address before: 233399, 4 floor, building 3, industrial Acceleration Center, Wuhe County Economic Development Zone, Bengbu, Anhui.

Patentee before: WUHE ZHIKAI ENVIRONMENTAL PROTECTION TECHNOLOGY Co.,Ltd.

Patentee before: Zeng Yuee