CN110196044A - It is a kind of based on GPS closed loop detection Intelligent Mobile Robot build drawing method - Google Patents
It is a kind of based on GPS closed loop detection Intelligent Mobile Robot build drawing method Download PDFInfo
- Publication number
- CN110196044A CN110196044A CN201910453204.3A CN201910453204A CN110196044A CN 110196044 A CN110196044 A CN 110196044A CN 201910453204 A CN201910453204 A CN 201910453204A CN 110196044 A CN110196044 A CN 110196044A
- Authority
- CN
- China
- Prior art keywords
- laser
- pose
- data
- closed loop
- subgraph
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 39
- 238000001514 detection method Methods 0.000 title claims abstract description 28
- 238000012216 screening Methods 0.000 claims abstract description 12
- 238000004364 calculation method Methods 0.000 claims abstract description 4
- 239000002245 particle Substances 0.000 claims description 36
- 238000006073 displacement reaction Methods 0.000 claims description 12
- 238000012545 processing Methods 0.000 claims description 7
- 210000003813 thumb Anatomy 0.000 claims description 2
- 238000005516 engineering process Methods 0.000 abstract description 2
- 230000033001 locomotion Effects 0.000 description 6
- 230000001133 acceleration Effects 0.000 description 5
- 238000012952 Resampling Methods 0.000 description 3
- 230000004927 fusion Effects 0.000 description 3
- 241001269238 Data Species 0.000 description 2
- 230000007812 deficiency Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000007689 inspection Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 239000000654 additive Substances 0.000 description 1
- 230000000996 additive effect Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000001186 cumulative effect Effects 0.000 description 1
- 230000007850 degeneration Effects 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 235000013399 edible fruits Nutrition 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 230000014759 maintenance of location Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000013021 overheating Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
Abstract
The invention belongs to intelligent robot technology field, discloses a kind of Intelligent Mobile Robot based on the detection of GPS closed loop and build drawing method, comprising: robot is travelled according to fixed route, and laser point cloud data is obtained in traveling;In global coordinate system, using transformational relation and previous frame laser point cloud data between pose in adjacent two frames laser, interframe laser point cloud data is spliced;Tile type builds the step of figure;The designated frame for choosing subgraph is key frame;The current location for orienting robot in real time according to GPS data, using the position as the center of circle, distance to a declared goal is radius, defines closed loop candidate region;The corresponding laser frame of the pose and key frame are carried out closed loop connection when certain pose meets screening conditions by each candidate pose in each subgraph in screening closed loop candidate region.Drawing method is built using the Intelligent Mobile Robot of the invention based on the detection of GPS closed loop, positioning relevant calculation amount substantially reduces, and location efficiency and positioning accuracy are improved significantly.
Description
Technical field
The invention belongs to intelligent robot technology fields, and in particular to a kind of substation inspection based on the detection of GPS closed loop
Robot builds drawing method.
Background technique
It is the necessary links before the operation of laser navigation transformer substation robot that substation, which builds figure, mainly uses up the map of substation
Possibility is detailed and is accurately recorded by way of data.Laser navigation Intelligent Mobile Robot is using the map and swashs
The peripheral information that optical sensor obtains in real time compares, and the position of Intelligent Mobile Robot is determined by location algorithm.Institute
It is most important with navigator fix of the precision of substation's map for laser navigation transformer substation robot.
It is information of the robot according to sensor that the two-dimensional laser of transformer substation robot, which builds figure, calculates self-position on one side,
The process of one side constructing environment map.Common method is to acquire data jointly by laser radar and encoder, and the later period is by data
Fusion, to generate substation's map, this method is had the following deficiencies:
(1) single alignment sensor is used, it is difficult to meet the positioning accuracy collection reliability requirement of robot;
(2) there are biggish cumulative errors, cause part-toroidal road that can not be closed;
(3) substation build figure process data volume it is huge, build figure inefficiency.
Summary of the invention
Object of the present invention is to: in view of the deficiencies of the prior art, provide a kind of substation inspection machine based on the detection of GPS closed loop
Device people builds drawing method.This method, which can effectively improve, builds figure efficiency, precision and accuracy.
Specifically, the present invention adopts the following technical solutions realize, comprising:
The step of obtaining laser point cloud data:
Robot is travelled according to fixed route, in the process of moving record two-dimensional laser data, mileage counts, inertia is led
Unit (IMU) data of navigating and GPS data;
The pose initial value of robot is estimated;
Seek pose transformational relation in adjacent two frames laser data;
Two-dimensional laser data processing is handled as laser point cloud data;
Calculate pose of the laser point cloud in laser sensor coordinate system;
The step of splicing laser point cloud data:
In global coordinate system, transformational relation and previous frame laser point cloud number between pose in the adjacent two frames laser are utilized
According to splicing to interframe laser point cloud data;
Tile type builds the step of figure:
Using continuous laser point cloud data spanning subgraph, the laser data of continuous specified frame number generates a Zhang Zitu,
Include the laser point cloud data in previous subgraph in rear specified frame number in latter subgraph;
The step of closed loop detects:
The designated frame for choosing subgraph is key frame;
The current location for orienting robot in real time according to GPS data, using the position as the center of circle, distance to a declared goal is radius,
Define closed loop candidate region;
Each candidate pose in each subgraph in screening closed loop candidate region, when certain pose meets screening conditions, by this
The corresponding laser frame of pose and key frame carry out closed loop connection.
Furthermore, the method that the pose initial value to robot is estimated includes:
It is counted according to mileage, IMU data establish the pose predictive equation of robot, the state for more new particle;
Counted using mileage, two-dimensional laser sensing data, GPS data establish observation model, for calculating particle
Weight;
Weighted average by calculating population obtains the initial value estimated value of robot pose.
Furthermore, in the step of tile type builds figure, subgraph is indicated in the form of grating map, and difference indicates
Each grid is idle or occupies state.
Furthermore, every to establish a Zhang Zitu in the step of tile type builds figure, carry out a closed loop detection.
Furthermore, the screening method particularly includes:
The confidence level of each subgraph in closed loop candidate region is calculated, calculation formula is as follows:
Wherein, the map grid sum that m occupies for the corresponding laser point cloud of frame laser data a certain in the subgraph, if n-th
The map reference of a map grid is (xn, yn), then the map grid confidence level is
It is calculated in each subgraph according to the pose difference of each pose of robot in each subgraph pose corresponding with present frame
The corresponding confidence weight of each candidate's pose, formula are as follows:
Wherein, x_offset is the displacement between the candidate pose of some in some subgraph and current pose along x-axis, y_offset
It is the displacement between the candidate pose of some in some subgraph and current pose along y-axis, transweightIt is displacement weight,
Candiate.rotation is to rotate angle, rot between the candidate pose of some in some subgraph and current poseweightIt is rotation angle
Spend weight;
Calculate the confidence of the candidate pose of some in the confidence level candidate_probability and the subgraph of each subgraph
The product of weight candidate_weight is spent, as the confidence level score value of current closed loop candidate pose, formula is as follows:
Score=candidate_probability*candidate_weight
The highest pose of confidence level score value score is selected, when the confidence level score value score of the pose is greater than a certain threshold value
When, then the pose meets screening conditions.
Furthermore, described to handle two-dimensional laser data processing to further include when laser point cloud data, it is obtained often
After one frame laser data, first check whether the frame per second of laser data meets the requirement of threshold value, it is then with thumb down if it is less than threshold value
The requirement of sufficient threshold value, at this time without using the laser data and report and alarm, waiting receives next frame laser data.
Furthermore, after often obtaining a frame laser data, if the frame per second of the laser data meets the requirement of threshold value, then
First each laser reflection point in the laser data is filtered, is removed in each laser reflection point at a distance of closer point and farther away
Point, remaining each laser reflection point are used further to subsequent each laser reflection point in the meter of the pose in laser sensor coordinate system
It calculates.
Beneficial effects of the present invention are as follows: being built using the Intelligent Mobile Robot of the invention based on the detection of GPS closed loop
Drawing method improves the positioning accuracy of robot using the positioning method combined of multi-sensor information such as odometer, IMU, GPS
And reliability;By extracting key frame, and the closed loop detection method of GPS positioning screening is combined, effectively reduces matching candidate collection
Quantity, improve closed loop detection speed, and then improve substation and build figure efficiency;The positioning of GPS is utilized in closed loop detection
Data are effectively reduced and are missed closed loop rate as caused by similar scene, are improved the precision and accuracy of closed loop detection, are finally made
Obtain substation's map perfection closure.
Detailed description of the invention
Fig. 1 is that the embodiment of the present invention builds map flow chart.
Fig. 2 is the robot localization method flow diagram of the embodiment of the present invention.
Fig. 3 is the closed loop testing process of the embodiment of the present invention.
Fig. 4 is the closed loop candidate region schematic diagram of the embodiment of the present invention.
Specific embodiment
Below with reference to embodiment and referring to attached drawing, present invention is further described in detail.
Embodiment:
One embodiment of the present of invention, describe it is a kind of based on GPS closed loop detection Intelligent Mobile Robot build figure side
Method.
The robot that the present embodiment uses, hardware system sensor mainly includes odometer, inertial navigation unit, three-dimensional
Laser sensor, wherein odometer can be used to calibrating position predicted value, inertial navigation unit can be used to calibrate linear velocity,
Angular speed, three-dimensional laser sensor are used to obtain laser data.Robot software's system uses robot operating system ROS, this
It is a kind of common robot software's platform, it can provide the function of similar operations system for disparate computers cluster, in ROS system
It include the node for realizing positioning function in system.It is, of course, understood that the method for the present invention can also pass through other robot
Software systems are realized that the present embodiment is only used as a kind of implementation using ROS system.
The Navigational Movements of map and robot in the present embodiment assume within a two-dimensional surface, the coordinate used
System includes map coordinates system, robot coordinate system, laser sensor coordinate system.
Map coordinates system is global coordinate system.Map coordinates system is used when calculating robot's pose.
Robot coordinate system is with the coordinate system of the artificial origin of machine, in two dimensional navigation, usually with the center of robot
Point is origin.
Laser sensor coordinate system is the pose of laser data using the center of laser sensor as the coordinate system of origin
Use laser sensor coordinate system.
It needs the data below different coordinates to be transformed into the same coordinate system, just can be carried out pose and compare and calculate.
The conversion of coordinate value can be realized by TF module (coordinate transferring) in ROS system between different coordinates.
Referring to Fig.1, the Intelligent Mobile Robot based on the detection of GPS closed loop of the present embodiment builds drawing method, realizes
Key step is completed in front-end and back-end respectively.Wherein, front-end processor is the built-in industrial control machine of robot built-in, completes to swash
The acquisition and processing of light point cloud data, laser point cloud data splicing, tile type build figure, and the laser frame of subgraph is sent at rear end
Manage device;Using the computer of high calculated performance, the laser frame of subgraph and GPS data carry out closed loop to back-end processor based on the received
Detection, and the grating map of generation is returned into front-end processor.Specific step is as follows:
1, the acquisition and processing of laser point cloud data
1-1) Intelligent Mobile Robot receives patrol task, travels according to fixed route, in the process of moving with fixation
Sample frequency record two-dimensional laser data, mileage count, inertial navigation unit (IMU) data and GPS data.
Particle filter algorithm 1-2) is utilized, in conjunction with the data information of the Multi-sensor Fusions such as odometer, IMU, GPS, to machine
The pose initial value of device people is estimated.
In the real-time position fixing process of robot, there are some particles that may deviate target too far.These particles are that we want
System mode a possibility that it is very small, therefore obtain weight it is very small.If only changing particle according to particle propagation
If state, the particle of deviation is to deviate forever, is difficult to return around target.In this way, deviate particle may it is more long-pending more
It is more, there is deviation so as to cause positioning.In the present embodiment, estimated using the initial pose that particle filter carries out Multi-sensor Fusion
It calculates.Specifically, counted according to mileage, IMU data establish the pose predictive equation of robot, counted using mileage, two
Dimension laser sensor data, GPS data establish observation model.Pose predictive equation is used for the state of more new particle, observation model
For calculating the weight of particle, the initial value estimated value of robot pose is obtained finally by the weighted average for calculating population,
As shown in Figure 2.
1-2-1) by pose in two frame laser datas of the two-dimensional laser sensor from robot, at the beginning of determining robot
Beginning pose P0(x0,y0,θ0) and initial velocity
1-2-2) according to newest received odometer, the acceleration of IMU data acquisition robotWhereinFor t moment robot x-axis direction acceleration,Acceleration for t moment robot in y-axis direction,For t moment
Robot angular acceleration.In conjunction with initial velocityThe movement velocity for updating robot, establishes the pose of robot
Predictive equation.Assuming that the speed of t moment robot isWherein,It is t moment robot in x-axis direction
Speed,Speed for t moment robot in y-axis direction,For t moment Schemes of Angular Velocity Estimation for Robots, then the pose of robot is pre-
Surveying equation can indicate are as follows:
Wherein, Pt、Pt-1Respectively robot in t moment, the corresponding pose of t-1 reception two continuous frames laser data,For the speed of t-1 moment robot,For t-1 moment robot x-axis direction speed,Speed for t-1 moment robot in y-axis direction,For t-1 moment Schemes of Angular Velocity Estimation for Robots, Δ t is t moment and t-1
Time difference between moment, At-1For the acceleration of t-1 moment robot, ut-1For the system noise at t-1 moment.
It is counted according to newest received mileage, two-dimensional laser sensing data, GPS establish following observation model:
zT, i=(xT, i, yT, i, θT, i)
zT, l=(xT, l, yT, l, θT, l)
zT, g=(xT, g, yT, g, θT, g)
zt=λ1zT, i+λ2zT, l+λ3zT, l=(xt, yt, θt)
Wherein, zt,iFor the observation model of odometer, xt,iFor robot in the received mileage of t moment counts x-axis side
To data, yt,iFor the data in robot y-axis direction in the received mileage of t moment counts, θt,iIt is robot in t moment
Received mileage counts middle angular speed;zt,lFor the observation model of two-dimensional laser sensor, xt,lIt is received for robot in t moment
Two-dimensional laser sensing data in x-axis direction data, yt,lIt is robot in the received two-dimensional laser sensor number of t moment
According to the data in middle y-axis direction, θt,lFor robot in the received two-dimensional laser sensing data of t moment angular speed;zt,gFor GPS
Observation model, xt,gFor the data of robot x-axis direction in the received GPS data of t moment, yt,gIt is robot in t moment
The data in y-axis direction, θ in received GPS datat,gFor robot in the received GPS data of t moment angular speed;ztIt is to inner
The observation model that journey counts, two-dimensional laser sensing data, GPS are final after merging, xtFor t moment comprehensive odometer, two dimension
The data of laser sensor, GPS three in x-axis direction, data of the final robot in x-axis direction, ytFor comprehensive odometer, two
The data of laser sensor, GPS three in y-axis direction, the data of final robot on the y axis are tieed up, θ t is t moment robot
Angular speed, λ1、λ2、λ3The respectively weight of the observation of odometer, two-dimensional laser sensor and GPS, meets λ in practice1+λ2
+λ3=1, and λ2> λ1> λ3。
It is input with the pose predictive equation of robot, two-dimensional laser positioning is carried out, it is possible to prevente effectively from because of sample degeneracy
Positioning drift phenomenon caused by problem.
1-2-3) generate particle.According to the initial pose P of predetermined robot0=(x0,y0,θ0), initial time is in machine
N number of equally distributed particle is generated in the motion range of device people at random, and particle hasThree
Feature.
1-2-4) utilize robot motion model's predicted state.According to 1-2-2) in the motion pose of robot sought it is pre-
Survey equation, update 1-2-3) in each particle state.Since there are noises for control system, reasonable noise need to be added.It closes
The noise of reason refers to the Gaussian noise with normal distribution statistical characteristic under normal conditions, first, Gaussian white noise is available
Specific mathematic(al) representation statement, is convenient for deriving analysis and operation;Second, Gaussian noise reflects in actual control system really
Additive noise situation, than more accurately representing the characteristic of noise in control system.
1-2-5) more new particle weight.1-2- is substituted into using the observation that odometer, two-dimensional laser sensor and GPS are obtained
2) weight of observation model more new particle calculates corresponding weight with the successively arrival of observation for each particle.The power
The pose that weight values represent prediction obtains the probability of observation model when taking each particle.It is commented as all being carried out to all particles
Valence.Closer to the particle of observation, the weight of acquisition is higher.The calculation of weight is as follows:
ωi=1/di
Wherein, ωiFor the weight of i-th of particle, diEuclidean distance for i-th of particle apart from observation model value.
1-2-6) calculate state variable estimate.The estimated value of state variable is calculated by the weighted average of populationI.e. robot receive kth frame laser data when t moment robot pose estimated value pose_estimated.
1-2-7) resampling.In order to solve the weight degenerate problem in calculating process, using number of effective particles NeFf is measured
The degree of degeneration of particle:
Wherein ωiFor the weight of i-th of particle.Number of effective particles NeffIt is smaller, show that weight degradation is more serious.
Work as NeffValue be less than a certain threshold value NthWhen, carry out resampling;Otherwise, the processing for carrying out subsequent time data, according to
The state of the motion pose predictive equation more new particle of the robot of subsequent time seeks the pose estimation of subsequent time robot
Value.Resampling method particularly includes: particle is screened according to the weight of particle, it, should a large amount of right of retention in screening process
Great particle gives up the small particle of sub-fraction weight, instead the biggish particle of weight.
1-3) seek pose transformational relation in adjacent two frames laser data.According to robot respectively in t moment, t-1 moment
Receive the corresponding pose P of two continuous frames laser datat、Pt-1, solve Pt=Pt-1rt-1+tt-1, obtain robot two continuous frames laser
Transformational relation (r between the corresponding pose of datat-1,tt-1).Wherein, rt-1For swing offset matrix, P is representedtWith Pt-1Between
Rotate angle displacement;rt-1For straight-line displacement matrix, P is representedtWith Pt-1Between straight-line displacement.
1-4) laser data is handled to obtain laser point cloud data, laser point cloud data is filtered, that is, is removed
Corresponding noise (at a distance of closer point and farther away point) in each laser reflection point in laser point cloud data, residue is as effective
Point.Calculate pose of effective laser point cloud in laser sensor coordinate system.It is specific as follows:
After often obtaining laser data laserscan of the frame from the two-dimensional laser sensor of robot, ROS system is to sharp
Light data is handled to obtain laser point cloud data pointcloud, and laser point cloud data pointcloud is every frame laser number
According to the general designation of a laser data point information of the laserscan.size () for being included, each laser reflection point is reflected in laser
Coordinate in sensor coordinate system.The number of data points that the frame per second of different laser sensors, scanning angle, every frame include is different, ROS
System is handled and is calculated using laser point cloud data in position fixing process.
The characteristics of for sparse environment, can be filtered laser point cloud data pointcloud, that is, remove laser point
Corresponding noise (at a distance of closer point and farther away point), remaining conduct in each laser reflection point in cloud data pointcloud
The confidence level of pose estimation can be improved in available point.Due to the corresponding laser point cloud data of each frame laser data
Result that pointcloud is filtered is different, and the value of the corresponding available point of each frame laser data is different.For example,
On the sensor that laserscan.size () is 1141, available point is about several hundred.It, be into for laser data laserscan
The detection of row frame per second checks whether the frame per second of laser data laserscan is less than threshold that is, after often obtaining a frame laser data
Value then re-starts respective handling after report and alarm and waiting receive next frame laser data if it is less than threshold value.For example, such as
The frame per second that fruit defines is 25Hz, and the time for obtaining adjacent two frames laser data is greater than 40ms, then laser data laserscan is not
It meets the requirements, the failures such as overheating occurs in possible laser sensor need report and alarm and wait next frame laser data.
2, laser point cloud data splices
In global coordinate system, transformational relation between the pose of the adjacent two frames laser point cloud acquired in step 1-3) is utilized
(rt-1,tt-1) and previous frame laser point cloud data, interframe laser point cloud is spliced, formula is spliced are as follows:
Qk=Qk-1rk-1+tk-1
Wherein, Qk-1、QkRespectively -1 frame of kth, kth frame laser point cloud data.
3, tile type builds figure
Interframe laser point cloud data in global coordinate system is mapped in map coordinates system according to transformational relation, map uses
Grating map form indicates that the size of grid determines the resolution ratio of map, and each grid indicates that the lattice point is with s=0 or s=1
Free time still occupies state.For each it is observed that grid, grey parts indicate the grid that is fallen in of laser terminal point,
Barrier is detected the presence of, indicates that the state being occupied, white indicate idle state.It is generated per continuous 40 frame laser data
One Zhang Zitu, second subgraph include the rear 20 frame laser data of a upper subgraph.For example, by t frame to t+40 frame laser number
According to being registrated, subgraph A1 is formed, t+20 frame is registrated to t+60 frame laser data, forms subgraph A2, t+40 frame to t
+ 80 frame laser datas are registrated, and subgraph A3 is formed, and so on.
4, closed loop detects
It is matched between the subgraph of serial number using laser interframe pose, to establish connection between subgraph;And it closes
Ring detection be between the associated subgraph discontinuously numbered between establish connection, to reduce the tired of the laser frame between the two subgraphs
Product error.It selects the specified laser frame (such as last frame) of every subgraph or specifies several laser frames as key frame.It is logical
It crosses and closed loop detection is carried out between the posture information of key frame subgraph, effectively reduce the quantity of matching candidate collection, improve detection speed.
Whenever establishing a Zhang Zitu in step 3, in conjunction with GPS data, a closed loop detection is carried out.The specific method is as follows:
4-1) it is assumed that map is S-type.GPS data orients the current location of robot in real time, with the position
It is set to the center of circle, distance to a declared goal (such as 5 meters) is radius, and positioning filters out closed loop candidate region, i.e. round dash area in Fig. 4.
The confidence level candidate_probability for 4-2) calculating each subgraph of robot in closed loop candidate region, calculates
Formula is as follows:
Wherein, the map grid sum that m occupies for the corresponding laser point cloud of frame laser data a certain in some subgraph, if the
The map reference of n map grid is (xn,yn), then the confidence level of the map grid is
It is calculated in each subgraph according to the pose difference of each pose of robot in each subgraph pose corresponding with present frame
Each candidate corresponding confidence weight candidate_weight of pose, formula are as follows:
Wherein, x_offset is the displacement difference between the candidate pose of some in some subgraph and current pose along x-axis, y_
Offset is the displacement difference between the candidate pose of some in some subgraph and current pose along y-axis, transweightIt is displacement weight,
Candiate.rotation is the rotation differential seat angle between the candidate pose of some in some subgraph and current pose, rotweightIt is rotation
Gyration weight.
The confidence level of the candidate pose of some in the confidence level candidate_probability and the subgraph of each subgraph is weighed
Confidence level score value of the product of value candidate_weight as current closed loop candidate pose, formula are as follows:
Score=candidate_probability*candidate_weight
The highest pose of confidence level score value score 4-3) is selected, when the score score of the pose is greater than a certain threshold value,
The corresponding laser frame of the pose and key frame carry out closed loop connection, show the corresponding corresponding subgraph of laser frame of the pose and the pass
Subgraph where the key frame position on map is adjacent.
Although the present invention has been described by way of example and in terms of the preferred embodiments, embodiment is not for the purpose of limiting the invention.Not
It is detached from the spirit and scope of the present invention, any equivalent change or retouch done also belongs to the protection scope of the present invention.Cause
This protection scope of the present invention should be based on the content defined in the claims of this application.
Claims (7)
1. a kind of Intelligent Mobile Robot based on the detection of GPS closed loop builds drawing method characterized by comprising
The step of obtaining laser point cloud data:
Robot is travelled according to fixed route, records two-dimensional laser data in the process of moving, mileage counts, inertial navigation list
First (IMU) data and GPS data;
The pose initial value of robot is estimated;
Seek pose transformational relation in adjacent two frames laser data;
Two-dimensional laser data processing is handled as laser point cloud data;
Calculate pose of the laser point cloud in laser sensor coordinate system;
The step of splicing laser point cloud data:
In global coordinate system, using transformational relation and previous frame laser point cloud data between pose in the adjacent two frames laser,
Interframe laser point cloud data is spliced;
Tile type builds the step of figure:
Using continuous laser point cloud data spanning subgraph, the laser data of continuous specified frame number generates a Zhang Zitu, latter
Include the laser point cloud data in previous subgraph in rear specified frame number in subgraph;
The step of closed loop detects:
The designated frame for choosing subgraph is key frame;
The current location for orienting robot in real time according to GPS data, using the position as the center of circle, distance to a declared goal is radius, is defined
Closed loop candidate region;
Each candidate pose in each subgraph in screening closed loop candidate region, when certain pose meets screening conditions, by the pose
Corresponding laser frame and key frame carry out closed loop connection.
2. the Intelligent Mobile Robot according to claim 1 based on the detection of GPS closed loop builds drawing method, feature exists
In the method that the pose initial value to robot is estimated includes:
It is counted according to mileage, IMU data establish the pose predictive equation of robot, the state for more new particle;
Counted using mileage, two-dimensional laser sensing data, GPS data establish observation model, for calculating the weight of particle;
Weighted average by calculating population obtains the initial value estimated value of robot pose.
3. the Intelligent Mobile Robot according to claim 1 based on the detection of GPS closed loop builds drawing method, feature exists
In, in the step of tile type builds figure, subgraph is indicated in the form of grating map, difference indicate each grid be it is idle also
It is to occupy state.
4. the Intelligent Mobile Robot according to claim 1 based on the detection of GPS closed loop builds drawing method, feature exists
In, it is every to establish a Zhang Zitu in the step of tile type builds figure, carry out a closed loop detection.
5. the Intelligent Mobile Robot according to claim 1 based on the detection of GPS closed loop builds drawing method, feature exists
In the screening method particularly includes:
The confidence level of each subgraph in closed loop candidate region is calculated, calculation formula is as follows:
Wherein, the map grid sum that m occupies for the corresponding laser point cloud of frame laser data a certain in the subgraph, if n-th of ground
The map reference of figure grid is (xn,yn), then the map grid confidence level is
It is calculated according to the pose difference of each pose of robot in each subgraph pose corresponding with present frame each in each subgraph
The corresponding confidence weight of candidate pose, formula are as follows:
Wherein, x_offset is the displacement between the candidate pose of some in some subgraph and current pose along x-axis, and y_offset is certain
Along the displacement of y-axis, trans between the candidate pose of some in a subgraph and current poseweightIt is displacement weight,
Candiate.rotation is to rotate angle, rot between the candidate pose of some in some subgraph and current poseweightIt is rotation angle
Spend weight;
The confidence level for calculating the candidate pose of some in the confidence level candidate_probability and the subgraph of each subgraph is weighed
The product of value candidate_weight, as the confidence level score value of current closed loop candidate pose, formula is as follows:
Score=candidate_probability*candidate_weight
The highest pose of confidence level score value score is selected, when the confidence level score value score of the pose is greater than a certain threshold value, then
The pose meets screening conditions.
6. any Intelligent Mobile Robot based on the detection of GPS closed loop builds drawing method according to claim 1~5,
It is characterized in that, it is described to handle two-dimensional laser data processing to further include when laser point cloud data, often obtaining a frame laser number
According to rear, first check whether the frame per second of laser data meets the requirement of threshold value, then sufficient threshold value with thumb down is wanted if it is less than threshold value
It asks, at this time without using the laser data and report and alarm, waiting receives next frame laser data.
7. the Intelligent Mobile Robot according to claim 6 based on the detection of GPS closed loop builds drawing method, feature exists
In after often obtaining a frame laser data, if the frame per second of the laser data meets the requirement of threshold value, then first in the laser data
Each laser reflection point be filtered, remove in each laser reflection point at a distance of closer point and farther away point, remaining each laser
Reflection point is used further to subsequent each laser reflection point in the calculating of the pose in laser sensor coordinate system.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910453204.3A CN110196044A (en) | 2019-05-28 | 2019-05-28 | It is a kind of based on GPS closed loop detection Intelligent Mobile Robot build drawing method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910453204.3A CN110196044A (en) | 2019-05-28 | 2019-05-28 | It is a kind of based on GPS closed loop detection Intelligent Mobile Robot build drawing method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110196044A true CN110196044A (en) | 2019-09-03 |
Family
ID=67753295
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910453204.3A Pending CN110196044A (en) | 2019-05-28 | 2019-05-28 | It is a kind of based on GPS closed loop detection Intelligent Mobile Robot build drawing method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110196044A (en) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111076733A (en) * | 2019-12-10 | 2020-04-28 | 亿嘉和科技股份有限公司 | Robot indoor map building method and system based on vision and laser slam |
CN111223145A (en) * | 2020-01-03 | 2020-06-02 | 上海有个机器人有限公司 | Data processing method, system, service device and storage medium thereof |
CN111337011A (en) * | 2019-12-10 | 2020-06-26 | 亿嘉和科技股份有限公司 | Indoor positioning method based on laser and two-dimensional code fusion |
CN111427979A (en) * | 2020-01-15 | 2020-07-17 | 深圳市镭神智能系统有限公司 | Dynamic map construction method, system and medium based on laser radar |
CN112162294A (en) * | 2020-10-10 | 2021-01-01 | 北京布科思科技有限公司 | Robot structure detection method based on laser sensor |
CN112378411A (en) * | 2019-12-06 | 2021-02-19 | 黑芝麻智能科技(上海)有限公司 | Method for mapping and positioning by using 3D point cloud and 6D camera posture |
CN112747735A (en) * | 2019-10-29 | 2021-05-04 | 株式会社日立大厦系统 | Route data collection device |
CN112986912A (en) * | 2021-03-19 | 2021-06-18 | 北京小狗吸尘器集团股份有限公司 | Floor sweeper repositioning method and device based on structured light sensor and electronic equipment |
CN113739785A (en) * | 2020-05-29 | 2021-12-03 | 杭州海康机器人技术有限公司 | Robot positioning method and device and storage medium |
CN113781569A (en) * | 2021-01-04 | 2021-12-10 | 北京京东乾石科技有限公司 | Loop detection method and device |
CN113838051A (en) * | 2021-11-25 | 2021-12-24 | 之江实验室 | Robot closed-loop detection method based on three-dimensional point cloud |
CN114518108A (en) * | 2020-11-18 | 2022-05-20 | 郑州宇通客车股份有限公司 | Positioning map construction method and device |
CN117406259A (en) * | 2023-12-14 | 2024-01-16 | 江西北斗云智慧科技有限公司 | Beidou-based intelligent construction site vehicle positioning method and system |
Citations (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130135138A1 (en) * | 2011-11-28 | 2013-05-30 | Raytheon Company | Method for phase unwrapping using confidence-based rework |
CN104778688A (en) * | 2015-03-27 | 2015-07-15 | 华为技术有限公司 | Method and device for registering point cloud data |
CN104932535A (en) * | 2015-06-04 | 2015-09-23 | 西安应用光学研究所 | Method for carrying out closed-loop test on airborne forward-looking infrared search equipment by using simulation system |
CN105856230A (en) * | 2016-05-06 | 2016-08-17 | 简燕梅 | ORB key frame closed-loop detection SLAM method capable of improving consistency of position and pose of robot |
CN106272423A (en) * | 2016-08-31 | 2017-01-04 | 哈尔滨工业大学深圳研究生院 | A kind of multirobot for large scale environment works in coordination with the method for drawing and location |
CN106971403A (en) * | 2017-04-27 | 2017-07-21 | 武汉数文科技有限公司 | Point cloud chart is as processing method and processing device |
CN107025668A (en) * | 2017-03-30 | 2017-08-08 | 华南理工大学 | A kind of design method of the visual odometry based on depth camera |
CN107160395A (en) * | 2017-06-07 | 2017-09-15 | 中国人民解放军装甲兵工程学院 | Map constructing method and robot control system |
CN107356252A (en) * | 2017-06-02 | 2017-11-17 | 青岛克路德机器人有限公司 | A kind of Position Method for Indoor Robot for merging visual odometry and physics odometer |
CN107657640A (en) * | 2017-09-30 | 2018-02-02 | 南京大典科技有限公司 | Intelligent patrol inspection management method based on ORB SLAM |
CN107796397A (en) * | 2017-09-14 | 2018-03-13 | 杭州迦智科技有限公司 | A kind of Robot Binocular Vision localization method, device and storage medium |
WO2018071416A1 (en) * | 2016-10-11 | 2018-04-19 | Kaarta, Inc. | Laser scanner with real-time, online ego-motion estimation |
CN108253958A (en) * | 2018-01-18 | 2018-07-06 | 亿嘉和科技股份有限公司 | A kind of robot real-time location method under sparse environment |
CN108287550A (en) * | 2018-02-01 | 2018-07-17 | 速感科技(北京)有限公司 | The method of SLAM systems and construction data correlation based on data correlation and error detection |
CN108337915A (en) * | 2017-12-29 | 2018-07-27 | 深圳前海达闼云端智能科技有限公司 | Three-dimensional builds drawing method, device, system, high in the clouds platform, electronic equipment and computer program product |
CN108458715A (en) * | 2018-01-18 | 2018-08-28 | 亿嘉和科技股份有限公司 | A kind of robot localization initial method based on laser map |
CN108520543A (en) * | 2018-04-09 | 2018-09-11 | 网易(杭州)网络有限公司 | A kind of method that relative accuracy map is optimized, equipment and storage medium |
CN109064506A (en) * | 2018-07-04 | 2018-12-21 | 百度在线网络技术(北京)有限公司 | Accurately drawing generating method, device and storage medium |
CN109087393A (en) * | 2018-07-23 | 2018-12-25 | 汕头大学 | A method of building three-dimensional map |
CN109084732A (en) * | 2018-06-29 | 2018-12-25 | 北京旷视科技有限公司 | Positioning and air navigation aid, device and processing equipment |
CN109214248A (en) * | 2017-07-04 | 2019-01-15 | 百度在线网络技术(北京)有限公司 | The method and apparatus of the laser point cloud data of automatic driving vehicle for identification |
CN109410735A (en) * | 2017-08-15 | 2019-03-01 | 百度在线网络技术(北京)有限公司 | Reflected value map constructing method and device |
CN109443351A (en) * | 2019-01-02 | 2019-03-08 | 亿嘉和科技股份有限公司 | A kind of robot three-dimensional laser positioning method under sparse environment |
CN109460032A (en) * | 2018-11-29 | 2019-03-12 | 亿嘉和科技股份有限公司 | A kind of localization method and autonomous charging of robots method based on laser-correlation |
CN109509226A (en) * | 2018-11-27 | 2019-03-22 | 广东工业大学 | Three dimensional point cloud method for registering, device, equipment and readable storage medium storing program for executing |
CN109633665A (en) * | 2018-12-17 | 2019-04-16 | 北京主线科技有限公司 | The sparse laser point cloud joining method of traffic scene |
CN109682373A (en) * | 2018-12-28 | 2019-04-26 | 中国兵器工业计算机应用技术研究所 | A kind of sensory perceptual system of unmanned platform |
CN109764869A (en) * | 2019-01-16 | 2019-05-17 | 中国矿业大学 | A kind of positioning of autonomous crusing robot and the three-dimensional map construction method of binocular camera and inertial navigation fusion |
-
2019
- 2019-05-28 CN CN201910453204.3A patent/CN110196044A/en active Pending
Patent Citations (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130135138A1 (en) * | 2011-11-28 | 2013-05-30 | Raytheon Company | Method for phase unwrapping using confidence-based rework |
CN104778688A (en) * | 2015-03-27 | 2015-07-15 | 华为技术有限公司 | Method and device for registering point cloud data |
CN104932535A (en) * | 2015-06-04 | 2015-09-23 | 西安应用光学研究所 | Method for carrying out closed-loop test on airborne forward-looking infrared search equipment by using simulation system |
CN105856230A (en) * | 2016-05-06 | 2016-08-17 | 简燕梅 | ORB key frame closed-loop detection SLAM method capable of improving consistency of position and pose of robot |
CN106272423A (en) * | 2016-08-31 | 2017-01-04 | 哈尔滨工业大学深圳研究生院 | A kind of multirobot for large scale environment works in coordination with the method for drawing and location |
WO2018071416A1 (en) * | 2016-10-11 | 2018-04-19 | Kaarta, Inc. | Laser scanner with real-time, online ego-motion estimation |
CN107025668A (en) * | 2017-03-30 | 2017-08-08 | 华南理工大学 | A kind of design method of the visual odometry based on depth camera |
CN106971403A (en) * | 2017-04-27 | 2017-07-21 | 武汉数文科技有限公司 | Point cloud chart is as processing method and processing device |
CN107356252A (en) * | 2017-06-02 | 2017-11-17 | 青岛克路德机器人有限公司 | A kind of Position Method for Indoor Robot for merging visual odometry and physics odometer |
CN107160395A (en) * | 2017-06-07 | 2017-09-15 | 中国人民解放军装甲兵工程学院 | Map constructing method and robot control system |
CN109214248A (en) * | 2017-07-04 | 2019-01-15 | 百度在线网络技术(北京)有限公司 | The method and apparatus of the laser point cloud data of automatic driving vehicle for identification |
CN109410735A (en) * | 2017-08-15 | 2019-03-01 | 百度在线网络技术(北京)有限公司 | Reflected value map constructing method and device |
CN107796397A (en) * | 2017-09-14 | 2018-03-13 | 杭州迦智科技有限公司 | A kind of Robot Binocular Vision localization method, device and storage medium |
CN107657640A (en) * | 2017-09-30 | 2018-02-02 | 南京大典科技有限公司 | Intelligent patrol inspection management method based on ORB SLAM |
CN108337915A (en) * | 2017-12-29 | 2018-07-27 | 深圳前海达闼云端智能科技有限公司 | Three-dimensional builds drawing method, device, system, high in the clouds platform, electronic equipment and computer program product |
CN108458715A (en) * | 2018-01-18 | 2018-08-28 | 亿嘉和科技股份有限公司 | A kind of robot localization initial method based on laser map |
CN108253958A (en) * | 2018-01-18 | 2018-07-06 | 亿嘉和科技股份有限公司 | A kind of robot real-time location method under sparse environment |
CN108287550A (en) * | 2018-02-01 | 2018-07-17 | 速感科技(北京)有限公司 | The method of SLAM systems and construction data correlation based on data correlation and error detection |
CN108520543A (en) * | 2018-04-09 | 2018-09-11 | 网易(杭州)网络有限公司 | A kind of method that relative accuracy map is optimized, equipment and storage medium |
CN109084732A (en) * | 2018-06-29 | 2018-12-25 | 北京旷视科技有限公司 | Positioning and air navigation aid, device and processing equipment |
CN109064506A (en) * | 2018-07-04 | 2018-12-21 | 百度在线网络技术(北京)有限公司 | Accurately drawing generating method, device and storage medium |
CN109087393A (en) * | 2018-07-23 | 2018-12-25 | 汕头大学 | A method of building three-dimensional map |
CN109509226A (en) * | 2018-11-27 | 2019-03-22 | 广东工业大学 | Three dimensional point cloud method for registering, device, equipment and readable storage medium storing program for executing |
CN109460032A (en) * | 2018-11-29 | 2019-03-12 | 亿嘉和科技股份有限公司 | A kind of localization method and autonomous charging of robots method based on laser-correlation |
CN109633665A (en) * | 2018-12-17 | 2019-04-16 | 北京主线科技有限公司 | The sparse laser point cloud joining method of traffic scene |
CN109682373A (en) * | 2018-12-28 | 2019-04-26 | 中国兵器工业计算机应用技术研究所 | A kind of sensory perceptual system of unmanned platform |
CN109443351A (en) * | 2019-01-02 | 2019-03-08 | 亿嘉和科技股份有限公司 | A kind of robot three-dimensional laser positioning method under sparse environment |
CN109764869A (en) * | 2019-01-16 | 2019-05-17 | 中国矿业大学 | A kind of positioning of autonomous crusing robot and the three-dimensional map construction method of binocular camera and inertial navigation fusion |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112747735A (en) * | 2019-10-29 | 2021-05-04 | 株式会社日立大厦系统 | Route data collection device |
CN112378411B (en) * | 2019-12-06 | 2023-02-28 | 黑芝麻智能科技(上海)有限公司 | Method for mapping and positioning by using 3D point cloud and 6D camera posture |
CN112378411A (en) * | 2019-12-06 | 2021-02-19 | 黑芝麻智能科技(上海)有限公司 | Method for mapping and positioning by using 3D point cloud and 6D camera posture |
CN111076733A (en) * | 2019-12-10 | 2020-04-28 | 亿嘉和科技股份有限公司 | Robot indoor map building method and system based on vision and laser slam |
CN111337011A (en) * | 2019-12-10 | 2020-06-26 | 亿嘉和科技股份有限公司 | Indoor positioning method based on laser and two-dimensional code fusion |
CN111076733B (en) * | 2019-12-10 | 2022-06-14 | 亿嘉和科技股份有限公司 | Robot indoor map building method and system based on vision and laser slam |
CN111223145A (en) * | 2020-01-03 | 2020-06-02 | 上海有个机器人有限公司 | Data processing method, system, service device and storage medium thereof |
CN111427979B (en) * | 2020-01-15 | 2021-12-21 | 深圳市镭神智能系统有限公司 | Dynamic map construction method, system and medium based on laser radar |
CN111427979A (en) * | 2020-01-15 | 2020-07-17 | 深圳市镭神智能系统有限公司 | Dynamic map construction method, system and medium based on laser radar |
CN113739785A (en) * | 2020-05-29 | 2021-12-03 | 杭州海康机器人技术有限公司 | Robot positioning method and device and storage medium |
CN112162294B (en) * | 2020-10-10 | 2023-12-15 | 北京布科思科技有限公司 | Robot structure detection method based on laser sensor |
CN112162294A (en) * | 2020-10-10 | 2021-01-01 | 北京布科思科技有限公司 | Robot structure detection method based on laser sensor |
CN114518108A (en) * | 2020-11-18 | 2022-05-20 | 郑州宇通客车股份有限公司 | Positioning map construction method and device |
CN114518108B (en) * | 2020-11-18 | 2023-09-08 | 宇通客车股份有限公司 | Positioning map construction method and device |
CN113781569A (en) * | 2021-01-04 | 2021-12-10 | 北京京东乾石科技有限公司 | Loop detection method and device |
CN113781569B (en) * | 2021-01-04 | 2024-04-05 | 北京京东乾石科技有限公司 | Loop detection method and device |
CN112986912A (en) * | 2021-03-19 | 2021-06-18 | 北京小狗吸尘器集团股份有限公司 | Floor sweeper repositioning method and device based on structured light sensor and electronic equipment |
CN112986912B (en) * | 2021-03-19 | 2023-06-16 | 北京小狗吸尘器集团股份有限公司 | Floor sweeper repositioning method and device based on structured light sensor and electronic equipment |
CN113838051A (en) * | 2021-11-25 | 2021-12-24 | 之江实验室 | Robot closed-loop detection method based on three-dimensional point cloud |
CN117406259A (en) * | 2023-12-14 | 2024-01-16 | 江西北斗云智慧科技有限公司 | Beidou-based intelligent construction site vehicle positioning method and system |
CN117406259B (en) * | 2023-12-14 | 2024-03-22 | 江西北斗云智慧科技有限公司 | Beidou-based intelligent construction site vehicle positioning method and system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110196044A (en) | It is a kind of based on GPS closed loop detection Intelligent Mobile Robot build drawing method | |
CN109521756B (en) | Obstacle motion information generation method and apparatus for unmanned vehicle | |
CN109443351B (en) | Robot three-dimensional laser positioning method in sparse environment | |
CN108253958B (en) | Robot real-time positioning method in sparse environment | |
CN109974712A (en) | It is a kind of that drawing method is built based on the Intelligent Mobile Robot for scheming optimization | |
AU2019233779B2 (en) | Vehicle tracking | |
CN110726409B (en) | Map fusion method based on laser SLAM and visual SLAM | |
CN109282808B (en) | Unmanned aerial vehicle and multi-sensor fusion positioning method for bridge three-dimensional cruise detection | |
CN113781582A (en) | Synchronous positioning and map creating method based on laser radar and inertial navigation combined calibration | |
CN106168485B (en) | Walking track data projectional technique and device | |
CN108700421A (en) | Use the method and system of the portable navigation of offline cartographic information auxiliary enhancing | |
CN114526745B (en) | Drawing construction method and system for tightly coupled laser radar and inertial odometer | |
Jaspers et al. | Multi-modal local terrain maps from vision and lidar | |
CN113933818A (en) | Method, device, storage medium and program product for calibrating laser radar external parameter | |
Liu et al. | Real-time 6d lidar slam in large scale natural terrains for ugv | |
CN110515382A (en) | A kind of smart machine and its localization method | |
CN102981160B (en) | Method and device for ascertaining aerial target track | |
CN114119886A (en) | High-precision map point cloud reconstruction method and device, vehicle, equipment and storage medium | |
CN115639823A (en) | Terrain sensing and movement control method and system for robot under rugged and undulating terrain | |
CN114323033A (en) | Positioning method and device based on lane lines and feature points and automatic driving vehicle | |
Nguyen et al. | Improving the accuracy of the autonomous mobile robot localization systems based on the multiple sensor fusion methods | |
Jung et al. | HeLiPR: heterogeneous LiDAR dataset for inter-LiDAR place recognition under spatial and temporal variations | |
CN116380039A (en) | Mobile robot navigation system based on solid-state laser radar and point cloud map | |
CN116429121A (en) | Positioning method and device based on multiple sensors, self-mobile device and storage medium | |
CN112947481B (en) | Autonomous positioning control method for home service robot |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190903 |
|
RJ01 | Rejection of invention patent application after publication |