CN101944240B - Fusion method of multi-robot three-dimensional geometrical map - Google Patents
Fusion method of multi-robot three-dimensional geometrical map Download PDFInfo
- Publication number
- CN101944240B CN101944240B CN2010102620759A CN201010262075A CN101944240B CN 101944240 B CN101944240 B CN 101944240B CN 2010102620759 A CN2010102620759 A CN 2010102620759A CN 201010262075 A CN201010262075 A CN 201010262075A CN 101944240 B CN101944240 B CN 101944240B
- Authority
- CN
- China
- Prior art keywords
- map
- dimensional
- fusion
- robot
- width
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Abstract
The invention discloses a fusion method of a multi-robot three-dimensional geometrical map, comprising the steps of projecting a three-dimensional geometrical map to be fused into two-dimensional grid maps; fusing the grid maps by adopting an image matching algorithm to acquire rotation parameters among the two-dimensional grid maps; and rotating and transforming the three-dimensional map and fusing the three-dimensional geometrical map by adopting a three-dimensional point set registration algorithm. The fusion method of the multi-robot three-dimensional geometrical map integrates the two-dimensional map fusion and the three-dimensional point set registration algorithm through considering geometrical characteristic information in the three-dimensional map to finish the fusion of the multi-robot three-dimensional geometrical map and obviously improve the fusion accuracy of the three-dimensional geometrical map. The fusion method of the multi-robot three-dimensional geometrical map is mainly applied to simultaneously positioning a plurality of robots and reestablishing the map and fusing partial map created by a single robot to form a global map.
Description
Technical field
The present invention relates to robot and locate simultaneously and map reconstruction field, is the method that a kind of how much maps that multirobot is created merge specifically.
Background technology
Along with the continuous expansion of application, the location can not satisfy the demands with the map reconstruction algorithm in many instances in the time of based on unit device robot system; And multi-robot system because have concurrent job, have complementary functions, advantage such as information sharing, received the many concerns of People more and more.The mobile robot locate simultaneously with map reconstruction problem in; Map is according to the current pose information creating of robot; Therefore; If can know the initial posture information and the corresponding equation of motion thereof of each robot, create when so just can realize map through the mode of multirobot associating filtering.Yet; In many instances; This condition can not be met; Did each robot all must independently accomplish the building work of local map, thereby face a following problem with regard to needs: when each robot had all safeguarded a map about local environment, how this utilize these local maps to obtain the global information of environment? The map of Here it is multirobot merges problem (as shown in Figure 1).
Concern that to the robot mutual alignment map under the unknown situation merges problem, a kind of method is to estimate the transformation relation between map through the mutual measurement between the robot, then the characteristic in the map is carried out association, finally realizes the fusion of map; Or robot adopts a kind of self-adaptation particle filter algorithm to estimate the relative pose information between the robot then, and decides the multirobot cooperation to explore the strategy of environment thus through mutual exchange metrical information.To the fusion problem of multirobot grating map, can regard it as find the solution optimal transformation process, or solve through the mode of image registration.Yet what present method mostly was directed against is the fusion problem of two-dimensional map.To the fusion problem of three-dimensional geometry map, directly adopt three-dimensional point set registration Algorithm to solve usually, but three-dimensional point set registration Algorithm does not consider that map can reflect this situation of geometric properties in the environment.
Summary of the invention
The invention provides a kind of fusion method of multirobot three-dimensional geometry map; Through considering the geometric properties information in the three-dimensional map; Merge and three-dimensional point set registration Algorithm in conjunction with two-dimensional map; Accomplish the fusion of multirobot three-dimensional geometry map, obvious improve the accuracy that the three-dimensional geometry map merges.
A kind of fusion method of multirobot three-dimensional geometry map may further comprise the steps:
(1) two width of cloth three-dimensional geometry map M that will be to be merged
1(x, y, z) and M
2(x; Y z) projects to that dimensionality reduction obtains two-dimensional map on the x-y plane, with fixing size two-dimensional map is carried out rasterizing then and obtains two width of cloth grating maps to be merged; Wherein, The grid that the characteristic existence is arranged is for occupying grid, and the grid that no characteristic exists then is idle grid, and two kinds of different grids are given different numerical;
(2) two width of cloth grating maps of treating fusion carry out Fourier-Mellin transform, calculate relative rotation angle θ between the two;
(3) the relative rotation angle θ that utilizes step (2) to obtain is to three-dimensional geometry map M
2(x, y z) are rotated conversion, obtain the three-dimensional geometry map M ' after the rotational transform
2(x, y, z);
(4) adopt the most contiguous iterative algorithm, to M
1(x, y, z) and M '
2(x, y z) merge.
Further, described step (4) is:
(4.1) calculate M '
2(z) middle every bit is to M for x, y
1(closest approach z) utilizes the hypercomplex number algorithm for x, y, finds the solution to make M '
2(x, y, every bit and M in z)
1(x, y, z) Euler's square distance between the closest approach of middle correspondence and minimum transformation matrix K;
(4.2) with described transformation matrix matrix K to M '
2(x, y z) carry out conversion, the M ' that obtains upgrading
2(x, y z), and calculate M
1(x, y is z) with the M ' that upgrades
2(distance between z) is the result of an iteration for x, y;
(4.3) iteration is carried out in repeating step (4.1) and (4.2), calculates the error of twice iteration, if error greater than ε, continues iterative process; If error is less than ε, iteration finishes.ε is according to fusion accuracy needs value.
The fusion method of multirobot three-dimensional geometry map of the present invention; Through considering the geometric properties information in the three-dimensional map; Merge and three-dimensional point set registration Algorithm in conjunction with two-dimensional map, accomplish the fusion of multirobot three-dimensional geometry map, the obvious accuracy of improving the fusion of three-dimensional geometry map.The fusion method of multirobot three-dimensional geometry map of the present invention be mainly used in multirobot locate simultaneously with the map reconstruction application in, the local map that the individual machine people is created merges to form global map.
Description of drawings
Fig. 1 is the synoptic diagram that multirobot three-dimensional geometry map merges problem;
Fig. 2 is the process flow diagram of the fusion method of multirobot three-dimensional geometry map of the present invention;
Fig. 3 is two width of cloth three-dimensional geometry map to be merged as the test input;
Fig. 4 is the projection on two dimensional surface before the test input of Fig. 3 is not merged;
Fig. 5 adopts the projection of test result on two dimensional surface after the inventive method merges to test input shown in Figure 3;
Fig. 6 adopts the projection of test result on two dimensional surface after method of the prior art merges to test input shown in Figure 3.
Embodiment
Specify the present invention below in conjunction with embodiment and accompanying drawing, but the present invention is not limited to this.
As shown in Figure 2, a kind of fusion method of multirobot three-dimensional geometry map, and is created separately about how much maps of the partial 3 d of environment and is merged with vision sensor perception environment a plurality of robots, and its flow process may further comprise the steps:
Consider that robot moves on the x-y plane, definition dimensionality reduction mapping (being projection model)
f:(x,y,z)→(x,y)
Wherein (x, y z) are the volume coordinate of geometric properties, and (x y) is the planimetric coordinates of geometric properties.
With two width of cloth three-dimensional geometry map M to be merged
1(x, y, z) and M
2(x, y z) project to dimensionality reduction on the x-y plane, carry out rasterizing and obtain two width of cloth grating map m to be merged projecting to map on the x-y plane with fixing size then
1(x, y) and m
2(x, y).Grating map is made up of the grid cell of several same size, a sub regions on the corresponding x-y of the grid cell plane, and the size of subregion can be adjusted according to actual conditions.In the grating map, the grid that the characteristic existence is arranged is for occupying grid, and the grid that no characteristic exists then is idle grid, and two kinds of different grids are given different numerical.
If two width of cloth grating map m to be merged
1(x, y) and m
2(x, y) in, m
2(x is through to m y)
1(x y) carries out obtaining behind translation, rotation and the scale transformation, then m
1(x, y) and m
2(x, y) existence is following suc as formula the transformation relation shown in (I):
m
2(x,y)=m
1(s(xcosα+ysinα)-x
0,s(-xsinα+ycosσ)-y
0) (I)
In the formula (I), s is a zoom factor, and α is the anglec of rotation, x
0And y
0Be translational movement;
M then
1(x, y) and m
2(x, y) the Fourier transform amplitude F of correspondence
1(ξ, ζ) and F
2(ξ satisfies formula (II) between ζ), and is as follows:
(II) can find out from formula, the Fourier transform amplitude F of two width of cloth grating maps to be merged
1(ξ, ζ) and F
2(ξ, the relation between ζ) is only relevant with zoom factor s and anglec of rotation α, and and translational movement x
0And y
0Irrelevant.
Therefore, grating map m
1(x, y) and m
2(x, y) fusion is carried out in two steps: ask zoom factor s and rotation angle α through amplitude spectrum earlier, and then find the solution translation parameters x
0And y
0
To formula (II) introduce polar coordinates represent (ξ ζ), can obtain formula (III), and is as follows:
In the formula (III), θ is relative rotation angle, and ρ is a polar radius;
Further adopt the representation of logarithmic coordinate, and ignore coefficient s
-2, can obtain formula (IV), as follows:
|F
2(logρ,θ)|=|F
1(logρ-logs,θ-α)| (IV)
Formula (IV) can be used Fourier's displacement theory and solve:
If two width of cloth grating map m to be merged
1(x, y) and m
2(x, y) middle two width of cloth image I that exist
1(x, y) and I
2(x, y) in, image I
2(x is y) by image I
1(x, y) difference translation x on x and y direction
0And y
0Obtain, then I
1(x, y) and I
2(x y) exists suc as formula the transformation relation shown in (V):
I
2(x,y)=I
1(x-x
0,y-y
0) (V)
And establish F
1(ξ, ζ) and F
2(ξ ζ) is respectively I
1(x, y) and I
2(x, y) Fourier transform of correspondence, then F
1(ξ, ζ) and F
2(ξ, satisfy suc as formula the relation shown in (VI) between ζ):
The definition image I
1(x, y) and I
2(x, cross-power spectrum y) does
Wherein
Be F
2Complex conjugate.(VII) carries out Fourier inversion to formula, (the x on image space
0, y
0) locate to obtain an impulse function, the position of pulse is two width of cloth image I
1(x, y) and I
2(x, the relative translation amount x between y)
0And y
0
After obtaining zoom factor s and rotation angle α thereof, to map m
2(x y) carries out convergent-divergent and rotation, the map m ' after the generating transformation
2(x, y), m ' then
2(x, y) and m
1(y) only there is translation relation in x on x and y direction
m′
2(x,y)=m
1(x-x
0,y-y
0) (VIII)
Use Fourier's displacement theory can obtain two width of cloth map m ' once more to formula (VIII)
2(x, y) and m
1(x, the relative translation amount x between y)
0And y
0
Because three-dimensional geometry map M
1(x, y, z) and M
2(x, y, z) in the plane projection m
1(x, y) and m
2(x, the relative rotation angle θ between y) can find the solution through the fusion of grating map, therefore, finds the solution the relative rotation angle θ that obtains according to the fusion of grating map, can obtain three-dimensional geometry map M
1(x, y, z) and M
2(x, y, relative rotation angle θ z), and with this to three-dimensional geometry map M
2(x, y z) obtain M ' after being rotated
2(x, y z), adopt the most contiguous iterative algorithm to carry out the fusion of map then.
The iterative process the most each time of contiguous iterative algorithm is all carried out in two steps.The first step: try to achieve M ' earlier
2(z) middle every bit is to M for x, y
1(closest approach z) utilizes the hypercomplex number algorithm to find the solution to make M ' then for x, y
2(x, y, every bit and M in z)
1(x, y, z) Euler's square distance between the closest approach of middle correspondence and minimum transformation matrix K; Second step; With transformation matrix K to M '
2(x, y z) carry out the M ' that conversion obtains upgrading
2(x, y z), calculate M then
1(x, y, z) and M '
2(x, y, the distance between z).Repeat above-mentioned iterative process, calculate the error of twice iteration, if error greater than ε, continues iterative process; If error is less than ε, iteration finishes.ε is according to fusion accuracy needs value.
Use the Hausdroff distance to weigh M
1(x, y, z) and M '
2(x, y, the distance between z).Definition three-dimensional geometry map M
1(x, y, z) and M
2(x, y, the Hausdroff distance B (M between z)
1, M
2) be:
D(M
1,M
2)=max(d(M
1,M
2),d(M
2,M
1))
Wherein,
d(M
1,M
2)=||p-q||,p=max(M
1),q=min(M
2)
d(M
2,M
1)=||p-q||,p=max(M
2),q=min(M
1)
Import as test with two width of cloth three-dimensional geometry maps as shown in Figure 3; Projection before it does not merge on two dimensional surface is as shown in Figure 4; Projection after adopting method of the present invention to merge on two dimensional surface is as shown in Figure 5, and the projection on two dimensional surface is as shown in Figure 6 after adopting method of the prior art to merge.According to the comparison of Fig. 4~6, can draw: use method of the present invention can obtain more objectively syncretizing effect.
Claims (2)
1. the fusion method of a multirobot three-dimensional geometry map is characterized in that, may further comprise the steps:
(1) two width of cloth three-dimensional geometry map M that will be to be merged
1(x, y, z), M
2(x; Y z) projects to that dimensionality reduction obtains two-dimensional map on the x-y plane, with fixing size two-dimensional map is carried out rasterizing then and obtains two width of cloth grating maps to be merged; Wherein, The grid that the characteristic existence is arranged is for occupying grid, and the grid that no characteristic exists then is idle grid, and two kinds of different grids are given different numerical;
(2) two width of cloth grating maps of treating fusion carry out Fourier-Mellin transform, calculate the relative rotation angle θ between two width of cloth grating maps;
(3) the relative rotation angle θ that utilizes step (2) to obtain is to three-dimensional geometry map M
2(x, y z) are rotated conversion, obtain the three-dimensional geometry map M ' after the conversion
2(x, y, z);
(4) adopt the most contiguous iterative algorithm, to M
1(x, y, z) and M '
2(x, y z) merge.
2. fusion method as claimed in claim 1 is characterized in that, in the described step (4), to M
1(x, y, z) and M '
2(process that z) merges is following for x, y:
(4.1) calculate M '
2(z) middle every bit is to M for x, y
1(closest approach z) utilizes the hypercomplex number algorithm for x, y, finds the solution to make M '
2(x, y, every bit and M in z)
1(x, y, z) Euler's square distance between the closest approach of middle correspondence and minimum transformation matrix K;
(4.2) with described transformation matrix matrix K to M '
2(x, y z) carry out conversion, the M ' that obtains upgrading
2(x, y z), and calculate M
1(x, y is z) with the M ' that upgrades
2(distance between z) is the result of an iteration for x, y;
(4.3) error of twice iteration is calculated, if error greater than ε, then continues to repeat above-mentioned step in repeating step (4.1) and (4.2); If error is less than ε, then iteration finishes; Described ε is according to fusion accuracy needs value.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2010102620759A CN101944240B (en) | 2010-08-20 | 2010-08-20 | Fusion method of multi-robot three-dimensional geometrical map |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2010102620759A CN101944240B (en) | 2010-08-20 | 2010-08-20 | Fusion method of multi-robot three-dimensional geometrical map |
Publications (2)
Publication Number | Publication Date |
---|---|
CN101944240A CN101944240A (en) | 2011-01-12 |
CN101944240B true CN101944240B (en) | 2012-02-15 |
Family
ID=43436220
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2010102620759A Active CN101944240B (en) | 2010-08-20 | 2010-08-20 | Fusion method of multi-robot three-dimensional geometrical map |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN101944240B (en) |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105447911B (en) * | 2014-09-26 | 2020-01-31 | 联想(北京)有限公司 | 3D map fusion method and device and electronic equipment |
CN105184243B (en) * | 2015-08-24 | 2018-10-23 | 王红军 | A kind of environmental characteristic expression based on 3 d grid map and knowledge method for distinguishing |
CN105094135A (en) * | 2015-09-03 | 2015-11-25 | 上海电机学院 | Distributed multi-robot map fusion system and fusion method |
CN105469445B (en) * | 2015-12-08 | 2018-06-29 | 电子科技大学 | A kind of step-length changeably drawing generating method |
CN105701771A (en) * | 2016-03-17 | 2016-06-22 | 江苏科技大学 | Digital map stitching method based on radio frequency identification anchors |
CN106227218A (en) * | 2016-09-27 | 2016-12-14 | 深圳乐行天下科技有限公司 | The navigation barrier-avoiding method of a kind of Intelligent mobile equipment and device |
CN106447717B (en) * | 2016-09-30 | 2019-05-03 | 中国科学院自动化研究所 | A kind of method for reconstructing of the light selective film illumination micro-imaging based on multi-angle |
CN109426222A (en) * | 2017-08-24 | 2019-03-05 | 中华映管股份有限公司 | Unmanned handling system and its operating method |
CN108227717B (en) * | 2018-01-30 | 2021-12-03 | 中国人民解放军陆军装甲兵学院 | Multi-mobile-robot map fusion method and fusion platform based on ORB (object-oriented bounding Box) features |
CN108537263B (en) * | 2018-03-29 | 2020-10-30 | 苏州大学张家港工业技术研究院 | Grid map fusion method based on maximum public subgraph |
CN108775901B (en) * | 2018-07-20 | 2021-05-07 | 山东大学 | Real-time SLAM scene map construction system, navigation system and method |
CN111462198B (en) * | 2020-03-10 | 2023-03-24 | 西南交通大学 | Multi-mode image registration method with scale, rotation and radiation invariance |
CN112051921B (en) * | 2020-07-02 | 2023-06-27 | 杭州易现先进科技有限公司 | AR navigation map generation method, device, computer equipment and readable storage medium |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101033971B (en) * | 2007-02-09 | 2011-02-16 | 中国科学院合肥物质科学研究院 | Mobile robot map building system and map building method thereof |
CN101266659B (en) * | 2008-05-08 | 2010-06-09 | 山东大学 | Robot grid sub-map amalgamation method based on immune self-adapted genetic algorithm |
CN101413806B (en) * | 2008-11-07 | 2011-05-25 | 湖南大学 | Mobile robot grating map creating method of real-time data fusion |
-
2010
- 2010-08-20 CN CN2010102620759A patent/CN101944240B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN101944240A (en) | 2011-01-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN101944240B (en) | Fusion method of multi-robot three-dimensional geometrical map | |
Rabbani et al. | Efficient hough transform for automatic detection of cylinders in point clouds | |
CN107862719B (en) | Method and device for calibrating external parameters of camera, computer equipment and storage medium | |
CN105241461A (en) | Map creating and positioning method of robot and robot system | |
CN105509733A (en) | Measuring method for relative pose of non-cooperative spatial circular object | |
CN105913489A (en) | Indoor three-dimensional scene reconstruction method employing plane characteristics | |
CN104240297A (en) | Rescue robot three-dimensional environment map real-time construction method | |
CN105091744A (en) | Pose detection apparatus and method based on visual sensor and laser range finder | |
CN105258702A (en) | Global positioning method based on SLAM navigation mobile robot | |
Shijie et al. | Monocular vision-based two-stage iterative algorithm for relative position and attitude estimation of docking spacecraft | |
Das et al. | Scan registration with multi-scale k-means normal distributions transform | |
CN103411589B (en) | A kind of 3-D view matching navigation method based on four-dimensional real number matrix | |
Rao et al. | CurveSLAM: An approach for vision-based navigation without point features | |
CN106097431A (en) | A kind of object global recognition method based on 3 d grid map | |
CN205247208U (en) | Robotic system | |
CN106204416A (en) | Panoramic parking assist system and wide angle picture adjustment method thereof and device | |
CN103729510B (en) | Based on the interior 3 D complex model exact mirror image symmetry computational methods accumulateing conversion | |
CN110163902A (en) | A kind of inverse depth estimation method based on factor graph | |
Lee et al. | 3D visual perception system for bin picking in automotive sub-assembly automation | |
CN107449416A (en) | Fixed star hangover asterism extracting method based on vector accumulation | |
KR101093793B1 (en) | Method for measuring 3d pose information using virtual plane information | |
Fleischmann et al. | Image analysis by conformal embedding | |
CN105205859A (en) | Similarity measurement method of environmental characteristics based on three-dimensional raster map | |
CN101344376A (en) | Measuring method for spacing circle geometric parameter based on monocular vision technology | |
CN100590658C (en) | Method for matching two dimensional object point and image point with bilateral constraints |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |