CN101944240A - Fusion method of multi-robot three-dimensional geometrical map - Google Patents
Fusion method of multi-robot three-dimensional geometrical map Download PDFInfo
- Publication number
- CN101944240A CN101944240A CN 201010262075 CN201010262075A CN101944240A CN 101944240 A CN101944240 A CN 101944240A CN 201010262075 CN201010262075 CN 201010262075 CN 201010262075 A CN201010262075 A CN 201010262075A CN 101944240 A CN101944240 A CN 101944240A
- Authority
- CN
- China
- Prior art keywords
- map
- dimensional
- fusion
- robot
- width
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Abstract
The invention discloses a fusion method of a multi-robot three-dimensional geometrical map, comprising the steps of projecting a three-dimensional geometrical map to be fused into two-dimensional grid maps; fusing the grid maps by adopting an image matching algorithm to acquire rotation parameters among the two-dimensional grid maps; and rotating and transforming the three-dimensional map and fusing the three-dimensional geometrical map by adopting a three-dimensional point set registration algorithm. The fusion method of the multi-robot three-dimensional geometrical map integrates the two-dimensional map fusion and the three-dimensional point set registration algorithm through considering geometrical characteristic information in the three-dimensional map to finish the fusion of the multi-robot three-dimensional geometrical map and obviously improve the fusion accuracy of the three-dimensional geometrical map. The fusion method of the multi-robot three-dimensional geometrical map is mainly applied to simultaneously positioning a plurality of robots and reestablishing the map and fusing partial map created by a single robot to form a global map.
Description
Technical field
The present invention relates to robot and locate simultaneously and map reconstruction field, is the method that a kind of how much maps that multirobot is created merge specifically.
Background technology
Along with the continuous expansion of application, the location can not satisfy the demands in many instances with the map reconstruction algorithm in the time of based on unit device robot system; And multi-robot system because have concurrent job, have complementary functions, advantage such as information sharing, be subjected to people and more and more paid close attention to.The mobile robot locate simultaneously with map reconstruction problem in, map is according to the current pose information creating of robot, therefore, if can know the initial posture information and the corresponding equation of motion thereof of each robot, create when so just can realize map by the mode of multirobot associating filtering.Yet, in many instances, this condition can not be met, each robot all must independently finish the building work of local map, did thereby face a following problem with regard to needs: when each robot had all safeguarded a map about local environment, how this utilize these local maps to obtain the global information of environment? the map of Here it is multirobot merges problem (as shown in Figure 1).
Concern that at the robot mutual alignment map under the unknown situation merges problem, a kind of method is to estimate transformation relation between map by the mutual measurement between the robot, then the feature in the map is carried out association, finally realizes the fusion of map; Or robot is by mutual exchange metrical information, adopts a kind of self-adaptation particle filter algorithm to estimate relative pose information between the robot then, and decides the multirobot cooperation to explore the strategy of environment thus.At the fusion problem of multirobot grating map, can regard it as find the solution optimal transformation process, or solve by the mode of image registration.Yet, present method mostly at be the fusion problem of two-dimensional map.To the fusion problem of three-dimensional geometry map, directly adopt three-dimensional point set registration Algorithm to solve usually, but three-dimensional point set registration Algorithm does not consider that map can reflect this situation of geometric properties in the environment.
Summary of the invention
The invention provides a kind of fusion method of multirobot three-dimensional geometry map, by considering the geometric properties information in the three-dimensional map, merge and three-dimensional point set registration Algorithm in conjunction with two-dimensional map, finish the fusion of multirobot three-dimensional geometry map, obvious improve the accuracy that the three-dimensional geometry map merges.
A kind of fusion method of multirobot three-dimensional geometry map may further comprise the steps:
(1) two width of cloth three-dimensional geometry map M that will be to be merged
1(x, y, z) and M
2(x, y, z) project to that dimensionality reduction obtains two-dimensional map on the x-y plane, with fixing size two-dimensional map is carried out rasterizing then and obtain two width of cloth grating maps to be merged, wherein, the grid that the feature existence is arranged is for occupying grid, and the grid that no feature exists then is idle grid, and two kinds of different grids are given different numerical value;
(2) two width of cloth grating maps for the treatment of fusion carry out Fourier-Mellin transform, calculate relative rotation angle θ between the two;
(3) the relative rotation angle θ that utilizes step (2) to obtain is to three-dimensional geometry map M
2(x, y z) are rotated conversion, obtain the three-dimensional geometry map M ' after the rotational transform
2(x, y, z);
(4) adopt the most contiguous iterative algorithm, to M
1(x, y, z) and M '
2(x, y z) merge.
Further, described step (4) is:
(4.1) calculate M '
2(z) middle every bit is to M for x, y
1(closest approach z) utilizes the hypercomplex number algorithm for x, y, finds the solution to make M '
2(x, y, z) every bit in and M
1(x, y, z) Euler's square distance between the closest approach of middle correspondence and minimum transformation matrix K;
(4.2) with described transformation matrix matrix K to M '
2(x, y z) carry out conversion, the M ' that obtains upgrading
2(x, y z), and calculate M
1(x, y is z) with the M ' that upgrades
2(z) distance between is the result of an iteration for x, y;
(4.3) iteration is carried out in repeating step (4.1) and (4.2), calculates the error of twice iteration, if error greater than ε, continues iterative process; If error is less than ε, iteration finishes.ε is according to fusion accuracy needs value.
The fusion method of multirobot three-dimensional geometry map of the present invention, by considering the geometric properties information in the three-dimensional map, merge and three-dimensional point set registration Algorithm in conjunction with two-dimensional map, finish the fusion of multirobot three-dimensional geometry map, the obvious accuracy of improving the fusion of three-dimensional geometry map.The fusion method of multirobot three-dimensional geometry map of the present invention be mainly used in multirobot locate simultaneously with the map reconstruction application in, the local map that the individual machine people is created merges to form global map.
Description of drawings
Fig. 1 is the synoptic diagram that multirobot three-dimensional geometry map merges problem;
Fig. 2 is the process flow diagram of the fusion method of multirobot three-dimensional geometry map of the present invention;
Fig. 3 is two width of cloth three-dimensional geometry map to be merged as the test input;
Fig. 4 is the projection on two dimensional surface before the test input of Fig. 3 is not merged;
Fig. 5 is the projection of test result on two dimensional surface after test input employing the inventive method shown in Figure 3 is merged;
Fig. 6 is the projection of test result on two dimensional surface after test input employing method of the prior art shown in Figure 3 is merged.
Embodiment
Describe the present invention in detail below in conjunction with embodiment and accompanying drawing, but the present invention is not limited to this.
As shown in Figure 2, a kind of fusion method of multirobot three-dimensional geometry map, with vision sensor perception environment, and how much maps of partial 3 d of creating separately about environment merge to a plurality of robots, and its flow process may further comprise the steps:
Consider that robot moves on the x-y plane, definition dimensionality reduction mapping (being projection model)
f:(x,y,z)→(x,y)
Wherein (x, y z) are the volume coordinate of geometric properties, and (x y) is the planimetric coordinates of geometric properties.
With two width of cloth three-dimensional geometry map M to be merged
1(x, y, z) and M
2(x, y z) project to dimensionality reduction on the x-y plane, carry out rasterizing and obtain two width of cloth grating map m to be merged projecting to map on the x-y plane with fixing size then
1(x, y) and m
2(x, y).Grating map is made up of the grid cell of several same size, a sub regions on the corresponding x-y of the grid cell plane, and the size of subregion can be adjusted according to actual conditions.In the grating map, the grid that the feature existence is arranged is for occupying grid, and the grid that no feature exists then is idle grid, and two kinds of different grids are given different numerical value.
If two width of cloth grating map m to be merged
1(x, y) and m
2(x, y) in, m
2(x is by to m y)
1(x y) carries out obtaining behind translation, rotation and the scale transformation, then m
1(x, y) and m
2(x, y) existence is following suc as formula the transformation relation shown in (I):
m
2(x,y)=m
1(s(xcosα+ysinα)-x
0,s(-xsinα+ycosσ)-y
0) (I)
In the formula (I), s is a zoom factor, and α is the anglec of rotation, x
0And y
0Be translational movement;
M then
1(x, y) and m
2(x, y) Dui Ying Fourier transform amplitude F
1(ξ, ζ) and F
2(ξ satisfies formula (II) between ζ), and is as follows:
From formula (II) as can be seen, the Fourier transform amplitude F of two width of cloth grating maps to be merged
1(ξ, ζ) and F
2(ξ, ζ) relation between is only relevant with zoom factor s and anglec of rotation α, and with translational movement x
0And y
0Irrelevant.
Therefore, grating map m
1(x, y) and m
2(x, y) fusion is carried out in two steps: ask zoom factor s and rotation angle α by amplitude spectrum earlier, and then find the solution translation parameters x
0And y
0
To formula (II) introduce polar coordinates represent (ξ ζ), can obtain formula (III), and is as follows:
In the formula (III), θ is relative rotation angle, and ρ is a polar radius;
Further adopt the representation of logarithmic coordinate, and ignore coefficient s
-2, can obtain formula (IV), as follows:
|F
2(logρ,θ)|=|F
1(logρ-logs,θ-α)| (IV)
Formula (IV) can be used Fourier's displacement theory and solve:
If two width of cloth grating map m to be merged
1(x, y) and m
2(x, y) middle two width of cloth image I that exist
1(x, y) and I
2(x, y) in, image I
2(x is y) by image I
1(x y) distinguishes translation x in the x and y direction
0And y
0Obtain, then I
1(x, y) and I
2(x y) exists suc as formula the transformation relation shown in (V):
I
2(x,y)=I
1(x-x
0,y-y
0) (V)
And establish F
1(ξ, ζ) and F
2(ξ ζ) is respectively I
1(x, y) and I
2(x, y) Dui Ying Fourier transform, then F
1(ξ, ζ) and F
2(ξ, satisfy suc as formula the relation shown in (VI) between ζ):
The definition image I
1(x, y) and I
2(x, cross-power spectrum y) is
Wherein
Be F
2Complex conjugate.(VII) carries out Fourier inversion to formula, (the x on image space
0, y
0) locate to obtain an impulse function, the position of pulse is two width of cloth image I
1(x, y) and I
2(x, y) the relative translation amount x between
0And y
0
After obtaining zoom factor s and rotation angle α thereof, to map m
2(x y) carries out convergent-divergent and rotation, the map m ' after the generation conversion
2(x, y), m ' then
2(x, y) and m
1(y) only there is translation relation in x in the x and y direction
m′
2(x,y)=m
1(x-x
0,y-y
0) (VIII)
Use Fourier's displacement theory can obtain two width of cloth map m ' once more to formula (VIII)
2(x, y) and m
1(x, y) the relative translation amount x between
0And y
0
Because three-dimensional geometry map M
1(x, y, z) and M
2(x, y, z) projection m in the plane
1(x, y) and m
2(x, y) the relative rotation angle θ between can find the solution by the fusion of grating map, therefore, finds the solution the relative rotation angle θ that obtains according to the fusion of grating map, can obtain three-dimensional geometry map M
1(x, y, z) and M
2(x, y, relative rotation angle θ z), and with this to three-dimensional geometry map M
2(x, y z) obtain M ' after being rotated
2(x, y z), adopt the most contiguous iterative algorithm to carry out the fusion of map then.
The iterative process the most each time of contiguous iterative algorithm is all carried out in two steps.The first step: try to achieve M ' earlier
2(z) middle every bit is to M for x, y
1(closest approach z) utilizes the hypercomplex number algorithm to find the solution to make M ' then for x, y
2(x, y, z) every bit in and M
1(x, y, z) Euler's square distance between the closest approach of middle correspondence and minimum transformation matrix K; Second step; With transformation matrix K to M '
2(x, y z) carry out the M ' that conversion obtains upgrading
2(x, y z), calculate M then
1(x, y, z) and M '
2(x, y, z) distance between.Repeat above-mentioned iterative process, calculate the error of twice iteration, if error greater than ε, continues iterative process; If error is less than ε, iteration finishes.ε is according to fusion accuracy needs value.
Use the Hausdroff distance to weigh M
1(x, y, z) and M '
2(x, y, z) distance between.Definition three-dimensional geometry map M
1(x, y, z) and M
2(x, y, z) the Hausdroff distance D (M between
1, M
2) be:
D(M
1,M
2)=max(d(M
1,M
2),d(M
2,M
1))
Wherein,
d(M
1,M
2)=||p-q||,p=max(M
1),q=min(M
2)
d(M
2,M
1)=||p-q||,p=max(M
2),q=min(M
1)
Import as test with two width of cloth three-dimensional geometry maps as shown in Figure 3, projection before it does not merge on two dimensional surface as shown in Figure 4, adopt method of the present invention to merge the projection of back on two dimensional surface as shown in Figure 5, and adopt method of the prior art to merge the projection of back on two dimensional surface as shown in Figure 6.According to the comparison of Fig. 4~6, can draw: use method of the present invention can obtain more objective syncretizing effect.
Claims (2)
1. the fusion method of a multirobot three-dimensional geometry map is characterized in that, may further comprise the steps:
(1) two width of cloth three-dimensional geometry map M that will be to be merged
1(x, y, z), M
2(x, y, z) project to that dimensionality reduction obtains two-dimensional map on the x-y plane, with fixing size two-dimensional map is carried out rasterizing then and obtain two width of cloth grating maps to be merged, wherein, the grid that the feature existence is arranged is for occupying grid, and the grid that no feature exists then is idle grid, and two kinds of different grids are given different numerical value;
(2) two width of cloth grating maps for the treatment of fusion carry out Fourier-Mellin transform, calculate the relative rotation angle θ between two width of cloth grating maps;
(3) the relative rotation angle θ that utilizes step (2) to obtain is to three-dimensional geometry map M
2(x, y z) are rotated conversion, obtain the three-dimensional geometry map M ' after the conversion
2(x, y, z);
(4) adopt the most contiguous iterative algorithm, to M
1(x, y, z) and M '
2(x, y z) merge.
2. fusion method as claimed in claim 1 is characterized in that, in the described step (4), to M
1(x, y, z) and M '
2(process that z) merges is as follows for x, y:
(4.1) calculate M '
2(z) middle every bit is to M for x, y
1(closest approach z) utilizes the hypercomplex number algorithm for x, y, finds the solution to make M '
2(x, y, z) every bit in and M
1(x, y, z) Euler's square distance between the closest approach of middle correspondence and minimum transformation matrix K;
(4.2) with described transformation matrix matrix K to M '
2(x, y z) carry out conversion, the M ' that obtains upgrading
2(x, y z), and calculate M
1(x, y is z) with the M ' that upgrades
2(z) distance between is the result of an iteration for x, y;
(4.3) error of twice iteration is calculated, if error greater than ε, then continues to repeat above-mentioned step in repeating step (4.1) and (4.2); If error is less than ε, then iteration finishes.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2010102620759A CN101944240B (en) | 2010-08-20 | 2010-08-20 | Fusion method of multi-robot three-dimensional geometrical map |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2010102620759A CN101944240B (en) | 2010-08-20 | 2010-08-20 | Fusion method of multi-robot three-dimensional geometrical map |
Publications (2)
Publication Number | Publication Date |
---|---|
CN101944240A true CN101944240A (en) | 2011-01-12 |
CN101944240B CN101944240B (en) | 2012-02-15 |
Family
ID=43436220
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2010102620759A Active CN101944240B (en) | 2010-08-20 | 2010-08-20 | Fusion method of multi-robot three-dimensional geometrical map |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN101944240B (en) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105094135A (en) * | 2015-09-03 | 2015-11-25 | 上海电机学院 | Distributed multi-robot map fusion system and fusion method |
CN105184243A (en) * | 2015-08-24 | 2015-12-23 | 王红军 | Environment characteristic expression and identification method based on three dimensional grid map |
CN105447911A (en) * | 2014-09-26 | 2016-03-30 | 联想(北京)有限公司 | 3D map merging method, 3D map merging device and electronic device |
CN105469445A (en) * | 2015-12-08 | 2016-04-06 | 电子科技大学 | Step size changeable map generation method |
CN105701771A (en) * | 2016-03-17 | 2016-06-22 | 江苏科技大学 | Digital map stitching method based on radio frequency identification anchors |
CN106227218A (en) * | 2016-09-27 | 2016-12-14 | 深圳乐行天下科技有限公司 | The navigation barrier-avoiding method of a kind of Intelligent mobile equipment and device |
CN106447717A (en) * | 2016-09-30 | 2017-02-22 | 中国科学院自动化研究所 | Multi-angle based selective light-sheet illumination microscopy imaging reconstruction method |
CN108227717A (en) * | 2018-01-30 | 2018-06-29 | 中国人民解放军陆军装甲兵学院 | Multiple mobile robot's map amalgamation method and convergence platform based on ORB features |
CN108537263A (en) * | 2018-03-29 | 2018-09-14 | 苏州大学张家港工业技术研究院 | Grid map fusion method based on maximum public subgraph |
CN108775901A (en) * | 2018-07-20 | 2018-11-09 | 山东大学 | A kind of real-time SLAM scenes map structuring system, navigation system and method |
CN109426222A (en) * | 2017-08-24 | 2019-03-05 | 中华映管股份有限公司 | Unmanned handling system and its operating method |
CN111462198A (en) * | 2020-03-10 | 2020-07-28 | 西南交通大学 | Multi-mode image registration method with scale, rotation and radiation invariance |
CN112051921A (en) * | 2020-07-02 | 2020-12-08 | 杭州易现先进科技有限公司 | AR navigation map generation method and device, computer equipment and readable storage medium |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101033971A (en) * | 2007-02-09 | 2007-09-12 | 中国科学院合肥物质科学研究院 | Mobile robot map building system and map building method thereof |
CN101266659A (en) * | 2008-05-08 | 2008-09-17 | 山东大学 | Robot grid sub-map amalgamation method based on immune self-adapted genetic algorithm |
CN101413806A (en) * | 2008-11-07 | 2009-04-22 | 湖南大学 | Mobile robot grating map creating method of real-time data fusion |
-
2010
- 2010-08-20 CN CN2010102620759A patent/CN101944240B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101033971A (en) * | 2007-02-09 | 2007-09-12 | 中国科学院合肥物质科学研究院 | Mobile robot map building system and map building method thereof |
CN101266659A (en) * | 2008-05-08 | 2008-09-17 | 山东大学 | Robot grid sub-map amalgamation method based on immune self-adapted genetic algorithm |
CN101413806A (en) * | 2008-11-07 | 2009-04-22 | 湖南大学 | Mobile robot grating map creating method of real-time data fusion |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105447911A (en) * | 2014-09-26 | 2016-03-30 | 联想(北京)有限公司 | 3D map merging method, 3D map merging device and electronic device |
CN105447911B (en) * | 2014-09-26 | 2020-01-31 | 联想(北京)有限公司 | 3D map fusion method and device and electronic equipment |
CN105184243A (en) * | 2015-08-24 | 2015-12-23 | 王红军 | Environment characteristic expression and identification method based on three dimensional grid map |
CN105184243B (en) * | 2015-08-24 | 2018-10-23 | 王红军 | A kind of environmental characteristic expression based on 3 d grid map and knowledge method for distinguishing |
CN105094135A (en) * | 2015-09-03 | 2015-11-25 | 上海电机学院 | Distributed multi-robot map fusion system and fusion method |
CN105469445B (en) * | 2015-12-08 | 2018-06-29 | 电子科技大学 | A kind of step-length changeably drawing generating method |
CN105469445A (en) * | 2015-12-08 | 2016-04-06 | 电子科技大学 | Step size changeable map generation method |
CN105701771A (en) * | 2016-03-17 | 2016-06-22 | 江苏科技大学 | Digital map stitching method based on radio frequency identification anchors |
CN106227218A (en) * | 2016-09-27 | 2016-12-14 | 深圳乐行天下科技有限公司 | The navigation barrier-avoiding method of a kind of Intelligent mobile equipment and device |
CN106447717B (en) * | 2016-09-30 | 2019-05-03 | 中国科学院自动化研究所 | A kind of method for reconstructing of the light selective film illumination micro-imaging based on multi-angle |
CN106447717A (en) * | 2016-09-30 | 2017-02-22 | 中国科学院自动化研究所 | Multi-angle based selective light-sheet illumination microscopy imaging reconstruction method |
CN109426222A (en) * | 2017-08-24 | 2019-03-05 | 中华映管股份有限公司 | Unmanned handling system and its operating method |
CN108227717A (en) * | 2018-01-30 | 2018-06-29 | 中国人民解放军陆军装甲兵学院 | Multiple mobile robot's map amalgamation method and convergence platform based on ORB features |
CN108227717B (en) * | 2018-01-30 | 2021-12-03 | 中国人民解放军陆军装甲兵学院 | Multi-mobile-robot map fusion method and fusion platform based on ORB (object-oriented bounding Box) features |
CN108537263A (en) * | 2018-03-29 | 2018-09-14 | 苏州大学张家港工业技术研究院 | Grid map fusion method based on maximum public subgraph |
CN108537263B (en) * | 2018-03-29 | 2020-10-30 | 苏州大学张家港工业技术研究院 | Grid map fusion method based on maximum public subgraph |
CN108775901A (en) * | 2018-07-20 | 2018-11-09 | 山东大学 | A kind of real-time SLAM scenes map structuring system, navigation system and method |
CN111462198A (en) * | 2020-03-10 | 2020-07-28 | 西南交通大学 | Multi-mode image registration method with scale, rotation and radiation invariance |
CN111462198B (en) * | 2020-03-10 | 2023-03-24 | 西南交通大学 | Multi-mode image registration method with scale, rotation and radiation invariance |
CN112051921A (en) * | 2020-07-02 | 2020-12-08 | 杭州易现先进科技有限公司 | AR navigation map generation method and device, computer equipment and readable storage medium |
CN112051921B (en) * | 2020-07-02 | 2023-06-27 | 杭州易现先进科技有限公司 | AR navigation map generation method, device, computer equipment and readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN101944240B (en) | 2012-02-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN101944240B (en) | Fusion method of multi-robot three-dimensional geometrical map | |
Rabbani et al. | Efficient hough transform for automatic detection of cylinders in point clouds | |
CN107862719B (en) | Method and device for calibrating external parameters of camera, computer equipment and storage medium | |
CN105241461A (en) | Map creating and positioning method of robot and robot system | |
CN102929288B (en) | Unmanned aerial vehicle inspection head control method based on visual servo | |
CN105509733A (en) | Measuring method for relative pose of non-cooperative spatial circular object | |
CN104240297A (en) | Rescue robot three-dimensional environment map real-time construction method | |
CN105091744A (en) | Pose detection apparatus and method based on visual sensor and laser range finder | |
CN104395932A (en) | Method for registering data | |
CN103411589B (en) | A kind of 3-D view matching navigation method based on four-dimensional real number matrix | |
Das et al. | Scan registration with multi-scale k-means normal distributions transform | |
Rao et al. | CurveSLAM: An approach for vision-based navigation without point features | |
CN205247208U (en) | Robotic system | |
CN103729510B (en) | Based on the interior 3 D complex model exact mirror image symmetry computational methods accumulateing conversion | |
CN110163902A (en) | A kind of inverse depth estimation method based on factor graph | |
Lee et al. | 3D visual perception system for bin picking in automotive sub-assembly automation | |
KR102361133B1 (en) | Method for acquiring distance to at least one object located in omni-direction of vehicle and vision device using the same | |
KR101093793B1 (en) | Method for measuring 3d pose information using virtual plane information | |
CN104949628A (en) | Method for reconstructing complex morphology of flexible platy structure based on two-dimensional orthogonal curvature | |
CN106023314A (en) | B spline master curve fitting method based on rotary axis direction mapping | |
CN100590658C (en) | Method for matching two dimensional object point and image point with bilateral constraints | |
Kyriakis et al. | Terrain following for fixed-wing unmanned aerial vehicles using feedback equivalence | |
Song et al. | Some sensing and perception techniques for an omnidirectional ground vehicle with a laser scanner | |
CN111504276B (en) | Visual projection scale factor set-based joint target function multi-propeller attitude angle acquisition method | |
Rosenhahn et al. | Free-form pose estimation by using twist representations |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant |