LU501005B1 - Global drift-free simultaneous localization and mapping method for autonomous robot - Google Patents

Global drift-free simultaneous localization and mapping method for autonomous robot Download PDF

Info

Publication number
LU501005B1
LU501005B1 LU501005A LU501005A LU501005B1 LU 501005 B1 LU501005 B1 LU 501005B1 LU 501005 A LU501005 A LU 501005A LU 501005 A LU501005 A LU 501005A LU 501005 B1 LU501005 B1 LU 501005B1
Authority
LU
Luxembourg
Prior art keywords
global
frame
line
autonomous robot
mapping method
Prior art date
Application number
LU501005A
Other languages
German (de)
Inventor
Liangyu Zhao
Original Assignee
Beijing Institute Tech
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute Tech filed Critical Beijing Institute Tech
Priority to LU501005A priority Critical patent/LU501005B1/en
Application granted granted Critical
Publication of LU501005B1 publication Critical patent/LU501005B1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/579Depth or shape recovery from multiple images from motion
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The present invention discloses a global drift-free simultaneous localization and mapping method for an autonomous robot. Line features are added to local state estimation to represent geometric structure information of an environment more intuitively. Meanwhile, a line representation method based on linear errors and a local state and global sensor information fusion algorithm are used to effectively solve the problem of accurate state estimation in large-scale weak texture scenarios, realize locally accurate and global drift-free pose estimation, and improve the robustness in repeated line feature texture scenarios.

Description

BL-5337
GLOBAL DRIFT-FREE SIMULTANEOUS LOCALIZATION AND MAPPING ~~ LV501005
METHOD FORAUTONOMOUS ROBOT
TECHNICAL FIELD
[01] The present invention relates to a simultaneous localization and mapping method for an autonomous robot, in particular to a global drift-free simultaneous localization and mapping method for an autonomous robot, belonging to the field of intelligent control on robots.
BACKGROUND ART
[02] The Simultaneous Localization and Mapping (SLAM) system is a key technology of autonomous navigation. The visual-inertial fused SLAM system can improve problems of a camera caused by strong illumination changes through acceleration and angular speed information sensed by an inertial measurement unit (IMU) thereof, and has high positioning accuracy in a small area.
[03] However, the state estimation technology for autonomous robots still faces huge challenges in large-scale weak texture scenarios. On the one hand, it is difficult to extract a large number of effective feature points, so that the localization accuracy is greatly reduced. Videos with poor texture or motion blur may even cause the system to completely fail.
[04] On the other hand, the multi-source information fusion method used in large-scale scenarios does not consider errors of global localization, involves noise and has a low acquisition frequency, so it cannot be used alone for high-precision localization.
SUMMARY
[05] In order to overcome the above problems, the present invention designs a global drift-free simultaneous localization and mapping method for an autonomous robot, which fuse local positioning and global sensing localization to achieve accurate global sensing localization.
[06] The local positioning includes state estimation using a camera, an inertial measurement unit, and a laser radar;
[07] The global sensing localization includes state estimation using a global navigation satellite system (GNSS), a magnetometer, and a barometer.
[08] Further, multi-source fusion is performed on the visual inertia and the global navigation satellite system, the visual inertia including the inertial measurement unit and the camera.
[09] According to the present invention, the method includes the following steps:
[10] S1, tracking of front-endpoint and line features;
[11] S2, joint optimization of back-end visual inertial SLAM poses; and
[12] S3, fusion with a GNSS pose to obtain a global pose.
[13] In step S1, linear errors are used to express linear features as linear constraints on endpoints of a line, so that the line features are naturally integrated into a linear expression based on a feature point algorithm. 1
BL-5337
[14] Specifically, step S1 includes the following sub-steps: LU501005
[15] S11, establishing a point-line feature representation observation model; and
[16] S12, establishing an IMU pre-integration observation model.
[17] The present invention has the following beneficial effects.
[18] (1) The autonomous robot can achieve locally accurate and global drift-free pose estimation in large-scale weak texture scenarios.
[19] (2) Line features are added to local state estimation, which effectively improves the accuracy of relative pose estimation between key frames.
[20] (3) Linear features are expressed as linear constraints on endpoints of a line, so that the line features are naturally integrated into a linear expression based on a feature point algorithm, which effectively improves the robustness of the algorithm in the scenarios of repeated line features.
[21] (4) Locally accurate and global drift-free pose estimation is achieved.
BRIEFT DESCRIPTION OF THE DRAWINGS
[22] FIG 1 shows a schematic flow diagram of a simultaneous localization and mapping method for an autonomous robot according to a preferred embodiment of the present invention;
[23] FIG 2 shows a schematic diagram of projection errors of a line feature in the simultaneous localization and mapping method for an autonomous robot;
[24] FIG 3 shows a schematic diagram of projection errors of a line feature in the simultaneous localization and mapping method for an autonomous robot; and
[25] FIG 4 shows a schematic diagram of a pre-integration process in the simultaneous localization and mapping method for an autonomous robot according to a preferred embodiment.
DETAILED DESCRIPTION OF THE EMBODIMENTS
[26] A global drift-free simultaneous localization and mapping method for an autonomous robot according to the present invention includes the following steps to fuse local positioning and global sensing localization:
[27] S1. Tracking of front-endpoint and line features;
[28] S2. Joint optimization of back-end visual inertial SLAM poses; and
[29] S3. Fusion with a GNSS pose to obtain a global pose.
[30] In step S1, an optical flow method is used to perform front-end data association between adjacent image frames captured by a camera, so as to track features.
[31] In the present invention, a linear error is used to express a linear feature as a linear constraint on an end of a line, so that the line feature is naturally integrated into a linear expression based on a feature point algorithm, which effectively improves the robustness of the algorithm in a scenario of repeated line features.
[32] Specifically, step S1 includes the following sub-steps:
[33] S11. A point-line feature representation observation model is established;
[34] S12. An IMU pre-integration observation model is established.
[35] In step S11, projection errors of a line feature are shown in FIG 2 and FIG 3.
Where, i represents different frame images of the camera, the pose of the i-th frame of 2
BL-5337
LU501005 . FE (3 . . the camera is I, € SEQ) and 5 ( ) represents Euclidean transformation; ” represents the world coordinate system, which usually has a plurality of point features in a frame of image, 3D coordinates of the observed j-th feature point are expressed as 3 3
X, eR , endpoints of the k-th line segment are Ex and Quen , its projected . . P, k q. k € R? : coordinates on the image plane are * and “* , and the corresponding . ph qj eR’ . . P homogeneous coordinates are Pix and “+ ; 2D point coordinates of “** and 2
Qos correspondingly detected on the image plane are ” and x ER , the . . pl. q eR’ 3 corresponding homogeneous coordinates are ‘* and “* , R’ represents three-dimensional coordinates, and R° represents two-dimensional coordinates.
[36] Further, in the present invention, a plane can be determined from a line constituted by the optical center of the camera, + and “+, and its unit normal vector 4 is expressed as: ~h ~h 1 _ i ] i r Pix %Yir =[h 2 ;] |# ~h
Xd. 37] Pix Xdix (1)
[38] Where, h, Lh and % are parameter coefficients and are values to be solved, —h sh
Pixand ‘+ can be obtained from the image.
[39] In the present invention, a projection model of the camera is expressed as 7, and its homogeneous coordinates are expressed as 7 "
O=l[o, ©, o,]' =R,P, +t, fire, 0, a(T,. P,)= 0,
La 9
T.P wap 1 >
[40] (2)
[41] Where, €) and % represent the rotation and translation of Le 3 respectively; F eR represents point coordinates in the world coordinate system, © 3
BL-5337
LU501005 is the point coordinates in the corresponding camera coordinate system, Le, ds and C “> are the focal length and principal point of the camera respectively.
[42] Further, in the present invention, the point-line feature representation el observation model is represented by a re-projection error "/ of the point features ! and a re-projection error “* of the line features, respectively: poo || 7e Tu, X e;; = -7( iw > wii)
Yi h e' _ d,. _ Li TT (r,.P,,) ik T7 To h du Li "A (Tu >Qvr)
[43] (3)
[44] Where, the superscript p represents a point, the superscript / represents a line, 5» and " ’ represent pixel coordinates of the j-th feature point obtained by ; d d . . P q matching the i-th frame, ”* and “* are respectively distances from “* and “+ to a projection line Prix
[45] Further, in equation 3, the Jacobian matrix of disturbance of the camera pose by the line feature re-projection error is: fr, 3 dla tjr > | pi tLe q phd { fe 5%, > Lo ZA { [a Foy Fo ht hoy bs FR . a. ; 4 Th 0 0, 9, , a , 1 rs rs Pa, % J D Gy, ¥oa, ie * Be : rh Fe hfe h RE A 4%, LS EA a, 2 i = SE ——— =f. —— FA +=; FAQ +}, +f =, =f, Th +1, Al, i
[46] [is PE Tr er 4: “ar Far = = dis (4)
[47] Where, ¢ is a Lie algebra representation form of Le , and { ls is an antisymmetric matrix, which can be expressed as: a 0 — b b=|c 0 -a c -b a 0
[48] (5) 4
BL-5337
[49] Further, the Jacobian matrixes of the linear feature re-projection error on the LU501005 . P & @ & . endpoints * and *** ofthe line segment are:
Ca fo, Lo fe. So
Ce, 7 ly ied ey a R, = | a, a o, op, ) . ; ’ 0
J
0
Vom i i
Ce; | - f f f a 3 = FL Fy F0 à SY, } \ EQ, & ' ==, =>} meet - ete R, } io, a 0, a,
[50] * = © ; CC (6) . . Ee On ; ra .
[51] Further, in equation 6, **, “¥ and ** are coordinate components of 0 0 Po, point # and —” represents camera coordinates corresponding to point v#; *x, a 2 0 “ay and % are coordinate components of point € 7 , and 0, represents camera coordinates corresponding to point “+; * and 4 can be expressed by the following equation: 0, € Pr a RP , + Fan — as — | " 1 0 g € Oz a RO. . + La
[52] ; (7)
[53] Where, Re represents rotation from the world coordinate system to the camera coordinate system, and ** represents translation from the world coordinate system to the camera coordinate system.
[54] In step S12, a scale factor of visual measurement is optimized by using inertial measurement data, and pose estimation results of two sensors are aligned to realize visual-inertial joint pose calibration.
[55] Further, the state transfer equation of IMUs can be expressed as:
BL-5337
LU501005 _ w b w 2
Puy, = Pu WO [| (R,, a" —g")5t i i te[i,i+1] t w w b w v,=v +] R ,a‘— ot i+l i inion wb, 8 ) > 0 = t
Dus, J te[i.i+1] dus, Le’
[56] ? (8)
[57] All IMUs between the i-th frame and the (i+1)-th frame are integrated to obtain oe w . . a position p wh, , aspeed V;.1, and rotation @ws., ofthe (i+1)-th frame;
[58] Ye represents the speed of the i-th frame, Ra represents the rotation of the k, i-th frame from the body coordinate system to the world coordinate system, & represents an acceleration measured by the IMU under the body coordinate system, and f represents a time from the i-th frame to the (i+1)-th frame;
Ww
[59] S represents an acceleration of gravity under the world coordinate system,
E& and €9 represents an angular speed of the IMU.
[60] The state transfer of the IMU depends on the rotation, position and speed of the i-th frame. In the traditional algorithm, each time the robot adjusts its pose, it needs to re-iterate and update the speed and rotation of the i-th frame, and needs to re-integrate the values after each iteration, so that the transfer strategy is time-consuming.
[61] In the present invention, optimization variables are separated from IMU pre-integration terms from the /-th frame to the (7+1)-th frame: a . _n oo Car a 43
Rpg = R, { Pub, ge LA af m 5 g At }+ Ei. 3 Sat a Ww à à 3 R,. Pat R, », ë sf) + B BD 4 how Wd Hur, [= 4 &
[62] (9)
R. . .
[63] Where, “> represents rotation from the world coordinate system to the body coordinate system, and Ho represents rotation from the world coordinate system to . tax °° BA $ Ad 5 Say . the body coordinate system; [ = Ps. das. } SNe is a pre-integrated measurement value, which further satisfies: 6
BL-5337
LU501005 { ’ ae Bow oon?
EL tel] FEE
B= (Ron 9" )61
SD Jey par} TVR
Gun Ê LT: Là Hon 4 3 a 11°
CUT BE gel PUR Laat
[64] Le + (10)
[65] Where, &b, represents rotation from time & to time 5 under the world
Hop, bh coordinate system, and ” *:"* represents rotation from time “¢ to time “i under the body coordinate system.
[66] Since IMU measurement values usually contain noise and are collected in discrete time, considering noise and under discrete time conditions, the IMU pre-integration observation model in discrete time between k and +1 can be obtained according to equation (10): a =a, + ot +-—aot b; bn +1 b; bn bi, bn 2
Pop, = Pop, + aot dbp,. a de, ; ol
[67] (11)
[68] Where, b, represents a current frame and b,., represents a next frame;
[69] a, B., and 4,»,, represent position, speed, and rotation increments obtained by integrating from the m-th IMU time to the (m+1)-th IMU time in a discrete form; @,, , Pas, , and 4», represent IMU increment information obtained by median integration between two successive frames at the previous time; {#+ represents an acceleration and Ÿ*} represents an angular speed.
[70] Further, 7
BL-5337
LU501005
Fo à ff =D ba ; Bh, d ob bei ; Bay = by HY (6 bh En)
Sm Ar ° by Pr be, ca=s(Ry (@-b"+n")+ ; ob b bo iy
Re, {a mel bo + #4)
[71] i, Fmd * FU (12) ; Ba b. =}
[72] Where, the superscripts represent the ‘“ frame and the #***; frame, b. 5 br n° CL bh. & represents an IMU offset, £ represents angular speed white noise in the ”* n°» . . 2 b x frame, and ® represents acceleration white noise in the ** frame.
[73] The process of pre-integration is shown in FIG. 4. In the present invention, since the offset update is small, pre-integrals can be approximated as: dy, dy, + JE db AS ob}
B.. =p, + JE Eh + JE 56° bb, ~ bb, Cr Bl Ul, ¥ € 5 / i : a7 = i, x & % re ag Bb
Yrs, f bb, > JE Oo b> 74) * ; (13)
Ass à; fra .
[75] Where, Wy / &82 | and Drs are pre-integrals, a fen J De a fees
Oa, h N ly, A IN off &5 ; a op & à 7 N £4 & ; B
MN
So Soh? B88; « É0b = Ook fo dak « & # = > are
Jacobian matrixes of pre-integrals for each offset at the i-frame time, and n represents the n-th frame;
IX
[76] “represents an IMU acceleration offset.
[77] The measurement residual of the IMU is: 8
BL-5337
LU501005 oa, , op, by b
En 7950855, X)= oq, », bb,
Sb’ bb, 5b w 1 Ww 2 ~
R,, (Ps, = Py, TV; df+5 Lt )= Op, 214, ® (As, ® qs, Me — w w W D - A N b, b, b; —b, bb” — b
[78] 5 5 (14)
[79] Where, Le is a real part of a rotation error expressed in a quaternion form; br bh,
[80] g is an angular speed offset inthe ” frame.
[81] In S2, all state variables in a sliding window are optimized by minimizing the sum of cost terms of all measurement residuals, so as to obtain an optimized pose, which can be expressed as: i 2 » 1? min pen. I. ) + > pe’, |, ) + ’ (EX, ? 1 11? b > ode. | + X pe. |. )
[82] (ik)ex, (i, i+D)ey, (15) e’
[83] Where, Hi+l Hs an IMU measurement residual from the i-th frame to (i+1)-th frame time in the body coordinate system and can be obtained according to equation 14, A is a set of all IMU pre-integrals in the sliding window and can be / obtained according to equation 11; J and "" are respectively re-projection errors of the point feature and the line feature and can be obtained according to equation 3; % and #4 are respectively a set of point features and a set of line features observed 9
BL-5337 in the i-th frame; © is a Cauchy robust function used to suppress abnormal values; LU501005 € prior . . . . “qe . . . . is prior information calculated after the sliding window is marginalized by one frame and obtained during the initialization of the camera, and the specific obtaining method can be set by those skilled in the art based on experience.
[84] In S3, the optimized pose obtained in step S2 is fused with the GNSS pose to obtain a global pose.
[85] Specifically, in the fusion process, the variable to be estimated is the global
Y=, Xl XY x ={p".4} Pp ® pose ASK as op all points, where “i — Didi 3 ; is a
W position under the global coordinate system, and q is an orientation under the global coordinate system.
[86] Further, the fusion process can be expressed by the following equation:
X = arg max | Me x)
[88] Where, Æ represents the global pose after fusion, S is a set of measurement values, including visual inertial local measurement and global wt measurement GNSS, / represents the law of maximum likelihood, and ** represents all measurement values at time 7.
[89] Further, the uncertainty of the measurement values conforms to the Gaussian distribution, and equation 16 can be expressed as: n ; La * - pe ; TI { N x FAN i x = arg max [TT Tepe Az ~ A, Ol) 2 i=0 kesS ’ 2 N + EY 2 = arg Min 23e A Aa
[90] £ :=6 kel (17) i
[91] Where, represents a measurement value obtained by the measurement
OF model at time /, *** represents a covariance.
[92] Since the local estimation of visual inertia is accurate in a small range, the relative pose between two frames can be used. Considering two consecutive frames 7-1 and /, a visual inertia factor can be obtained:
BL-5337
LU501005 wh hr ay owt Lif .
Sp Ha, { X} TT Ay i, (xX, x)
Pop 1 i w —1 w w u 8 N (p, — FD PL - PL
F1 5 wow 4: À Gr 4.
[93] i Mr ti i: _ (18) i i . 7 I
[94] Where, Ca > it) and (4, ‚PP. ) respectively represent the poses of local estimation at the consecutive frames 7-1 and f, is a quaternion difference between the position and the rotation state. In equation 18, the first line on the right is a relative position error of two poses, and the second line represents a relative rotation ri. ch} error of two poses. “* represents a measurement value of visual inertia at time /, ’ represents a measurement value obtained by the visual inertia measurement model at x € ES time 7, “‘*-1" is a global pose at time 7-1, ds is rotation in a local frame of visual
D, inertia at time 7-1, P: is a position at time f in the local estimation of visual inertia, w £ w —1 #. ei is inverse rotation of a global frame at time #-1, # * is a position of the global
JW frame at time 7, and ##* is rotation of the global frame at time 7.
[95] The original measurement value of the position obtained by GNSS can be
PSS _— w wo owl . expressed as “7 = LA > Vi 571 , then the GNSS factor is expressed as:
GNSS GNSS GNSS GNSS
Z a h (X) = 4, a h (x,) _ poss _ p”
[96] ‘ ‘ (19)
GNSS J ONSS
[97] Where, %; represents a measurement value of GNSS at time 7, % represents a measurement value obtained by the GNSS measurement model at time 7,
GNSS, . P” ;
P; is an original measurement value of GNSS, and /; is a measurement value obtained by the measurement model.
[98] According to equations 18-19, equation 17 can be expressed as: ; 2 N GNSS 7.GNSS 2 * . / ! Ni Ni oop X = argmin(X > [el AO] +E HE [27 =A" fers)
X t=0 les * t=0 GNSSeS ‘ (20) 11
BL-5337 ; GNSS LU501005
[100] Where, ©, represents a visual inertia covariance, 0, represents a GNSS covariance, and then the fused global pose is obtained. 12

Claims (4)

BL-5337 WHAT IS CLAIMED IS: LUS01005
1. A global drift-free simultaneous localization and mapping method for an autonomous robot, characterized in that, local positioning and global sensing localization are fused to achieve accurate global sensing localization, the local positioning comprises state estimation using a camera, an inertial measurement unit, and a laser radar; the global sensing localization comprises state estimation using a global navigation satellite system, a magnetometer, and a barometer.
2. The global drift-free simultaneous localization and mapping method for the autonomous robot according to claim 1, characterized in that, the method comprises the following steps: S1, tracking of front-endpoint and line features; S2, joint optimization of back-end visual inertial SLAM poses; and S3, fusion with a GNSS pose to obtain a global pose.
3. The global drift-free simultaneous localization and mapping method for the autonomous robot according to claim 1, characterized in that, in step S1, linear errors are used to express linear features as linear constraints on endpoints of a line, so that the line features are naturally integrated into a linear expression based on a feature point algorithm.
4. The global drift-free simultaneous localization and mapping method for the autonomous robot according to claim 2, characterized in that, in S2, all state variables in a sliding window are optimized by minimizing the sum of cost terms of all measurement residuals, so as to obtain an optimized pose, which can be expressed as: i 2 » 1? min pen. I. ) + > pe’, |, ) + " (J), ? 2 2 / b > ple + X ede] "112, , ’ Zp (ik)ey (i, i+D)ey, (15) b €; i+l . . ; ; where, ’ is an IMU measurement residual from i-th frame to (i+1)-th frame time in a body coordinate system, and A is a set of all IMU pre-integrals in / the sliding window; ” and ’” are respectively re-projection errors of a point feature and a line feature and can be obtained according to equation 3; 7» and % are respectively a set of point features and a set of line features observed in the i-th frame; 13
BL-5337 LU501005 e . P is a Cauchy robust function used to suppress abnormal values; /""” is prior information calculated after the sliding window is marginalized by one frame and obtained during the initialization of the camera. 14
LU501005A 2021-12-14 2021-12-14 Global drift-free simultaneous localization and mapping method for autonomous robot LU501005B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
LU501005A LU501005B1 (en) 2021-12-14 2021-12-14 Global drift-free simultaneous localization and mapping method for autonomous robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
LU501005A LU501005B1 (en) 2021-12-14 2021-12-14 Global drift-free simultaneous localization and mapping method for autonomous robot

Publications (1)

Publication Number Publication Date
LU501005B1 true LU501005B1 (en) 2023-06-15

Family

ID=86731543

Family Applications (1)

Application Number Title Priority Date Filing Date
LU501005A LU501005B1 (en) 2021-12-14 2021-12-14 Global drift-free simultaneous localization and mapping method for autonomous robot

Country Status (1)

Country Link
LU (1) LU501005B1 (en)

Similar Documents

Publication Publication Date Title
CN111024066B (en) Unmanned aerial vehicle vision-inertia fusion indoor positioning method
CN109887057B (en) Method and device for generating high-precision map
CN109579843B (en) Multi-robot cooperative positioning and fusion image building method under air-ground multi-view angles
CN110068335B (en) Unmanned aerial vehicle cluster real-time positioning method and system under GPS rejection environment
CN110084832B (en) Method, device, system, equipment and storage medium for correcting camera pose
CN109509230A (en) A kind of SLAM method applied to more camera lens combined type panorama cameras
CN111833333A (en) Binocular vision-based boom type tunneling equipment pose measurement method and system
CN110319772B (en) Visual large-span distance measurement method based on unmanned aerial vehicle
Yang et al. Vision‐based localization and robot‐centric mapping in riverine environments
CN114623817B (en) Self-calibration-contained visual inertial odometer method based on key frame sliding window filtering
CN110726406A (en) Improved nonlinear optimization monocular inertial navigation SLAM method
CN110887486B (en) Unmanned aerial vehicle visual navigation positioning method based on laser line assistance
CN111986261B (en) Vehicle positioning method and device, electronic equipment and storage medium
CN110207693B (en) Robust stereoscopic vision inertial pre-integration SLAM method
CN114019552A (en) Bayesian multi-sensor error constraint-based location reliability optimization method
CN114638897B (en) Multi-camera system initialization method, system and device based on non-overlapping views
Chen et al. Stereo visual inertial pose estimation based on feedforward-feedback loops
Bazin et al. UAV attitude estimation by vanishing points in catadioptric images
CN112731503B (en) Pose estimation method and system based on front end tight coupling
CN112991400B (en) Multi-sensor auxiliary positioning method for unmanned ship
CN116681733B (en) Near-distance real-time pose tracking method for space non-cooperative target
LU501005B1 (en) Global drift-free simultaneous localization and mapping method for autonomous robot
CN116804553A (en) Odometer system and method based on event camera/IMU/natural road sign
CN113077515B (en) Tight coupling initialization method for underwater vision inertial navigation pressure positioning
CN115344033A (en) Monocular camera/IMU/DVL tight coupling-based unmanned ship navigation and positioning method

Legal Events

Date Code Title Description
FG Patent granted

Effective date: 20230615