CN109991636A - Map constructing method and system based on GPS, IMU and binocular vision - Google Patents

Map constructing method and system based on GPS, IMU and binocular vision Download PDF

Info

Publication number
CN109991636A
CN109991636A CN201910225839.8A CN201910225839A CN109991636A CN 109991636 A CN109991636 A CN 109991636A CN 201910225839 A CN201910225839 A CN 201910225839A CN 109991636 A CN109991636 A CN 109991636A
Authority
CN
China
Prior art keywords
imu
coordinate system
acceleration
vision
moment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910225839.8A
Other languages
Chinese (zh)
Inventor
孙静
拱印生
朱庆林
张延坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
QIMING INFORMATION TECHNOLOGY Co Ltd
Original Assignee
QIMING INFORMATION TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by QIMING INFORMATION TECHNOLOGY Co Ltd filed Critical QIMING INFORMATION TECHNOLOGY Co Ltd
Priority to CN201910225839.8A priority Critical patent/CN109991636A/en
Publication of CN109991636A publication Critical patent/CN109991636A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)

Abstract

The present invention discloses a kind of map constructing method and system based on GPS, IMU and binocular vision, is corrected using IMU and binocular vision to it and uses vision guided navigation without the weak place of GPS signal or signal in tunnel etc..It can be good at solving the problems in vision SLAM, and improve the robustness of SLAM system.The method merged using visual information and IMU information, multiple sensors data are merged, provide new algorithm structure, effective frame is stablized in building, pilotless automobile or robot are carried out stablizing effective autonomous positioning and map structuring, it realizes that the location information of GPS is corrected and supplements, is applied to pilotless automobile and robot system, carry out large-scale application.

Description

Map constructing method and system based on GPS, IMU and binocular vision
Technical field:
The present invention discloses a kind of map constructing method and system based on GPS, IMU and binocular vision, is related to one kind The autonomous positioning of view-based access control model feature and IMU information and map structuring system, belong to robot and control technology field.
Background technique:
In recent years, the rapid development of Internet technology brings the chance of revolutionary variation to automobile industry.With this Meanwhile vehicle intellectualized technology is just gradually used widely, this technology simplifies the driver behavior of automobile and improves row Sail safety.And it is exactly pilotless automobile that wherein most typically, which is also the most popular following application,.In adding for artificial intelligence technology It holds down, unmanned high speed development, is changing the trip mode of the mankind, and then relevant industries pattern can be changed on a large scale.
For travelling pilotless automobile in zone of ignorance, due to being blocked etc. building, trees, automobile is normal The often state in no signal or weak signal, can not provide effective position;In the place of bad environments, because of the reasons such as weather GPS Or Beidou Navigation System weak output signal, pilotless automobile can not effectively be positioned.For this purpose, pilotless automobile is necessary Ability with autonomous positioning and map structuring.By real-time autonomous positioning and map structuring, ambient condition information is obtained, really Determine the location of pilotless automobile, important foundation is provided for path planning.
Simultaneous localization and mapping (simultaneous localization an mapping, SLAM) technology is machine The important problem in one, device people field, SLAM problem can be described as: the robot for being equipped with sensor freely moves in space It is dynamic, using collected sensor information, self-position is positioned, while on the basis of self poisoning increment type structure Map is built, positions and charts while realizing robot.Two key factors for influencing SLAM Resolving probiems are respectively sensor Data characteristics and observation data dependence, if it is possible to it is efficient using sensing data and to improve the accuracy of data correlation And robustness, the positioning of robot and the building of map will be influenced.
However unmanned technology of today is mostly still in auxiliary driving phase, lacks autonomous positioning and map structuring Ability.Pilotless automobile less simultaneously carries out autonomous positioning and map structuring, positioning and map structure using single-sensor It is lower to build precision, cannot effectively merge multiple sensors, or multiple sensors data can be merged, but It is that cannot carry out stablizing effective autonomous positioning and map structuring to pilotless automobile, and cannot popularize on a large scale.
Summary of the invention:
The present invention provides a kind of map constructing method and system based on GPS, IMU and binocular vision, using IMU and Binocular vision is corrected it and uses vision guided navigation without the weak place of GPS signal or signal in tunnel etc..It can be good at The problems in vision SLAM is solved, and improves the robustness of SLAM system.
A kind of map constructing method and system based on GPS, IMU and binocular vision of the present invention, the skill of use Art scheme are as follows:
A kind of map constructing method based on GPS, IMU and binocular vision of the present invention, which is characterized in that including Following steps:
Step 1: the pose of initialization pilotless automobile or robot, determines pilotless automobile traveling or robot The track of movement corrects IMU Inertial Measurement Unit, the first-class data hold time consistency of binocular camera shooting;
Step 2: carrying out IMU Inertial Measurement Unit and binocular camera and the letter based on the RTK GNSS receiver resolved Breath acquisition and extraction, to carry out visual signature tracking;
Step 3: the IMU pre-integration based on flow pattern, comprising: IMU pre-integration calculates, calculates pre- score Jacobi and association side Poor matrix calculates residual error Jacobi;
Step 4: IMU initialization (initialization of vision inertia joint), this is the first step of vision and inertial data fusion, It is one section of loose coupling process, vision data and quick IMU data is combined, comprising: offset of gyroscope demarcates (zero bias), ruler Degree restores and acceleration of gravity pre-estimation, accelerometer biasing calibration and the optimization of scale acceleration of gravity;
Step 5: being optimized using close coupling Optimized model, comprising: the optimization of vision inertia consecutive frame close coupling, vision The optimization of inertia local map close coupling;
Step 6: estimation pilotless automobile or the similarity degree of robot trajectory and ambient enviroment three-dimensional map, obtain Real-time pose;
Step 7: constructing map and carrying out closed loop detection using bag of words, existing map is corrected, is further mentioned The precision of high three-dimensional map.
1, in step 2) information collection and extracting:
After each binocular camera collects image, the extraction of ORB feature is carried out to image, it, will after extracting ORB feature The result of extraction stores;
The IMU data of pilotless automobile or robot interior are acquired in real time, and it is pre- to be sent into the IMU based on flow pattern It is handled in integral.
2. in IMU pre-integration of the step 3) based on flow pattern:
During the IMU pre-integration calculates, world coordinate system, IMU coordinate system and camera coordinates system usually with W, B, C come It indicates, Δ t is sampled at a certain time interval, measures the acceleration a of IMUBAnd angular velocity omegaB.IMU measurement model:
Wherein,The angular speed and acceleration of IMU measurement model under indicates coordinate system B,BωWB(t)∈R3It indicates Instantaneous angular velocity of the center the IMU B relative to world coordinate system W, R at coordinate system BWBIndicate the center IMU B relative to world coordinates It is the spin matrix of W,For RWBTransposition,Wa(t)∈R3Indicate the instantaneous acceleration of the IMU under W coordinate,WG indicates to sit in W Mark is lower gravitational vectors, and b indicates zero bias, and η indicates gaussian random noise, ba、bgAcceleration and gravitational vectors zero bias are respectively indicated, ηa、ηgRespectively indicate acceleration and gravitational vectors gaussian random noise;
Rotation R, the derivative for translating p and speed v can indicate are as follows:
Wherein,WV indicates the instantaneous velocity of the IMU under W coordinate system,Indicate instantaneous angular velocity of the B relative to W, RWBTable Show spin matrix of the center the IMU B relative to world coordinate system W,WP indicates the instantaneous translation under W coordinate system, For its derivative;
Rotation R, translation p and speed v under world coordinate system can be acquired by general integral formula, be used under discrete time Euler's integral can will above continuous time integral rewrite and simultaneous can obtain:
Δ t is certain time interval, ηgd(t)、ηad(t) indicate under discrete time acceleration and gravitational vectors Gauss with Machine noise, ba、bgRespectively indicate acceleration and gravitational vectors zero bias, ηad、ηgdRespectively indicate acceleration and gravitational vectors Gauss with Machine noise,For the angular speed and acceleration of IMU measurement model, quantity of state at this time is under world coordinate system;
If needing corresponding IMU quantity of state of known i moment now with two adjacent key frames of i and j, discrete time is solved The corresponding IMU pre-integration value of lower j moment key frame:
Δ t is certain time interval, Δ tijFor the time difference between i, j moment, g is gravitational vectors,Respectively Indicate the gravitational vectors changed over time and acceleration zero bias,Indicate the IMU measurement model angular speed changed over time, Indicate the IMU measurement model speed changed over time, Rj、Ri、vj、vi、pj、piThe respectively rotation at j moment and i moment, speed And translation, RkFor the spin matrix changed over time,For discrete acceleration and the gravitational vectors height changed over time This random noise;
In order to avoid repetition integral is changed using the opposite of quantity of state that IMU pre-integration method calculates i, j moment, that is, Reference frame is transformed under the i moment coordinate system of IMU by world coordinate system, IMU quantity of state is calculated at i the and j moment Between relative quantity:
Rj、Ri、vj、vi、pj、piThe respectively rotation at j moment and i moment, speed and translation,Point
The acceleration and gravitational vectors zero bias that Wei do not change over time,Indicate the IMU measurement model angle changed over time Speed,Indicate the IMU measurement model speed changed over time,For the discrete acceleration and gravity changed over time Vector Gaussian random noise, Δ Rij、Δvij、ΔpijRotation, speed and translation between i, j is poor,For Δ RiTransposition, ΔtijFor the time difference between i, j moment, Δ t is certain time interval, Δ Rik、ΔvikFor the rotation and speed at i, k moment Difference;
From the equations above as can be seen that entire pre-integration formula and biasing, noise have relationship, need to analyze respectively partially The influence with noise to system is set, noise obeys Gauss normal distribution, bigoted obedience random walk model;
It is the single order item not biased in the IMU pre-integration that initialization uses, Jacobian matrix is in initialization completion It is just calculated before optimizing afterwards, obtains the accurate pre-integration value (true value) of IMU and estimated value to obtain IMU's Residual error;
System is Gaussian distributed, and covariance matrix is calculated according to Gaussian Profile, is the matrix of a 9*9;
The calculating residual error Jacobi:
Residual error:
Parameter to be estimated:
ΔRij、Δvij、ΔpijIt is poor for the rotation at i, j moment, speed and translation,For device average value, For the acceleration and gravitational vectors zero bias at i moment,For its average value, δ ba, δ bg are its renewal amount, and φ is rotation Turn, Δ tijInter frame temporal is poor, Ri、Rj、vi、vj、pi、pjIndicate spin matrix, speed and the translation at i, j moment,For its turn It sets;
The corresponding Jacobi of displacement residual error is as above, and the j moment, corresponding Jacobi can also so derive, the rotation of speed residual sum Jacobi's formula it is similar to translation.So far IMU pre-integration is completed.
3. in step 4) IMU initialization (initialization of vision inertia joint):
The rotation amount of key frame is contacted using two to estimate the biasing of gyroscope, in a constant bgOn the basis of add One smallChange, calculate the renewal amount of offset of gyroscope by minimizing rotation error:
Wherein, N is the number of key frame, Δ Ri,i+1It is two continuous key frame pre-integration wheel measuring amounts, J is covariance Matrix,Respectively the B coordinate system at i+1 moment is relative to W coordinate system and the W coordinate system at i moment relative to B coordinate The spin matrix of system;Give zero initial value, above-mentioned function is optimized by G2O, and it is corresponding to solve minimum target functionAs top The initial bias of spiral shell instrument;
Pre-integration formula is substituted into after finding out zero bias can recalculate a pre-integration value, make pre-integration numerical value more Accurately;
It carries out scale to restore and acceleration of gravity pre-estimation, it is necessary first to establish pre-evaluation state vector X=[s, gW], Wherein, s is scale, gWIt is the acceleration of gravity under world coordinate system;
Three key frames are used herein (indicates) that simultaneous vision and IMU pre-integration data construct an AX with 1,2,3 The least square overdetermined equation of=B at least needs four key frames, seeks least square problem using singular value decomposition, speed compared with Slow but precision is high;
I is unit matrix,WPCFor the displacement of image center C under W coordinate system, RWB、RWCIt is W coordinate system relative to B coordinate system With the spin matrix of C coordinate system, Δ tnmTime difference (n, m=1,2,3) between two frames;
Since scale restores the influence for not accounting for accelerometer biasing with the calculating process of acceleration of gravity pre-estimation, So that acceleration of gravity and accelerometer biasing are difficult to differentiate between, it is likely that lead to systemic disease state property problem, so carrying out acceleration Meter biasing calibration and the optimization of scale acceleration of gravity;
If the acceleration of gravity direction of inertial coodinate system IgWNormalizated unit vectorθ is angle.Then the spin matrix between world coordinate system W and inertial coodinate system I is calculated:
Then acceleration of gravity is expressed as:
Due to RWIIt is rotation about the z axis, so only by X, the influence of Y-direction:
Biasing comprising acceleration:
Using three continuous key frames come release rate amount:
Wherein [](:,1:2)The first two columns of representing matrix, acceleration bias renewal amountδ θ is minute angle variable,WPCFor The displacement of image center C under W coordinate system,CPBFor translation of the center the IMU B under C coordinate system, s is scale, Δ tnm、Δpnm、Δ vnmIt is time, displacement and the speed difference (n, m=1,2,3) between two frames, R respectivelyWB、RWC、RWIIt is W coordinate system relative to B coordinate It is, the spin matrix of C coordinate system and I coordinate system,For g under I coordinate systemINormalizated unit vector, J is covariance matrix; Only optimize the direction of acceleration of gravity, the data that every suboptimization obtains all can calculate newest g by substituting into againWContinue excellent Change, variable convergence of going directly.
In step 5) is optimized using close coupling Optimized model:
There are two kinds of optimal ways for the optimization of vision inertia consecutive frame close coupling: (1) there is map rejuvenation:
Global optimization state vector, the rotation comprising current time j are constructed firstTranslational velocityDisplacement Accelerometer biasingAnd offset of gyroscopeGlobal error equation includes vision re-projection error EprojWith IMU measurement error EIMU
Vision re-projection equation is exactly simple pinhole camera re-projection equation, there is no what especially its residual error, it is refined can It is no different than calculating and information matrix with general vision;
IMU measuring error equation includes two parts, is P, V, R and b respectivelya,bgBiasing;Its residual error, residual error Jacobian matrix Completion has been settled accounts in IMU pre-integration process with information matrix, has been brought directly to:
eb=bj-bj
(40)
eR、ep、ev、ebFor rotation, translation, speed and biased error, RWB、RWC、RWIFor W coordinate system relative to B coordinate system, The spin matrix of C coordinate system and I coordinate system, b are biasing;
(2) there is not map rejuvenation:
As described above, constructing global optimization state vector, the rotation including current time j+1 and last moment j first Translational velocityDisplacementAccelerometer biasingAnd offset of gyroscopeGlobal error equation includes vision re-projection error Eproj, IMU measurement error EIMUWith prior uncertainty Eprior:
The error equation of vision and IMU and as before, prior uncertainty is the quantity of state at j moment;
eR、ep、ev、ebFor rotation, translation, speed and biased error,For its transposition, RWBFor W seat Mark system is relative to B coordinate system spin matrix, RBWIt is B coordinate system relative to W coordinate system spin matrix, b is biasing, and ρ is coefficient,WpBWvBThe translation and speed that are the center IMU B under W coordinate system;
Vision inertia local map close coupling optimize when, the key frame for needing to optimize include vision re-projection error and IMU measurement error.
Step 7) closed loop detection in, building map simultaneously using bag of words progress closed loop detection, to existing map into Row correction, further increases the precision of three-dimensional map:
A kind of map structuring system based on GPS, IMU and binocular vision of the present invention, which includes vehicle-mounted Arithmetic center, binocular camera, IMU Inertial Measurement Unit, the GNSS receiver based on RTK resolving are electrically connected;
The vehicle-mounted arithmetic center is by ROS frame by binocular camera, IMU Inertial Measurement Unit and based on RTK resolving The collected data of GNSS receiver are stored and are handled;
The GNSS receiver resolved based on RTK receives GPS positioning information;
The binocular camera acquisition surrounding image information carries out point Yun Jianmo;
The IMU Inertial Measurement Unit acquisition measurement automobile triaxial attitude angle and acceleration;
The pose for initializing pilotless automobile or robot first determines pilotless automobile traveling or robot motion Track, correct IMU Inertial Measurement Unit, the first-class data hold time consistency of binocular camera shooting;IMU inertia measurement is carried out later The information collection and extraction of unit and binocular camera and the GNSS receiver based on RTK resolving, are sent into vehicle-mounted arithmetic center; Secondly the IMU pre-integration based on flow pattern is carried out in vehicle-mounted arithmetic center, IMU initialization (initialization of vision inertia joint), is made Optimized with close coupling Optimized model and is calculated;Estimate the similar of pilotless automobile or robot trajectory and ambient enviroment three-dimensional map Degree obtains real-time pose;It constructs map and carries out closed loop detection using bag of words, existing map is corrected, into one Step improves the precision of three-dimensional map.
The positive effect of the present invention is:
Using the method that visual information and IMU information merge, multiple sensors data are merged, are provided new Effective frame is stablized in algorithm structure, building, carries out stablizing effective autonomous positioning and ground to pilotless automobile or robot Figure building, realizes that the location information of GPS is corrected and supplements, and is applied to pilotless automobile and robot system, carries out big Sizable application.
Detailed description of the invention:
Fig. 1 is system structure diagram of the invention;
Fig. 2, Fig. 3 are picture captured by binocular camera when embodiment 2 is tested in Chengdu University of Electronic Science and Technology campus.
Specific implementation method:
Below in conjunction with attached drawing, preferred implementation of the invention is described in detail.It should be appreciated that preferred implementation is only Illustrate the present invention, rather than limiting the scope of protection of the present invention.
Embodiment 1
The present invention is based on the autonomous positioning and map constructing method of GPS, IMU and binocular vision and system, including it is following Step:
1, the pose for initializing pilotless automobile or robot determines pilotless automobile traveling or robot motion Track, the data hold time consistency of correction IMU Inertial Measurement Unit, binocular camera, guarantees it under same local area network;
2, IMU Inertial Measurement Unit and binocular camera are carried out and the information based on the RTK GNSS receiver resolved is adopted Collection and extraction, to carry out visual signature tracking;
3, the IMU pre-integration based on flow pattern, comprising: IMU pre-integration calculates, calculates pre- score Jacobi and covariance square Battle array calculates residual error Jacobi;
4, IMU initialization (initialization of vision inertia joint), it is one that this, which is the first step of vision and inertial data fusion,
Section loose coupling process, vision data and quick IMU data are combined, comprising: offset of gyroscope calibration (zero
Partially), scale restores and acceleration of gravity pre-estimation, accelerometer biasing calibration and scale acceleration of gravity optimize;
5, it is optimized using close coupling Optimized model, comprising: the optimization of vision inertia consecutive frame close coupling, vision inertia office The optimization of portion's map close coupling;
Close coupling needs are added to characteristics of image in feature vector, obtain final posture information.Finally export binocular The document of camera acquired image and IMU, time and GPS coordinate.
6, estimate that the similarity degree of pilotless automobile or robot trajectory and ambient enviroment three-dimensional map obtains real-time position Appearance;
7, it constructs map and carries out closed loop detection using bag of words, existing map is corrected, further increases three Tie up the precision of map.
First by IMU is estimated pose sequence and camera estimation pose sequence alignment be estimated that camera track True scale, and the IMU pose that can predict picture frame well and last moment characteristic point are in lower frame image Position improves signature tracking algorithmic match speed and copes with atwirl algorithm robustness, and accelerometer provides in last IMU Gravity vector the position of estimation can be switched in the world coordinate system that practical navigation needs.Meanwhile smart phone etc. is mobile Terminal greatly reduces the Costco Wholesale of sensor to the wilderness demand of camera.It in summary, is a kind of low-cost and high-performance Navigation scheme.
Embodiment 2
As shown in Figure 1, the map structuring system based on GPS, IMU and binocular vision, including vehicle-mounted arithmetic center, binocular Camera, IMU Inertial Measurement Unit, GNSS receiver based on RTK resolving etc..
Binocular camera and IMU can be integrated in whole circuit board by hardware view fusion, and whole signal transmission are all It completes in circuit board, is directly counted between the microprocessor of control vision system and the microprocessor for controlling IMU According to exchange, the thing of environment or target information more details is promoted to after related algorithm by visual processes, Ke Yiti Rise the performance of whole system.The interaction and fusion that clear data can also be used pass through data between data/address bus progress sensor Exchange.Binocular camera, IMU Inertial Measurement Unit, the data-out port based on the RTK GNSS receiver resolved pass through data Collected data are input in vehicle-mounted arithmetic center by bus, and the storage of data is carried out in vehicle-mounted arithmetic center and is passed through Algorithm is carried out fusion to binocular camera, IMU Inertial Measurement Unit data and to being obtained based on the GNSS receiver that RTK is resolved GPS signal is corrected.
The vehicle-mounted arithmetic center is the core component of whole system, using ROS frame by binocular camera, IMU inertia Measuring unit and based on RTK resolve the collected data of GNSS receiver stored and handled.It is described to be resolved based on RTK GNSS receiver receive GPS positioning information;The binocular camera acquisition surrounding image information carries out point Yun Jianmo;It is described IMU Inertial Measurement Unit acquisition measurement automobile triaxial attitude angle and acceleration.
The pose for initializing pilotless automobile or robot first determines pilotless automobile traveling or robot motion Track, correct IMU Inertial Measurement Unit, the first-class data hold time consistency of binocular camera shooting;IMU inertia measurement is carried out later The information collection and extraction of unit and binocular camera and the GNSS receiver based on RTK resolving, are sent into vehicle-mounted arithmetic center; Secondly the IMU pre-integration based on flow pattern is carried out in vehicle-mounted arithmetic center, IMU initialization (initialization of vision inertia joint), is made Optimized with close coupling Optimized model and is calculated;Estimate the similar of pilotless automobile or robot trajectory and ambient enviroment three-dimensional map Degree obtains real-time pose;It constructs map and carries out closed loop detection using bag of words, existing map is corrected, into one Step improves the precision of three-dimensional map.
Fig. 2, Fig. 3 are picture captured by binocular camera when testing in Chengdu University of Electronic Science and Technology campus, we can be with Find out due to use binocular camera, due to know binocular camera focal length and distance and angle between the two, A fixed signal object is taken in figure, by calculate we can the distance between more accurate judgement symbol object and observation point, mention High detection accuracy.
The above description is only a preferred embodiment of the present invention, is not intended to restrict the invention.Obviously, those skilled in the art Member can carry out various changes and deformation to the present invention without departing from the spirit and scope of the present invention.If in this way, of the invention Within the scope of the claims of the present invention and its equivalent technology, then the present invention is also intended to encompass these to these modification and variation Including change and deformation.

Claims (7)

1. a kind of map constructing method based on GPS, IMU and binocular vision, which comprises the following steps:
Step 1: the pose of initialization pilotless automobile or robot, determines pilotless automobile traveling or robot motion Track, correct IMU Inertial Measurement Unit, the first-class data hold time consistency of binocular camera shooting;
Step 2: the information for carrying out IMU Inertial Measurement Unit and binocular camera and the GNSS receiver based on RTK resolving is adopted Collection and extraction, to carry out visual signature tracking;
Step 3: the IMU pre-integration based on flow pattern, comprising: IMU pre-integration calculates, calculates pre- score Jacobi and covariance square Battle array calculates residual error Jacobi;
Step 4: IMU initialization (initialization of vision inertia joint), it is one that this, which is the first step of vision and inertial data fusion, Section loose coupling process, vision data and quick IMU data are combined, comprising: offset of gyroscope calibration (zero bias), scale are extensive Multiple and acceleration of gravity pre-estimation, accelerometer biasing calibration and the optimization of scale acceleration of gravity;
Step 5: being optimized using close coupling Optimized model, comprising: the optimization of vision inertia consecutive frame close coupling, vision inertia The optimization of local map close coupling;
Step 6: estimation pilotless automobile or the similarity degree of robot trajectory and ambient enviroment three-dimensional map, obtain in real time Pose;
Step 7: constructing map and carrying out closed loop detection using bag of words, existing map is corrected, further increases three Tie up the precision of map.
2. the map constructing method according to claim 1 based on GPS, IMU and binocular vision, which is characterized in that In step 2) information collection and extraction:
After each binocular camera collects image, the extraction of ORB feature is carried out to image, after extracting ORB feature, will be extracted Result storage;
The IMU data of pilotless automobile or robot interior are acquired in real time, and are sent into the IMU pre-integration based on flow pattern In handled.
3. the map constructing method and system according to claim 1 based on GPS, IMU and binocular vision, feature exist In in IMU pre-integration of the step 3) based on flow pattern:
During the IMU pre-integration calculates, world coordinate system, IMU coordinate system and camera coordinates system are usually with W, B, C come table Show, Δ t is sampled at a certain time interval, measures the acceleration a of IMUBAnd angular velocity omegaB.IMU measurement model:
Wherein,The angular speed and acceleration of IMU measurement model under indicates coordinate system B,BωWB(t)∈R3It indicates Instantaneous angular velocity of the center the IMU B relative to world coordinate system W, R under coordinate system BWBIndicate the center IMU B relative to world coordinate system The spin matrix of W,For RWBTransposition,Wa(t)∈R3Indicate the instantaneous acceleration of the IMU under W coordinate,WG is indicated in W coordinate It is lower gravitational vectors, b indicates zero bias, and η indicates gaussian random noise, ba、bgRespectively indicate acceleration and gravitational vectors zero bias, ηa、 ηgRespectively indicate acceleration and gravitational vectors gaussian random noise;
Rotation R, the derivative for translating p and speed v can indicate are as follows:
Wherein,WV indicates the instantaneous velocity of the IMU under W coordinate system,Indicate instantaneous angular velocity of the B relative to W, RWBIndicate IMU Spin matrix of the center B relative to world coordinate system W,WP indicates the instantaneous translation under W coordinate system,For it Derivative;
Rotation R, translation p and speed v under world coordinate system can be acquired by general integral formula, and Euler is used under discrete time Integral can will above continuous time integral rewrite and simultaneous can obtain:
Δ t is certain time interval, ηgd(t)、ηad(t) indicate that acceleration and gravitational vectors gaussian random are made an uproar under discrete time Sound, ba、bgRespectively indicate acceleration and gravitational vectors zero bias, ηad、ηgdIt respectively indicates acceleration and gravitational vectors gaussian random is made an uproar Sound,For the angular speed and acceleration of IMU measurement model, quantity of state at this time is under world coordinate system;
If corresponding IMU quantity of state of known i moment is needed now with two adjacent key frames of i and j, when solving j under discrete time Carve the corresponding IMU pre-integration value of key frame:
Δ t is certain time interval, Δ tijFor the time difference between i, j moment, g is gravitational vectors,It respectively indicates The gravitational vectors and acceleration zero bias changed over time,Indicate the IMU measurement model angular speed changed over time,Indicate with The IMU measurement model speed of time change, Rj、Ri、vj、vi、pj、piThe respectively rotation at j moment and i moment, speed and translation, RkFor the spin matrix changed over time,Discrete acceleration and gravitational vectors gaussian random to change over time are made an uproar Sound;
In order to avoid repetition integral is changed using the opposite of quantity of state that IMU pre-integration method calculates i, j moment, that is, will ginseng It examines coordinate system to be transformed by world coordinate system under the i moment coordinate system of IMU, IMU quantity of state is calculated between i the and j moment Relative quantity:
Rj、Ri、vj、vi、pj、piThe respectively rotation at j moment and i moment, speed and translation,Respectively change over time Acceleration and gravitational vectors zero bias,Indicate the IMU measurement model angular speed changed over time,What expression changed over time IMU measurement model speed,For the discrete acceleration and gravitational vectors gaussian random noise changed over time, Δ Rij、 Δvij、ΔpijRotation, speed and translation between i, j is poor,For Δ RiTransposition, Δ tijBetween i, j moment when Between it is poor, Δ t be certain time interval, Δ Rik、ΔvikRotation and speed difference for i, k moment;
From the equations above as can be seen that entire pre-integration formula and biasing, noise have relationship, need to analyze respectively biasing and Influence of the noise to system, noise obey Gauss normal distribution, bigoted obedience random walk model;
The single order item not biased in the IMU pre-integration that uses of initialization, Jacobian matrix be after the initialization is completed into It is just calculated before row optimization, obtains the accurate pre-integration value (true value) of IMU and estimated value to obtain the residual error of IMU;
System is Gaussian distributed, and covariance matrix is calculated according to Gaussian Profile, is the matrix of a 9*9;
The calculating residual error Jacobi:
Residual error:
Parameter to be estimated:
ΔRij、Δvij、ΔpijIt is poor for the rotation at i, j moment, speed and translation,For device average value, For i The acceleration and gravitational vectors zero bias at moment,For its average value, δ ba、δbgFor its renewal amount, φ is rotation, Δ tij Inter frame temporal is poor, Ri、Rj、vi、vj、pi、pjIndicate spin matrix, speed and the translation at i, j moment,For its transposition;
The corresponding Jacobi of displacement residual error is as above, and the j moment, corresponding Jacobi can also so derive, and speed residual sum rotates refined It is similar to translation than formula.So far IMU pre-integration is completed.
4. the map constructing method according to claim 1 based on GPS, IMU and binocular vision, which is characterized in that In step 4) IMU initialization (initialization of vision inertia joint):
The rotation amount of key frame is contacted using two to estimate the biasing of gyroscope, in a constant bgOn the basis of add one It is smallChange, calculate the renewal amount of offset of gyroscope by minimizing rotation error:
Wherein, N is the number of key frame, Δ Ri,i+1It is two continuous key frame pre-integration wheel measuring amounts, J is covariance square Battle array,Respectively the B coordinate system at i+1 moment is relative to W coordinate system and the W coordinate system at i moment relative to B coordinate system Spin matrix;Give zero initial value, above-mentioned function is optimized by G2O, and it is corresponding to solve minimum target functionAs gyro The initial bias of instrument;
Pre-integration formula is substituted into after finding out zero bias can recalculate a pre-integration value, keep pre-integration numerical value more quasi- Really;
It carries out scale to restore and acceleration of gravity pre-estimation, it is necessary first to establish pre-evaluation state vector X=[s, gW], wherein S is scale, gWIt is the acceleration of gravity under world coordinate system;
Three key frames are used herein (indicates) that simultaneous vision and IMU pre-integration data construct an AX=B's with 1,2,3 Least square overdetermined equation at least needs four key frames, seeks least square problem using singular value decomposition, speed relatively it is slow still Precision is high;
I is unit matrix,WPCFor the displacement of image center C under W coordinate system, RWB、RWCIt is W coordinate system relative to B coordinate system and C The spin matrix of coordinate system, Δ tnmTime difference (n, m=1,2,3) between two frames;
Since scale restores the influence for not accounting for accelerometer biasing with the calculating process of acceleration of gravity pre-estimation, so that Acceleration of gravity and accelerometer biasing are difficult to differentiate between, it is likely that lead to systemic disease state property problem, so it is inclined to carry out accelerometer Set calibration and the optimization of scale acceleration of gravity;
If the acceleration of gravity direction of inertial coodinate system IgWNormalizated unit vector θ is angle.Then the spin matrix between world coordinate system W and inertial coodinate system I is calculated:
Then acceleration of gravity is expressed as:
Due to RWIIt is rotation about the z axis, so only by X, the influence of Y-direction:
Biasing comprising acceleration:
Using three continuous key frames come release rate amount:
Wherein [](:,1:2)The first two columns of representing matrix, acceleration bias renewal amountδ θ is minute angle variable,WPCFor W seat Mark is the displacement of lower image center C,CPBFor translation of the center the IMU B under C coordinate system, s is scale, Δ tnm、Δpnm、ΔvnmPoint It is not time, displacement and the speed difference (n, m=1,2,3) between two frames, RWB、RWC、RWIIt is W coordinate system relative to B coordinate system, C The spin matrix of coordinate system and I coordinate system,For g under I coordinate systemINormalizated unit vector, J is covariance matrix;It is only excellent Change the direction of acceleration of gravity, the data that every suboptimization obtains all can calculate newest g by substituting into againWContinue to optimize, directly It is restrained up to variable.
5. the map constructing method according to claim 1 based on GPS, IMU and binocular vision, which is characterized in that During step 5) is optimized using close coupling Optimized model:
There are two kinds of optimal ways for the optimization of vision inertia consecutive frame close coupling: (1) there is map rejuvenation:
Global optimization state vector, the rotation comprising current time j are constructed firstTranslational velocityDisplacementAccelerate Degree meter biasingAnd offset of gyroscopeGlobal error equation includes vision re-projection error EprojWith IMU measurement error EIMU
Vision re-projection equation is exactly simple pinhole camera re-projection equation, and there is no what, especially its residual error, Jacobi are counted It calculates and information matrix is no different with general vision;
IMU measuring error equation includes two parts, is P, V, R and b respectivelya,bgBiasing;Its residual error, residual error Jacobian matrix and letter It ceases matrix and has settled accounts completion in IMU pre-integration process, be brought directly to:
eb=bj-bj (40)
eR、ep、ev、ebFor rotation, translation, speed and biased error, RWB、RWC、RWIIt is W coordinate system relative to B coordinate system, C coordinate The spin matrix of system and I coordinate system, b are biasing;
(2) there is not map rejuvenation:
As described above, constructing global optimization state vector, the rotation including current time j+1 and last moment j first Translational velocityDisplacementAccelerometer biasingAnd offset of gyroscopeIt is whole Error equation includes vision re-projection error Eproj, IMU measurement error EIMUWith prior uncertainty Eprior:
The error equation of vision and IMU and as before, prior uncertainty is the quantity of state at j moment;
eR、ep、ev、ebFor rotation, translation, speed and biased error,For its transposition, RWBFor W coordinate system phase For B coordinate system spin matrix, RBWIt is B coordinate system relative to W coordinate system spin matrix, b is biasing, and ρ is coefficient,WpBWvBFor Translation and speed of the center the IMU B under W coordinate system;
When vision inertia local map close coupling optimizes, the key frame for needing to optimize includes that vision re-projection error and IMU are surveyed Measure error.
6. the map constructing method according to claim 1 based on GPS, IMU and binocular vision, which is characterized in that In the closed loop detection of step 7), constructs map and carry out closed loop detection using bag of words, existing map is corrected, into one Step improves the precision of three-dimensional map.
7. a kind of map structuring system based on GPS, IMU and binocular vision, it is characterised in that:
The system includes vehicle-mounted arithmetic center, binocular camera, IMU Inertial Measurement Unit, the GNSS reception resolved based on RTK Device;
The vehicle-mounted arithmetic center by ROS frame by binocular camera, IMU Inertial Measurement Unit and based on RTK resolve GNSS The collected data of receiver are stored and are handled;
The GNSS receiver resolved based on RTK receives GPS positioning information;
The binocular camera acquisition surrounding image information carries out point Yun Jianmo;
The IMU Inertial Measurement Unit acquisition measurement automobile triaxial attitude angle and acceleration;
Binocular camera and IMU can be integrated in whole circuit board by hardware view fusion, and whole signal transmission are all in electricity It is completed in the plate plate of road, directly carries out data between the microprocessor of control vision system and the microprocessor for controlling IMU Exchange, is promoted the thing of environment or target information more details to after related algorithm by visual processes, can be promoted whole The performance of a system.The interaction and fusion that clear data can also be used, the exchange of data between sensor is carried out by data/address bus. Binocular camera, IMU Inertial Measurement Unit, the data-out port based on the RTK GNSS receiver resolved pass through data/address bus Collected data are input in vehicle-mounted arithmetic center, the storage of data is carried out in vehicle-mounted arithmetic center and pass through algorithm Fusion is carried out to binocular camera, IMU Inertial Measurement Unit data and to the GPS obtained based on the GNSS receiver that RTK is resolved Signal is corrected.
CN201910225839.8A 2019-03-25 2019-03-25 Map constructing method and system based on GPS, IMU and binocular vision Pending CN109991636A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910225839.8A CN109991636A (en) 2019-03-25 2019-03-25 Map constructing method and system based on GPS, IMU and binocular vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910225839.8A CN109991636A (en) 2019-03-25 2019-03-25 Map constructing method and system based on GPS, IMU and binocular vision

Publications (1)

Publication Number Publication Date
CN109991636A true CN109991636A (en) 2019-07-09

Family

ID=67131077

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910225839.8A Pending CN109991636A (en) 2019-03-25 2019-03-25 Map constructing method and system based on GPS, IMU and binocular vision

Country Status (1)

Country Link
CN (1) CN109991636A (en)

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110412635A (en) * 2019-07-22 2019-11-05 武汉大学 A kind of environment beacon support under GNSS/SINS/ vision tight integration method
CN110455301A (en) * 2019-08-01 2019-11-15 河北工业大学 A kind of dynamic scene SLAM method based on Inertial Measurement Unit
CN110487267A (en) * 2019-07-10 2019-11-22 湖南交工智能技术有限公司 A kind of UAV Navigation System and method based on VIO&UWB pine combination
CN110542916A (en) * 2019-09-18 2019-12-06 上海交通大学 satellite and vision tight coupling positioning method, system and medium
CN110703754A (en) * 2019-10-17 2020-01-17 南京航空航天大学 Path and speed highly-coupled trajectory planning method for automatic driving vehicle
CN110702107A (en) * 2019-10-22 2020-01-17 北京维盛泰科科技有限公司 Monocular vision inertial combination positioning navigation method
CN110930506A (en) * 2019-10-11 2020-03-27 深圳市道通智能航空技术有限公司 Three-dimensional map generation method, mobile device, and computer-readable storage medium
CN111141295A (en) * 2019-12-20 2020-05-12 南京航空航天大学 Automatic map recovery method based on monocular ORB-SLAM
CN111337037A (en) * 2020-05-19 2020-06-26 北京数字绿土科技有限公司 Mobile laser radar slam drawing device and data processing method
CN111596665A (en) * 2020-05-29 2020-08-28 浙江大学 Dense height map construction method suitable for leg-foot robot planning
CN111623773A (en) * 2020-07-17 2020-09-04 国汽(北京)智能网联汽车研究院有限公司 Target positioning method and device based on fisheye vision and inertial measurement
CN111862150A (en) * 2020-06-19 2020-10-30 杭州易现先进科技有限公司 Image tracking method and device, AR device and computer device
CN112033422A (en) * 2020-08-28 2020-12-04 的卢技术有限公司 SLAM method for multi-sensor fusion
CN112082545A (en) * 2020-07-29 2020-12-15 武汉威图传视科技有限公司 Map generation method, device and system based on IMU and laser radar
CN112240768A (en) * 2020-09-10 2021-01-19 西安电子科技大学 Visual inertial navigation fusion SLAM method based on Runge-Kutta4 improved pre-integration
CN112256001A (en) * 2020-09-29 2021-01-22 华南理工大学 Visual servo control method for mobile robot under visual angle constraint
CN112284396A (en) * 2020-10-29 2021-01-29 的卢技术有限公司 Vehicle positioning method suitable for underground parking lot
CN112762929A (en) * 2020-12-24 2021-05-07 华中科技大学 Intelligent navigation method, device and equipment
CN112789655A (en) * 2019-09-23 2021-05-11 北京航迹科技有限公司 System and method for calibrating an inertial test unit and camera
CN112904883A (en) * 2021-01-26 2021-06-04 德鲁动力科技(成都)有限公司 Terrain perception method, motion control method and system for quadruped robot
CN113155121A (en) * 2021-03-22 2021-07-23 珠海深圳清华大学研究院创新中心 Vehicle positioning method and device and electronic equipment
WO2021147391A1 (en) * 2020-01-21 2021-07-29 魔门塔(苏州)科技有限公司 Map generation method and device based on fusion of vio and satellite navigation system
CN113190639A (en) * 2021-05-13 2021-07-30 重庆市勘测院 Comprehensive drawing method for residential area
CN113325454A (en) * 2021-05-18 2021-08-31 武汉大学 Combined positioning method based on ArduRover unmanned vehicle
CN113375665A (en) * 2021-06-18 2021-09-10 西安电子科技大学 Unmanned aerial vehicle pose estimation method based on multi-sensor elastic coupling
CN113465608A (en) * 2021-07-22 2021-10-01 清华大学苏州汽车研究院(吴江) Calibration method and system for roadside sensor
WO2021218620A1 (en) * 2020-04-30 2021-11-04 上海商汤临港智能科技有限公司 Map building method and apparatus, device and storage medium
CN113701750A (en) * 2021-08-23 2021-11-26 长安大学 Fusion positioning system of underground multi-sensor
WO2021248636A1 (en) * 2020-06-12 2021-12-16 东莞市普灵思智能电子有限公司 System and method for detecting and positioning autonomous driving object
CN114088087A (en) * 2022-01-21 2022-02-25 深圳大学 High-reliability high-precision navigation positioning method and system under unmanned aerial vehicle GPS-DENIED
CN114252099A (en) * 2021-12-03 2022-03-29 武汉科技大学 Intelligent vehicle multi-sensor fusion self-calibration method and system
CN114579679A (en) * 2020-12-01 2022-06-03 中移(成都)信息通信科技有限公司 Spatial positioning data fusion method, system, device and computer storage medium
CN115201873A (en) * 2022-09-06 2022-10-18 中冶智诚(武汉)工程技术有限公司 Multi-system collaborative indoor and outdoor precise positioning system architecture and operation method thereof
CN115307626A (en) * 2021-05-06 2022-11-08 北京航通云科技有限公司 Redundancy positioning method applied to small unmanned aerial vehicle
WO2024045632A1 (en) * 2022-08-31 2024-03-07 华南理工大学 Binocular vision and imu-based underwater scene three-dimensional reconstruction method, and device
CN113296133B (en) * 2020-04-01 2024-03-15 易通共享技术(广州)有限公司 Device and method for realizing position calibration based on binocular vision measurement and high-precision positioning fusion technology

Cited By (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110487267A (en) * 2019-07-10 2019-11-22 湖南交工智能技术有限公司 A kind of UAV Navigation System and method based on VIO&UWB pine combination
CN110487267B (en) * 2019-07-10 2021-06-04 湖南交工智能技术有限公司 Unmanned aerial vehicle navigation system and method based on VIO & UWB loose combination
CN110412635B (en) * 2019-07-22 2023-11-24 武汉大学 GNSS/SINS/visual tight combination method under environment beacon support
CN110412635A (en) * 2019-07-22 2019-11-05 武汉大学 A kind of environment beacon support under GNSS/SINS/ vision tight integration method
CN110455301A (en) * 2019-08-01 2019-11-15 河北工业大学 A kind of dynamic scene SLAM method based on Inertial Measurement Unit
CN110542916A (en) * 2019-09-18 2019-12-06 上海交通大学 satellite and vision tight coupling positioning method, system and medium
CN112789655A (en) * 2019-09-23 2021-05-11 北京航迹科技有限公司 System and method for calibrating an inertial test unit and camera
CN110930506B (en) * 2019-10-11 2022-09-09 深圳市道通智能航空技术股份有限公司 Three-dimensional map generation method, mobile device, and computer-readable storage medium
CN110930506A (en) * 2019-10-11 2020-03-27 深圳市道通智能航空技术有限公司 Three-dimensional map generation method, mobile device, and computer-readable storage medium
CN110703754A (en) * 2019-10-17 2020-01-17 南京航空航天大学 Path and speed highly-coupled trajectory planning method for automatic driving vehicle
CN110702107A (en) * 2019-10-22 2020-01-17 北京维盛泰科科技有限公司 Monocular vision inertial combination positioning navigation method
CN111141295A (en) * 2019-12-20 2020-05-12 南京航空航天大学 Automatic map recovery method based on monocular ORB-SLAM
WO2021147391A1 (en) * 2020-01-21 2021-07-29 魔门塔(苏州)科技有限公司 Map generation method and device based on fusion of vio and satellite navigation system
CN113296133B (en) * 2020-04-01 2024-03-15 易通共享技术(广州)有限公司 Device and method for realizing position calibration based on binocular vision measurement and high-precision positioning fusion technology
WO2021218620A1 (en) * 2020-04-30 2021-11-04 上海商汤临港智能科技有限公司 Map building method and apparatus, device and storage medium
CN111337037B (en) * 2020-05-19 2020-09-29 北京数字绿土科技有限公司 Mobile laser radar slam drawing device and data processing method
CN111337037A (en) * 2020-05-19 2020-06-26 北京数字绿土科技有限公司 Mobile laser radar slam drawing device and data processing method
CN111596665A (en) * 2020-05-29 2020-08-28 浙江大学 Dense height map construction method suitable for leg-foot robot planning
WO2021248636A1 (en) * 2020-06-12 2021-12-16 东莞市普灵思智能电子有限公司 System and method for detecting and positioning autonomous driving object
CN111862150A (en) * 2020-06-19 2020-10-30 杭州易现先进科技有限公司 Image tracking method and device, AR device and computer device
CN111623773A (en) * 2020-07-17 2020-09-04 国汽(北京)智能网联汽车研究院有限公司 Target positioning method and device based on fisheye vision and inertial measurement
CN112082545A (en) * 2020-07-29 2020-12-15 武汉威图传视科技有限公司 Map generation method, device and system based on IMU and laser radar
CN112082545B (en) * 2020-07-29 2022-06-21 武汉威图传视科技有限公司 Map generation method, device and system based on IMU and laser radar
CN112033422B (en) * 2020-08-28 2023-11-21 的卢技术有限公司 SLAM method for multi-sensor fusion
CN112033422A (en) * 2020-08-28 2020-12-04 的卢技术有限公司 SLAM method for multi-sensor fusion
CN112240768A (en) * 2020-09-10 2021-01-19 西安电子科技大学 Visual inertial navigation fusion SLAM method based on Runge-Kutta4 improved pre-integration
CN112256001A (en) * 2020-09-29 2021-01-22 华南理工大学 Visual servo control method for mobile robot under visual angle constraint
CN112284396B (en) * 2020-10-29 2023-01-03 的卢技术有限公司 Vehicle positioning method suitable for underground parking lot
CN112284396A (en) * 2020-10-29 2021-01-29 的卢技术有限公司 Vehicle positioning method suitable for underground parking lot
CN114579679A (en) * 2020-12-01 2022-06-03 中移(成都)信息通信科技有限公司 Spatial positioning data fusion method, system, device and computer storage medium
CN112762929A (en) * 2020-12-24 2021-05-07 华中科技大学 Intelligent navigation method, device and equipment
CN112762929B (en) * 2020-12-24 2022-08-02 华中科技大学 Intelligent navigation method, device and equipment
CN112904883B (en) * 2021-01-26 2022-08-05 德鲁动力科技(成都)有限公司 Terrain perception method, motion control method and system for quadruped robot
CN112904883A (en) * 2021-01-26 2021-06-04 德鲁动力科技(成都)有限公司 Terrain perception method, motion control method and system for quadruped robot
CN113155121B (en) * 2021-03-22 2024-04-02 珠海深圳清华大学研究院创新中心 Vehicle positioning method and device and electronic equipment
CN113155121A (en) * 2021-03-22 2021-07-23 珠海深圳清华大学研究院创新中心 Vehicle positioning method and device and electronic equipment
CN115307626A (en) * 2021-05-06 2022-11-08 北京航通云科技有限公司 Redundancy positioning method applied to small unmanned aerial vehicle
CN113190639B (en) * 2021-05-13 2022-12-13 重庆市勘测院 Comprehensive drawing method for residential area
CN113190639A (en) * 2021-05-13 2021-07-30 重庆市勘测院 Comprehensive drawing method for residential area
CN113325454B (en) * 2021-05-18 2022-06-14 武汉大学 Combined positioning method based on ArduRover unmanned vehicle
CN113325454A (en) * 2021-05-18 2021-08-31 武汉大学 Combined positioning method based on ArduRover unmanned vehicle
CN113375665A (en) * 2021-06-18 2021-09-10 西安电子科技大学 Unmanned aerial vehicle pose estimation method based on multi-sensor elastic coupling
CN113375665B (en) * 2021-06-18 2022-12-02 西安电子科技大学 Unmanned aerial vehicle pose estimation method based on multi-sensor elastic coupling
CN113465608A (en) * 2021-07-22 2021-10-01 清华大学苏州汽车研究院(吴江) Calibration method and system for roadside sensor
CN113465608B (en) * 2021-07-22 2024-05-03 清华大学苏州汽车研究院(吴江) Road side sensor calibration method and system
CN113701750A (en) * 2021-08-23 2021-11-26 长安大学 Fusion positioning system of underground multi-sensor
CN114252099B (en) * 2021-12-03 2024-02-23 武汉科技大学 Multi-sensor fusion self-calibration method and system for intelligent vehicle
CN114252099A (en) * 2021-12-03 2022-03-29 武汉科技大学 Intelligent vehicle multi-sensor fusion self-calibration method and system
CN114088087B (en) * 2022-01-21 2022-04-15 深圳大学 High-reliability high-precision navigation positioning method and system under unmanned aerial vehicle GPS-DENIED
CN114088087A (en) * 2022-01-21 2022-02-25 深圳大学 High-reliability high-precision navigation positioning method and system under unmanned aerial vehicle GPS-DENIED
WO2024045632A1 (en) * 2022-08-31 2024-03-07 华南理工大学 Binocular vision and imu-based underwater scene three-dimensional reconstruction method, and device
CN115201873A (en) * 2022-09-06 2022-10-18 中冶智诚(武汉)工程技术有限公司 Multi-system collaborative indoor and outdoor precise positioning system architecture and operation method thereof
CN115201873B (en) * 2022-09-06 2023-07-28 中冶智诚(武汉)工程技术有限公司 Multi-system cooperative indoor and outdoor precise positioning system architecture and operation method thereof

Similar Documents

Publication Publication Date Title
CN109991636A (en) Map constructing method and system based on GPS, IMU and binocular vision
CN109631887B (en) Inertial navigation high-precision positioning method based on binocular, acceleration and gyroscope
CN108731670A (en) Inertia/visual odometry combined navigation locating method based on measurement model optimization
Panahandeh et al. Vision-aided inertial navigation based on ground plane feature detection
CN102538781B (en) Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method
CN110487267B (en) Unmanned aerial vehicle navigation system and method based on VIO & UWB loose combination
CN105953796A (en) Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone
CN106780699A (en) A kind of vision SLAM methods aided in based on SINS/GPS and odometer
CN110146909A (en) A kind of location data processing method
CN110221328A (en) A kind of Combinated navigation method and device
CN111426320B (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
CN105509739A (en) Tightly coupled INS/UWB integrated navigation system and method adopting fixed-interval CRTS smoothing
CN107728182A (en) Flexible more base line measurement method and apparatus based on camera auxiliary
CN109443348A (en) It is a kind of based on the underground garage warehouse compartment tracking for looking around vision and inertial navigation fusion
CN105953795B (en) A kind of navigation device and method for the tour of spacecraft surface
CN103776446A (en) Pedestrian autonomous navigation calculation algorithm based on MEMS-IMU
CN109931955A (en) Strapdown inertial navigation system Initial Alignment Method based on the filtering of state correlation Lie group
CN112967392A (en) Large-scale park mapping and positioning method based on multi-sensor contact
CN109764870A (en) Carrier initial heading evaluation method based on transformation estimator modeling scheme
CN115930977A (en) Method and system for positioning characteristic degradation scene, electronic equipment and readable storage medium
Klein et al. LiDAR and INS fusion in periods of GPS outages for mobile laser scanning mapping systems
Gupta et al. Terrain‐based vehicle orientation estimation combining vision and inertial measurements
CN105115507B (en) Personal navigation system and method in a kind of double mode room based on double IMU
CN113029173A (en) Vehicle navigation method and device
CN103017773B (en) A kind of based on catalog of celestial bodies region feature and natural satellite road sign around section air navigation aid

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination