CN112987065A - Handheld SLAM device integrating multiple sensors and control method thereof - Google Patents
Handheld SLAM device integrating multiple sensors and control method thereof Download PDFInfo
- Publication number
- CN112987065A CN112987065A CN202110166885.2A CN202110166885A CN112987065A CN 112987065 A CN112987065 A CN 112987065A CN 202110166885 A CN202110166885 A CN 202110166885A CN 112987065 A CN112987065 A CN 112987065A
- Authority
- CN
- China
- Prior art keywords
- gnss
- ins
- laser
- coordinate system
- pose
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 40
- 230000001133 acceleration Effects 0.000 claims description 22
- 238000005259 measurement Methods 0.000 claims description 21
- 239000011159 matrix material Substances 0.000 claims description 20
- 230000005484 gravity Effects 0.000 claims description 12
- 230000004927 fusion Effects 0.000 claims description 11
- 238000005457 optimization Methods 0.000 claims description 11
- 230000009466 transformation Effects 0.000 claims description 8
- 238000004364 calculation method Methods 0.000 claims description 7
- 238000001914 filtration Methods 0.000 claims description 6
- 238000010276 construction Methods 0.000 claims description 5
- 238000012937 correction Methods 0.000 claims description 5
- 230000008569 process Effects 0.000 claims description 5
- 230000011218 segmentation Effects 0.000 claims description 5
- 230000006870 function Effects 0.000 claims description 4
- 238000012800 visualization Methods 0.000 claims description 4
- 238000013459 approach Methods 0.000 claims description 3
- 230000002457 bidirectional effect Effects 0.000 claims description 3
- 238000006243 chemical reaction Methods 0.000 claims description 3
- 238000000605 extraction Methods 0.000 claims description 3
- 239000005433 ionosphere Substances 0.000 claims description 3
- 238000012545 processing Methods 0.000 claims description 3
- 238000012216 screening Methods 0.000 claims description 3
- 230000007704 transition Effects 0.000 claims description 3
- 239000005436 troposphere Substances 0.000 claims description 3
- 238000013507 mapping Methods 0.000 abstract description 6
- 238000004040 coloring Methods 0.000 abstract description 2
- 230000006872 improvement Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 5
- 230000007613 environmental effect Effects 0.000 description 4
- 238000011161 development Methods 0.000 description 3
- 230000018109 developmental process Effects 0.000 description 3
- 230000001360 synchronised effect Effects 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000004807 localization Effects 0.000 description 2
- 239000000047 product Substances 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000019771 cognition Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 239000002689 soil Substances 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/48—Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Navigation (AREA)
Abstract
A hand-held SLAM device fusing multiple sensors and a control method thereof comprise a GNSS/INS combined positioning module, a GNSS columnar antenna, a 16-line laser radar, a color binocular camera, a processor and a flat panel display; the GNSS columnar antenna is connected with the GNSS/INS combined positioning module; the GNSS/INS combined positioning module, the 16-line laser radar, the color binocular camera and the flat panel display are all connected with the processor, so that the miniaturization of equipment can be realized. The invention provides a mapping scheme based on square voxels under limited hardware resources and aims at solving the problems that feature point matching cannot be performed in an environment with few textures by vision, so that the depth of a three-dimensional point under a camera coordinate system cannot be obtained, and the depth can be directly measured by laser but texture information in a space cannot be captured, and dense reconstruction is performed on a space plane for point cloud coloring by using a color image.
Description
Technical Field
The invention relates to the field of synchronous positioning and mapping methods of handheld terminals fusing multiple sensors, in particular to a handheld SLAM device fusing multiple sensors and a control method thereof.
Background
At present, the land resource survey is mainly based on satellite remote sensing data, and field survey is assisted. With the development of remote sensing technology, high-grade remote sensing data plays an increasingly important role and has proved to be the most direct and effective method. The work is mainly based on a remote sensing technology, the land utilization data of the same space in different time phases are subjected to superposition contrast analysis to find the change of the earth surface, and the dynamic change characteristics and the future development trend of the land utilization are analyzed in the aspects of time, space, quantity and quality. However, satellite remote sensing has the following defects in village and town land survey: 1) the larger the coverage area is, the less obvious the local information of the image is; 2) the current resources of the remote sensing satellite are limited, the satellite over-top period and the over-top time are usually fixed, the frequency of earth observation is relatively low, and a timely remote sensing image cannot be given to a sudden problem; 3) the acquired data are mainly two-dimensional plane data, and the requirements of land resource survey management and village and town planning and construction cannot be met. The handheld SLAM (Simultaneous LoCalization and Mapping, Simultaneous LoCalization and Mapping technology) device is powerful supplement of a satellite remote sensing measuring means, has the characteristics of flexibility, high efficiency, rapidness, fineness, accuracy, low operation cost, wide application range, short production period and the like, and has obvious advantages in the aspect of rapidly acquiring three-dimensional data in small areas and areas with complex environments.
For the handheld SLAM device, the important point is to solve the problems of positioning and environment perception of the mobile terminal. Currently, the Global Navigation Satellite System (GNSS), which is the infrastructure of PNT (position, Navigation and Timing) services, is the most used in the Positioning field. However, the satellite navigation signal is weak, the signal is easy to be shielded, the anti-interference capability is poor, the deception is easy, and the data output frequency of the GNSS receiver is low. Particularly in complex environments of villages and small towns, due to the influence of dense buildings, trees and the like, GNSS often causes interruption of positioning output due to signal shielding. In order to make up for the shortage of relying on GNSS positioning only, a GNSS/INS combined positioning technology is usually formed in combination with an Inertial Navigation System (INS). The INS has the advantages that the INS does not have any optical, electrical and magnetic connection with the outside, the work is not limited by meteorological conditions, the navigation task is completed by the motion carrier equipment completely, and complete navigation parameters can be provided, and continuous positioning output can be kept when GNSS signals are shielded. But the biggest disadvantage is that the method has the accumulated effect of errors, and the positioning precision is continuously reduced as the positioning process is carried out. The high-precision INS can keep high positioning precision for a long time, but the price and the cost are high, and the requirement of civil low cost is difficult to meet. The LiDAR (Light DeteCtion and Ranging) has good relative positioning performance, is slower in divergence compared with INS, can effectively inhibit the accumulation of navigation errors, and has good complementarity with GNSS in environmental requirements. But the environment information it acquires is slightly insufficient in detail and color information of the environment cannot be acquired. Compared with other sensors, the visual sensor can acquire enough environmental details, can draw the appearance and shape of an object, read marks and the like, and help a carrier to realize environmental cognition, and the functions of the other sensors cannot be realized, and the visual sensor is more benefited by the rapid development of digital image processing technology and the improvement of the performance of computer hardware in recent years. But is greatly influenced by environmental factors and external factors, such as insufficient light, and reduced sight caused by weather factors. Particularly, under complex environments of villages and towns, the problems of large image calculation amount and difficult algorithm realization exist. Relevant manufacturers at home and abroad have produced relevant integrated products, such as a BLK20go handset of a LeiCa measuring system under the Hexagon flag, which integrates a laser radar and a panoramic camera, but the device is mainly applied indoors; BaCkpa Ck integrates GNSS, IMU, two 16-line Velodyne laser radars and 4 panoramic cameras, and a domestic digital green soil company also has a product LiBaCkpa DGC50 of the same type, but the equipment is very expensive in price, low in popularization rate, large in equipment volume and not beneficial to carrying, and each sensor is only physically integrated, and the fusion of data layers is not realized.
Disclosure of Invention
In order to solve the problems, the invention provides a multi-sensor-fused handheld SLAM device and a control method thereof, and provides the multi-sensor-fused SLAM method based on an Error State Kalman Filtering (ESKF), which considers the advantages of a single sensor and aims at the limitations thereof.
The invention provides a multi-sensor-fused handheld SLAM device which comprises a device body, a GNSS/INS combined positioning module, a GNSS columnar antenna, a 16-line laser radar, a color binocular camera, a processor and a flat panel display, wherein the GNSS columnar antenna is connected with the GNSS/INS combined positioning module; the GNSS/INS combined positioning module, the 16-line laser radar, the colorful binocular camera and the flat panel display are connected with the processor, the flat panel display is arranged at the front end of the device main body, the GNSS/INS combined positioning module and the processor are arranged in the device main body, the GNSS columnar antenna and the 16-line laser radar are arranged on the device main body, and the colorful binocular camera is arranged on the rear side of the upper portion of the device main body.
As a further improvement of the structure of the invention, the bottom of the device main body is provided with a handle.
The invention provides a control method of a handheld SLAM device fusing multiple sensors, which comprises the following specific steps:
estimating the pose based on the 16-line laser radar;
1) point cloud segmentation;
2) extracting characteristics;
3) a laser odometer;
(II) estimating the pose of the GNSS/INS combined positioning module;
1) GNSS double-difference relative positioning GNSS-RTK;
2) solving the pose based on the acceleration and angular velocity information;
3) solving the GNSS/INS loose combined pose;
(III) estimating a GNSS-laser/INS semi-tight fusion pose;
1) estimating a laser/INS tight fusion pose;
2) solving the GNSS-laser/INSS loose combined pose;
(IV) a voxel-based visualization scheme;
because the vision can not carry out feature point matching in an environment with less texture, the depth of a three-dimensional point under a camera coordinate system can not be obtained, although the depth can be measured directly by laser, the texture information in the space can not be captured, aiming at the respective advantages and disadvantages of the vision and the laser, a map is built based on a square voxel under limited hardware resources, the voxel point cloud is colored by combining the color information of a color image, the voxel with the minimum unit in the space replaces a surface patch in the traditional three-dimensional reconstruction, the points of a dense plane are mainly laser points obtained by three-time region growing and pixel points corresponding to the plane in the image, and firstly, the laser points need to be projected into the image to obtain a gray value; then, matching the pixel points of the image with a plane under a laser coordinate system; and finally, calculating the pixel coordinate depth successfully matched with the plane.
As a further improvement of the control method, the first step is based on the pose estimation of the 16-line laser radar, and the specific steps are as follows:
1) point cloud segmentation, namely projecting laser point cloud onto a distance image, and judging whether an indoor ground point exists or not by utilizing the judgment of vertical angle and a threshold value so as to segment a ground point and a non-ground point; carrying out clustering processing on the non-ground points by using BFS (bidirectional Forwarding-type clustering), setting corresponding thresholds and removing dynamic noise points;
2) and (3) feature extraction, namely horizontally dividing the distance image into a plurality of sub-images, finding 5 points from the left and right of one point in the point cloud set at the moment t in the same vertical direction, and constructing a set S, wherein the curvature C of each point is as follows:
wherein R iskIs a distance;
setting a threshold value C for the calculated curvature value sequencethIf the number is larger than the threshold value, the edge point is determined; otherwise, for the planar points, nF with the maximum C value and not belonging to the ground point is selected from each rowmeEdge points, forming a set Fme(ii) a nF for selecting minimum C value from each rowmpIndividual plane points belonging to all rows of ground points or division points, forming a set FmpAgain, a screening is performed from the set FmeSelecting nF not belonging to ground point and having maximum C valueeEdge points, forming a set Fe(ii) a From the set FmpSelecting nF belonging to ground point and having minimum C valuepIndividual plane points, forming a set Fp;
3) The laser odometer sets the current time as t, the last time as t-1, and selects the characteristic pointAndand at time t-1Andconstruction ofPoint-to-line correspondence of (1) andby means of LM optimization calculationIs constrained to obtain [ t ]z,θroll,θpitch](ii) a Reuse of LM optimization calculationsIs constrained and combined with [ t ] -tz,θroll,θpitch]Thereby obtaining [ tx,ty,θyaw]Thus, a pose parameter [ t ] under the carrier local coordinate system is obtainedx,ty,tz,θroll,θpitch,θyaw]。
As a further improvement of the control method, step (II) is based on the position and attitude estimation of a GNSS/INS combined positioning module, and the specific steps are as follows:
1) GNSS double-difference relative positioning GNSS-RTK, utilizing GNSS observation data and GNSS differential data to establish pseudo-range and carrier double-difference observation equation, making the instantaneous coordinate of satellite s be Xs,Ys,ZsThe approximate coordinates of the station r areThe correction number is dx, dy and dz, and is linearized, which is abbreviated as:
wherein, representing an interstation interplanetary double difference operator;representing an interplanetary difference operator; s and r are serial numbers of the satellite and the survey station; j is a pseudo range or carrier observation type j ═ 1, 2, 3; p is a pseudo-range observed value;as geometrical distance between satellite and receiverc is the speed of light in a vacuum environment; deltaion、δtropThe ionosphere refraction error and the troposphere refraction error are included; lambda [ alpha ]iA carrier wavelength;carrier phase observations, integer ambiguities; epsilonP、Pseudo range and carrier phase observation noise;
firstly, solving the integer ambiguity by using a Lambda algorithm, and then solving the correction numbers dx, dy and dz, thereby determining the coordinates X, Y and Z of the carrier under a GNSS coordinate system:
2) firstly, setting an INS coordinate system B, a world coordinate system W and acceleration collected by a GNSS/INS combined positioning moduleAnd angular velocity
Wherein: a isB、ωBTrue values of acceleration and angular velocity, respectively; ba、bgDrift values for acceleration and angular velocity, respectively; etaa、ηgTo measure noise; gWIs a gravity value; q. q.sWBA rotation quaternion representing the rotation from INS coordinate system B to world coordinate system W;
integrating the acceleration and angular velocity data, and calculating a corresponding pose:
wherein:respectively represent B under the world coordinate systemkPosition, velocity and rotational quaternion at the time; Δ t is time Bk+1And BkThe time difference between them;represents the time BtRelative to the BkThe attitude at that moment;
3) the position and the speed of the GNSS/INS loose combination are used as the basis of the loose combination, when the earth coordinate system is used as a navigation coordinate system, the combined navigation system has 15 state parameters, and the attitude, the speed, the position, the gyro drift and the zero offset of an accelerometer are respectively used as state variablesXGI:
Wherein: p, V is a position and velocity matrix; theta is a roll angle, a pitch angle and a yaw angle;respectively representing the zero offset of the accelerometer and the gyro drift along the motion direction of the carrier;
the system state and measurement equations are established as follows:
ZGI(t)=HGI(t)XGI(t)+wGI(t)
wherein: a. theGI(t) is a state equation coefficient matrix when the GNSS/INS is loosely combined; wGI(t) is the state equation system noise when the GNSS/INS is loosely combined; zGI(t) is an external measurement value; hGI(t) is a measurement equation coefficient matrix when GNSS/INS is loosely combined; w is aGI(t) is the measurement noise when the GNSS/INS is loosely combined;
and finally, solving by utilizing Kalman filtering to obtain the pose of the carrier.
The control method is further improved, and step (III) GNSS-laser/INS semi-tight fusion pose estimation is carried out, and the specific steps are as follows;
1) laser/INS close fusion pose estimation, calculating relative transformation of a carrier by using acceleration and angular velocity data acquired by a GNSS/INS combined positioning module and characteristic information extracted by two frames of continuous point clouds, and bkIs INS data under the laser radar time k, wherein the local coordinate system is bk-1;
δX:=[δP,δV,δθ,δba,δbg,δg]
wherein:respectively b arek+1Relative moment of time and bkPosition, velocity and rotational quaternion at the time;is b iskA gravity value under a local coordinate system at the moment; δ P, δ V and δ θ are error values of roll angle, pitch angle and yaw angle corresponding to position, speed and rotation angle respectively; δ ba、δbgDelta g is the zero offset of the accelerometer, the gyro drift and the gravity error;
when the current frame of INS measurement data is obtained, the error state is:
wherein: etawIs Gaussian noise, FtAs an error state transition matrix, GtIs a noise jacobian;
state update can be regarded as an optimization problem, namely, the attitude state prior is alignedAnd a residual function introduced based on the observation modelThe optimization problem of (1), first, based on pose transformationAnd equation of observationCalculating a residual error:
wherein: Σ is the covariance of the error state δ X; j is a Jacobian matrix; m is the covariance matrix of the measured noise, superscript lkFor point cloud data at time k, lpiIs laser point cloud;
next, an observation equation will be calculatedJacobi J ofk(ii) a Obtaining the posterior of the error state according to the updating process of the ESKF; updating pose transformation by using a posteriori of the error state; finally, judging whether the posterior of the error state approaches to 0 or not, if so, stopping iteration; otherwise, recalculating is based on pose transformationAnd equation of observationPerforming iterative computation on the residual error;
adding the solved error shape to the priorsIn the method, final frame-to-frame conversion can be obtained
Wherein:respectively b arek+1Relative moment of time and bkA priori position, velocity and rotational quaternion of the moment;is b iskA priori gravity value under a moment local coordinate system;-ba、-bgzero-offset and gyro drift of a priori accelerometer;
and finally, obtaining the coordinates under the global coordinate system:
wherein:andis a slave frame bkTo frame bk+1The posture of (2) is changed;indicates that the origin of global coordinates is in frame bk+1Coordinate information under a coordinate system;
2) solving the pose of the GNSS-laser/INSS loose combination, wherein the loose combination is the GNSS-laser/INSS combination based on the position and the speed, and when the earth coordinate system is taken as a navigation coordinate system, the attitude, the speed, the position, the gyro drift and the zero offset of an accelerometer are respectively taken as state variables XGLI:
The system state and measurement equations are established as follows:
ZGLI(t)=HGLI(t)XGLI(t)+wGLI(t)
wherein: a. theGLI(t) is a state equation coefficient matrix when GNSS-laser/INS loose combination is carried out; wGLI(t) is the state equation system noise when GNSS-laser/INS is loosely combined; zGLI(t) is an external measurement value when the GNSS-laser/INS is loosely combined; hGLI(t) is a measurement equation coefficient matrix when GNSS-laser/INS loose combination is performed; w is aGLI(t) is the measurement noise in the GNSS-laser/INS loose combination;
and finally, solving by utilizing Kalman filtering to obtain the pose of the carrier.
As a further improvement of the control method, the 16-line laser radar adopts a Velodyne VLP-16 type laser radar.
As a further improvement of the control method, the GNSS/INS combined positioning module collects GNSS observation data, GNSS differential data, acceleration and angular velocity data.
Compared with the prior art, the beneficial effect is:
in an unknown environment, a carrier acquires information of a surrounding environment by using a handheld SLAM device to perform synchronous positioning and mapping, the handheld SLAM device considered by the inventor comprises a GNSS/INS combined positioning module, a GNSS columnar antenna, a 16-line laser radar, a color binocular camera, a processor and a flat panel display, and the miniaturization of equipment can be realized. Aiming at the condition that the existing EKF model laser radar observation constraint is easy to have low performance and even diverge, the EKF is replaced by the ESKF, the acceleration and angular velocity data in the GNSS/INS combined positioning module are fused, the pose estimation is optimized, and the GNSS positioning result in the GNSS/INS combined positioning module is loosely coupled with the GNSS positioning result to obtain the global pose under the GNSS coordinate system. The laser can measure the depth very directly, but cannot capture texture information in space, aiming at the problem that the depth of a three-dimensional point under a camera coordinate system cannot be obtained because feature point matching cannot be carried out in an environment with few textures in vision. And providing a mapping scheme based on square voxels under limited hardware resources, and performing dense reconstruction on a spatial plane subjected to point cloud coloring by using a color image.
Drawings
FIG. 1 is a schematic diagram of a real-time embodiment of a hand-held SLAM device according to the present invention;
FIG. 2 is a design drawing of the hand-held SLAM device of the present invention;
fig. 3 is a flow chart of a SLAM method of fusing multiple sensors.
Description of the labeling:
1. a GNSS column antenna; 2. a color binocular camera; 3. a processor; 4. a handle; 5. a 16-line laser radar; 6. a flat panel display; 7. and the GNSS/INS combined positioning module.
Detailed Description
The invention is described in further detail below with reference to the following detailed description and accompanying drawings:
the invention provides a multi-sensor-fused handheld SLAM device and a control method thereof, and aims to solve the technical problems that the method is based on an Error State Kalman Filtering (ESKF) SLAM method, the method is provided by taking the advantages of a single sensor into consideration and aiming at the limitation of the single sensor, the pose transformation of a handheld terminal is obtained through calculation by fusing data of the multiple sensors, errors caused by incapability of positioning of a GNSS and rare characteristic points are reduced, the obtained motion track of the handheld terminal is smoother and more accurate, and the accuracy of map construction is improved.
As shown in fig. 1, a handheld SLAM device with integrated multi-sensors comprises a device body, a GNSS/INS combined positioning module 7, a GNSS column antenna 1, a 16-line laser radar 5, a color binocular camera 2, a processor 3 and a flat panel display 6, wherein the GNSS column antenna 1 is connected with the GNSS/INS combined positioning module 7; GNSS INS combined positioning module 7, 16 line lidar 5, colored two mesh cameras 2, flat panel display 6 all be connected with treater 3, there is flat panel display 6 device main part front end, there are GNSS INS combined positioning module 7 and treater 3 in the device main part, there are GNSS column antenna 1 and 16 line lidar 5 in the device main part, there are colored two mesh cameras 2 device main part upper portion rear side, there is handle 4 device main part bottom. Fig. 2 is a design diagram of a hand-held SLAM device.
In the invention, under an unknown environment, a carrier obtains a laser point cloud and a color image of a surrounding environment through a 16-line laser radar and a color binocular camera in the moving process, the self motion transformation is calculated through registration of the laser point cloud, meanwhile, the motion data of the carrier is obtained through a GNSS columnar antenna and a GNSS/INS combined positioning module, and the data are subjected to fusion processing to obtain the position of the carrier. And then constructing a three-dimensional map of the surrounding environment by using the laser point cloud and the color image, and simultaneously performing visualization by using a flat panel display to realize synchronous positioning and map building of the carrier.
Pose estimation based on 16-line laser radar
The 16-line laser radar adopts a Velodyne VLP-16 type laser radar.
1. And (4) point cloud segmentation. Projecting the laser point cloud onto a distance image, and judging whether an indoor ground point exists or not by utilizing the judgment of a vertical angle and a threshold value so as to divide a ground point and a non-ground point; and (4) clustering the non-ground points by using BFS (bidirectional Forwarding-class clustering) and setting corresponding thresholds to remove dynamic noise points.
2. And (5) feature extraction. The range image is divided horizontally into a number of sub-images. And (5) finding 5 points from left to right in the same vertical direction of one point in the point cloud set at the time t to construct a set S. The curvature C of each point is then:
wherein R iskIs a distance.
Setting a threshold value C for the calculated curvature value sequencethIf the number is larger than the threshold value, the edge point is determined; otherwise, it is a plane point. Selecting nF which does not belong to ground point and has maximum C value from each rowmeEdge points, forming a set Fme(ii) a nF for selecting minimum C value from each rowmpIndividual plane points belonging to all rows of ground points or division pointsSet Fmp. And screening again. From the set FmeSelecting nF not belonging to ground point and having maximum C valueeEdge points, forming a set Fe(ii) a From the set FmpSelecting nF belonging to ground point and having minimum C valuepIndividual plane points, forming a set Fp。
3. And (4) a laser odometer. And setting the current time as t, and setting the last time as t-1. Selecting feature pointsAndand at time t-1Andconstruction ofPoint-to-line correspondence of (1) andpoint-to-surface correspondence of (a). Optimizing computations using LMIs constrained to obtain [ t ]z,θroll,θpitch](ii) a Reuse of LM optimization calculationsIs constrained and combined with [ t ] -tz,θroll,θpitch]Thereby obtaining [ tx,ty,θyaw]. Thus, a pose parameter [ t ] under the carrier local coordinate system is obtainedx,ty,tz,θroll,θpitch,θyaw]。
(II) position and orientation estimation based on GNSS/INS combined positioning module
The GNSS/INS combined positioning module collects GNSS observation data, GNSS differential data, acceleration data and angular speed data.
GNSS double-difference relative positioning (GNSS-RTK). And establishing a pseudo-range and carrier double-difference observation equation by utilizing the GNSS observation data and the GNSS differential data. Let the instantaneous coordinate of the satellite s be (X)s,Ys,Zs) The approximate coordinates of the station r areThe corrected number is (dx, dy, dz) and is linearized, which is abbreviated as:
wherein, representing an interstation interplanetary double difference operator;representing an interplanetary difference operator; s and r are serial numbers of the satellite and the survey station; j is a pseudorange or carrier observation type (j ═ 1, 2, 3); p is a pseudo-range observed value;as geometrical distance between satellite and receiverc is the speed of light in a vacuum environment; deltaion、δtropThe ionosphere refraction error and the troposphere refraction error are included; lambda [ alpha ]iCarrier waveA wavelength;carrier phase observations, integer ambiguities; epsilonP、The pseudo range and the carrier phase observation noise.
And (3) solving the integer ambiguity by using a Lambda algorithm, and then solving the correction numbers (dx, dy, dz) so as to determine the coordinates (X, Y, Z) of the carrier under the GNSS coordinate system:
2. and solving the pose based on the acceleration and angular velocity information. First, an INS coordinate system is set as B, and a world coordinate system is set as W. Acceleration acquired by GNSS/INS combined positioning moduleAnd angular velocity
Wherein: a isB、ωBTrue values of acceleration and angular velocity, respectively; ba、bgDrift values for acceleration and angular velocity, respectively; etaa、ηgTo measure noise; gWIs a gravity value; q. q.sWBA rotation quaternion representing the rotation from INS coordinate system B to world coordinate system W.
The acceleration and angular velocity data are integrated, and the corresponding pose can be calculated:
wherein:respectively represent B under the world coordinate systemkPosition, velocity and rotational quaternion at the time; Δ t is time Bk+1And BkThe time difference between them;represents the time BtRelative to the BkThe attitude at the moment.
And 3, solving the GNSS/INS loose combined pose. The loose combination is a position, velocity based GNSS/INS combination. When the earth coordinate system is used as a navigation coordinate system, the integrated navigation system has 15 state parameters, and the attitude, the speed and the position, the gyro drift and the zero offset of the accelerometer are respectively used as state variables XGI:
Wherein: p, V is a position and velocity matrix; theta is a roll angle, a pitch angle and a yaw angle;respectively, the accelerometer zero offset and the gyro drift along the direction of carrier motion.
The system state and measurement equations are established as follows:
ZGI(t)=HGI(t)XGI(t)+wGI(t) (12)
wherein: a. theGI(t) is a state equation coefficient matrix when the GNSS/INS is loosely combined; wGI(t) is the state equation system noise when the GNSS/INS is loosely combined; zGI(t) is an external measurement value; hGI(t) is a measurement equation coefficient matrix when GNSS/INS is loosely combined; w is aGI(t) is the measurement noise of the GNSS/INS loose combination.
And finally, solving by utilizing Kalman filtering to obtain the pose of the carrier.
(III) GNSS-laser/INS semi-tight fusion pose estimation
1. And (4) carrying out laser/INS tight fusion pose estimation. And calculating the relative transformation of the carrier by using the acceleration and angular velocity data acquired by the GNSS/INS combined positioning module and the characteristic information extracted by two frames of continuous point clouds. Let bkIs INS data under the laser radar time k, wherein the local coordinate system is bk-1。
δX:=[δP,δV,δθ,δba,δbg,δg] (14)
wherein:respectively b arek+1Relative moment of time and bkPosition, velocity and rotational quaternion at the time;is b iskA gravity value under a local coordinate system at the moment; δ P, δ V, δ θ are error values of position, velocity, rotation angle (roll angle, pitch angle, yaw angle), respectively; δ ba、δbgδ g is accelerometer zero offset, gyro drift, and gravity error.
When the current frame of INS measurement data is obtained, the error state is:
wherein: etawIs Gaussian noise, FtAs an error state transition matrix, GtIs a noisy jacobian.
State update can be regarded as an optimization problem, namely, the attitude state prior is alignedAnd a residual function introduced based on the observation modelTo the optimization problem of (2). First, based on pose transformationAnd equation of observationCalculating a residual error:
wherein: Σ is the covariance of the error state δ X; j is a Jacobian matrix; m is the covariance matrix of the measured noise, superscript lkFor point cloud data at time k, lpiAs a laser spotAnd (4) cloud.
Next, an observation equation will be calculatedJacobi J ofk(ii) a Obtaining the posterior of the error state according to the updating process of the ESKF; updating pose transformation by using a posteriori of the error state; finally, judging whether the posterior of the error state approaches to 0 or not, if so, stopping iteration; otherwise, recalculating is based on pose transformationAnd equation of observationAnd (4) performing iterative computation on the residual error.
Adding the solved error shape to the priorsIn the method, final frame-to-frame conversion can be obtained
Wherein:respectively b arek+1Relative moment of time and bkA priori position, velocity and rotational quaternion of the moment;is b iskA priori gravity value under a moment local coordinate system;-ba、-bgthe accelerometer is a priori zero offset and gyro drift.
And finally, obtaining the coordinates under the global coordinate system:
wherein:andis a slave frame bkTo frame bk+1The posture of (2) is changed;indicates that the origin of global coordinates is in frame bk+1Coordinate information in a coordinate system.
And 2, solving the GNSS-laser/INSS loose combined pose. The loose combination is a position, velocity based GNSS-laser/INSS combination. When the earth coordinate system is taken as a navigation coordinate system, the attitude, the speed and the position, the gyro drift and the zero offset of the accelerometer are taken as state variables X respectivelyGLI:
The system state and measurement equations are established as follows:
ZGLI(t)=HGLI(t)XGLI(t)+wGLI(t) (22)
wherein: a. theGLI(t) is a state equation coefficient matrix when GNSS-laser/INS loose combination is carried out; wGLI(t) is the state equation system noise when GNSS-laser/INS is loosely combined; zGLI(t) is an external measurement value when the GNSS-laser/INS is loosely combined; hGLI(t) is a measurement equation coefficient matrix when GNSS-laser/INS loose combination is performed; w is aGLI(t) is the measurement noise of the GNSS-laser/INS loose combination.
And finally, solving by utilizing Kalman filtering to obtain the pose of the carrier.
(IV) voxel-based visualization scheme
Since feature point matching cannot be performed in an environment with few textures by vision, the depth of a three-dimensional point in a camera coordinate system cannot be obtained. Laser light, while capable of measuring depth very directly, cannot capture texture information in space. Aiming at the respective advantages and disadvantages of vision and laser, a map is built based on square voxels under limited hardware resources, the voxel point cloud is colored by combining color information of a color image, and the voxel with the minimum unit in the space replaces a surface patch in the traditional three-dimensional reconstruction. The points of the densified plane are mainly laser points obtained by three times of region growth and pixel points corresponding to the plane in the image. Firstly, projecting a laser point into an image to obtain a gray value; then, matching the pixel points of the image with a plane under a laser coordinate system; and finally, calculating the pixel coordinate depth successfully matched with the plane.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the present invention in any way, but any modifications or equivalent variations made according to the technical spirit of the present invention are within the scope of the present invention as claimed.
Claims (8)
1. A hand-held SLAM device fusing multisensor, includes device main part, GNSS/INS combination positioning module (7), GNSS column antenna (1), 16 line laser radar (5), colored binocular camera (2), treater (3) and flat panel display (6), its characterized in that: the GNSS columnar antenna (1) is connected with the GNSS/INS combined positioning module (7); GNSS INS combined positioning module (7), 16 line lidar (5), colored two mesh camera (2), flat panel display (6) all be connected with treater (3), there are flat panel display (6) device main part front end, there are GNSS INS combined positioning module (7) and treater (3) in the device main part, there are GNSS column antenna (1) and 16 line lidar (5) in the device main part, there are colored two mesh camera (2) device main part upper portion rear side.
2. The multi-sensor-integrated handheld SLAM apparatus of claim 1, wherein: the bottom of the device main body is provided with a handle (4).
3. A control method of a handheld SLAM device fusing multiple sensors comprises the following specific steps:
estimating the pose based on the 16-line laser radar;
1) point cloud segmentation;
2) extracting characteristics;
3) a laser odometer;
(II) estimating the pose of the GNSS/INS combined positioning module;
1) GNSS double-difference relative positioning GNSS-RTK;
2) solving the pose based on the acceleration and angular velocity information;
3) solving the GNSS/INS loose combined pose;
(III) estimating a GNSS-laser/INS semi-tight fusion pose;
1) estimating a laser/INS tight fusion pose;
2) solving the GNSS-laser/INSS loose combined pose;
(IV) a voxel-based visualization scheme;
because the vision can not carry out feature point matching in an environment with less texture, the depth of a three-dimensional point under a camera coordinate system can not be obtained, although the depth can be measured directly by laser, the texture information in the space can not be captured, aiming at the respective advantages and disadvantages of the vision and the laser, a map is built based on a square voxel under limited hardware resources, the voxel point cloud is colored by combining the color information of a color image, the voxel with the minimum unit in the space replaces a surface patch in the traditional three-dimensional reconstruction, the points of a dense plane are mainly laser points obtained by three-time region growing and pixel points corresponding to the plane in the image, and firstly, the laser points need to be projected into the image to obtain a gray value; then, matching the pixel points of the image with a plane under a laser coordinate system; and finally, calculating the pixel coordinate depth successfully matched with the plane.
4. The method of claim 3, wherein the method comprises: the method comprises the following steps of (1) estimating the pose based on a 16-line laser radar, specifically:
1) point cloud segmentation, namely projecting laser point cloud onto a distance image, and judging whether an indoor ground point exists or not by utilizing the judgment of vertical angle and a threshold value so as to segment a ground point and a non-ground point; carrying out clustering processing on the non-ground points by using BFS (bidirectional Forwarding-type clustering), setting corresponding thresholds and removing dynamic noise points;
2) and (3) feature extraction, namely horizontally dividing the distance image into a plurality of sub-images, finding 5 points from the left and right of one point in the point cloud set at the moment t in the same vertical direction, and constructing a set S, wherein the curvature C of each point is as follows:
wherein R iskIs a distance;
setting a threshold value C for the calculated curvature value sequencethIf the number is larger than the threshold value, the edge point is determined; otherwise, for the planar points, nF with the maximum C value and not belonging to the ground point is selected from each rowmeEdge points, forming a set Fme(ii) a nF for selecting minimum C value from each rowmpIndividual plane points belonging to all rows of ground points or division points, forming a set FmpAgain, a screening is performed from the set FmeSelecting nF not belonging to ground point and having maximum C valueeEdge points, forming a set Fe(ii) a From the set FmpSelecting nF belonging to ground point and having minimum C valuepIndividual plane points, forming a set Fp;
3) The laser odometer sets the current time as t, the last time as t-1, and selects the characteristic pointAndand at time t-1Andconstruction ofPoint-to-line correspondence of (1) andby means of LM optimization calculationIs constrained to obtain [ t ]z,θroll,θpitch](ii) a Reuse of LM optimization calculationsIs constrained and combined with [ t ] -tz,θroll,θpitch]Thereby obtaining [ tx,ty,θyaw]Thus, a pose parameter [ t ] under the carrier local coordinate system is obtainedx,ty,tz,θroll,θpitch,θyaw]。
5. The method of claim 3, wherein the method comprises: step (II) based on the position and orientation estimation of the GNSS/INS combined positioning module, the specific steps are as follows:
1) GNSS double-difference relative positioning GNSS-RTK, utilizing GNSS observation data and GNSS differential data to establish pseudo-range and carrier double-difference observation equation, making the instantaneous coordinate of satellite s be Xs,Ys,ZsProximity of the measuring station rHas the similar coordinates ofThe correction number is dx, dy and dz, and is linearized, which is abbreviated as:
wherein, representing an interstation interplanetary double difference operator;representing an interplanetary difference operator; s and r are serial numbers of the satellite and the survey station; j is a pseudo range or carrier observation type j ═ 1, 2, 3; p is a pseudo-range observed value;as geometrical distance between satellite and receiverc is the speed of light in a vacuum environment; deltaion、δtropThe ionosphere refraction error and the troposphere refraction error are included; lambda [ alpha ]iA carrier wavelength;carrier phase observations, integer ambiguities; epsilonP、Pseudo range and carrier phase observation noise;
firstly, solving the integer ambiguity by using a Lambda algorithm, and then solving the correction numbers dx, dy and dz, thereby determining the coordinates X, Y and Z of the carrier under a GNSS coordinate system:
2) firstly, setting an INS coordinate system B, a world coordinate system W and acceleration collected by a GNSS/INS combined positioning moduleAnd angular velocity
Wherein: a isB、ωBTrue values of acceleration and angular velocity, respectively; ba、bgDrift values for acceleration and angular velocity, respectively; etaa、ηgTo measure noise; gWIs a gravity value; q. q.sWBA rotation quaternion representing the rotation from INS coordinate system B to world coordinate system W;
integrating the acceleration and angular velocity data, and calculating a corresponding pose:
wherein:respectively represent B under the world coordinate systemkPosition, velocity and rotational quaternion at the time; Δ t is time Bk+1And BkThe time difference between them;represents the time BtRelative to the BkThe attitude at that moment;
3) the position and the speed of the GNSS/INS loose combination are used as the basis of the loose combination, when the earth coordinate system is used as a navigation coordinate system, the combined navigation system has 15 state parameters, and the attitude, the speed, the position, the gyro drift and the zero offset of an accelerometer are respectively used as state variables XGI:
Wherein: p, V is a position and velocity matrix; theta is a roll angle, a pitch angle and a yaw angle;respectively representing the zero offset of the accelerometer and the gyro drift along the motion direction of the carrier;
the system state and measurement equations are established as follows:
ZGI(t)=HGI(t)XGI(t)+wGI(t)
wherein: a. theGI(t) is a state equation coefficient matrix when the GNSS/INS is loosely combined; wGI(t) is the state equation system noise when the GNSS/INS is loosely combined; zGI(t) is an external measurement value; hGI(t) is a measurement equation coefficient matrix when GNSS/INS is loosely combined; w is aGI(t) is the measurement noise when the GNSS/INS is loosely combined;
and finally, solving by utilizing Kalman filtering to obtain the pose of the carrier.
6. The method of claim 3, wherein the method comprises: step three, estimating a semi-tight fusion pose of the GNSS-laser/INS, which comprises the following specific steps;
1) laser/INS close fusion pose estimation, calculating relative transformation of a carrier by using acceleration and angular velocity data acquired by a GNSS/INS combined positioning module and characteristic information extracted by two frames of continuous point clouds, and bkIs INS data under the laser radar time k, wherein the local coordinate system is bk-1;
δX:=[δP,δV,δθ,δba,δbg,δg]
wherein:respectively b arek+1Relative moment of time and bkPosition, velocity and rotational quaternion at the time;is b iskA gravity value under a local coordinate system at the moment; δ P, δ V and δ θ are error values of roll angle, pitch angle and yaw angle corresponding to position, speed and rotation angle respectively; δ ba、δbgDelta g is the zero offset of the accelerometer, the gyro drift and the gravity error;
when the current frame of INS measurement data is obtained, the error state is:
wherein: etawIs Gaussian noise, FtAs an error state transition matrix, GtIs a noise jacobian;
state update can be regarded as an optimization problem, namely, the attitude state prior is alignedAnd a residual function introduced based on the observation modelThe optimization problem of (1), first, based on pose transformationAnd equation of observationCalculating a residual error:
wherein: Σ is the covariance of the error state δ X; j is a Jacobian matrix; m is the covariance matrix of the measured noise, superscript lkFor point cloud data at time k, lpiIs laser point cloud;
next, an observation equation will be calculatedJacobi J ofk(ii) a Obtaining the posterior of the error state according to the updating process of the ESKF; updating pose transformation by using a posteriori of the error state; finally, judging whether the posterior of the error state approaches to 0 or not, if so, stopping iteration; otherwise, recalculating is based on pose transformationAnd equation of observationPerforming iterative computation on the residual error;
adding the solved error shape to the priorsIn the method, final frame-to-frame conversion can be obtained
Wherein:respectively b arek+1Relative moment of time and bkA priori position, velocity and rotational quaternion of the moment;is b iskA priori gravity value under a moment local coordinate system;-ba、-bgzero-offset and gyro drift of a priori accelerometer;
and finally, obtaining the coordinates under the global coordinate system:
wherein:andis a slave frame bkTo frame bk+1The posture of (2) is changed;indicates that the origin of global coordinates is in frame bk+1Coordinate information under a coordinate system;
2) solving the pose of the GNSS-laser/INSS loose combination, wherein the loose combination is the GNSS-laser/INSS combination based on the position and the speed, and when the earth coordinate system is taken as a navigation coordinate system, the attitude, the speed, the position, the gyro drift and the zero offset of an accelerometer are respectively taken as state variables XGLI:
The system state and measurement equations are established as follows:
ZGLI(t)=HGLI(t)XGLI(t)+wGLI(t)
wherein: a. theGLI(t) equation of state for GNSS-laser/INS loose combinationA coefficient matrix; wGLI(t) is the state equation system noise when GNSS-laser/INS is loosely combined; zGLI(t) is an external measurement value when the GNSS-laser/INS is loosely combined; hGLI(t) is a measurement equation coefficient matrix when GNSS-laser/INS loose combination is performed; w is aGLI(t) is the measurement noise in the GNSS-laser/INS loose combination;
and finally, solving by utilizing Kalman filtering to obtain the pose of the carrier.
7. The method of claim 3, wherein the method comprises: the 16-line laser radar adopts a Velodyne VLP-16 type laser radar.
8. The method of claim 3, wherein the method comprises: the GNSS/INS combined positioning module collects GNSS observation data, GNSS differential data, acceleration data and angular speed data.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110166885.2A CN112987065B (en) | 2021-02-04 | 2021-02-04 | Multi-sensor-integrated handheld SLAM device and control method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110166885.2A CN112987065B (en) | 2021-02-04 | 2021-02-04 | Multi-sensor-integrated handheld SLAM device and control method thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112987065A true CN112987065A (en) | 2021-06-18 |
CN112987065B CN112987065B (en) | 2024-01-12 |
Family
ID=76348702
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110166885.2A Active CN112987065B (en) | 2021-02-04 | 2021-02-04 | Multi-sensor-integrated handheld SLAM device and control method thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112987065B (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113534227A (en) * | 2021-07-26 | 2021-10-22 | 中国电子科技集团公司第五十四研究所 | Multi-sensor fusion absolute positioning method suitable for complex non-cooperative scene |
CN113820735A (en) * | 2021-08-31 | 2021-12-21 | 上海华测导航技术股份有限公司 | Method for determining position information, position measuring device, terminal, and storage medium |
CN114396943A (en) * | 2022-01-12 | 2022-04-26 | 国家电网有限公司 | Fusion positioning method and terminal |
CN114463433A (en) * | 2021-12-21 | 2022-05-10 | 上海交通大学深圳研究院 | In-vivo SLAM method and system based on fusion of optical fiber strain sensor and monocular camera |
CN115267725A (en) * | 2022-09-27 | 2022-11-01 | 上海仙工智能科技有限公司 | Drawing establishing method and device based on single-line laser radar and storage medium |
CN116540286A (en) * | 2023-07-06 | 2023-08-04 | 中国科学院空天信息创新研究院 | Multi-source robust fusion positioning method based on non-integrity constraint |
CN117168441A (en) * | 2023-11-02 | 2023-12-05 | 西安因诺航空科技有限公司 | Multi-sensor fusion SLAM positioning and reconstructing method and system |
CN117351140A (en) * | 2023-09-15 | 2024-01-05 | 中国科学院自动化研究所 | Three-dimensional reconstruction method, device and equipment integrating panoramic camera and laser radar |
CN117871943A (en) * | 2024-03-13 | 2024-04-12 | 保定朗信电子科技有限公司 | Split type high-voltage electric energy metering method and system |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109597095A (en) * | 2018-11-12 | 2019-04-09 | 北京大学 | Backpack type 3 D laser scanning and three-dimensional imaging combined system and data capture method |
CN111045017A (en) * | 2019-12-20 | 2020-04-21 | 成都理工大学 | Method for constructing transformer substation map of inspection robot by fusing laser and vision |
CN111929699A (en) * | 2020-07-21 | 2020-11-13 | 北京建筑大学 | Laser radar inertial navigation odometer considering dynamic obstacles and mapping method and system |
-
2021
- 2021-02-04 CN CN202110166885.2A patent/CN112987065B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109597095A (en) * | 2018-11-12 | 2019-04-09 | 北京大学 | Backpack type 3 D laser scanning and three-dimensional imaging combined system and data capture method |
CN111045017A (en) * | 2019-12-20 | 2020-04-21 | 成都理工大学 | Method for constructing transformer substation map of inspection robot by fusing laser and vision |
CN111929699A (en) * | 2020-07-21 | 2020-11-13 | 北京建筑大学 | Laser radar inertial navigation odometer considering dynamic obstacles and mapping method and system |
Non-Patent Citations (5)
Title |
---|
BO ZHOU等: "A LiDAR Odometry for Outdoor Mobile Robots Using NDT Based Scan Matching in GPS-denied environments", 《2017 IEEE 7TH ANNUAL INTERNATIONAL CONFERENCE ON CYBER TECHNOLOGY IN AUTOMATION, CONTROL, AND INTELLIGENT SYSTEMS》 * |
HAOYU ZHOU 等: "Lidar/UWB Fusion Based SLAM With Anti-Degeneration Capability", 《 IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY》 * |
WEIKUN ZHEN等: "Robust localization and localizability estimation with a rotating laser scanner", 《2017 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION》 * |
张浩峰;赵春霞;王晓丽;李白云;成伟明;刘华军;: "面向室外自然环境的移动机器人视觉仿真系统", 系统仿真学报, no. 03 * |
郝刚涛;杜小平;赵继广;宋建军;: "单目-无扫描3D激光雷达融合的非合作目标相对位姿估计", 宇航学报, no. 10 * |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113534227B (en) * | 2021-07-26 | 2022-07-01 | 中国电子科技集团公司第五十四研究所 | Multi-sensor fusion absolute positioning method suitable for complex non-cooperative scene |
CN113534227A (en) * | 2021-07-26 | 2021-10-22 | 中国电子科技集团公司第五十四研究所 | Multi-sensor fusion absolute positioning method suitable for complex non-cooperative scene |
CN113820735A (en) * | 2021-08-31 | 2021-12-21 | 上海华测导航技术股份有限公司 | Method for determining position information, position measuring device, terminal, and storage medium |
CN113820735B (en) * | 2021-08-31 | 2023-12-01 | 上海华测导航技术股份有限公司 | Determination method of position information, position measurement device, terminal and storage medium |
CN114463433A (en) * | 2021-12-21 | 2022-05-10 | 上海交通大学深圳研究院 | In-vivo SLAM method and system based on fusion of optical fiber strain sensor and monocular camera |
CN114463433B (en) * | 2021-12-21 | 2024-09-20 | 上海交通大学深圳研究院 | In-vivo SLAM method and system based on fusion of optical fiber strain sensor and monocular camera |
CN114396943B (en) * | 2022-01-12 | 2024-07-23 | 国家电网有限公司 | Fusion positioning method and terminal |
CN114396943A (en) * | 2022-01-12 | 2022-04-26 | 国家电网有限公司 | Fusion positioning method and terminal |
CN115267725A (en) * | 2022-09-27 | 2022-11-01 | 上海仙工智能科技有限公司 | Drawing establishing method and device based on single-line laser radar and storage medium |
CN115267725B (en) * | 2022-09-27 | 2023-01-31 | 上海仙工智能科技有限公司 | Drawing establishing method and device based on single-line laser radar and storage medium |
CN116540286B (en) * | 2023-07-06 | 2023-08-29 | 中国科学院空天信息创新研究院 | Multi-source robust fusion positioning method based on non-integrity constraint |
CN116540286A (en) * | 2023-07-06 | 2023-08-04 | 中国科学院空天信息创新研究院 | Multi-source robust fusion positioning method based on non-integrity constraint |
CN117351140A (en) * | 2023-09-15 | 2024-01-05 | 中国科学院自动化研究所 | Three-dimensional reconstruction method, device and equipment integrating panoramic camera and laser radar |
CN117351140B (en) * | 2023-09-15 | 2024-04-05 | 中国科学院自动化研究所 | Three-dimensional reconstruction method, device and equipment integrating panoramic camera and laser radar |
CN117168441B (en) * | 2023-11-02 | 2024-02-20 | 西安因诺航空科技有限公司 | Multi-sensor fusion SLAM positioning and reconstructing method and system |
CN117168441A (en) * | 2023-11-02 | 2023-12-05 | 西安因诺航空科技有限公司 | Multi-sensor fusion SLAM positioning and reconstructing method and system |
CN117871943A (en) * | 2024-03-13 | 2024-04-12 | 保定朗信电子科技有限公司 | Split type high-voltage electric energy metering method and system |
CN117871943B (en) * | 2024-03-13 | 2024-05-24 | 保定朗信电子科技有限公司 | Split type high-voltage electric energy metering method and system |
Also Published As
Publication number | Publication date |
---|---|
CN112987065B (en) | 2024-01-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112987065B (en) | Multi-sensor-integrated handheld SLAM device and control method thereof | |
CN113781582B (en) | Synchronous positioning and map creation method based on laser radar and inertial navigation combined calibration | |
Maaref et al. | Lane-level localization and mapping in GNSS-challenged environments by fusing lidar data and cellular pseudoranges | |
CN111457902B (en) | Water area measuring method and system based on laser SLAM positioning | |
CN113899375B (en) | Vehicle positioning method and device, storage medium and electronic equipment | |
CN113820735B (en) | Determination method of position information, position measurement device, terminal and storage medium | |
CN115407357B (en) | Low-harness laser radar-IMU-RTK positioning mapping algorithm based on large scene | |
CN107289910B (en) | Optical flow positioning system based on TOF | |
CN114444158B (en) | Underground roadway deformation early warning method and system based on three-dimensional reconstruction | |
WO2020041668A1 (en) | Signals of opportunity aided inertial navigation | |
CN103033836B (en) | navigation pointing method of vehicle navigation pointing device | |
CN112967392A (en) | Large-scale park mapping and positioning method based on multi-sensor contact | |
CN114119886A (en) | High-precision map point cloud reconstruction method and device, vehicle, equipment and storage medium | |
CN115272596A (en) | Multi-sensor fusion SLAM method oriented to monotonous texture-free large scene | |
CN115183762A (en) | Airport warehouse inside and outside mapping method, system, electronic equipment and medium | |
CN116608873A (en) | Multi-sensor fusion positioning mapping method for automatic driving vehicle | |
CN114488094A (en) | Vehicle-mounted multi-line laser radar and IMU external parameter automatic calibration method and device | |
CN113960614A (en) | Elevation map construction method based on frame-map matching | |
CN113959437A (en) | Measuring method and system for mobile measuring equipment | |
Khoshelham et al. | Vehicle positioning in the absence of GNSS signals: Potential of visual-inertial odometry | |
CN118168545A (en) | Positioning navigation system and method for weeding robot based on multi-source sensor fusion | |
CN115435779A (en) | Intelligent body pose estimation method based on GNSS/IMU/optical flow information fusion | |
Wu et al. | AFLI-Calib: Robust LiDAR-IMU extrinsic self-calibration based on adaptive frame length LiDAR odometry | |
Grejner-Brzezinska et al. | From Mobile Mapping to Telegeoinformatics | |
CN111197986B (en) | Real-time early warning and obstacle avoidance method for three-dimensional path of unmanned aerial vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |