CN112987065A - Handheld SLAM device integrating multiple sensors and control method thereof - Google Patents

Handheld SLAM device integrating multiple sensors and control method thereof Download PDF

Info

Publication number
CN112987065A
CN112987065A CN202110166885.2A CN202110166885A CN112987065A CN 112987065 A CN112987065 A CN 112987065A CN 202110166885 A CN202110166885 A CN 202110166885A CN 112987065 A CN112987065 A CN 112987065A
Authority
CN
China
Prior art keywords
gnss
ins
laser
coordinate system
pose
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110166885.2A
Other languages
Chinese (zh)
Other versions
CN112987065B (en
Inventor
严超
王庆
刘玉
张昊
李明
许九婧
张波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southeast University
Beijing Research Center for Information Technology in Agriculture
Original Assignee
Southeast University
Beijing Research Center for Information Technology in Agriculture
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southeast University, Beijing Research Center for Information Technology in Agriculture filed Critical Southeast University
Priority to CN202110166885.2A priority Critical patent/CN112987065B/en
Publication of CN112987065A publication Critical patent/CN112987065A/en
Application granted granted Critical
Publication of CN112987065B publication Critical patent/CN112987065B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/48Determining position by combining or switching between position solutions derived from the satellite radio beacon positioning system and position solutions derived from a further system

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A hand-held SLAM device fusing multiple sensors and a control method thereof comprise a GNSS/INS combined positioning module, a GNSS columnar antenna, a 16-line laser radar, a color binocular camera, a processor and a flat panel display; the GNSS columnar antenna is connected with the GNSS/INS combined positioning module; the GNSS/INS combined positioning module, the 16-line laser radar, the color binocular camera and the flat panel display are all connected with the processor, so that the miniaturization of equipment can be realized. The invention provides a mapping scheme based on square voxels under limited hardware resources and aims at solving the problems that feature point matching cannot be performed in an environment with few textures by vision, so that the depth of a three-dimensional point under a camera coordinate system cannot be obtained, and the depth can be directly measured by laser but texture information in a space cannot be captured, and dense reconstruction is performed on a space plane for point cloud coloring by using a color image.

Description

Handheld SLAM device integrating multiple sensors and control method thereof
Technical Field
The invention relates to the field of synchronous positioning and mapping methods of handheld terminals fusing multiple sensors, in particular to a handheld SLAM device fusing multiple sensors and a control method thereof.
Background
At present, the land resource survey is mainly based on satellite remote sensing data, and field survey is assisted. With the development of remote sensing technology, high-grade remote sensing data plays an increasingly important role and has proved to be the most direct and effective method. The work is mainly based on a remote sensing technology, the land utilization data of the same space in different time phases are subjected to superposition contrast analysis to find the change of the earth surface, and the dynamic change characteristics and the future development trend of the land utilization are analyzed in the aspects of time, space, quantity and quality. However, satellite remote sensing has the following defects in village and town land survey: 1) the larger the coverage area is, the less obvious the local information of the image is; 2) the current resources of the remote sensing satellite are limited, the satellite over-top period and the over-top time are usually fixed, the frequency of earth observation is relatively low, and a timely remote sensing image cannot be given to a sudden problem; 3) the acquired data are mainly two-dimensional plane data, and the requirements of land resource survey management and village and town planning and construction cannot be met. The handheld SLAM (Simultaneous LoCalization and Mapping, Simultaneous LoCalization and Mapping technology) device is powerful supplement of a satellite remote sensing measuring means, has the characteristics of flexibility, high efficiency, rapidness, fineness, accuracy, low operation cost, wide application range, short production period and the like, and has obvious advantages in the aspect of rapidly acquiring three-dimensional data in small areas and areas with complex environments.
For the handheld SLAM device, the important point is to solve the problems of positioning and environment perception of the mobile terminal. Currently, the Global Navigation Satellite System (GNSS), which is the infrastructure of PNT (position, Navigation and Timing) services, is the most used in the Positioning field. However, the satellite navigation signal is weak, the signal is easy to be shielded, the anti-interference capability is poor, the deception is easy, and the data output frequency of the GNSS receiver is low. Particularly in complex environments of villages and small towns, due to the influence of dense buildings, trees and the like, GNSS often causes interruption of positioning output due to signal shielding. In order to make up for the shortage of relying on GNSS positioning only, a GNSS/INS combined positioning technology is usually formed in combination with an Inertial Navigation System (INS). The INS has the advantages that the INS does not have any optical, electrical and magnetic connection with the outside, the work is not limited by meteorological conditions, the navigation task is completed by the motion carrier equipment completely, and complete navigation parameters can be provided, and continuous positioning output can be kept when GNSS signals are shielded. But the biggest disadvantage is that the method has the accumulated effect of errors, and the positioning precision is continuously reduced as the positioning process is carried out. The high-precision INS can keep high positioning precision for a long time, but the price and the cost are high, and the requirement of civil low cost is difficult to meet. The LiDAR (Light DeteCtion and Ranging) has good relative positioning performance, is slower in divergence compared with INS, can effectively inhibit the accumulation of navigation errors, and has good complementarity with GNSS in environmental requirements. But the environment information it acquires is slightly insufficient in detail and color information of the environment cannot be acquired. Compared with other sensors, the visual sensor can acquire enough environmental details, can draw the appearance and shape of an object, read marks and the like, and help a carrier to realize environmental cognition, and the functions of the other sensors cannot be realized, and the visual sensor is more benefited by the rapid development of digital image processing technology and the improvement of the performance of computer hardware in recent years. But is greatly influenced by environmental factors and external factors, such as insufficient light, and reduced sight caused by weather factors. Particularly, under complex environments of villages and towns, the problems of large image calculation amount and difficult algorithm realization exist. Relevant manufacturers at home and abroad have produced relevant integrated products, such as a BLK20go handset of a LeiCa measuring system under the Hexagon flag, which integrates a laser radar and a panoramic camera, but the device is mainly applied indoors; BaCkpa Ck integrates GNSS, IMU, two 16-line Velodyne laser radars and 4 panoramic cameras, and a domestic digital green soil company also has a product LiBaCkpa DGC50 of the same type, but the equipment is very expensive in price, low in popularization rate, large in equipment volume and not beneficial to carrying, and each sensor is only physically integrated, and the fusion of data layers is not realized.
Disclosure of Invention
In order to solve the problems, the invention provides a multi-sensor-fused handheld SLAM device and a control method thereof, and provides the multi-sensor-fused SLAM method based on an Error State Kalman Filtering (ESKF), which considers the advantages of a single sensor and aims at the limitations thereof.
The invention provides a multi-sensor-fused handheld SLAM device which comprises a device body, a GNSS/INS combined positioning module, a GNSS columnar antenna, a 16-line laser radar, a color binocular camera, a processor and a flat panel display, wherein the GNSS columnar antenna is connected with the GNSS/INS combined positioning module; the GNSS/INS combined positioning module, the 16-line laser radar, the colorful binocular camera and the flat panel display are connected with the processor, the flat panel display is arranged at the front end of the device main body, the GNSS/INS combined positioning module and the processor are arranged in the device main body, the GNSS columnar antenna and the 16-line laser radar are arranged on the device main body, and the colorful binocular camera is arranged on the rear side of the upper portion of the device main body.
As a further improvement of the structure of the invention, the bottom of the device main body is provided with a handle.
The invention provides a control method of a handheld SLAM device fusing multiple sensors, which comprises the following specific steps:
estimating the pose based on the 16-line laser radar;
1) point cloud segmentation;
2) extracting characteristics;
3) a laser odometer;
(II) estimating the pose of the GNSS/INS combined positioning module;
1) GNSS double-difference relative positioning GNSS-RTK;
2) solving the pose based on the acceleration and angular velocity information;
3) solving the GNSS/INS loose combined pose;
(III) estimating a GNSS-laser/INS semi-tight fusion pose;
1) estimating a laser/INS tight fusion pose;
2) solving the GNSS-laser/INSS loose combined pose;
(IV) a voxel-based visualization scheme;
because the vision can not carry out feature point matching in an environment with less texture, the depth of a three-dimensional point under a camera coordinate system can not be obtained, although the depth can be measured directly by laser, the texture information in the space can not be captured, aiming at the respective advantages and disadvantages of the vision and the laser, a map is built based on a square voxel under limited hardware resources, the voxel point cloud is colored by combining the color information of a color image, the voxel with the minimum unit in the space replaces a surface patch in the traditional three-dimensional reconstruction, the points of a dense plane are mainly laser points obtained by three-time region growing and pixel points corresponding to the plane in the image, and firstly, the laser points need to be projected into the image to obtain a gray value; then, matching the pixel points of the image with a plane under a laser coordinate system; and finally, calculating the pixel coordinate depth successfully matched with the plane.
As a further improvement of the control method, the first step is based on the pose estimation of the 16-line laser radar, and the specific steps are as follows:
1) point cloud segmentation, namely projecting laser point cloud onto a distance image, and judging whether an indoor ground point exists or not by utilizing the judgment of vertical angle and a threshold value so as to segment a ground point and a non-ground point; carrying out clustering processing on the non-ground points by using BFS (bidirectional Forwarding-type clustering), setting corresponding thresholds and removing dynamic noise points;
2) and (3) feature extraction, namely horizontally dividing the distance image into a plurality of sub-images, finding 5 points from the left and right of one point in the point cloud set at the moment t in the same vertical direction, and constructing a set S, wherein the curvature C of each point is as follows:
Figure BDA0002934031150000031
wherein R iskIs a distance;
setting a threshold value C for the calculated curvature value sequencethIf the number is larger than the threshold value, the edge point is determined; otherwise, for the planar points, nF with the maximum C value and not belonging to the ground point is selected from each rowmeEdge points, forming a set Fme(ii) a nF for selecting minimum C value from each rowmpIndividual plane points belonging to all rows of ground points or division points, forming a set FmpAgain, a screening is performed from the set FmeSelecting nF not belonging to ground point and having maximum C valueeEdge points, forming a set Fe(ii) a From the set FmpSelecting nF belonging to ground point and having minimum C valuepIndividual plane points, forming a set Fp
3) The laser odometer sets the current time as t, the last time as t-1, and selects the characteristic point
Figure BDA0002934031150000041
And
Figure BDA0002934031150000042
and at time t-1
Figure BDA0002934031150000043
And
Figure BDA0002934031150000044
construction of
Figure BDA0002934031150000045
Point-to-line correspondence of (1) and
Figure BDA0002934031150000046
by means of LM optimization calculation
Figure BDA0002934031150000047
Is constrained to obtain [ t ]zrollpitch](ii) a Reuse of LM optimization calculations
Figure BDA0002934031150000048
Is constrained and combined with [ t ] -tzrollpitch]Thereby obtaining [ tx,tyyaw]Thus, a pose parameter [ t ] under the carrier local coordinate system is obtainedx,ty,tzrollpitchyaw]。
As a further improvement of the control method, step (II) is based on the position and attitude estimation of a GNSS/INS combined positioning module, and the specific steps are as follows:
1) GNSS double-difference relative positioning GNSS-RTK, utilizing GNSS observation data and GNSS differential data to establish pseudo-range and carrier double-difference observation equation, making the instantaneous coordinate of satellite s be Xs,Ys,ZsThe approximate coordinates of the station r are
Figure BDA0002934031150000049
The correction number is dx, dy and dz, and is linearized, which is abbreviated as:
Figure BDA00029340311500000410
Figure BDA00029340311500000411
wherein the content of the first and second substances,
Figure BDA00029340311500000412
Figure BDA00029340311500000413
representing an interstation interplanetary double difference operator;
Figure BDA00029340311500000414
representing an interplanetary difference operator; s and r are serial numbers of the satellite and the survey station; j is a pseudo range or carrier observation type j ═ 1, 2, 3; p is a pseudo-range observed value;
Figure BDA00029340311500000415
as geometrical distance between satellite and receiver
Figure BDA00029340311500000416
c is the speed of light in a vacuum environment; deltaion、δtropThe ionosphere refraction error and the troposphere refraction error are included; lambda [ alpha ]iA carrier wavelength;
Figure BDA00029340311500000417
carrier phase observations, integer ambiguities; epsilonP
Figure BDA00029340311500000418
Pseudo range and carrier phase observation noise;
firstly, solving the integer ambiguity by using a Lambda algorithm, and then solving the correction numbers dx, dy and dz, thereby determining the coordinates X, Y and Z of the carrier under a GNSS coordinate system:
Figure BDA00029340311500000419
2) firstly, setting an INS coordinate system B, a world coordinate system W and acceleration collected by a GNSS/INS combined positioning module
Figure BDA00029340311500000420
And angular velocity
Figure BDA00029340311500000421
Figure BDA0002934031150000051
Figure BDA0002934031150000052
Wherein: a isB、ωBTrue values of acceleration and angular velocity, respectively; ba、bgDrift values for acceleration and angular velocity, respectively; etaa、ηgTo measure noise; gWIs a gravity value; q. q.sWBA rotation quaternion representing the rotation from INS coordinate system B to world coordinate system W;
integrating the acceleration and angular velocity data, and calculating a corresponding pose:
Figure BDA0002934031150000053
Figure BDA0002934031150000054
Figure BDA0002934031150000055
wherein:
Figure BDA0002934031150000056
respectively represent B under the world coordinate systemkPosition, velocity and rotational quaternion at the time; Δ t is time Bk+1And BkThe time difference between them;
Figure BDA0002934031150000057
represents the time BtRelative to the BkThe attitude at that moment;
3) the position and the speed of the GNSS/INS loose combination are used as the basis of the loose combination, when the earth coordinate system is used as a navigation coordinate system, the combined navigation system has 15 state parameters, and the attitude, the speed, the position, the gyro drift and the zero offset of an accelerometer are respectively used as state variablesXGI:
Figure BDA0002934031150000058
Wherein: p, V is a position and velocity matrix; theta is a roll angle, a pitch angle and a yaw angle;
Figure BDA0002934031150000059
respectively representing the zero offset of the accelerometer and the gyro drift along the motion direction of the carrier;
the system state and measurement equations are established as follows:
Figure BDA00029340311500000510
ZGI(t)=HGI(t)XGI(t)+wGI(t)
wherein: a. theGI(t) is a state equation coefficient matrix when the GNSS/INS is loosely combined; wGI(t) is the state equation system noise when the GNSS/INS is loosely combined; zGI(t) is an external measurement value; hGI(t) is a measurement equation coefficient matrix when GNSS/INS is loosely combined; w is aGI(t) is the measurement noise when the GNSS/INS is loosely combined;
and finally, solving by utilizing Kalman filtering to obtain the pose of the carrier.
The control method is further improved, and step (III) GNSS-laser/INS semi-tight fusion pose estimation is carried out, and the specific steps are as follows;
1) laser/INS close fusion pose estimation, calculating relative transformation of a carrier by using acceleration and angular velocity data acquired by a GNSS/INS combined positioning module and characteristic information extracted by two frames of continuous point clouds, and bkIs INS data under the laser radar time k, wherein the local coordinate system is bk-1
Make the pose relationship between two continuous frames
Figure BDA0002934031150000061
And error state δ X is:
Figure BDA0002934031150000062
δX:=[δP,δV,δθ,δba,δbg,δg]
wherein:
Figure BDA0002934031150000063
respectively b arek+1Relative moment of time and bkPosition, velocity and rotational quaternion at the time;
Figure BDA0002934031150000064
is b iskA gravity value under a local coordinate system at the moment; δ P, δ V and δ θ are error values of roll angle, pitch angle and yaw angle corresponding to position, speed and rotation angle respectively; δ ba、δbgDelta g is the zero offset of the accelerometer, the gyro drift and the gravity error;
when the current frame of INS measurement data is obtained, the error state is:
Figure BDA0002934031150000065
wherein: etawIs Gaussian noise, FtAs an error state transition matrix, GtIs a noise jacobian;
state update can be regarded as an optimization problem, namely, the attitude state prior is aligned
Figure BDA0002934031150000066
And a residual function introduced based on the observation model
Figure BDA0002934031150000067
The optimization problem of (1), first, based on pose transformation
Figure BDA0002934031150000068
And equation of observation
Figure BDA0002934031150000069
Calculating a residual error:
Figure BDA00029340311500000610
Figure BDA00029340311500000611
wherein: Σ is the covariance of the error state δ X; j is a Jacobian matrix; m is the covariance matrix of the measured noise, superscript lkFor point cloud data at time k, lpiIs laser point cloud;
next, an observation equation will be calculated
Figure BDA00029340311500000612
Jacobi J ofk(ii) a Obtaining the posterior of the error state according to the updating process of the ESKF; updating pose transformation by using a posteriori of the error state; finally, judging whether the posterior of the error state approaches to 0 or not, if so, stopping iteration; otherwise, recalculating is based on pose transformation
Figure BDA00029340311500000613
And equation of observation
Figure BDA00029340311500000614
Performing iterative computation on the residual error;
adding the solved error shape to the priors
Figure BDA00029340311500000615
In the method, final frame-to-frame conversion can be obtained
Figure BDA00029340311500000616
Figure BDA0002934031150000071
Wherein:
Figure BDA0002934031150000072
respectively b arek+1Relative moment of time and bkA priori position, velocity and rotational quaternion of the moment;
Figure BDA0002934031150000073
is b iskA priori gravity value under a moment local coordinate system;-ba-bgzero-offset and gyro drift of a priori accelerometer;
and finally, obtaining the coordinates under the global coordinate system:
Figure BDA0002934031150000074
wherein:
Figure BDA0002934031150000075
and
Figure BDA0002934031150000076
is a slave frame bkTo frame bk+1The posture of (2) is changed;
Figure BDA0002934031150000077
indicates that the origin of global coordinates is in frame bk+1Coordinate information under a coordinate system;
2) solving the pose of the GNSS-laser/INSS loose combination, wherein the loose combination is the GNSS-laser/INSS combination based on the position and the speed, and when the earth coordinate system is taken as a navigation coordinate system, the attitude, the speed, the position, the gyro drift and the zero offset of an accelerometer are respectively taken as state variables XGLI:
Figure BDA0002934031150000078
The system state and measurement equations are established as follows:
Figure BDA0002934031150000079
ZGLI(t)=HGLI(t)XGLI(t)+wGLI(t)
wherein: a. theGLI(t) is a state equation coefficient matrix when GNSS-laser/INS loose combination is carried out; wGLI(t) is the state equation system noise when GNSS-laser/INS is loosely combined; zGLI(t) is an external measurement value when the GNSS-laser/INS is loosely combined; hGLI(t) is a measurement equation coefficient matrix when GNSS-laser/INS loose combination is performed; w is aGLI(t) is the measurement noise in the GNSS-laser/INS loose combination;
and finally, solving by utilizing Kalman filtering to obtain the pose of the carrier.
As a further improvement of the control method, the 16-line laser radar adopts a Velodyne VLP-16 type laser radar.
As a further improvement of the control method, the GNSS/INS combined positioning module collects GNSS observation data, GNSS differential data, acceleration and angular velocity data.
Compared with the prior art, the beneficial effect is:
in an unknown environment, a carrier acquires information of a surrounding environment by using a handheld SLAM device to perform synchronous positioning and mapping, the handheld SLAM device considered by the inventor comprises a GNSS/INS combined positioning module, a GNSS columnar antenna, a 16-line laser radar, a color binocular camera, a processor and a flat panel display, and the miniaturization of equipment can be realized. Aiming at the condition that the existing EKF model laser radar observation constraint is easy to have low performance and even diverge, the EKF is replaced by the ESKF, the acceleration and angular velocity data in the GNSS/INS combined positioning module are fused, the pose estimation is optimized, and the GNSS positioning result in the GNSS/INS combined positioning module is loosely coupled with the GNSS positioning result to obtain the global pose under the GNSS coordinate system. The laser can measure the depth very directly, but cannot capture texture information in space, aiming at the problem that the depth of a three-dimensional point under a camera coordinate system cannot be obtained because feature point matching cannot be carried out in an environment with few textures in vision. And providing a mapping scheme based on square voxels under limited hardware resources, and performing dense reconstruction on a spatial plane subjected to point cloud coloring by using a color image.
Drawings
FIG. 1 is a schematic diagram of a real-time embodiment of a hand-held SLAM device according to the present invention;
FIG. 2 is a design drawing of the hand-held SLAM device of the present invention;
fig. 3 is a flow chart of a SLAM method of fusing multiple sensors.
Description of the labeling:
1. a GNSS column antenna; 2. a color binocular camera; 3. a processor; 4. a handle; 5. a 16-line laser radar; 6. a flat panel display; 7. and the GNSS/INS combined positioning module.
Detailed Description
The invention is described in further detail below with reference to the following detailed description and accompanying drawings:
the invention provides a multi-sensor-fused handheld SLAM device and a control method thereof, and aims to solve the technical problems that the method is based on an Error State Kalman Filtering (ESKF) SLAM method, the method is provided by taking the advantages of a single sensor into consideration and aiming at the limitation of the single sensor, the pose transformation of a handheld terminal is obtained through calculation by fusing data of the multiple sensors, errors caused by incapability of positioning of a GNSS and rare characteristic points are reduced, the obtained motion track of the handheld terminal is smoother and more accurate, and the accuracy of map construction is improved.
As shown in fig. 1, a handheld SLAM device with integrated multi-sensors comprises a device body, a GNSS/INS combined positioning module 7, a GNSS column antenna 1, a 16-line laser radar 5, a color binocular camera 2, a processor 3 and a flat panel display 6, wherein the GNSS column antenna 1 is connected with the GNSS/INS combined positioning module 7; GNSS INS combined positioning module 7, 16 line lidar 5, colored two mesh cameras 2, flat panel display 6 all be connected with treater 3, there is flat panel display 6 device main part front end, there are GNSS INS combined positioning module 7 and treater 3 in the device main part, there are GNSS column antenna 1 and 16 line lidar 5 in the device main part, there are colored two mesh cameras 2 device main part upper portion rear side, there is handle 4 device main part bottom. Fig. 2 is a design diagram of a hand-held SLAM device.
In the invention, under an unknown environment, a carrier obtains a laser point cloud and a color image of a surrounding environment through a 16-line laser radar and a color binocular camera in the moving process, the self motion transformation is calculated through registration of the laser point cloud, meanwhile, the motion data of the carrier is obtained through a GNSS columnar antenna and a GNSS/INS combined positioning module, and the data are subjected to fusion processing to obtain the position of the carrier. And then constructing a three-dimensional map of the surrounding environment by using the laser point cloud and the color image, and simultaneously performing visualization by using a flat panel display to realize synchronous positioning and map building of the carrier.
Pose estimation based on 16-line laser radar
The 16-line laser radar adopts a Velodyne VLP-16 type laser radar.
1. And (4) point cloud segmentation. Projecting the laser point cloud onto a distance image, and judging whether an indoor ground point exists or not by utilizing the judgment of a vertical angle and a threshold value so as to divide a ground point and a non-ground point; and (4) clustering the non-ground points by using BFS (bidirectional Forwarding-class clustering) and setting corresponding thresholds to remove dynamic noise points.
2. And (5) feature extraction. The range image is divided horizontally into a number of sub-images. And (5) finding 5 points from left to right in the same vertical direction of one point in the point cloud set at the time t to construct a set S. The curvature C of each point is then:
Figure BDA0002934031150000091
wherein R iskIs a distance.
Setting a threshold value C for the calculated curvature value sequencethIf the number is larger than the threshold value, the edge point is determined; otherwise, it is a plane point. Selecting nF which does not belong to ground point and has maximum C value from each rowmeEdge points, forming a set Fme(ii) a nF for selecting minimum C value from each rowmpIndividual plane points belonging to all rows of ground points or division pointsSet Fmp. And screening again. From the set FmeSelecting nF not belonging to ground point and having maximum C valueeEdge points, forming a set Fe(ii) a From the set FmpSelecting nF belonging to ground point and having minimum C valuepIndividual plane points, forming a set Fp
3. And (4) a laser odometer. And setting the current time as t, and setting the last time as t-1. Selecting feature points
Figure BDA0002934031150000092
And
Figure BDA0002934031150000093
and at time t-1
Figure BDA0002934031150000094
And
Figure BDA0002934031150000095
construction of
Figure BDA0002934031150000096
Point-to-line correspondence of (1) and
Figure BDA0002934031150000097
point-to-surface correspondence of (a). Optimizing computations using LM
Figure BDA0002934031150000098
Is constrained to obtain [ t ]zrollpitch](ii) a Reuse of LM optimization calculations
Figure BDA0002934031150000099
Is constrained and combined with [ t ] -tzrollpitch]Thereby obtaining [ tx,tyyaw]. Thus, a pose parameter [ t ] under the carrier local coordinate system is obtainedx,ty,tzrollpitchyaw]。
(II) position and orientation estimation based on GNSS/INS combined positioning module
The GNSS/INS combined positioning module collects GNSS observation data, GNSS differential data, acceleration data and angular speed data.
GNSS double-difference relative positioning (GNSS-RTK). And establishing a pseudo-range and carrier double-difference observation equation by utilizing the GNSS observation data and the GNSS differential data. Let the instantaneous coordinate of the satellite s be (X)s,Ys,Zs) The approximate coordinates of the station r are
Figure BDA0002934031150000101
The corrected number is (dx, dy, dz) and is linearized, which is abbreviated as:
Figure BDA0002934031150000102
Figure BDA0002934031150000103
wherein the content of the first and second substances,
Figure BDA0002934031150000104
Figure BDA0002934031150000105
representing an interstation interplanetary double difference operator;
Figure BDA0002934031150000106
representing an interplanetary difference operator; s and r are serial numbers of the satellite and the survey station; j is a pseudorange or carrier observation type (j ═ 1, 2, 3); p is a pseudo-range observed value;
Figure BDA0002934031150000107
as geometrical distance between satellite and receiver
Figure BDA0002934031150000108
c is the speed of light in a vacuum environment; deltaion、δtropThe ionosphere refraction error and the troposphere refraction error are included; lambda [ alpha ]iCarrier waveA wavelength;
Figure BDA0002934031150000109
carrier phase observations, integer ambiguities; epsilonP
Figure BDA00029340311500001010
The pseudo range and the carrier phase observation noise.
And (3) solving the integer ambiguity by using a Lambda algorithm, and then solving the correction numbers (dx, dy, dz) so as to determine the coordinates (X, Y, Z) of the carrier under the GNSS coordinate system:
Figure BDA00029340311500001011
2. and solving the pose based on the acceleration and angular velocity information. First, an INS coordinate system is set as B, and a world coordinate system is set as W. Acceleration acquired by GNSS/INS combined positioning module
Figure BDA00029340311500001012
And angular velocity
Figure BDA00029340311500001013
Figure BDA00029340311500001014
Figure BDA00029340311500001015
Wherein: a isB、ωBTrue values of acceleration and angular velocity, respectively; ba、bgDrift values for acceleration and angular velocity, respectively; etaa、ηgTo measure noise; gWIs a gravity value; q. q.sWBA rotation quaternion representing the rotation from INS coordinate system B to world coordinate system W.
The acceleration and angular velocity data are integrated, and the corresponding pose can be calculated:
Figure BDA00029340311500001016
Figure BDA00029340311500001017
Figure BDA0002934031150000111
wherein:
Figure BDA0002934031150000112
respectively represent B under the world coordinate systemkPosition, velocity and rotational quaternion at the time; Δ t is time Bk+1And BkThe time difference between them;
Figure BDA0002934031150000113
represents the time BtRelative to the BkThe attitude at the moment.
And 3, solving the GNSS/INS loose combined pose. The loose combination is a position, velocity based GNSS/INS combination. When the earth coordinate system is used as a navigation coordinate system, the integrated navigation system has 15 state parameters, and the attitude, the speed and the position, the gyro drift and the zero offset of the accelerometer are respectively used as state variables XGI:
Figure BDA0002934031150000114
Wherein: p, V is a position and velocity matrix; theta is a roll angle, a pitch angle and a yaw angle;
Figure BDA0002934031150000115
respectively, the accelerometer zero offset and the gyro drift along the direction of carrier motion.
The system state and measurement equations are established as follows:
Figure BDA0002934031150000116
ZGI(t)=HGI(t)XGI(t)+wGI(t) (12)
wherein: a. theGI(t) is a state equation coefficient matrix when the GNSS/INS is loosely combined; wGI(t) is the state equation system noise when the GNSS/INS is loosely combined; zGI(t) is an external measurement value; hGI(t) is a measurement equation coefficient matrix when GNSS/INS is loosely combined; w is aGI(t) is the measurement noise of the GNSS/INS loose combination.
And finally, solving by utilizing Kalman filtering to obtain the pose of the carrier.
(III) GNSS-laser/INS semi-tight fusion pose estimation
1. And (4) carrying out laser/INS tight fusion pose estimation. And calculating the relative transformation of the carrier by using the acceleration and angular velocity data acquired by the GNSS/INS combined positioning module and the characteristic information extracted by two frames of continuous point clouds. Let bkIs INS data under the laser radar time k, wherein the local coordinate system is bk-1
Make the pose relationship between two continuous frames
Figure BDA0002934031150000117
And error state δ X is:
Figure BDA0002934031150000118
δX:=[δP,δV,δθ,δba,δbg,δg] (14)
wherein:
Figure BDA0002934031150000119
respectively b arek+1Relative moment of time and bkPosition, velocity and rotational quaternion at the time;
Figure BDA00029340311500001110
is b iskA gravity value under a local coordinate system at the moment; δ P, δ V, δ θ are error values of position, velocity, rotation angle (roll angle, pitch angle, yaw angle), respectively; δ ba、δbgδ g is accelerometer zero offset, gyro drift, and gravity error.
When the current frame of INS measurement data is obtained, the error state is:
Figure BDA0002934031150000121
wherein: etawIs Gaussian noise, FtAs an error state transition matrix, GtIs a noisy jacobian.
State update can be regarded as an optimization problem, namely, the attitude state prior is aligned
Figure BDA0002934031150000122
And a residual function introduced based on the observation model
Figure BDA0002934031150000123
To the optimization problem of (2). First, based on pose transformation
Figure BDA0002934031150000124
And equation of observation
Figure BDA0002934031150000125
Calculating a residual error:
Figure BDA0002934031150000126
Figure BDA0002934031150000127
wherein: Σ is the covariance of the error state δ X; j is a Jacobian matrix; m is the covariance matrix of the measured noise, superscript lkFor point cloud data at time k, lpiAs a laser spotAnd (4) cloud.
Next, an observation equation will be calculated
Figure BDA0002934031150000128
Jacobi J ofk(ii) a Obtaining the posterior of the error state according to the updating process of the ESKF; updating pose transformation by using a posteriori of the error state; finally, judging whether the posterior of the error state approaches to 0 or not, if so, stopping iteration; otherwise, recalculating is based on pose transformation
Figure BDA0002934031150000129
And equation of observation
Figure BDA00029340311500001210
And (4) performing iterative computation on the residual error.
Adding the solved error shape to the priors
Figure BDA00029340311500001211
In the method, final frame-to-frame conversion can be obtained
Figure BDA00029340311500001212
Figure BDA00029340311500001213
Wherein:
Figure BDA00029340311500001214
respectively b arek+1Relative moment of time and bkA priori position, velocity and rotational quaternion of the moment;
Figure BDA00029340311500001215
is b iskA priori gravity value under a moment local coordinate system;-ba-bgthe accelerometer is a priori zero offset and gyro drift.
And finally, obtaining the coordinates under the global coordinate system:
Figure BDA00029340311500001216
wherein:
Figure BDA00029340311500001217
and
Figure BDA00029340311500001218
is a slave frame bkTo frame bk+1The posture of (2) is changed;
Figure BDA00029340311500001219
indicates that the origin of global coordinates is in frame bk+1Coordinate information in a coordinate system.
And 2, solving the GNSS-laser/INSS loose combined pose. The loose combination is a position, velocity based GNSS-laser/INSS combination. When the earth coordinate system is taken as a navigation coordinate system, the attitude, the speed and the position, the gyro drift and the zero offset of the accelerometer are taken as state variables X respectivelyGLI:
Figure BDA0002934031150000131
The system state and measurement equations are established as follows:
Figure BDA0002934031150000132
ZGLI(t)=HGLI(t)XGLI(t)+wGLI(t) (22)
wherein: a. theGLI(t) is a state equation coefficient matrix when GNSS-laser/INS loose combination is carried out; wGLI(t) is the state equation system noise when GNSS-laser/INS is loosely combined; zGLI(t) is an external measurement value when the GNSS-laser/INS is loosely combined; hGLI(t) is a measurement equation coefficient matrix when GNSS-laser/INS loose combination is performed; w is aGLI(t) is the measurement noise of the GNSS-laser/INS loose combination.
And finally, solving by utilizing Kalman filtering to obtain the pose of the carrier.
(IV) voxel-based visualization scheme
Since feature point matching cannot be performed in an environment with few textures by vision, the depth of a three-dimensional point in a camera coordinate system cannot be obtained. Laser light, while capable of measuring depth very directly, cannot capture texture information in space. Aiming at the respective advantages and disadvantages of vision and laser, a map is built based on square voxels under limited hardware resources, the voxel point cloud is colored by combining color information of a color image, and the voxel with the minimum unit in the space replaces a surface patch in the traditional three-dimensional reconstruction. The points of the densified plane are mainly laser points obtained by three times of region growth and pixel points corresponding to the plane in the image. Firstly, projecting a laser point into an image to obtain a gray value; then, matching the pixel points of the image with a plane under a laser coordinate system; and finally, calculating the pixel coordinate depth successfully matched with the plane.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the present invention in any way, but any modifications or equivalent variations made according to the technical spirit of the present invention are within the scope of the present invention as claimed.

Claims (8)

1. A hand-held SLAM device fusing multisensor, includes device main part, GNSS/INS combination positioning module (7), GNSS column antenna (1), 16 line laser radar (5), colored binocular camera (2), treater (3) and flat panel display (6), its characterized in that: the GNSS columnar antenna (1) is connected with the GNSS/INS combined positioning module (7); GNSS INS combined positioning module (7), 16 line lidar (5), colored two mesh camera (2), flat panel display (6) all be connected with treater (3), there are flat panel display (6) device main part front end, there are GNSS INS combined positioning module (7) and treater (3) in the device main part, there are GNSS column antenna (1) and 16 line lidar (5) in the device main part, there are colored two mesh camera (2) device main part upper portion rear side.
2. The multi-sensor-integrated handheld SLAM apparatus of claim 1, wherein: the bottom of the device main body is provided with a handle (4).
3. A control method of a handheld SLAM device fusing multiple sensors comprises the following specific steps:
estimating the pose based on the 16-line laser radar;
1) point cloud segmentation;
2) extracting characteristics;
3) a laser odometer;
(II) estimating the pose of the GNSS/INS combined positioning module;
1) GNSS double-difference relative positioning GNSS-RTK;
2) solving the pose based on the acceleration and angular velocity information;
3) solving the GNSS/INS loose combined pose;
(III) estimating a GNSS-laser/INS semi-tight fusion pose;
1) estimating a laser/INS tight fusion pose;
2) solving the GNSS-laser/INSS loose combined pose;
(IV) a voxel-based visualization scheme;
because the vision can not carry out feature point matching in an environment with less texture, the depth of a three-dimensional point under a camera coordinate system can not be obtained, although the depth can be measured directly by laser, the texture information in the space can not be captured, aiming at the respective advantages and disadvantages of the vision and the laser, a map is built based on a square voxel under limited hardware resources, the voxel point cloud is colored by combining the color information of a color image, the voxel with the minimum unit in the space replaces a surface patch in the traditional three-dimensional reconstruction, the points of a dense plane are mainly laser points obtained by three-time region growing and pixel points corresponding to the plane in the image, and firstly, the laser points need to be projected into the image to obtain a gray value; then, matching the pixel points of the image with a plane under a laser coordinate system; and finally, calculating the pixel coordinate depth successfully matched with the plane.
4. The method of claim 3, wherein the method comprises: the method comprises the following steps of (1) estimating the pose based on a 16-line laser radar, specifically:
1) point cloud segmentation, namely projecting laser point cloud onto a distance image, and judging whether an indoor ground point exists or not by utilizing the judgment of vertical angle and a threshold value so as to segment a ground point and a non-ground point; carrying out clustering processing on the non-ground points by using BFS (bidirectional Forwarding-type clustering), setting corresponding thresholds and removing dynamic noise points;
2) and (3) feature extraction, namely horizontally dividing the distance image into a plurality of sub-images, finding 5 points from the left and right of one point in the point cloud set at the moment t in the same vertical direction, and constructing a set S, wherein the curvature C of each point is as follows:
Figure FDA0002934031140000021
wherein R iskIs a distance;
setting a threshold value C for the calculated curvature value sequencethIf the number is larger than the threshold value, the edge point is determined; otherwise, for the planar points, nF with the maximum C value and not belonging to the ground point is selected from each rowmeEdge points, forming a set Fme(ii) a nF for selecting minimum C value from each rowmpIndividual plane points belonging to all rows of ground points or division points, forming a set FmpAgain, a screening is performed from the set FmeSelecting nF not belonging to ground point and having maximum C valueeEdge points, forming a set Fe(ii) a From the set FmpSelecting nF belonging to ground point and having minimum C valuepIndividual plane points, forming a set Fp
3) The laser odometer sets the current time as t, the last time as t-1, and selects the characteristic point
Figure FDA0002934031140000022
And
Figure FDA0002934031140000023
and at time t-1
Figure FDA0002934031140000024
And
Figure FDA0002934031140000025
construction of
Figure FDA0002934031140000026
Point-to-line correspondence of (1) and
Figure FDA0002934031140000027
by means of LM optimization calculation
Figure FDA0002934031140000028
Is constrained to obtain [ t ]zrollpitch](ii) a Reuse of LM optimization calculations
Figure FDA0002934031140000029
Is constrained and combined with [ t ] -tzrollpitch]Thereby obtaining [ tx,tyyaw]Thus, a pose parameter [ t ] under the carrier local coordinate system is obtainedx,ty,tzrollpitchyaw]。
5. The method of claim 3, wherein the method comprises: step (II) based on the position and orientation estimation of the GNSS/INS combined positioning module, the specific steps are as follows:
1) GNSS double-difference relative positioning GNSS-RTK, utilizing GNSS observation data and GNSS differential data to establish pseudo-range and carrier double-difference observation equation, making the instantaneous coordinate of satellite s be Xs,Ys,ZsProximity of the measuring station rHas the similar coordinates of
Figure FDA00029340311400000210
The correction number is dx, dy and dz, and is linearized, which is abbreviated as:
Figure FDA00029340311400000211
Figure FDA0002934031140000031
wherein the content of the first and second substances,
Figure FDA0002934031140000032
Figure FDA0002934031140000033
representing an interstation interplanetary double difference operator;
Figure FDA0002934031140000034
representing an interplanetary difference operator; s and r are serial numbers of the satellite and the survey station; j is a pseudo range or carrier observation type j ═ 1, 2, 3; p is a pseudo-range observed value;
Figure FDA0002934031140000035
as geometrical distance between satellite and receiver
Figure FDA0002934031140000036
c is the speed of light in a vacuum environment; deltaion、δtropThe ionosphere refraction error and the troposphere refraction error are included; lambda [ alpha ]iA carrier wavelength;
Figure FDA0002934031140000037
carrier phase observations, integer ambiguities; epsilonP
Figure FDA0002934031140000038
Pseudo range and carrier phase observation noise;
firstly, solving the integer ambiguity by using a Lambda algorithm, and then solving the correction numbers dx, dy and dz, thereby determining the coordinates X, Y and Z of the carrier under a GNSS coordinate system:
Figure FDA0002934031140000039
2) firstly, setting an INS coordinate system B, a world coordinate system W and acceleration collected by a GNSS/INS combined positioning module
Figure FDA00029340311400000310
And angular velocity
Figure FDA00029340311400000311
Figure FDA00029340311400000312
Figure FDA00029340311400000313
Wherein: a isB、ωBTrue values of acceleration and angular velocity, respectively; ba、bgDrift values for acceleration and angular velocity, respectively; etaa、ηgTo measure noise; gWIs a gravity value; q. q.sWBA rotation quaternion representing the rotation from INS coordinate system B to world coordinate system W;
integrating the acceleration and angular velocity data, and calculating a corresponding pose:
Figure FDA00029340311400000314
Figure FDA00029340311400000315
Figure FDA00029340311400000316
wherein:
Figure FDA00029340311400000317
respectively represent B under the world coordinate systemkPosition, velocity and rotational quaternion at the time; Δ t is time Bk+1And BkThe time difference between them;
Figure FDA00029340311400000318
represents the time BtRelative to the BkThe attitude at that moment;
3) the position and the speed of the GNSS/INS loose combination are used as the basis of the loose combination, when the earth coordinate system is used as a navigation coordinate system, the combined navigation system has 15 state parameters, and the attitude, the speed, the position, the gyro drift and the zero offset of an accelerometer are respectively used as state variables XGI:
Figure FDA0002934031140000041
Wherein: p, V is a position and velocity matrix; theta is a roll angle, a pitch angle and a yaw angle;
Figure FDA0002934031140000042
respectively representing the zero offset of the accelerometer and the gyro drift along the motion direction of the carrier;
the system state and measurement equations are established as follows:
Figure FDA0002934031140000043
ZGI(t)=HGI(t)XGI(t)+wGI(t)
wherein: a. theGI(t) is a state equation coefficient matrix when the GNSS/INS is loosely combined; wGI(t) is the state equation system noise when the GNSS/INS is loosely combined; zGI(t) is an external measurement value; hGI(t) is a measurement equation coefficient matrix when GNSS/INS is loosely combined; w is aGI(t) is the measurement noise when the GNSS/INS is loosely combined;
and finally, solving by utilizing Kalman filtering to obtain the pose of the carrier.
6. The method of claim 3, wherein the method comprises: step three, estimating a semi-tight fusion pose of the GNSS-laser/INS, which comprises the following specific steps;
1) laser/INS close fusion pose estimation, calculating relative transformation of a carrier by using acceleration and angular velocity data acquired by a GNSS/INS combined positioning module and characteristic information extracted by two frames of continuous point clouds, and bkIs INS data under the laser radar time k, wherein the local coordinate system is bk-1
Make the pose relationship between two continuous frames
Figure FDA0002934031140000044
And error state δ X is:
Figure FDA0002934031140000045
δX:=[δP,δV,δθ,δba,δbg,δg]
wherein:
Figure FDA0002934031140000046
respectively b arek+1Relative moment of time and bkPosition, velocity and rotational quaternion at the time;
Figure FDA0002934031140000047
is b iskA gravity value under a local coordinate system at the moment; δ P, δ V and δ θ are error values of roll angle, pitch angle and yaw angle corresponding to position, speed and rotation angle respectively; δ ba、δbgDelta g is the zero offset of the accelerometer, the gyro drift and the gravity error;
when the current frame of INS measurement data is obtained, the error state is:
Figure FDA0002934031140000048
wherein: etawIs Gaussian noise, FtAs an error state transition matrix, GtIs a noise jacobian;
state update can be regarded as an optimization problem, namely, the attitude state prior is aligned
Figure FDA0002934031140000049
And a residual function introduced based on the observation model
Figure FDA00029340311400000410
The optimization problem of (1), first, based on pose transformation
Figure FDA00029340311400000411
And equation of observation
Figure FDA00029340311400000412
Calculating a residual error:
Figure FDA0002934031140000051
Figure FDA0002934031140000052
wherein: Σ is the covariance of the error state δ X; j is a Jacobian matrix; m is the covariance matrix of the measured noise, superscript lkFor point cloud data at time k, lpiIs laser point cloud;
next, an observation equation will be calculated
Figure FDA0002934031140000053
Jacobi J ofk(ii) a Obtaining the posterior of the error state according to the updating process of the ESKF; updating pose transformation by using a posteriori of the error state; finally, judging whether the posterior of the error state approaches to 0 or not, if so, stopping iteration; otherwise, recalculating is based on pose transformation
Figure FDA0002934031140000054
And equation of observation
Figure FDA0002934031140000055
Performing iterative computation on the residual error;
adding the solved error shape to the priors
Figure FDA0002934031140000056
In the method, final frame-to-frame conversion can be obtained
Figure FDA0002934031140000057
Figure FDA0002934031140000058
Wherein:
Figure FDA0002934031140000059
respectively b arek+1Relative moment of time and bkA priori position, velocity and rotational quaternion of the moment;
Figure FDA00029340311400000510
is b iskA priori gravity value under a moment local coordinate system;-ba-bgzero-offset and gyro drift of a priori accelerometer;
and finally, obtaining the coordinates under the global coordinate system:
Figure FDA00029340311400000511
wherein:
Figure FDA00029340311400000512
and
Figure FDA00029340311400000513
is a slave frame bkTo frame bk+1The posture of (2) is changed;
Figure FDA00029340311400000514
indicates that the origin of global coordinates is in frame bk+1Coordinate information under a coordinate system;
2) solving the pose of the GNSS-laser/INSS loose combination, wherein the loose combination is the GNSS-laser/INSS combination based on the position and the speed, and when the earth coordinate system is taken as a navigation coordinate system, the attitude, the speed, the position, the gyro drift and the zero offset of an accelerometer are respectively taken as state variables XGLI:
Figure FDA00029340311400000515
The system state and measurement equations are established as follows:
Figure FDA0002934031140000061
ZGLI(t)=HGLI(t)XGLI(t)+wGLI(t)
wherein: a. theGLI(t) equation of state for GNSS-laser/INS loose combinationA coefficient matrix; wGLI(t) is the state equation system noise when GNSS-laser/INS is loosely combined; zGLI(t) is an external measurement value when the GNSS-laser/INS is loosely combined; hGLI(t) is a measurement equation coefficient matrix when GNSS-laser/INS loose combination is performed; w is aGLI(t) is the measurement noise in the GNSS-laser/INS loose combination;
and finally, solving by utilizing Kalman filtering to obtain the pose of the carrier.
7. The method of claim 3, wherein the method comprises: the 16-line laser radar adopts a Velodyne VLP-16 type laser radar.
8. The method of claim 3, wherein the method comprises: the GNSS/INS combined positioning module collects GNSS observation data, GNSS differential data, acceleration data and angular speed data.
CN202110166885.2A 2021-02-04 2021-02-04 Multi-sensor-integrated handheld SLAM device and control method thereof Active CN112987065B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110166885.2A CN112987065B (en) 2021-02-04 2021-02-04 Multi-sensor-integrated handheld SLAM device and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110166885.2A CN112987065B (en) 2021-02-04 2021-02-04 Multi-sensor-integrated handheld SLAM device and control method thereof

Publications (2)

Publication Number Publication Date
CN112987065A true CN112987065A (en) 2021-06-18
CN112987065B CN112987065B (en) 2024-01-12

Family

ID=76348702

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110166885.2A Active CN112987065B (en) 2021-02-04 2021-02-04 Multi-sensor-integrated handheld SLAM device and control method thereof

Country Status (1)

Country Link
CN (1) CN112987065B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113534227A (en) * 2021-07-26 2021-10-22 中国电子科技集团公司第五十四研究所 Multi-sensor fusion absolute positioning method suitable for complex non-cooperative scene
CN113820735A (en) * 2021-08-31 2021-12-21 上海华测导航技术股份有限公司 Method for determining position information, position measuring device, terminal, and storage medium
CN114396943A (en) * 2022-01-12 2022-04-26 国家电网有限公司 Fusion positioning method and terminal
CN115267725A (en) * 2022-09-27 2022-11-01 上海仙工智能科技有限公司 Drawing establishing method and device based on single-line laser radar and storage medium
CN116540286A (en) * 2023-07-06 2023-08-04 中国科学院空天信息创新研究院 Multi-source robust fusion positioning method based on non-integrity constraint
CN117168441A (en) * 2023-11-02 2023-12-05 西安因诺航空科技有限公司 Multi-sensor fusion SLAM positioning and reconstructing method and system
CN117351140A (en) * 2023-09-15 2024-01-05 中国科学院自动化研究所 Three-dimensional reconstruction method, device and equipment integrating panoramic camera and laser radar
CN117871943A (en) * 2024-03-13 2024-04-12 保定朗信电子科技有限公司 Split type high-voltage electric energy metering method and system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109597095A (en) * 2018-11-12 2019-04-09 北京大学 Backpack type 3 D laser scanning and three-dimensional imaging combined system and data capture method
CN111045017A (en) * 2019-12-20 2020-04-21 成都理工大学 Method for constructing transformer substation map of inspection robot by fusing laser and vision
CN111929699A (en) * 2020-07-21 2020-11-13 北京建筑大学 Laser radar inertial navigation odometer considering dynamic obstacles and mapping method and system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109597095A (en) * 2018-11-12 2019-04-09 北京大学 Backpack type 3 D laser scanning and three-dimensional imaging combined system and data capture method
CN111045017A (en) * 2019-12-20 2020-04-21 成都理工大学 Method for constructing transformer substation map of inspection robot by fusing laser and vision
CN111929699A (en) * 2020-07-21 2020-11-13 北京建筑大学 Laser radar inertial navigation odometer considering dynamic obstacles and mapping method and system

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
BO ZHOU等: "A LiDAR Odometry for Outdoor Mobile Robots Using NDT Based Scan Matching in GPS-denied environments", 《2017 IEEE 7TH ANNUAL INTERNATIONAL CONFERENCE ON CYBER TECHNOLOGY IN AUTOMATION, CONTROL, AND INTELLIGENT SYSTEMS》 *
HAOYU ZHOU 等: "Lidar/UWB Fusion Based SLAM With Anti-Degeneration Capability", 《 IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY》 *
WEIKUN ZHEN等: "Robust localization and localizability estimation with a rotating laser scanner", 《2017 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION》 *
张浩峰;赵春霞;王晓丽;李白云;成伟明;刘华军;: "面向室外自然环境的移动机器人视觉仿真系统", 系统仿真学报, no. 03 *
郝刚涛;杜小平;赵继广;宋建军;: "单目-无扫描3D激光雷达融合的非合作目标相对位姿估计", 宇航学报, no. 10 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113534227B (en) * 2021-07-26 2022-07-01 中国电子科技集团公司第五十四研究所 Multi-sensor fusion absolute positioning method suitable for complex non-cooperative scene
CN113534227A (en) * 2021-07-26 2021-10-22 中国电子科技集团公司第五十四研究所 Multi-sensor fusion absolute positioning method suitable for complex non-cooperative scene
CN113820735B (en) * 2021-08-31 2023-12-01 上海华测导航技术股份有限公司 Determination method of position information, position measurement device, terminal and storage medium
CN113820735A (en) * 2021-08-31 2021-12-21 上海华测导航技术股份有限公司 Method for determining position information, position measuring device, terminal, and storage medium
CN114396943A (en) * 2022-01-12 2022-04-26 国家电网有限公司 Fusion positioning method and terminal
CN114396943B (en) * 2022-01-12 2024-07-23 国家电网有限公司 Fusion positioning method and terminal
CN115267725A (en) * 2022-09-27 2022-11-01 上海仙工智能科技有限公司 Drawing establishing method and device based on single-line laser radar and storage medium
CN115267725B (en) * 2022-09-27 2023-01-31 上海仙工智能科技有限公司 Drawing establishing method and device based on single-line laser radar and storage medium
CN116540286B (en) * 2023-07-06 2023-08-29 中国科学院空天信息创新研究院 Multi-source robust fusion positioning method based on non-integrity constraint
CN116540286A (en) * 2023-07-06 2023-08-04 中国科学院空天信息创新研究院 Multi-source robust fusion positioning method based on non-integrity constraint
CN117351140A (en) * 2023-09-15 2024-01-05 中国科学院自动化研究所 Three-dimensional reconstruction method, device and equipment integrating panoramic camera and laser radar
CN117351140B (en) * 2023-09-15 2024-04-05 中国科学院自动化研究所 Three-dimensional reconstruction method, device and equipment integrating panoramic camera and laser radar
CN117168441A (en) * 2023-11-02 2023-12-05 西安因诺航空科技有限公司 Multi-sensor fusion SLAM positioning and reconstructing method and system
CN117168441B (en) * 2023-11-02 2024-02-20 西安因诺航空科技有限公司 Multi-sensor fusion SLAM positioning and reconstructing method and system
CN117871943A (en) * 2024-03-13 2024-04-12 保定朗信电子科技有限公司 Split type high-voltage electric energy metering method and system
CN117871943B (en) * 2024-03-13 2024-05-24 保定朗信电子科技有限公司 Split type high-voltage electric energy metering method and system

Also Published As

Publication number Publication date
CN112987065B (en) 2024-01-12

Similar Documents

Publication Publication Date Title
CN112987065B (en) Multi-sensor-integrated handheld SLAM device and control method thereof
CN113781582B (en) Synchronous positioning and map creation method based on laser radar and inertial navigation combined calibration
Maaref et al. Lane-level localization and mapping in GNSS-challenged environments by fusing lidar data and cellular pseudoranges
CN111457902B (en) Water area measuring method and system based on laser SLAM positioning
CN110221603A (en) A kind of long-distance barrier object detecting method based on the fusion of laser radar multiframe point cloud
CN113899375B (en) Vehicle positioning method and device, storage medium and electronic equipment
CN115407357B (en) Low-harness laser radar-IMU-RTK positioning mapping algorithm based on large scene
CN107289910B (en) Optical flow positioning system based on TOF
CN114444158B (en) Underground roadway deformation early warning method and system based on three-dimensional reconstruction
WO2020041668A1 (en) Signals of opportunity aided inertial navigation
CN110187375A (en) A kind of method and device improving positioning accuracy based on SLAM positioning result
CN103033836B (en) navigation pointing method of vehicle navigation pointing device
Abuhadrous et al. Digitizing and 3D modeling of urban environments and roads using vehicle-borne laser scanner system
CN113820735B (en) Determination method of position information, position measurement device, terminal and storage medium
CN112967392A (en) Large-scale park mapping and positioning method based on multi-sensor contact
CN114119886A (en) High-precision map point cloud reconstruction method and device, vehicle, equipment and storage medium
CN115272596A (en) Multi-sensor fusion SLAM method oriented to monotonous texture-free large scene
CN115183762A (en) Airport warehouse inside and outside mapping method, system, electronic equipment and medium
CN114488094A (en) Vehicle-mounted multi-line laser radar and IMU external parameter automatic calibration method and device
Khoshelham et al. Vehicle positioning in the absence of GNSS signals: Potential of visual-inertial odometry
CN116608873A (en) Multi-sensor fusion positioning mapping method for automatic driving vehicle
CN115435779A (en) Intelligent body pose estimation method based on GNSS/IMU/optical flow information fusion
Li et al. Aerial-triangulation aided boresight calibration for a low-cost UAV-LiDAR system
Wu et al. AFLI-Calib: Robust LiDAR-IMU extrinsic self-calibration based on adaptive frame length LiDAR odometry
Grejner-Brzezinska et al. From Mobile Mapping to Telegeoinformatics

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant