CN112950781B - Multi-sensor dynamic weighting fusion point cloud map construction method for special scene - Google Patents
Multi-sensor dynamic weighting fusion point cloud map construction method for special scene Download PDFInfo
- Publication number
- CN112950781B CN112950781B CN202110296526.9A CN202110296526A CN112950781B CN 112950781 B CN112950781 B CN 112950781B CN 202110296526 A CN202110296526 A CN 202110296526A CN 112950781 B CN112950781 B CN 112950781B
- Authority
- CN
- China
- Prior art keywords
- carrier
- key frame
- pose
- satellite positioning
- constraint
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000010276 construction Methods 0.000 title claims description 17
- 230000004927 fusion Effects 0.000 title claims description 17
- 238000005259 measurement Methods 0.000 claims abstract description 93
- 238000000034 method Methods 0.000 claims abstract description 49
- 230000005484 gravity Effects 0.000 claims abstract description 29
- 238000005457 optimization Methods 0.000 claims abstract description 7
- 230000001133 acceleration Effects 0.000 claims abstract description 5
- 230000033001 locomotion Effects 0.000 claims description 31
- 230000009466 transformation Effects 0.000 claims description 26
- 230000001360 synchronised effect Effects 0.000 claims description 17
- 230000003068 static effect Effects 0.000 claims description 10
- 238000013519 translation Methods 0.000 claims description 10
- 238000011156 evaluation Methods 0.000 claims description 9
- 230000008569 process Effects 0.000 claims description 9
- 238000013507 mapping Methods 0.000 claims description 8
- 238000000605 extraction Methods 0.000 claims description 7
- 239000011159 matrix material Substances 0.000 claims description 7
- 230000001131 transforming effect Effects 0.000 claims description 7
- 238000010606 normalization Methods 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 4
- 238000009825 accumulation Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000004807 localization Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/05—Geographic models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
- G06F17/16—Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/20—Finite element generation, e.g. wire-frame surface description, tesselation
Abstract
The invention provides a method for dynamically weighting each sensor constraint term in a maximum posterior probability objective function to improve track precision, which is used for constructing a special scene point cloud map comprising an outdoor environment, an indoor environment and a signal shielding area. Under GNSS coverage, satellite positioning constraint is constructed by taking the Euclidean distance between the carrier position and satellite positioning as a target, and the weight of the satellite positioning constraint is adjusted according to the laser odometer and the GNSS precision factor so as to reduce the influence of satellite positioning errors on a point cloud map; under the GNSS dead zone, the included angle between the expected gravity direction and the actual gravity direction formed by the carrier gesture and the inertial measurement data is minimized to form a gesture constraint, and the weight of the gesture constraint is adjusted according to the acceleration of the carrier, so that the problem of large accumulated error of the elevation of the point cloud map caused by lack of global position observation is solved. And integrating the constraint and solving the carrier pose by using a pose map optimization method based on a laser odometer, and splicing key frame point clouds to accurately generate the special scene point cloud map.
Description
Technical Field
The invention relates to the technical field of point cloud maps, in particular to a multi-sensor dynamic weighting fusion point cloud map construction method for special scenes.
Background
The accurate point cloud map is a precondition that an unmanned system realizes autonomous navigation and executes high-level tasks, and the current method for constructing the point cloud map by fusing multi-sensor information is mainly divided into an outdoor map building type and an indoor map building type. The outdoor environment uses GNSS (Global Navigation Satellite System ) positioning information as the position constraint of a carrier, a common least square method is used for fusing the positions obtained by matching GNSS positioning and laser scanning frames, and because GNSS errors under different time and space are inconsistent, the common least square method is used for solving the problems that the GNSS errors under different time and space are indistinct and introduced into the system, the carrier pose solving is inaccurate, and the local detail of a point cloud map obtained by splicing according to the carrier pose is not fine. Indoor environment is through the common least square fusion UWB (Ultra Wide Band) and laser radar's construction method, but UWB equipment fails in the special scene of signal shielding, under the condition that lack global position observation and have no other constraint, will not be because laser scanning interframe matching error accumulation leads to unable to establish accurate point cloud map. In addition, UWB base stations are required to be deployed in advance by UWB, and for a scene with a large range, the deployment cost of the base stations is very high. Furthermore, UWB signals are susceptible to interference and have low practicality.
Aiming at special scenes of signal shielding or indoor environments, the existing image construction method for fusing an IMU (Inertial Measurement Unit, an inertial measurement unit) and a laser radar generally only uses carrier pose information obtained by IMU calculation as an initial value of laser scanning matching or uses IMU pre-integration to constrain carrier poses at adjacent moments, and does not fully utilize the relation between the IMU measurement information and the gravity direction to construct constraints under a global coordinate system for carrier states, so that the accumulated errors of a point cloud map formed by splicing the carrier poses cannot be effectively eliminated.
Publication date 2020.07.10, publication number: chinese invention patent CN111402414 a: a point cloud map construction method, a device, equipment and a storage medium try to realize the map construction effect of only an immovable object based on point cloud and avoid tailing phenomenon caused by dynamic objects. But this solution does not solve the technical problems described above.
Disclosure of Invention
Aiming at the limitation of the prior art, the invention provides a multi-sensor dynamic weighted fusion point cloud map construction method of a special scene, which adopts the following technical scheme:
a method for constructing a point cloud map by dynamically weighting and fusing multiple sensors in a special scene comprises the steps of carrying out data acquisition and laser scanning on a carrier provided with a laser radar, satellite positioning equipment and an inertial measurement unit in a preset indoor and outdoor environment, and constructing a point cloud map within the moving track range of the carrier according to the point cloud obtained by the laser scanning; the method comprises the following steps:
S1, starting the carrier from the outside, adding the carrier pose at the starting point into a pose graph in a fixed vertex mode, taking the starting point as a coordinate origin, taking the east-north-sky direction as a reference direction, constructing a Cartesian rectangular coordinate system as a global coordinate system, and converting the subsequent satellite positioning data into the global coordinate system;
s2, when the carrier movement distance is greater than a set threshold value, marking a scanning result of the laser radar as a key frame, and adding a carrier pose corresponding to the key frame as a vertex to be solved into a pose graph; constructing laser mileage constraint according to an interframe scanning registration result of the laser radar, and adjusting dynamic weight of the laser mileage constraint according to point cloud registration coincidence degree; when the carrier is positioned in an outdoor environment, constructing satellite positioning constraint according to Euclidean distance between a satellite positioning result and the carrier position, and adjusting dynamic weight of the satellite positioning constraint according to a horizontal precision factor and laser mileage information; when the carrier is positioned in an indoor environment, calculating an included angle between an expected gravity direction and an actual gravity direction according to data of an inertial measurement unit and combining the carrier posture to construct posture constraint, and adjusting dynamic weight of the posture constraint according to the confidence coefficient of the carrier in a static or uniform state; when the carrier returns to the outdoor environment from the indoor environment, searching the historical key frame by utilizing a satellite positioning result to obtain a closed-loop candidate key frame, carrying out point cloud registration by utilizing the current key frame and a local sub-map taking the closed-loop candidate key frame as the center, constructing a laser closed-loop constraint according to the registration result, and adjusting the dynamic weight of the laser closed-loop constraint according to the point cloud registration coincidence degree;
S3, connecting variable vertexes related to each constraint according to constraint items comprising the laser mileage constraint, satellite positioning constraint, attitude constraint and laser closed-loop constraint and dynamic weights corresponding to the constraint items, constructing a pose graph, summing all constraint items in the pose graph to obtain a maximum posterior probability objective function related to a carrier track, and solving the pose graph by using a nonlinear optimization method;
and S4, splicing the point clouds of the historical key frames according to the solving result of the pose graph, and generating a point cloud map.
Compared with the prior art, the method and the device have the advantages that satellite positioning constraint is constructed by taking the Euclidean distance between the minimum carrier position and satellite positioning as a target under GNSS coverage, and the weight of the satellite positioning constraint is adjusted according to the laser odometer and the GNSS horizontal precision factor so as to reduce the influence of satellite positioning errors on a point cloud map; and constructing attitude constraint by taking an included angle between an expected gravity direction and an actual gravity direction formed by minimized carrier attitude and inertial measurement data as a target under a GNSS dead zone, and adjusting the weight of the carrier according to the confidence level of the carrier in a static or uniform state, thereby solving the problem of large accumulated error of the elevation of the point cloud map caused by lack of global position observation. And on the basis of a laser odometer, integrating the satellite positioning constraint, the attitude constraint, the laser odometer constraint and the closed-loop constraint in a pose diagram mode to obtain a maximum posterior probability objective function related to the carrier track. And solving the carrier pose obtained by the objective function to splice the laser radar key frame point cloud, so as to obtain the indoor and outdoor point cloud map which has small accumulated error, is consistent with a geographic coordinate system and has clear local detail. The method can be used for constructing the point cloud map of the special scene comprising the outdoor environment, the indoor environment and the signal shielding area, and realizes the alignment of the indoor and outdoor point cloud maps and a geographic coordinate system under the satellite positioning constraint and attitude constraint conditions constructed by the satellite positioning data and the inertial measurement data based on dynamic weights, thereby being beneficial to the implementation of the subsequent fusion positioning method.
As a preferable mode, the step S1 includes the following steps:
s11, selecting a position with good positioning signals in an outdoor environment as a starting point S of the carrier 0 Acquiring the longitude lambda of the starting point by using the satellite positioning equipment 0 Latitude phi 0 Height h 0 Heading angle psi 0 And will start at S 0 Carrier pose T at 0 Adding the fixed vertexes into the pose graph, and remembering laser radar scanning at the point of origin as a key frame V 0 ;
S12, according to the longitude, latitude, altitude and course angle, a Cartesian rectangular coordinate system is established as a global coordinate system by taking the starting point as an original point and taking the east-north-day as a reference direction: the longitude lambda of the starting point is firstly calculated 0 Latitude phi 0 Height h 0 Converting to ECEF coordinate system to obtain starting point S 0 Corresponding ECEF coordinate system coordinates (ECef x0 ,ecef y0 ,ecef z0 ) And calculates a rotation matrix corresponding to the longitude and latitude
S13, regarding longitude, latitude and altitude information (lambda) output by the satellite positioning device i ,φ i ,h i ) The coordinates are first converted to an ECEF coordinate system to obtain (ECef xi ,ecef yi ,ecef zi ) And then calculating according to the following formula to obtain a corresponding Cartesian rectangular coordinate system satellite positioning coordinate value:
further, the step S2 loops the following steps in the process from the starting point to before the carrier enters the indoor environment:
S21, according to the preset relative movement of the carrierDistance threshold Dist, extracting key frame V from the data acquired by the laser radar i I represents a key frame number;
s22, for key frame V i Point cloud and keyframe V of (2) i-1 Performing point cloud registration to obtain corresponding carrier relative pose transformation delta T between two key frames i,i-1 And calculating the average Euclidean distance of the effective matching point pairs of the two frames of point clouds after registrationFurther construct laser mileage constraint->Transforming delta T by relative pose i,i-1 As pose increment, calculate key frame V i Pose T i As the initial value of the pose graph to be added later;
s23, in key frame V i Corresponding time stamp t i As a target synchronous time stamp, interpolating the satellite positioning data to obtain a key frame V i Time synchronized satellite positioning P i Horizontal precision factor HDOP i And according to the satellite positioning result P after the synchronization of the laser odometer with the interpolation i Confidence evaluation is performed to obtain confidence evaluation bel (P i ) Based on the satellite positioning P i Horizontal precision factor HDOP i Confidence estimation bel (P i ) Constructing satellite positioning constraints
Wherein, the step S22 includes the following steps:
s221, for key frame V i Point cloud and keyframe V of (2) i-1 Performing point cloud registration to obtain corresponding carrier relative pose transformation delta T between two key frames i,i-1 ;
S222, calculating the average Euclidean distance of effective matching point pairs between the two frames of point clouds after registrationAccording to the relative pose transformation delta T of the carrier i,i-1 Construction laser mileage constraint->
Wherein, the L (·) function represents mapping pose data T= { R, T } formed by rotation and translation into a tangent space of SE (3) manifold space to obtain corresponding lie algebra Representing the weight of the laser mileage constraint, and determining the average Euclidean distance of effective matching point pairs of two frames of point clouds after registration>The representation is:
s223, utilizing relative pose transformation DeltaT i,i-1 As pose increment, calculate key frame V i Pose T of (2) i Initial value (T) i-1 ΔT i,i-1 ) And pose T i As a subsequent addition to the pose graph and key frame V i-1 The vertexes are connected with the corresponding vertexes;
wherein, the step S23 includes the following steps:
s231, regarding the satellite positioning data and the key frame V i Interpolation synchronization is performed according to the time stamp, and the process is as follows:
key frame V is taken i Corresponding time stamp t i When recorded as target synchronizationA timestamp; checking whether the start data time stamp of the satellite positioning data queue is equal to the key frame V i Corresponding time stamp t i If yes, the satellite positioning data and the laser radar key frame V are considered i Time synchronization, taking the satellite positioning data as a key frame V i Satellite positioning P for time stamp synchronization i Otherwise, checking the satellite positioning data queue:
if the data queue starts a data timestampThe satellite positioning data time of the frame does not meet the interpolation requirement, the data is popped up from the queue, and the satellite positioning data queue is checked;
if the time stamp of the 0 th and 1 st frame data in the satellite positioning data queue is less than t i The 0 th data is popped up from the queue, and the satellite positioning data queue is checked;
if the timestamp of the 1 st data of the data queueIndicating a temporally more critical frame V of the satellite positioning data i Corresponding time stamp t i Late, the interpolation fails to synchronize;
after the satellite positioning data queue is checked, if the data volume in the satellite positioning data queue is more than or equal to 2, the interpolation synchronization condition is met, and the time stamps of the 0 th and 1 st frame data in the satellite positioning data queue are taken Calculating interpolation scale factors s respectively f ,s b And according to interpolation ratioThe example factor interpolates satellite positioning information under the Cartesian coordinate system described in the step S13 to obtain a key frame V i Satellite positioning P for time stamp synchronization i Horizontal precision factor HDOP i :
wherein ,representing the ith frame data in a satellite positioning data queue, HDOP i Representation and->A corresponding horizontal precision factor;
s232, synchronizing the interpolation value according to the laser mileage of S22 i Confidence evaluation was performed as follows:
by applying the laser radar key frame V in the step S22 i Point cloud and lidar keyframe V i-1 Is registered to obtain corresponding carrier relative pose transformation (delta T) between two key frames i,i-1 ) -1 As a carrier from t i-1 To t i Pose increment estimation of (a);
at t i-1 Carrier pose T at moment i-1 As a reference, deltat is estimated using the above-described pose increment i,i-1 Can obtainTo t i Satellite positioning prediction corresponding to time
wherein ,(·)[x,y,z] Representing position information in the pose data;
according to t i Satellite positioning prediction corresponding to timeSatellite positioning P synchronized with time actually acquired from GNSS/RTK receiver i Euclidean distance between P i Confidence assessment is performed:
s233, according to the satellite positioning P after interpolation synchronization i May be constructed with respect to key frame V i Corresponding to the carrier pose T i Satellite positioning constraints of (a)/>
wherein ,the weight of the satellite positioning constraint is represented by a horizontal precision factor HDOP after interpolation synchronization i And evaluating bel (P) based on confidence of laser mileage i ) The composition is as follows:
further, the step S2 circulates the following steps in the process that the carrier is operated in the indoor environment:
s21, extracting a key frame V from the data acquired by the laser radar according to a preset carrier relative movement distance threshold Dist i I represents a key frame number;
s22, for key frame V i Point cloud and keyframe V of (2) i-1 Performing point cloud registration to obtain corresponding carrier relative pose transformation delta T between two key frames i,i-1 And calculating the average Euclidean distance of the effective matching point pairs of the two frames of point clouds after registrationFurther construct laser mileage constraint->Transforming delta T by relative pose i,i-1 As pose increment, calculate key frame V i Pose T i As the initial value of the pose graph to be added later;
s24, in key frame V i Corresponding time stamp t i As a target synchronization time stamp, interpolation synchronization is carried out on the inertial measurement data acquired by the inertial measurement unit, and a key frame V is obtained i Time-synchronized inertial measurement data I i I' i And further based on inertial measurement data I' i Confidence bel (I) of estimating whether the carrier is in stationary or uniform motion i ) And setting the identification bitBased on the inertial measurement data I i Confidence bel (I) of static or constant velocity state i ) Marking bit->Structural posture constraint
The step S24 includes the steps of:
s241, gaussian weight interpolation is used on the inertial measurement data queue to obtain the key frame V i The time stamp synchronized inertial measurement data is processed as follows:
key frame V is taken i Corresponding time stamp t i Marking as a target synchronous time stamp;
searching for a target synchronization timestamp t in a time-ordered inertial measurement data queue using dichotomy i Inertial measurement data, defined as I, for the upper time bound k Respectively taking n frames of inertial measurement data before and after to form a candidate interpolation inertial measurement data sequence
Interpolation of candidate interpolated inertial measurement data sequences using a gaussian probability density function to obtain a key frame V i Timestamp-synchronized inertial measurement data I i :
wherein ,wk Representing inertial measurement dataAccording to the weight of the inertia measurement data time stamp according to Gaussian probability density function calculation:
in which the target sync time stamp t i For a gaussian distribution mean, a single inertia measurement period sigma acts as the gaussian distribution variance,representing inertiaMeasuring a timestamp of the kth frame data in the data queue;
at t i For target interpolation time, taking inertial measurement data and />Performing linear interpolation to obtain another and key frame V i Timestamp-synchronized inertial measurement data, denoted as I' i ;
S242, according to inertial measurement data I' i Confidence bel (I) of estimating whether the carrier is in stationary or uniform motion i ) And sets the identification bit
in the formula ,(·)[acc] The function represents the extraction of accelerometer data from inertial measurement data, G 0 A gravity vector representing the local area;
s243, according to the inertia measurement data I after interpolation synchronization g i Constructing a key frame V using gravity direction i Corresponding to the carrier pose T i Attitude constraints of (a)
in the formula ,(·)[r,p] Functional representation of pitch angle extraction from estimated carrier pose dataAnd attitude angle (·) [acc] The function represents extracting accelerometer data from inertial measurement data, the H (·) function represents calculating an estimated gravity direction from carrier pitch angle, attitude angle and inertial measurement acceleration information,representing normalization of the local gravity vector to obtain the actual gravity direction vector,/>Weights representing the posture constraints of the item, represented by the identification bits of the carrier motion state +.>And confidence bel (I) i ) The composition is as follows: />
Further, the step S2 is to perform the following steps after the carrier returns from the indoor environment to the outdoor environment:
s21, extracting a key frame V from the data acquired by the laser radar according to a preset carrier relative movement distance threshold Dist i I represents a key frame number;
s25, using historical key frame V j Building a local sub-map for the center, for key frame V i Performing closed-loop scanning matching on the point cloud of the (E) and the local sub-map to obtain a key frame V i And historical key frame V j Corresponding carrier relative pose transformation delta T ij And calculating the average Euclidean distance of the effective matching point pairs of the two frames of point clouds after registrationConstructing laser closed loop scanning constraints->
Further, the step S25 includes the steps of:
s251, utilizing key frame V i The corresponding satellite positioning data is used as a search target to search KD-Tree formed by all key frames V to search the key frames V i Historical key frame V with shortest Euclidean distance corresponding to carrier position j ;
S252, with historical key frame V j In the center, N key frames are respectively indexed from the historical key frame sequence to obtain candidate closed-loop key frame sequences [ V ] j-N ,…,V j ,…,V j+N ]And according to the corresponding carrier pose sequence [ T ] j-N ,…,T j ,…,T j+N ]And carrying out the following coordinate transformation on the point cloud in the candidate closed-loop key frame sequence to construct a local sub-map:
in the formula ,representing map points in a local sub-map, +.>Representing historical key frame V in candidate closed-loop key frame sequence j Is { R j ,t j Respectively representing rotation matrix and position translation quantity, and jointly forming historical key frame V j Corresponding carrier pose T j ;
S253, key frame V i Performing point cloud registration on the point cloud of the (2) and the local sub-map, and calculating the average Euclidean distance of effective matching point pairs of the two frames of point clouds after registrationSetting a laser closed loop identification bit according to the Euclidean distance>
S254, according to the key frame V i Key frame V obtained by carrying out point cloud registration on point cloud of (1) and local sub-map i And historical key frame V j Corresponding carrier relative pose observations deltat ji Construction concerning carrier pose T i and Tj Laser closed loop scanning confinement of (2)
Where the L (·) function represents mapping pose data T= { R, T } consisting of rotation and translation to SE (3) manifold space,representing the weight of the laser closed loop scanning constraint, and determining the average Euclidean distance of the effective matching point pair of the two frames of point clouds after registration>The representation is:
further, the step S3 includes the following steps:
s31, sequentially connecting the poses of the key frames serving as vertexes to form a trunk part of a pose graph;
s32, connecting among vertexes in the pose graph according to the constraint relation of the laser mileage constraint and the laser closed-loop constraint, and distinguishing the weight of the constraint according to different thicknesses;
and S33, connecting corresponding vertexes in the pose graph according to the satellite positioning constraint and the pose constraint, and distinguishing the weight of the constraint according to different thicknesses.
Further, the step S3 further includes the following steps:
s34, summing all constraints in the pose graph to obtain a maximum posterior probability objective function related to the carrier track, wherein variables to be solved of the maximum posterior probability objective function are carrier poses corresponding to all historical key frames;
s35, solving the objective function to obtain carrier pose T corresponding to each vertex in the pose graph * 。
Further, the objective function is expressed as follows:
further, the step S4 includes the following steps:
s41, obtaining carrier pose T corresponding to each key frame according to the pose graph solving result i ;
S42, regarding the key frame V i (i=0, 1,., m) point cloud C i Point cloud C i Each point p in (a) n ∈C i (n=0, 1,..k) converting into the global coordinate system to achieve stitching of the point clouds, generating a point cloud map.
Drawings
Fig. 1 is a general logic schematic diagram of a point cloud map construction method for dynamic weighted fusion of multiple sensors of a special scene provided by an embodiment of the present invention;
fig. 2 is a step flowchart of a method for constructing a point cloud map by dynamically weighting and fusing multiple sensors of a special scene provided by an embodiment of the invention;
FIG. 3 is a flow chart showing the steps of step S1 according to the embodiment of the present invention;
FIG. 4 is a flowchart showing steps of step S2 of an embodiment of the present invention, wherein the steps are cycled from the start point to the time when the carrier enters the indoor environment;
FIG. 5 is a flowchart showing the steps of step S22 according to an embodiment of the present invention;
FIG. 6 is a step flow chart of step S2 of an embodiment of the present invention, wherein the step is cycled during operation of the carrier in an indoor environment;
fig. 7 is a flowchart showing steps performed in step S2 after the carrier is returned from the indoor environment to the outdoor environment according to the embodiment of the present invention;
FIG. 8 is a flowchart illustrating the steps of step S25 according to an embodiment of the present invention;
FIG. 9 is a step flow chart of step S3 according to an embodiment of the present invention;
FIG. 10 is a schematic diagram of a motion trajectory of a carrier according to an embodiment of the present invention;
FIG. 11 is a pose chart generated by a multi-sensor dynamic weighted fusion point cloud map construction method of a special scene provided by an embodiment of the invention;
FIG. 12 is a flowchart showing the steps of step S4 according to the embodiment of the present invention;
fig. 13 is an example of a point cloud map generated by a multi-sensor dynamic weighted fusion point cloud map construction method of a special scene according to an embodiment of the present invention.
Detailed Description
The drawings are for illustrative purposes only and are not to be construed as limiting the present patent;
It should be understood that the described embodiments are merely some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the embodiments of the present application, are within the scope of the embodiments of the present application.
The terminology used in the embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the embodiments of the application. As used in this application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present application as detailed in the accompanying claims. In the description of this application, it should be understood that the terms "first," "second," "third," and the like are used merely to distinguish between similar objects and are not necessarily used to describe a particular order or sequence, nor should they be construed to indicate or imply relative importance. The specific meaning of the terms in this application will be understood by those of ordinary skill in the art as the case may be.
Furthermore, in the description of the present application, unless otherwise indicated, "a plurality" means two or more. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. The invention is further illustrated in the following figures and examples.
In order to solve the limitations of the prior art, the present embodiment provides a technical solution, and the technical solution of the present invention is further described below with reference to the drawings and the embodiments.
A method for constructing a point cloud map by dynamically weighting and fusing multiple sensors in a special scene comprises the steps of carrying out data acquisition and laser scanning on a carrier provided with a laser radar, satellite positioning equipment and an inertial measurement unit in a preset indoor and outdoor environment, and constructing a point cloud map within the moving track range of the carrier according to the point cloud obtained by the laser scanning;
specifically, the carrier is started outdoors, the carrier enters the indoor environment of the signal shielding area after collecting outdoor environment data, and finally, the carrier is driven out of the indoor environment and returns to the outdoor environment again to form a closed loop.
Referring to fig. 1 and 2, the method for constructing the multi-sensor dynamic weighted fusion point cloud map of the special scene comprises the following steps:
s1, starting the carrier from the outside, adding the carrier pose at the starting point into a pose graph in a fixed vertex mode, taking the starting point as a coordinate origin, taking the east-north-sky direction as a reference direction, constructing a Cartesian rectangular coordinate system as a global coordinate system, and converting the subsequent satellite positioning data into the global coordinate system;
s2, when the carrier movement distance is greater than a set threshold value, marking a scanning result of the laser radar as a key frame, and adding a carrier pose corresponding to the key frame as a vertex to be solved into a pose graph; constructing laser mileage constraint according to an interframe scanning registration result of the laser radar, and adjusting dynamic weight of the laser mileage constraint according to point cloud registration coincidence degree; when the carrier is positioned in an outdoor environment, constructing satellite positioning constraint according to Euclidean distance between a satellite positioning result and the carrier position, and adjusting dynamic weight of the satellite positioning constraint according to a horizontal precision factor and laser mileage information; when the carrier is positioned in an indoor environment, calculating an included angle between an expected gravity direction and an actual gravity direction according to data of an inertial measurement unit and combining the carrier posture to construct posture constraint, and adjusting dynamic weight of the posture constraint according to the confidence coefficient of the carrier in a static or uniform state; when the carrier returns to the outdoor environment from the indoor environment, searching the historical key frame by utilizing a satellite positioning result to obtain a closed-loop candidate key frame, carrying out point cloud registration by utilizing the current key frame and a local sub-map taking the closed-loop candidate key frame as the center, constructing a laser closed-loop constraint according to the registration result, and adjusting the dynamic weight of the laser closed-loop constraint according to the point cloud registration coincidence degree;
S3, connecting variable vertexes related to each constraint according to constraint items comprising the laser mileage constraint, satellite positioning constraint, attitude constraint and laser closed-loop constraint and dynamic weights corresponding to the constraint items, constructing a pose graph, summing all constraint items in the pose graph to obtain a maximum posterior probability objective function related to a carrier track, and solving the pose graph by using a nonlinear optimization method;
and S4, splicing the point clouds of the historical key frames according to the solving result of the pose graph, and generating a point cloud map.
Compared with the prior art, the method and the device have the advantages that satellite positioning constraint is constructed by taking the Euclidean distance between the minimum carrier position and satellite positioning as a target under GNSS coverage, and the weight of the satellite positioning constraint is adjusted according to the laser odometer and the GNSS horizontal precision factor so as to reduce the influence of satellite positioning errors on a point cloud map; and constructing attitude constraint by taking an included angle between an expected gravity direction and an actual gravity direction formed by minimized carrier attitude and inertial measurement data as a target under a GNSS dead zone, and adjusting the weight of the carrier according to the confidence level of the carrier in a static or uniform state, thereby solving the problem of large accumulated error of the elevation of the point cloud map caused by lack of global position observation. And on the basis of a laser odometer, integrating the satellite positioning constraint, the attitude constraint, the laser odometer constraint and the closed-loop constraint in a pose diagram mode to obtain a maximum posterior probability objective function related to the carrier track. And solving the carrier pose obtained by the objective function to splice the laser radar key frame point cloud, so as to obtain the indoor and outdoor point cloud map which has small accumulated error, is consistent with a geographic coordinate system and has clear local detail. The method can be used for constructing the point cloud map of the special scene comprising the outdoor environment, the indoor environment and the signal shielding area, and realizes the alignment of the indoor and outdoor point cloud maps and a geographic coordinate system under the satellite positioning constraint and attitude constraint conditions constructed by the satellite positioning data and the inertial measurement data based on dynamic weights, thereby being beneficial to the implementation of the subsequent fusion positioning method.
Specifically, the constraint items comprise satellite positioning constraint constructed by positioning data acquired by the satellite positioning equipment, attitude constraint constructed by inertial measurement data acquired by the inertial measurement unit, laser mileage constraint constructed by registering point clouds obtained by laser radar laser scanning and laser closed-loop constraint.
The historical keyframes, i.e. keyframes obtained in the outdoor environment before the carrier enters the indoor environment from the outdoor environment, correspond to keyframes obtained when the carrier returns to the outdoor environment from the indoor environment.
The satellite positioning device refers to a measuring device for obtaining absolute positioning coordinates in a coordinate system by observing GNSS satellites. GNSS (Global Navigation Satellite System) is a generic term for all navigation positioning satellites, and any system that can achieve positioning by capturing and tracking its satellite signals can be included in the scope of GNSS systems, including GPS, beidou satellite navigation system (BDS), GLONASS satellite navigation system, GALILEO satellite navigation system, and so on.
The satellite positioning device can also combine with RTK (Real-time kinematic) carrier phase difference technology, wherein the RTK is a method for solving the difference and calculating the coordinates by processing the difference method of the carrier phase observed quantity of two measuring stations in Real time and sending the carrier phase acquired by the reference station to a user receiver.
The laser mileage is that the relative movement distance and the rotation angle of the carrier are calculated by the data of the front frame laser radar and the rear frame laser radar.
In the SLAM (simultaneous localization and mapping, instant localization and mapping) field, pose represents the transformation of the world coordinate system to the camera coordinate system, including rotation and translation; pose is essentially a transformation matrix, specifically a changing relationship between the world coordinate system and the camera coordinate system.
The Pose Graph (Pose Graph) only considers the Pose, a Graph optimization with only the track is constructed, the edges between Pose nodes are given an initial value by motion estimation obtained through feature matching between two key frames, once the initial value is completed, the position of a road mark point is not optimized any more, and only the relation between the Pose of the acquisition equipment is concerned.
In solving the technical problem that the solving of the carrier pose is inaccurate due to the fact that the constraint term of the objective function is introduced by the GNSS/RTK error under the outdoor environment, and finally, the point cloud map obtained by splicing according to the carrier pose cannot be aligned strictly, the embodiment of the invention can be practically regarded as providing a dynamic weight-based method, the influence of observation noise on the point cloud map is reduced by carrying out a real-time dynamic weighting method on the constraint term constructed by sensor observation, by taking GNSS as an example, the embodiment can weight the satellite positioning constraint according to the confidence of the satellite positioning information of the number of visible satellites under the space-time, and the weight of the satellite positioning constraint is dynamically changed because the number of the visible satellites under the space-time is dynamically changed.
In order to solve the problem that error accumulation caused by matching among laser scanning frames cannot be effectively eliminated due to lack of position observation of global coordinates in a signal shielding area in a special scene, and the accumulated error in the height direction of a final point cloud map is large, the invention provides a method for constructing carrier posture constraint by adopting inertial measurement units (Inertial Measurement Unit, IMU) data.
As a preferred embodiment, referring to fig. 3, the step S1 includes the following steps:
s11, selecting a position with good positioning signals in an outdoor environment as a starting point S of the carrier 0 Acquiring the longitude lambda of the starting point by using the satellite positioning equipment 0 Latitude phi 0 Height h 0 Heading angle psi 0 And will start at S 0 Carrier pose T at 0 Adding the fixed vertexes into the pose graph, and remembering laser radar scanning at the point of origin as a key frame V 0 ;
S12, according to the longitude, latitude, altitude and course angle, a Cartesian rectangular coordinate system is established as a global coordinate system by taking the starting point as an original point and taking the east-north-day as a reference direction: the longitude lambda of the starting point is firstly calculated 0 Latitude phi 0 Height h 0 Converting to ECEF coordinate system to obtain starting point S 0 Corresponding ECEF coordinate system coordinates (ECef x0 ,ecef y0 ,ecef z0 ) And calculates a rotation matrix corresponding to the longitude and latitude/>
S13, regarding longitude, latitude and altitude information (lambda) output by the satellite positioning device i ,φ i ,h i ) The coordinates are first converted to an ECEF coordinate system to obtain (ECef xi ,ecef yi ,ecef zi ) And then calculating according to the following formula to obtain a corresponding Cartesian rectangular coordinate system satellite positioning coordinate value:
further, referring to fig. 4, in the step S2, the following steps are circulated during the process from the starting point to the time when the carrier enters the indoor environment:
s21, extracting a key frame V from the data acquired by the laser radar according to a preset carrier relative movement distance threshold Dist i I represents a key frame number;
s22, for key frame V i Point cloud and keyframe V of (2) i-1 Performing point cloud registration to obtain corresponding carrier relative pose transformation delta T between two key frames i,i-1 And calculating the average Euclidean distance of the effective matching point pairs of the two frames of point clouds after registration Further construct laser mileage constraint->Transforming delta T by relative pose i,i-1 As pose increment, calculate key frame V i Pose T i As the initial value of the pose graph to be added later;
s23, in key frame V i Corresponding time stamp t i As a target synchronous time stamp, interpolating the satellite positioning data to obtain a key frame V i Time synchronized satellite positioning P i Horizontal precision factor HDOP i And according to the satellite positioning result P after the synchronization of the laser odometer with the interpolation i Confidence evaluation is performed to obtain confidence evaluation bel (P i ) Based on the satellite positioning P i Horizontal precision factor HDOP i Confidence estimation bel (P i ) Constructing satellite positioning constraints
Referring to fig. 5, the step S22 includes the following steps:
s221, for key frame V i Point cloud and key frame Vx of (a) -1 Performing point cloud registration to obtain corresponding carrier relative pose transformation delta T between two key frames i,i-1 ;
S222, calculating the average Euclidean distance of effective matching point pairs between the two frames of point clouds after registrationAccording to the relative pose transformation delta T of the carrier i,i-1 Construction laser mileage constraint->
Wherein, the L (·) function represents mapping pose data T= { R, T } formed by rotation and translation into a tangent space of SE (3) manifold space to obtain corresponding lie algebra Representation ofThe weight of the laser mileage constraint is represented by the average Euclidean distance of effective matching point pairs of two frames of point clouds after registration>The representation is:
s223, utilizing relative pose transformation DeltaT i,i-1 As pose increment, calculate key frame V i Pose T of (2) i Initial value (T) i-1 ΔT i,i-1 ) And pose T i As a subsequent addition to the pose graph and key frame V i-1 The vertexes are connected with the corresponding vertexes;
wherein, the step S23 includes the following steps:
s231, regarding the satellite positioning data and the key frame V i Interpolation synchronization is performed according to the time stamp, and the process is as follows:
key frame V is taken i Corresponding time stamp t i Marking as a target synchronous time stamp; checking whether the start data time stamp of the satellite positioning data queue is equal to the key frame V i Corresponding time stamp t i If yes, the satellite positioning data and the laser radar key frame V are considered i Time synchronization, taking the satellite positioning data as a key frame V i Satellite positioning P for time stamp synchronization i Otherwise, checking the satellite positioning data queue:
if the data queue starts a data timestampThe satellite positioning data time of the frame does not meet the interpolation requirement, the data is popped up from the queue, and the satellite positioning data queue is checked ;
If the time stamp of the 0 th and 1 st frame data in the satellite positioning data queue is less than t i The 0 th data is popped up from the queue, and the satellite positioning data queue is checked;
if the timestamp of the 1 st data of the data queueIndicating a temporally more critical frame V of the satellite positioning data i Corresponding time stamp t i Late, the interpolation fails to synchronize;
after the satellite positioning data queue is checked, if the data volume in the satellite positioning data queue is more than or equal to 2, the interpolation synchronization condition is met, and the time stamps of the 0 th and 1 st frame data in the satellite positioning data queue are taken Calculating interpolation scale factors s respectively f ,s b And interpolating satellite positioning information under the Cartesian coordinate system described in the step S13 according to the interpolation scale factor to obtain a key frame V i Satellite positioning P for time stamp synchronization i Horizontal precision factor HDOP i :
wherein ,representing the ith frame data in a satellite positioning data queue, HDOP i Representation and->A corresponding horizontal precision factor;
s232, synchronizing the interpolation value according to the laser mileage of S22 i Confidence evaluation was performed as follows:
by applying the laser radar key frame V in the step S22 i Point cloud and lidar keyframe V i-1 Is registered to obtain corresponding carrier relative pose transformation (delta T) between two key frames i,i-1 ) -1 As a carrier from t i-1 To t i Pose increment estimation of (a);
at t i-1 Carrier pose T at moment i-1 As a reference, deltat is estimated using the above-described pose increment i,i-1 T can be obtained i Satellite positioning prediction corresponding to time
wherein ,(·)[x,y,z] Representing position information in the pose data;
according to t i Satellite positioning prediction corresponding to timeSatellite positioning P synchronized with time actually acquired from GNSS/RTK receiver i Euclidean distance between P i Confidence assessment is performed: />
S233, according to the satellite positioning P after interpolation synchronization i May be constructed with respect to key frame V i Corresponding to the carrier pose T i Satellite positioning constraints of (a)
wherein ,the weight of the satellite positioning constraint is represented by a horizontal precision factor HDOP after interpolation synchronization i And evaluating bel (P) based on confidence of laser mileage i ) The composition is as follows:
further, referring to fig. 6, in the process of operating the carrier in the indoor environment, the step S2 loops the following steps:
s21, extracting a key frame V from the data acquired by the laser radar according to a preset carrier relative movement distance threshold Dist i I represents a key frame number;
S22, for key frame V i Point cloud and keyframe V of (2) i-1 Performing point cloud registration to obtain corresponding carrier relative pose transformation delta T between two key frames i,i-1 And calculating the average Euclidean distance of the effective matching point pairs of the two frames of point clouds after registrationFurther construct laser mileage constraint->Transforming delta T by relative pose i,i-1 As pose increment, calculate key frame V i Pose T i As the initial value of the pose graph to be added later;
s24, in key frame V i Corresponding time stamp t i As a target synchronization time stamp, interpolation synchronization is carried out on the inertial measurement data acquired by the inertial measurement unit, and a key frame V is obtained i Time-synchronized inertial measurement data I i I' i And further based on inertial measurement data I' i Confidence bel (I) of estimating whether the carrier is in stationary or uniform motion i ) And setting the identification bitBased on the inertial measurement data I i Confidence bel (I) of static or constant velocity state i ) Marking bit->Structural posture constraint->
The step S24 includes the steps of:
s241, gaussian weight interpolation is used on the inertial measurement data queue to obtain the key frame V i The time stamp synchronized inertial measurement data is processed as follows:
key frame V is taken i Corresponding time stamp t i Marking as a target synchronous time stamp;
searching for a target synchronization timestamp t in a time-ordered inertial measurement data queue using dichotomy i Inertial measurement data, defined as I, for the upper time bound k Respectively taking n frames of inertial measurement data before and after to form a candidate interpolation inertial measurement data sequence
Using gaussian probabilitiesInterpolation is carried out on the candidate interpolation inertial measurement data sequence by the density function, and a key frame V is obtained i Timestamp-synchronized inertial measurement data I i :
wherein ,wk Representing inertial measurement dataAccording to the weight of the inertia measurement data time stamp according to Gaussian probability density function calculation: />
In which the target sync time stamp t i For a gaussian distribution mean, a single inertia measurement period sigma acts as the gaussian distribution variance,a time stamp representing the kth frame data in the inertial measurement data queue;
at t i For target interpolation time, taking inertial measurement data and />Performing linear interpolation to obtain another and key frame V i Timestamp-synchronized inertial measurement data, denoted as I' i ;
S242, according to inertial measurement data I' i Confidence bel (I) of estimating whether the carrier is in stationary or uniform motion i ) And sets the identification bit
in the formula ,(·)[acc] The function represents the extraction of accelerometer data from inertial measurement data, G 0 A gravity vector representing the local area;
s243, according to the inertia measurement data I after interpolation synchronization g i Constructing a key frame V using gravity direction i Corresponding to the carrier pose T i Attitude constraints of (a)
in the formula ,(·)[r,p] The function represents the extraction of pitch and attitude angles from the estimated carrier pose data, (·) [acc] The function represents extracting accelerometer data from inertial measurement data, the H (·) function represents calculating an estimated gravity direction from carrier pitch angle, attitude angle and inertial measurement acceleration information,representing normalization of the local gravity vector to obtain the actual gravity direction vector,/>Weights representing the attitude constraints of the item, identified by the carrier motion state identification bit +.>And confidence bel (I) i ) The composition is as follows: />
Specifically, in an alternative embodiment, the relative movement distance threshold Dist of the carrier may be set to 1 meter, that is, the key frame is extracted once every 1 meter of movement of the carrier; however, since the extraction of the key frame mainly depends on the calculation result of the laser radar acquired data, the satellite positioning device and the inertial measurement unit do not necessarily output data in the time i, and therefore, the data in the time i can be obtained by a linear difference manner.
After the inertial measurement data at the moment i are obtained, the motion state of the carrier can be judged according to the angular velocity and the specific force in the inertial measurement data at the moment i, and when the angular velocity and the specific force are smaller than a preset threshold value, the carrier can be judged to be in a static or uniform state at the moment; when the motion state of the carrier is approximate to uniform motion or static state, the gesture constraint zone bit F G The effect is achieved; the gesture constraint G i Can be constructed according to the relation between acceleration and gravity direction in the inertia measurement data at the moment i, and takes the confidence level of carrier motion state detection as the weight of gesture constraintThe closer to the uniform motion or stationary state, the weight +.>The larger.
Further, referring to fig. 7, after the carrier returns from the indoor environment to the outdoor environment, the following steps are performed in step S2:
s21, extracting a key frame V from the data acquired by the laser radar according to a preset carrier relative movement distance threshold Dist i I represents a key frame number;
s25, using historical key frame V j Building a local sub-map for the center, for key frame V i Performing closed-loop scanning matching on the point cloud of the (E) and the local sub-map to obtain a key frame V i And historical key frame V j Corresponding carrier relative pose transformation delta T ij And calculateAverage Euclidean distance of effective matching point pairs of two-frame point clouds after registrationConstructing laser closed loop scanning constraints->
Still further, referring to fig. 8, the step S25 includes the following steps:
s251, utilizing key frame V i The corresponding satellite positioning data is used as a search target to search KD-Tree formed by all key frames V to search the key frames V i Historical key frame V with shortest Euclidean distance corresponding to carrier position j ;
S252, with historical key frame V j In the center, N key frames are respectively indexed from the historical key frame sequence to obtain candidate closed-loop key frame sequences [ V ] j-N ,…,V j ,…,V j+N ]And according to the corresponding carrier pose sequence [ T ] j-N ,…,T j ,…,T j+N ]And carrying out the following coordinate transformation on the point cloud in the candidate closed-loop key frame sequence to construct a local sub-map:
in the formula ,representing map points in a local sub-map, +.>Representing historical key frame V in candidate closed-loop key frame sequence j Is { R j ,t j Respectively representing rotation matrix and position translation quantity, and jointly forming historical key frame V j Corresponding carrier pose T j ;
S253, key frame V i Point cloud of (a)The local sub-map carries out point cloud registration and calculates the average Euclidean distance of effective matching point pairs of two frames of point clouds after registrationSetting a laser closed loop identification bit according to the Euclidean distance>
S254, according to the key frame V i Key frame V obtained by carrying out point cloud registration on point cloud of (1) and local sub-map i And historical key frame V j Corresponding carrier relative pose observations deltat ji Construction concerning carrier pose T i and Tj Laser closed loop scanning confinement of (2)
Where the L (·) function represents mapping pose data T= { R, T } consisting of rotation and translation to SE (3) manifold space, Representing the weight of the laser closed loop scanning constraint, and determining the average Euclidean distance of the effective matching point pair of the two frames of point clouds after registration>The representation is:
specifically, KD-Tree is an abbreviation for K-dimension Tree, which is a Tree data structure that stores instance points in K-Dimensional space for quick retrieval. The method is mainly applied to searching of multidimensional space key data, such as range searching, nearest neighbor searching and the like.
Further, referring to fig. 9, the step S3 includes the following steps:
s31, sequentially connecting the poses of the key frames serving as vertexes to form a trunk part of a pose graph;
s32, connecting among vertexes in the pose graph according to the constraint relation of the laser mileage constraint and the laser closed-loop constraint, and distinguishing the weight of the constraint according to different thicknesses;
and S33, connecting corresponding vertexes in the pose graph according to the satellite positioning constraint and the pose constraint, and distinguishing the weight of the constraint according to different thicknesses.
Specifically, the carrier is started outdoors, the carrier enters the indoor environment of the signal shielding area after collecting outdoor environment data, finally leaves from the indoor environment and returns to the outdoor environment again to form a closed-loop track as shown in fig. 10, wherein the vertex V 2 And vertex V k The space is denoted as ellipsis.
Referring to FIG. 11, the pose graph is mainly composed of vertices and edges, vertex V k For the corresponding carrier pose T at the kth moment (i.e. key frame) k The method comprises the steps of carrying out a first treatment on the surface of the Binary edge E ij Is the two vertexes { V } in the pose graph i ,V j Observing the corresponding relative pose, wherein the relative pose comprises the laser mileage constraint and the laser closed-loop constraint; unitary edge E ii Is the vertex V in the pose graph i Including the satellite positioning constraints and attitude constraints. Dark solid lines on vertices in the figure represent satellite positioning constraints, gray solid lines represent attitude constraints, black dashed lines between connected vertices represent laser mileage constraints, gray dashed lines between connected vertices represent laser closed-loop constraints, and different thicknesses represent different weights.
Further, referring to fig. 9, the step S3 further includes the following steps:
s34, summing all constraints in the pose graph to obtain a maximum posterior probability objective function related to the carrier track, wherein variables to be solved of the maximum posterior probability objective function are carrier poses corresponding to all historical key frames;
s35, solving the objective function to obtain carrier pose T corresponding to each vertex in the pose graph * 。
Further, the objective function is expressed as follows:
specifically, the parameter T represents the carrier pose corresponding to each key frame, the first summation item is a GNSS constraint error item, the second summation item is a pose constraint error item, the third summation item is a laser mileage constraint error item, and the fourth summation item is a closed loop constraint error item. T (T) * And representing the carrier pose corresponding to all vertexes in the pose graph, namely the parameters to be solved.
The objective function is essentially a nonlinear least squares model, which can be solved by using nonlinear optimization methods such as Gaussian-Newton, levenberg-Marquardt and the like to obtain carrier pose T corresponding to each vertex in the pose graph * . The objective of the optimization is to minimize the overall objective function by adjusting the parameter T.
Further, referring to fig. 12, the step S4 includes the following steps:
s41, obtaining carrier pose T corresponding to each key frame according to the pose graph solving result i ;
S42, regarding the key frame V i (i=0, 1,., m) point cloud C i Point cloud C i Each point p in (a) n ∈C i (n=0, 1,..k) conversion into the global coordinate system enables stitching of a point cloudAnd generating a point cloud map.
Specifically, in the step S42, the point p is converted into the following equation n Conversion into the global coordinate system:
wherein ,representing a point of transition to a global map, R i and ti Form carrier pose T together i Representing the rotational and translational transformation of the carrier coordinate system to the global coordinate system, respectively.
Finally, the indoor and outdoor point cloud map generated in this embodiment is shown in fig. 13, where the upper right corner is a partial enlargement of the indoor portion.
It is to be understood that the above examples of the present invention are provided by way of illustration only and not by way of limitation of the embodiments of the present invention. Other variations or modifications of the above teachings will be apparent to those of ordinary skill in the art. It is not necessary here nor is it exhaustive of all embodiments. Any modification, equivalent replacement, improvement, etc. which come within the spirit and principles of the invention are desired to be protected by the following claims.
Claims (10)
1. The method is characterized in that the method carries out data acquisition and laser scanning in a preset indoor and outdoor environment through a carrier provided with a laser radar, satellite positioning equipment and an inertial measurement unit, and a point cloud map in the moving track range of the carrier is constructed according to the point cloud obtained by the laser scanning; the method comprises the following steps:
S1, starting the carrier from the outside, and starting the starting point S 0 The carrier pose at the position is added into the pose graph in a fixed vertex mode and is used as a starting point S 0 For origin of coordinates, in the eastConstructing a Cartesian rectangular coordinate system as a global coordinate system by taking the north direction and the sky direction as reference directions, and converting subsequent satellite positioning data into the global coordinate system;
s2, when the carrier movement distance is greater than a set threshold value, marking a scanning result of the laser radar as a key frame, and adding a carrier pose corresponding to the key frame as a vertex to be solved into a pose graph; constructing laser mileage constraint according to an interframe scanning registration result of the laser radar, and adjusting dynamic weight of the laser mileage constraint according to point cloud registration coincidence degree; when the carrier is positioned in an outdoor environment, constructing satellite positioning constraint according to Euclidean distance between a satellite positioning result and the carrier position, and adjusting dynamic weight of the satellite positioning constraint according to a horizontal precision factor and laser mileage information; when the carrier is positioned in an indoor environment, calculating an included angle between an expected gravity direction and an actual gravity direction according to data of an inertial measurement unit and combining the carrier posture to construct posture constraint, and adjusting dynamic weight of the posture constraint according to the confidence coefficient of the carrier in a static or uniform state; when the carrier returns to the outdoor environment from the indoor environment, searching the historical key frame by utilizing a satellite positioning result to obtain a closed-loop candidate key frame, carrying out point cloud registration by utilizing the current key frame and a local sub-map taking the closed-loop candidate key frame as the center, constructing a laser closed-loop constraint according to the registration result, and adjusting the dynamic weight of the laser closed-loop constraint according to the point cloud registration coincidence degree;
S3, connecting variable vertexes related to each constraint according to constraint items comprising the laser mileage constraint, satellite positioning constraint, attitude constraint and laser closed-loop constraint and dynamic weights corresponding to the constraint items, constructing a pose graph, summing all constraint items in the pose graph to obtain a maximum posterior probability objective function related to a carrier track, and solving the pose graph by using a nonlinear optimization method;
and S4, splicing the point clouds of the historical key frames according to the solving result of the pose graph, and generating a point cloud map.
2. The method for constructing the point cloud map by dynamically weighted fusion of multiple sensors of the special scene according to claim 1, wherein the step S1 comprises the following steps:
s11, selecting a position with good positioning signals in an outdoor environment as a starting point S of the carrier 0 The satellite positioning equipment is used for acquiring the starting point S 0 Is lambda of the longitude of (1) 0 Latitude phi 0 Height h 0 Heading angle psi 0 And will start at S 0 Carrier pose T at 0 Added to the pose graph in the form of fixed vertices, recall point S 0 Scanning laser radar into key frame V 0 ;
S12, according to the longitude, latitude, altitude and course angle, a Cartesian rectangular coordinate system is established as a global coordinate system by taking the starting point as an original point and taking the east-north-day as a reference direction: the longitude lambda of the starting point is firstly calculated 0 Latitude phi 0 Height h 0 Converting to ECEF coordinate system to obtain starting point S 0 Corresponding ECEF coordinate system coordinates (ECef x0 ,ecef y0 ,ecef z0 ) And calculates a rotation matrix corresponding to the longitude and latitude
S13, regarding longitude, latitude and altitude information (lambda) output by the satellite positioning device i ,φ i ,h i ) The coordinates are first converted to an ECEF coordinate system to obtain (ECef xi ,ecef yi ,ecef zi ) And then calculating according to the following formula to obtain a corresponding Cartesian rectangular coordinate system satellite positioning coordinate value:
3. the method for constructing a point cloud map by dynamically weighted fusion of multiple sensors of a special scene according to claim 2, wherein the step S2 is to circulate the following steps in the process before the carrier starts from the starting point to enter an indoor environment:
s21, extracting a key frame V from the data acquired by the laser radar according to a preset carrier relative movement distance threshold Dist i I represents a key frame number;
s22, for key frame V i Point cloud and keyframe V of (2) i-1 Performing point cloud registration to obtain corresponding carrier relative pose transformation delta T between two key frames i,i-1 And calculating the average Euclidean distance of the effective matching point pairs of the two frames of point clouds after registrationFurther construct laser mileage constraint->Transforming delta T by the relative pose of the carrier i,i-1 As pose increment, calculate key frame V i Carrier pose T of (2) i As the initial value of the pose graph to be added later;
s23, in key frame V i Corresponding time stamp t i As a target synchronous time stamp, interpolating the satellite positioning data to obtain a key frame V i Time synchronized satellite positioning P i Horizontal precision factor HDOP i And according to the satellite positioning result P after the synchronization of the laser odometer with the interpolation i Confidence evaluation is performed to obtain confidence evaluation bel (P i ) Based on the satellite positioning P i Horizontal precision factor HDOP i Confidence estimation bel (P i ) Constructing satellite positioning constraints
Wherein, the step S22 includes the following steps:
s221, for key frame V i Point cloud and keyframe V of (2) i-1 Performing point cloud registration to obtain corresponding carrier relative pose transformation delta T between two key frames i,i-1 ;
S222, calculating the average Euclidean distance of effective matching point pairs between the two frames of point clouds after registrationAccording to the relative pose transformation delta T of the carrier i,i-1 Construction laser mileage constraint->
Wherein, the L (·) function represents mapping pose data T= { R, T } formed by rotation and translation into a tangent space of SE (3) manifold space to obtain corresponding lie algebraRepresenting the weight of the laser mileage constraint, and determining the average Euclidean distance of effective matching point pairs of two frames of point clouds after registration >The representation is:
s223, transforming delta T by using the carrier relative pose i,i-1 As pose increment, calculate key frame V i Pose T of (2) i Initial value (T) i-1 ΔT i,i-1 ) And pose T i As a subsequent addition to the siteKey frame V in gesture i-1 The vertexes are connected with the corresponding vertexes;
wherein, the step S23 includes the following steps:
s231, regarding the satellite positioning data and the key frame V i Interpolation synchronization is performed according to the time stamp, and the process is as follows:
key frame V is taken i Corresponding time stamp t i Marking as a target synchronous time stamp; checking whether the start data time stamp of the satellite positioning data queue is equal to the key frame V i Corresponding time stamp t i If yes, the satellite positioning data and the laser radar key frame V are considered i Time synchronization, taking the satellite positioning data as a key frame V i Satellite positioning P for time stamp synchronization i Otherwise, checking the satellite positioning data queue:
If the data queue starts a data timestampThe satellite positioning data time of the frame does not meet the interpolation requirement, the data is popped up from the queue, and the satellite positioning data queue is checked;
if the time stamps of the 0 th and 1 st frame data in the satellite positioning data queue <t i The 0 th data is popped up from the queue, and the satellite positioning data queue is checked;
if the timestamp of the 1 st data of the data queueIndicating a temporally more critical frame V of the satellite positioning data i Corresponding time stamp t i Late, the interpolation fails to synchronize;
after the satellite positioning data queue is checked, if the data volume in the satellite positioning data queue is not less than2, satisfying the interpolation synchronization condition, and taking the time stamp of the 0 th and 1 st frame data in the satellite positioning data queue Calculating interpolation scale factors s respectively f ,s b And interpolating satellite positioning information under the Cartesian coordinate system described in the step S13 according to the interpolation scale factor to obtain a key frame V i Satellite positioning P for time stamp synchronization i Horizontal precision factor HDOP i :
wherein ,representing the ith frame data in a satellite positioning data queue, HDOP i Representation and->A corresponding horizontal precision factor;
s232, synchronizing the interpolation value of the satellite positioning knot according to the laser mileage of S22Fruit P i Confidence evaluation was performed as follows:
by applying the laser radar key frame V in the step S22 i Point cloud and lidar keyframe V i-1 Is registered to obtain corresponding carrier relative pose transformation (delta T) between two key frames i,i-1 ) -1 As a carrier from t i-1 To t i Pose increment estimation of (a);
at t i-1 Carrier pose T at moment i-1 As a reference, deltat is estimated using the above-described pose increment i,i-1 T can be obtained i Satellite positioning prediction corresponding to time
wherein ,(·)[x,y,z] Representing position information in the pose data;
according to t i Satellite positioning prediction corresponding to timeSatellite positioning P synchronized with time actually acquired from GNSS/RTK receiver i Euclidean distance between P i Confidence assessment is performed:
s233, according to the satellite positioning P after interpolation synchronization i May be constructed with respect to key frame V i Corresponding to the carrier pose T i Satellite positioning constraints of (a)
wherein ,the weight of the satellite positioning constraint is represented by a horizontal precision factor HDOP after interpolation synchronization i And evaluating bel (P) based on confidence of laser mileage i ) The composition is as follows:
4. the method for constructing a point cloud map by dynamically weighting and fusing multiple sensors in a special scene according to claim 3, wherein the step S2 is to circulate the following steps in the process that the carrier is operated in an indoor environment:
s21, extracting a key frame V from the data acquired by the laser radar according to a preset carrier relative movement distance threshold Dist i I represents a key frame number;
S22, for key frame V i Point cloud and keyframe V of (2) i-1 Performing point cloud registration to obtain corresponding carrier relative pose transformation delta T between two key frames i,i-1 And calculating the average Euclidean distance of the effective matching point pairs of the two frames of point clouds after registrationFurther construct laser mileage constraint->Transforming delta T by relative pose i,i-1 As pose increment, calculate key frame V i Pose T i As the initial value of the pose graph to be added later;
s24, in key frame V i Corresponding timeStamp t i As a target synchronization time stamp, interpolation synchronization is carried out on the inertial measurement data acquired by the inertial measurement unit, and a key frame V is obtained i Time-synchronized inertial measurement data I i I' i And further based on inertial measurement data I' i Confidence bel (I) of estimating whether the carrier is in stationary or uniform motion i ) And setting the identification bitBased on the inertial measurement data I i Confidence bel (I) of static or constant velocity state i ) Marking bit->Structural posture constraint->
The step S24 includes the steps of:
s241, gaussian weight interpolation is used on the inertial measurement data queue to obtain the key frame V i The time stamp synchronized inertial measurement data is processed as follows:
key frame V is taken i Corresponding time stamp t i Marking as a target synchronous time stamp;
searching for a target synchronization timestamp t in a time-ordered inertial measurement data queue using dichotomy i Inertial measurement data, defined as I, for the upper time bound k Respectively taking n frames of inertial measurement data before and after to form a candidate interpolation inertial measurement data sequence
Interpolation of candidate interpolated inertial measurement data sequences using a gaussian probability density function to obtain a key frame V i Timestamp-synchronized inertial measurement data I i :
wherein ,wk Representing inertial measurement dataAccording to the weight of the inertia measurement data time stamp according to Gaussian probability density function calculation:
in which the target sync time stamp t i For a gaussian distribution mean, a single inertia measurement period sigma acts as the gaussian distribution variance,a time stamp representing the kth frame data in the inertial measurement data queue;
at t i For target interpolation time, taking inertial measurement data and />Performing linear interpolation to obtain another and key frame V i Timestamp-synchronized inertial measurement data, denoted as I' i ;
S242, according to inertial measurement data I' i Confidence bel (I) of estimating whether the carrier is in stationary or uniform motion i ) And sets the identification bit/>
in the formula ,(·)[acc] The function represents the extraction of accelerometer data from inertial measurement data, G 0 A gravity vector representing the local area;
s243, according to the inertia measurement data I after interpolation synchronization g i Constructing a key frame V using gravity direction i Corresponding to the carrier pose T i Attitude constraints of (a)
in the formula ,(·)[r,p] The function represents the extraction of pitch and attitude angles from the estimated carrier pose data, (·) [acc] The function represents extracting accelerometer data from inertial measurement data, the H (·) function represents calculating an estimated gravity direction from carrier pitch angle, attitude angle and inertial measurement acceleration information,representing normalization of the local gravity vector to obtain the actual gravity direction vector,/>Weights representing the attitude constraints of the item, identified by the carrier motion state identification bit +.>And confidence bel (I) i ) The composition is as follows: />
5. The method for constructing a point cloud map by dynamically weighted fusion of multiple sensors in a special scene according to claim 4, wherein the step S2 is performed after the carrier returns from the indoor environment to the outdoor environment by:
s21, extracting a key frame V from the data acquired by the laser radar according to a preset carrier relative movement distance threshold Dist i I represents a key frame number;
s25, using historical key frame V j Building a local sub-map for the center, for key frame V i Performing closed-loop scanning matching on the point cloud of the (E) and the local sub-map to obtain a key frame V i And historical key frame V j Corresponding carrier relative pose transformation delta T ij And calculating the average Euclidean distance of the effective matching point pairs of the two frames of point clouds after registrationConstructing laser closed loop scanning constraints->
6. The method for constructing a point cloud map by dynamically weighted fusion of multiple sensors of a special scene according to claim 5, wherein the step S25 comprises the steps of:
s251, utilizing key frame V i The corresponding satellite positioning data is used as a search target to search KD-Tree formed by all key frames V to search the key frames V i Historical key frame V with shortest Euclidean distance corresponding to carrier position j ;
S252, with historical key frame V j In the center, N key frames are respectively indexed from the historical key frame sequence to obtain candidate closed-loop key frame sequences [ V ] j-N ,…,V j ,…,V j+N ]And according to the corresponding carrier pose sequence [ T ] j-N ,…,T j ,…,T j+N ]For candidate closed loopThe point cloud in the key frame sequence is subjected to the following coordinate transformation to construct a local sub-map:
in the formula ,representing map points in a local sub-map, +.>Representing historical key frame V in candidate closed-loop key frame sequence j Is { R j ,t j Respectively representing rotation matrix and position translation quantity, and jointly forming historical key frame V j Corresponding carrier pose T j ;
S253, key frame V i Performing point cloud registration on the point cloud of the (2) and the local sub-map, and calculating the average Euclidean distance of effective matching point pairs of the two frames of point clouds after registrationSetting a laser closed loop identification bit according to the Euclidean distance>
S254, according to the key frame V i Key frame V obtained by carrying out point cloud registration on point cloud of (1) and local sub-map i And historical key frame V j Corresponding carrier relative pose observations deltat ji Construction concerning carrier pose T i and Tj Laser closed loop scanning confinement of (2)
Where the L (·) function represents mapping pose data T= { R, T } consisting of rotation and translation to SE (3) manifold space,representing the weight of the laser closed loop scanning constraint, and determining the average Euclidean distance of the effective matching point pair of the two frames of point clouds after registration>The representation is:
7. the method for constructing a point cloud map by dynamically weighted fusion of multiple sensors of a special scene according to claim 5 or 6, wherein the step S3 comprises the following steps:
s31, sequentially connecting the poses of the key frames serving as vertexes to form a trunk part of a pose graph;
s32, connecting among vertexes in the pose graph according to the constraint relation of the laser mileage constraint and the laser closed-loop constraint, and distinguishing the weight of the constraint according to different thicknesses;
And S33, connecting corresponding vertexes in the pose graph according to the satellite positioning constraint and the pose constraint, and distinguishing the weight of the constraint according to different thicknesses.
8. The method for constructing a point cloud map by dynamically weighting and fusing multiple sensors of a special scene according to claim 7, wherein the step S3 further comprises the following steps:
s34, summing all constraints in the pose graph to obtain a maximum posterior probability objective function related to the carrier track, wherein variables to be solved of the maximum posterior probability objective function are carrier poses corresponding to all historical key frames;
s35, solving the objective function to obtain carrier pose T corresponding to each vertex in the pose graph * 。
10. the method for constructing a point cloud map by dynamically weighting and fusing multiple sensors of a special scene according to claim 7, wherein the step S4 comprises the following steps:
s41, obtaining carrier pose T corresponding to each key frame according to the pose graph solving result i ;
S42, regarding the key frame V i (i=0, 1,., m) point cloud C i Point cloud C i Each point p in (a) n ∈C i (n=0, 1,..k) converting into the global coordinate system to achieve stitching of the point clouds, generating a point cloud map.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110296526.9A CN112950781B (en) | 2021-03-19 | 2021-03-19 | Multi-sensor dynamic weighting fusion point cloud map construction method for special scene |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110296526.9A CN112950781B (en) | 2021-03-19 | 2021-03-19 | Multi-sensor dynamic weighting fusion point cloud map construction method for special scene |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112950781A CN112950781A (en) | 2021-06-11 |
CN112950781B true CN112950781B (en) | 2023-04-25 |
Family
ID=76226933
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110296526.9A Active CN112950781B (en) | 2021-03-19 | 2021-03-19 | Multi-sensor dynamic weighting fusion point cloud map construction method for special scene |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112950781B (en) |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113340296B (en) * | 2021-06-21 | 2024-04-09 | 上海仙工智能科技有限公司 | Method and device for automatically updating mobile robot map |
CN113516750B (en) * | 2021-06-30 | 2022-09-27 | 同济大学 | Three-dimensional point cloud map construction method and system, electronic equipment and storage medium |
CN113534227B (en) * | 2021-07-26 | 2022-07-01 | 中国电子科技集团公司第五十四研究所 | Multi-sensor fusion absolute positioning method suitable for complex non-cooperative scene |
CN113628335A (en) * | 2021-07-28 | 2021-11-09 | 深圳优艾智合机器人科技有限公司 | Point cloud map construction method and device and computer readable storage medium |
CN113538410B (en) * | 2021-08-06 | 2022-05-20 | 广东工业大学 | Indoor SLAM mapping method based on 3D laser radar and UWB |
CN113850864B (en) * | 2021-09-14 | 2024-04-12 | 中南大学 | GNSS/LIDAR loop detection method for outdoor mobile robot |
CN113819912A (en) * | 2021-09-30 | 2021-12-21 | 中科测试(深圳)有限责任公司 | High-precision point cloud map generation method based on multi-sensor data |
CN114838726A (en) * | 2022-04-20 | 2022-08-02 | 哈尔滨理工大学 | GPS data correction algorithm based on multi-sensor data fusion |
WO2024001339A1 (en) * | 2022-07-01 | 2024-01-04 | 华为云计算技术有限公司 | Pose determination method and apparatus, and computing device |
CN115631314B (en) * | 2022-12-19 | 2023-06-09 | 中汽研(天津)汽车工程研究院有限公司 | Point cloud map construction method based on multi-feature and self-adaptive key frames |
CN116222544B (en) * | 2023-05-09 | 2023-08-04 | 浙江大学湖州研究院 | Automatic navigation and positioning method and device for feeding vehicle facing to feeding farm |
CN116772887B (en) * | 2023-08-25 | 2023-11-14 | 北京斯年智驾科技有限公司 | Vehicle course initialization method, system, device and readable storage medium |
CN117268373B (en) * | 2023-11-21 | 2024-02-13 | 武汉大学 | Autonomous navigation method and system for multi-sensor information fusion |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111815765A (en) * | 2020-07-21 | 2020-10-23 | 西北工业大学 | Heterogeneous data fusion-based image three-dimensional reconstruction method |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018026544A1 (en) * | 2016-07-22 | 2018-02-08 | Regents Of The University Of Minnesota | Square-root multi-state constraint kalman filter for vision-aided inertial navigation system |
CN106599108B (en) * | 2016-11-30 | 2019-12-31 | 浙江大学 | Method for constructing multi-modal environment map in three-dimensional environment |
KR102647351B1 (en) * | 2017-01-26 | 2024-03-13 | 삼성전자주식회사 | Modeling method and modeling apparatus using 3d point cloud |
CN111210518B (en) * | 2020-01-15 | 2022-04-05 | 西安交通大学 | Topological map generation method based on visual fusion landmark |
CN112086010B (en) * | 2020-09-03 | 2022-03-18 | 中国第一汽车股份有限公司 | Map generation method, map generation device, map generation equipment and storage medium |
CN112419501A (en) * | 2020-12-10 | 2021-02-26 | 中山大学 | Method for constructing geospatial heterogeneous collaborative map |
CN112507056B (en) * | 2020-12-21 | 2023-03-21 | 华南理工大学 | Map construction method based on visual semantic information |
-
2021
- 2021-03-19 CN CN202110296526.9A patent/CN112950781B/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111815765A (en) * | 2020-07-21 | 2020-10-23 | 西北工业大学 | Heterogeneous data fusion-based image three-dimensional reconstruction method |
Also Published As
Publication number | Publication date |
---|---|
CN112950781A (en) | 2021-06-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112950781B (en) | Multi-sensor dynamic weighting fusion point cloud map construction method for special scene | |
CN110118549B (en) | Multi-source information fusion positioning method and device | |
Maaref et al. | Lane-level localization and mapping in GNSS-challenged environments by fusing lidar data and cellular pseudoranges | |
JP6002126B2 (en) | Method and apparatus for image-based positioning | |
CN110044354A (en) | A kind of binocular vision indoor positioning and build drawing method and device | |
JP5589900B2 (en) | Local map generation device, global map generation device, and program | |
CN110243358A (en) | The unmanned vehicle indoor and outdoor localization method and system of multi-source fusion | |
CN110187375A (en) | A kind of method and device improving positioning accuracy based on SLAM positioning result | |
CN105352509A (en) | Unmanned aerial vehicle motion target tracking and positioning method under geographic information space-time constraint | |
CN109443348A (en) | It is a kind of based on the underground garage warehouse compartment tracking for looking around vision and inertial navigation fusion | |
CN110232683A (en) | A kind of landslide detection method based on unmanned plane point cloud | |
CN109596121A (en) | A kind of motor-driven station Automatic Targets and space-location method | |
Ruotsalainen | Vision-aided pedestrian navigation for challenging GNSS environments | |
Dill et al. | Seamless indoor-outdoor navigation for unmanned multi-sensor aerial platforms | |
CN108613675B (en) | Low-cost unmanned aerial vehicle movement measurement method and system | |
CN112525197A (en) | Ultra-wideband inertial navigation fusion pose estimation method based on graph optimization algorithm | |
Kaijaluoto | Precise indoor localization for mobile laser scanner | |
CN115183762A (en) | Airport warehouse inside and outside mapping method, system, electronic equipment and medium | |
Grejner-Brzezinska et al. | From Mobile Mapping to Telegeoinformatics | |
Xian et al. | Fusing stereo camera and low-cost inertial measurement unit for autonomous navigation in a tightly-coupled approach | |
Zahedian et al. | Localization of autonomous vehicles: proof of concept for a computer vision approach | |
Ellum et al. | Land-based integrated systems for mapping and GIS applications | |
Rodarmel et al. | Rigorous error modeling for sUAS acquired image-derived point clouds | |
Zhang et al. | OW-LOAM: Observation-weighted LiDAR odometry and mapping | |
Nielsen et al. | Development and flight test of a robust optical-inertial navigation system using low-cost sensors |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |