CN111612829B - High-precision map construction method, system, terminal and storage medium - Google Patents

High-precision map construction method, system, terminal and storage medium Download PDF

Info

Publication number
CN111612829B
CN111612829B CN202010495762.9A CN202010495762A CN111612829B CN 111612829 B CN111612829 B CN 111612829B CN 202010495762 A CN202010495762 A CN 202010495762A CN 111612829 B CN111612829 B CN 111612829B
Authority
CN
China
Prior art keywords
point
point cloud
points
precision map
ground
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010495762.9A
Other languages
Chinese (zh)
Other versions
CN111612829A (en
Inventor
王凡
唐锐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zongmu Technology Shanghai Co Ltd
Original Assignee
Zongmu Technology Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zongmu Technology Shanghai Co Ltd filed Critical Zongmu Technology Shanghai Co Ltd
Priority to CN202010495762.9A priority Critical patent/CN111612829B/en
Publication of CN111612829A publication Critical patent/CN111612829A/en
Application granted granted Critical
Publication of CN111612829B publication Critical patent/CN111612829B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Navigation (AREA)

Abstract

The invention provides a method, a system, a terminal and a storage medium for constructing a high-precision map, which comprise the following steps: s01: acquiring radar scanning point clouds, registering the point clouds pairwise, generating reconstructed point clouds, and removing point cloud distortion generated in the radar movement process by using an inertial measurement unit; s02: extracting characteristic points of each frame point cloud, and carrying out association matching on the characteristic points of adjacent frames; acquiring the transformation of radar attitude in the acquisition point cloud interval time; s03: and extracting ground points in the three-dimensional point cloud to form a ground feature set to generate a high-precision map. The characteristic point extraction does not adopt the existing characteristic, but extracts the corner and plane characteristics according to the curvature; and removing point cloud distortion by using an inertial measurement unit, extracting characteristic points, performing characteristic point association matching, and mapping the characteristics with a certain distance from the ground and a certain height onto the ground plane to form a high-precision map.

Description

High-precision map construction method, system, terminal and storage medium
Technical Field
The invention relates to the technical field of automobile electronics, in particular to a method, a system, a terminal and a storage medium for constructing a high-precision map.
Background
The fine map is used as an electronic map and comprises space vector data and attribute information, wherein the space vector data is a carrier of the attribute information of the electronic map. The traditional electronic map manufacturing method adopts a method for abstracting and extracting space vector data based on raster data or utilizes a GPS and a robot positioning and tracking device to record and collect the space position and visual field information of a region to process and produce space vector data. Such fine maps do not meet the requirements of L4 or even L5 level autopilot.
The high-precision map not only contains space vector data but also contains a plurality of semantic information, the map can report meanings of different colors on traffic lights, the speed limit of a road and the position of a left-turn lane can be indicated, one of important characteristics of the high-precision map is precision, navigation on a mobile phone can only reach meter-level precision, and the high-precision map can reach centimeter-level precision, so that the high-precision map is very important for an unmanned vehicle. Maintaining these maps updated is a significant task and the survey fleet needs to continually verify and update high-precision maps. In addition, these map accuracies can reach several centimeters, which is the highest level of map making accuracy. The high-precision map is specially designed for the unmanned vehicle and comprises road definition, intersections, traffic signals, lane rules and other elements for automobile navigation.
Disclosure of Invention
In order to solve the above and other potential technical problems, the present invention provides a method, a system, a terminal and a storage medium for constructing a high-precision map, wherein feature point extraction does not adopt existing features, but extracts corner and plane features according to curvature; and removing point cloud distortion by using an inertial measurement unit, extracting characteristic points, performing characteristic point association matching, and mapping the characteristics with a certain distance from the ground and a certain height onto the ground plane to form a high-precision map.
A method for constructing a high-precision map comprises the following steps:
s01: acquiring radar scanning point clouds, registering the point clouds pairwise, generating reconstructed point clouds, and removing point cloud distortion generated in the radar movement process by using an inertial measurement unit;
s02: extracting characteristic points of each frame point cloud, and carrying out association matching on the characteristic points of adjacent frames; acquiring the transformation of radar attitude in the acquisition point cloud interval time;
s03: and extracting ground points in the three-dimensional point cloud to form a ground feature set to generate a high-precision map.
Further, the specific method for acquiring the radar scanning point cloud in step S01 is as follows:
s011: according to the requirements, whether the combined navigation is needed to be added for drawing is judged, and if no special requirements exist, the laser radar and the video are preferably ensured to be synchronous;
s012: after the wire connection is completed, a board power supply and a laser radar power supply are required to be turned on, and whether the synchronous signals are received or not is confirmed. The computer is connected with the network port of the laser radar box through a network cable, the IP of the computer is manually set, the wireless is opened, the corresponding network card is selected, and the port is screened;
s013: the integrated navigation and the laser radar are synchronized, the self-contained GPS module of the laser radar is connected to the laser radar box, and the GPS time of the port is real GPS time. The integrated navigation needs to be set as follows: 1. measuring 3D distances among the IMU, the GNSS antenna and the laser radar equipment; 2. initializing a base station and acquiring the accurate position of the base station; after confirming synchronization, the path of the sweep needs to be planned. The parking lot with smaller size can collect data at one time, such as a parking lot under the ground of Naxian 800, and can plan a path according to Naxian 800; for a large parking lot, the collection of all the parking lots is difficult, and the parking lots can be collected by a branching section, but the repeated paths among the sections are ensured, so that the later splicing is facilitated;
s014: the video data recording method comprises the steps that a board needs to be connected with 5 paths of cameras, a screen is connected with the board, the board needs to enter a ubuntu system or a ubuntu virtual machine, after the adb is connected successfully, a command line is entered to sequentially input commands to start recording, if data recording is stopped, and the following commands are input in the same command line as the previous step. Problems that may occur when recording is started and solutions: display connection errors: checking that the screen power line is connected with the HDMI line; camera connection timeout: checking the connection sequence and the stability of the cameras; sh start and stop failure: restarting the board.
S015: the radar data is recorded, the speed is required to be controlled to be about 10km/h, veloview is opened, a small laser radar icon is selected, a drive selection interface is popped up, a corresponding drive is selected according to the laser radar, ok is clicked, and a real-time point cloud can be displayed; and (3) starting recording the clicked record, after the laser radar is successfully connected, displaying red color on the button, and in an available state, clicking to select a storage path, stopping recording, and clicking the button again.
S016: and (3) combined navigation recording: 1) And (3) equipment installation: a. the integrated navigation device is horizontally arranged on a bracket of a trunk of the automobile and is fixed. b. The combined navigation and GNSS antennas are connected, and the lever arm between the two is measured and recorded. (2) power connection: the power supply of the combined navigation equipment is connected with the power strip (3) to start recording: a. and selecting an outdoor open area, starting a power supply of the equipment, and preparing for recording. b. And observing the GNSS signal indicator lamp, and starting a data recording button after the GNSS signal indicator lamp is normal to start recording the IMU and the GNSS original data. (4) ending the recording: after the test is completed, the data recording button is closed, and recording is finished. (5) data export: and connecting the device USB cable with the PC to derive IMU and GNSS original data. (6) data processing: performing integrated navigation post-processing (7) result analysis and derivation using integrated navigation device professional post-processing software (GINS): and (3) carrying out result analysis by using GINS software, and if the requirements are met, deriving a combined navigation post-processing result as required.
Further, the specific method for reconstructing the point cloud in step S01 is as follows:
the process of point cloud reconstruction can be generalized to the process of two-by-two point cloud registration (point cloud registration), preferably with the use of nearest point iterations (Iterative Closest Point, ICP).
Further, the specific steps of the closest point iteration (Iterative Closest Point, ICP) in the step S01 are:
s017: obtaining data;
s018: feature point estimation, typically downsampling the point cloud, or by other means of computation
S019: characteristic value calculation, wherein the adopted characteristic value is NARF, FPFH, SIFT, and the purpose of calculating the characteristic value is mainly matching
S020: the association points are matched, and rigid transformation of two frames of point clouds can be calculated according to the matched points after the association points are matched in a neighbor searching mode, and a method such as Singular Value Decomposition (SVD) is adopted.
Further, to avoid the shortcomings of the closest point iteration (Iterative Closest Point, ICP), the initial values are typically calculated using some other algorithm, including greedy initial registration (greedy Initial Alignment), random sampling (sampling occurrences), and the like, followed by fine registration with ICP.
Further, in the step S01, the specific step of removing the point cloud distortion generated in the radar motion process by using the inertial measurement unit is:
s021: we can obtain IMU state_imustart at the point cloud initial time, and IMU state_imucur at the current time after the actual scan time relSweepTime. :
s022: after the scanning time, the drift generated by the current point is calculated: position-imuPositionShift= position-imuCur. Position-imuStart. Position-relSweeptime;
s022: the above equation assumes that the cart is moving at a constant speed (because IMU frequency is high) during the scan time relsweptime.
Preferably, the lidar adopts a scanning mode of rotation, the vehicle moves, when the next moment of scanning is performed, if the vehicle moves forwards, the origin position of the laser radar also moves forwards, the coordinate point position returned at the current moment is actually backward relative to the original position, and the whole point cloud is likely to be stretched. Because the frequency of the IMU is high, the drift amount of the current point can be calculated through the speed of the IMU, so that the point cloud distortion is corrected. However, because the speed of the current acquisition vehicle is relatively slow, the algorithm without removing the point cloud distortion can also normally run.
Further, the specific step of extracting the feature point of each frame point cloud in the step S02 is: and extracting corner and plane characteristics according to the curvature. Preferably, the set point i is a point cloudIs (are) a little (are)>Is a set of points located on the same scan line, then the smoothness of the local surface can be defined by the following function:
if the c value of one point is larger than the set maximum value, the point is the edge characteristic point; and if the value is smaller than the set minimum value, the surface characteristic point is obtained. For better describing the environment, we divide a scan line into four regions, each region retaining two edge points and four face points.
Further, the specific steps of performing the association matching of the feature points of the adjacent frames in the step S02 are as follows:
and for the edge characteristic point of the current frame, searching the nearest neighbor point j of the current point in the next frame, and searching the next nearest neighbor point l. For the surface feature point of the current frame, searching the nearest neighbor point j of the current point in the next frame, and searching two secondary neighbor points l and m. The relationship between them is shown in fig. 4. Let P be k At t k The point cloud of the moment in time,for it at t k+1 Projection of time, P k+1 At t k+1 A point cloud of time of day. />And P k+1 For estimating the motion of lidar. Epsilon k+1 And->The edge feature point set and the face feature point set are respectively. We can find therein +.>Is a related point of the (c). At each iteration we need to project the feature point set to the last moment, get +.>And->
For one edge pointIf (j, l) is the associated edge, +.>The point-to-line distance can be expressed as:
x is three-dimensional coordinates. Point of the same way availableFor by->The formed association surface has a point-to-plane distance:
further, the specific steps of acquiring the radar motion in the acquisition point cloud interval time in the step S02 are as follows:
s023: is provided withIs [ t ] k+1 ,t k ]Posture change between->Is a 6-DOF rigid body transformation;t x ,t y ,t z is a triaxial translation, θ x ,θ y ,θ z Is the rotation angle of the three axes, and a point i epsilon P is given k+1 ,t i For the time of this point, +.>Is [ t ] k+1 ,t i ]The change between->The interpolation can be used for calculation:
s024: in order to estimate the motion of the radar we need to know epsilon k+1 Andor->And->The relationship between them, we can define:
for three-axis translation, R is a rotation matrix, which can be defined by the rondrigas formula:
in the above formula, θ is the degree of rotation:
ω is the unit vector related to the rotation direction:
is an oblique symmetric matrix of ω.
S025: by combining the above distance calculation formula and the above formula, we can obtain the relationship between the edge feature points and the edge lines related to the edge feature points:
similarly, the following relationship exists between the surface feature points and the associated surfaces:
calculating lidar motion by adopting an L-M algorithm, and combining two distance formulas to obtain a nonlinear equation:
s026: solving the above formula forJacobian matrix of (2), defined as +.>And (3) obtaining a high-precision map by iteration and minimizing d, wherein an iteration formula is as follows:
lambda is a coefficient defined by the L-M algorithm.
Further, the step S02 further includes a step of acquiring odometer data of the radar, which specifically includes the steps of:
the algorithm flow is shown in the following table:
further, the assumption is that a cloth is laid on the ground, and the cloth falls down due to gravity. If the cloth is soft enough, it will completely fit the ground. The algorithm process is as follows: a) Turning the point cloud; b) Covering the cloth. In the cloth model, cloth is a grid model composed of interconnected nodes, which is called a spring model. The nodes can be defined as "virtual springs", following hooke's law (f= -kx)
According to newton's second law, the relationship between displacement and force:
x is the displacement at time t; f (F) ext (X, t) is the force of gravity and the impact caused by the obstacle; f (F) int The (X, t) elasticity has the following rule in the point cloud:
movement of the node is limited to the vertical direction
After the node reaches the correct position, the ground is immovable
The force is divided into two steps to improve efficiency
The movement of the points can be divided into four phases, as shown in fig. 8: a) Initializing; b) Descending under the action of gravity; c) Internal detection, namely, whether a critical point is reached or not; d) The movable point continues to move.
Further, the specific way of forming the ground feature set in the step S03 to generate the high-precision map is as follows:
s031: setting the gridding resolution to be 0.03cm x0.03cm;
s032: calculating a point cloud boundary and a size of the raster image;
s033: counting the number of points in each grid and accumulating the high reflectivity;
s034: calculating the average height and reflectivity in each grid;
s035: the average reflectivity of the entire image is normalized and multiplied by 255 as the gray value of the image.
Preferably, the rasterized image is as shown in fig. 5.
A high-precision map construction system, comprising:
a point cloud reconstruction unit for registration of two-by-two point clouds,
the inertial measurement unit is used for removing point cloud distortion generated in the radar moving process;
the feature extraction module is used for extracting feature points of each frame point cloud;
the association matching module is used for carrying out association matching on the extracted characteristic points of the adjacent frames; acquiring the transformation of radar attitude in the acquisition point cloud interval time;
and the map generation module is used for generating a high-precision map by the characteristics of the characteristic points and the characteristic point association areas.
Further, the system further comprises a ground feature set generation module, wherein the ground feature set generation module acquires feature points on the ground at feature point positions from the feature extraction module, and forms a ground feature set, and the ground feature set is used for representing ground contours and obstacles.
A terminal device such as a smart phone that can execute the above-described high-precision map construction method or a vehicle-mounted terminal control device that can execute the above-described high-precision map construction method program.
A server comprises a construction method for realizing the high-precision map and/or a construction system of the high-precision map.
A computer storage medium for storing a software program corresponding to the method for constructing a high-precision map and/or a data management system for the high-precision map.
As described above, the present invention has the following advantageous effects:
the characteristic point extraction does not adopt the existing characteristic, but extracts the corner and plane characteristics according to the curvature; and removing point cloud distortion by using an inertial measurement unit, extracting characteristic points, performing characteristic point association matching, and mapping the characteristics with a certain distance from the ground and a certain height onto the ground plane to form a high-precision map.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 shows a schematic flow chart of the present invention.
Fig. 2 shows a flow chart of the closest point iteration.
Fig. 3 is a schematic flow chart of the map construction according to the present invention.
Fig. 4 shows a schematic diagram of the association of edge feature points and face feature points.
Fig. 5 is a schematic diagram showing the implementation result of the map construction method of the present invention.
FIG. 6 shows a graphical representation of ARCMAP map generation in accordance with the present invention.
Fig. 7 shows a diagram of a vector map that is processed according to the present invention.
Fig. 8 shows a schematic diagram of cloth simulation filtering.
Detailed Description
Other advantages and effects of the present invention will become apparent to those skilled in the art from the following disclosure, which describes the embodiments of the present invention with reference to specific examples. The invention may be practiced or carried out in other embodiments that depart from the specific details, and the details of the present description may be modified or varied from the spirit and scope of the present invention. It should be noted that the following embodiments and features in the embodiments may be combined with each other without conflict.
It should be understood that the structures, proportions, sizes, etc. shown in the drawings are for illustration purposes only and should not be construed as limiting the invention to the extent that it can be practiced, since modifications, changes in the proportions, or otherwise, used in the practice of the invention, are not intended to be critical to the essential characteristics of the invention, but are intended to fall within the spirit and scope of the invention. Also, the terms such as "upper," "lower," "left," "right," "middle," and "a" and the like recited in the present specification are merely for descriptive purposes and are not intended to limit the scope of the invention, but are intended to provide relative positional changes or modifications without materially altering the technical context in which the invention may be practiced.
With reference to figures 1 to 8 of the drawings,
a method for constructing a high-precision map comprises the following steps:
s01: acquiring radar scanning point clouds, registering the point clouds pairwise, generating reconstructed point clouds, and removing point cloud distortion generated in the radar movement process by using an inertial measurement unit;
s02: extracting characteristic points of each frame point cloud, and carrying out association matching on the characteristic points of adjacent frames; acquiring the transformation of radar attitude in the acquisition point cloud interval time;
s03: and extracting ground points in the three-dimensional point cloud to form a ground feature set to generate a high-precision map.
Further, the specific method for acquiring the radar scanning point cloud in step S01 is as follows:
s011: according to the requirements, whether the combined navigation is needed to be added for drawing is judged, and if no special requirements exist, the laser radar and the video are preferably ensured to be synchronous;
s012: after the wire connection is completed, a board power supply and a laser radar power supply are required to be turned on, and whether the synchronous signals are received or not is confirmed. The computer is connected with the network port of the laser radar box through a network cable, the IP of the computer is manually set, the wireless is opened, the corresponding network card is selected, and the port is screened;
s013: the integrated navigation and the laser radar are synchronized, the self-contained GPS module of the laser radar is connected to the laser radar box, and the GPS time of the port is real GPS time. The integrated navigation needs to be set as follows: 1. measuring 3D distances among the IMU, the GNSS antenna and the laser radar equipment; 2. initializing a base station and acquiring the accurate position of the base station; after confirming synchronization, the path of the sweep needs to be planned. The parking lot with smaller size can collect data at one time, such as a parking lot under the ground of Naxian 800, and can plan a path according to Naxian 800; for a large parking lot, the collection of all the parking lots is difficult, and the parking lots can be collected by a branching section, but the repeated paths among the sections are ensured, so that the later splicing is facilitated;
s014: the video data recording method comprises the steps that a board needs to be connected with 5 paths of cameras, a screen is connected with the board, the board needs to enter a ubuntu system or a ubuntu virtual machine, after the adb is connected successfully, a command line is entered to sequentially input commands to start recording, if data recording is stopped, and the following commands are input in the same command line as the previous step. Problems that may occur when recording is started and solutions: display connection errors: checking that the screen power line is connected with the HDMI line; camera connection timeout: checking the connection sequence and the stability of the cameras; sh start and stop failure: restarting the board.
S015: the radar data is recorded, the speed is required to be controlled to be about 10km/h, veloview is opened, a small laser radar icon is selected, a drive selection interface is popped up, a corresponding drive is selected according to the laser radar, ok is clicked, and a real-time point cloud can be displayed; and (3) starting recording the clicked record, after the laser radar is successfully connected, displaying red color on the button, and in an available state, clicking to select a storage path, stopping recording, and clicking the button again.
S016: and (3) combined navigation recording: 1) And (3) equipment installation: a. the integrated navigation device is horizontally arranged on a bracket of a trunk of the automobile and is fixed. b. The combined navigation and GNSS antennas are connected, and the lever arm between the two is measured and recorded. (2) power connection: the power supply of the combined navigation equipment is connected with the power strip (3) to start recording: a. and selecting an outdoor open area, starting a power supply of the equipment, and preparing for recording. b. And observing the GNSS signal indicator lamp, and starting a data recording button after the GNSS signal indicator lamp is normal to start recording the IMU and the GNSS original data. (4) ending the recording: after the test is completed, the data recording button is closed, and recording is finished. (5) data export: and connecting the device USB cable with the PC to derive IMU and GNSS original data. (6) data processing: performing integrated navigation post-processing (7) result analysis and derivation using integrated navigation device professional post-processing software (GINS): and (3) carrying out result analysis by using GINS software, and if the requirements are met, deriving a combined navigation post-processing result as required.
Further, the specific method for reconstructing the point cloud in step S01 is as follows:
the process of point cloud reconstruction can be generalized to the process of two-by-two point cloud registration (point cloud registration), preferably with the use of nearest point iterations (Iterative Closest Point, ICP).
Further, the specific steps of the closest point iteration (Iterative Closest Point, ICP) in the step S01 are:
s017: obtaining data;
s018: feature point estimation, typically downsampling the point cloud, or by other means of computation
S019: characteristic value calculation, wherein the adopted characteristic value is NARF, FPFH, SIFT, and the purpose of calculating the characteristic value is mainly matching
S020: the association points are matched, and rigid transformation of two frames of point clouds can be calculated according to the matched points after the association points are matched in a neighbor searching mode, and a method such as Singular Value Decomposition (SVD) is adopted.
Further, to avoid the shortcomings of the closest point iteration (Iterative Closest Point, ICP), the initial values are typically calculated using some other algorithm, including greedy initial registration (greedy Initial Alignment), random sampling (sampling occurrences), and the like, followed by fine registration with ICP.
Further, in the step S01, the specific step of removing the point cloud distortion generated in the radar motion process by using the inertial measurement unit is:
s021: we can obtain IMU state_imustart at the point cloud initial time, and IMU state_imucur at the current time after the actual scan time relSweepTime. :
s022: after the scanning time, the drift generated by the current point is calculated: position-imuPositionShift= position-imuCur. Position-imuStart. Position-relSweeptime;
s022: the above equation assumes that the cart is moving at a constant speed (because IMU frequency is high) during the scan time relsweptime.
Preferably, the lidar adopts a scanning mode of rotation, the vehicle moves, when the next moment of scanning is performed, if the vehicle moves forwards, the origin position of the laser radar also moves forwards, the coordinate point position returned at the current moment is actually backward relative to the original position, and the whole point cloud is likely to be stretched. Because the frequency of the IMU is high, the drift amount of the current point can be calculated through the speed of the IMU, so that the point cloud distortion is corrected. However, because the speed of the current acquisition vehicle is relatively slow, the algorithm without removing the point cloud distortion can also normally run.
Further, the specific step of extracting the feature point of each frame point cloud in the step S02 is: and extracting corner and plane characteristics according to the curvature. Preferably, the set point i is a point cloudIs (are) a little (are)>Is a set of points located on the same scan line, then the smoothness of the local surface can be defined by the following function:
if the c value of one point is larger than the set maximum value, the point is the edge characteristic point; and if the value is smaller than the set minimum value, the surface characteristic point is obtained. For better describing the environment, we divide a scan line into four regions, each region retaining two edge points and four face points.
Further, the specific steps of performing the association matching of the feature points of the adjacent frames in the step S02 are as follows:
and for the edge characteristic point of the current frame, searching the nearest neighbor point j of the current point in the next frame, and searching the next nearest neighbor point l. For the surface feature point of the current frame, searching the nearest neighbor point j of the current point in the next frame, and searching two secondary neighbor points l and m. The relationship between them is shown in fig. 4. Let P be k At t k The point cloud of the moment in time,for it at t k+1 Projection of time, P k+1 At t k+1 A point cloud of time of day. />And P k+1 For estimating the motion of lidar. Epsilon k+1 And->The edge feature point set and the face feature point set are respectively. We can find therein +.>Is a related point of the (c). At each iteration we need to project the feature point set to the last moment, get +.>And->
For one edge pointIf (j, l) is the associated edge, +.>The point-to-line distance can be expressed as:
x is three-dimensional coordinates. Point of the same way availableFor by->The formed association surface has a point-to-plane distance: />
Further, the specific steps of acquiring the radar motion in the acquisition point cloud interval time in the step S02 are as follows:
s023: is provided withIs [ t ] k+1 ,t k ]Posture change between->Is a 6-DOF rigid body transformation;t x ,t y ,t z is a triaxial translation, θ x ,θ y ,θ z Is the rotation angle of the three axes, and a point i epsilon P is given k+1 ,t i For the time of this point, +.>Is [ t ] k+1 ,t i ]The change between->The interpolation can be used for calculation:
s024: in order to estimate the motion of the radar we need to know epsilon k+1 Andor->And->The relationship between them, we can define:
for three-axis translation, R is a rotation matrix, which can be defined by the rondrigas formula:
in the above formula, θ is the degree of rotation:
ω is the unit vector related to the rotation direction:
is an oblique symmetric matrix of ω.
S025: by combining the above distance calculation formula and the above formula, we can obtain the relationship between the edge feature points and the edge lines related to the edge feature points:
similarly, the following relationship exists between the surface feature points and the associated surfaces:
calculating lidar motion by adopting an L-M algorithm, and combining two distance formulas to obtain a nonlinear equation:
s026: solving the above formula forJacobian matrix of (2), defined as +.>And (3) obtaining a high-precision map by iteration and minimizing d, wherein an iteration formula is as follows:
lambda is a coefficient defined by the L-M algorithm.
Further, the step S02 further includes a step of acquiring odometer data of the radar, which specifically includes the steps of:
the algorithm flow is shown in the following table:
/>
further, the assumption is that a cloth is laid on the ground, and the cloth falls down due to gravity. If the cloth is soft enough, it will completely fit the ground. The algorithm process is as follows: a) Turning the point cloud; b) Covering the cloth. In the cloth model, cloth is a grid model composed of interconnected nodes, which is called a spring model. The nodes can be defined as "virtual springs", following hooke's law (f= -kx)
According to newton's second law, the relationship between displacement and force:
x is the displacement at time t; f (F) ext (X, t) is the force of gravity and the impact caused by the obstacle; f (F) int (X, t) spring force
In the point cloud there are the following rules:
movement of the node is limited to the vertical direction
After the node reaches the correct position, the ground is immovable
The force is divided into two steps to improve efficiency
The movement of the points can be divided into four phases, as shown in fig. 8: a) Initializing; b) Descending under the action of gravity; c) Internal detection, namely, whether a critical point is reached or not; d) The movable point continues to move.
Further, the specific way of forming the ground feature set in the step S03 to generate the high-precision map is as follows:
s031: setting the gridding resolution to be 0.03cm x0.03cm;
s032: calculating a point cloud boundary and a size of the raster image;
s033: counting the number of points in each grid and accumulating the high reflectivity;
s034: calculating the average height and reflectivity in each grid;
s035: the average reflectivity of the entire image is normalized and multiplied by 255 as the gray value of the image.
Preferably, the rasterized image is as shown in fig. 5.
A high-precision map construction system, comprising:
a point cloud reconstruction unit for registration of two-by-two point clouds,
the inertial measurement unit is used for removing point cloud distortion generated in the radar moving process;
the feature extraction module is used for extracting feature points of each frame point cloud;
the association matching module is used for carrying out association matching on the extracted characteristic points of the adjacent frames; acquiring the transformation of radar attitude in the acquisition point cloud interval time;
and the map generation module is used for generating a high-precision map by the characteristics of the characteristic points and the characteristic point association areas.
Further, the system further comprises a ground feature set generation module, wherein the ground feature set generation module acquires feature points on the ground at feature point positions from the feature extraction module, and forms a ground feature set, and the ground feature set is used for representing ground contours and obstacles.
A terminal device such as a smart phone that can execute the above-described high-precision map construction method or a vehicle-mounted terminal control device that can execute the above-described high-precision map construction method program.
As a preferred embodiment, the present embodiment further provides a terminal device, such as a smart phone, a tablet computer, a notebook computer, a desktop computer, a rack-mounted server, a blade server, a tower server, or a rack-mounted server (including an independent server, or a server cluster formed by a plurality of servers) that can execute a program, or the like. The terminal device of this embodiment includes at least, but is not limited to: a memory, a processor, and the like, which may be communicatively coupled to each other via a system bus. It should be noted that a terminal device having a component memory, a processor, but it should be understood that not all of the illustrated components are required to be implemented, and that alternative methods of data management of high-precision maps may implement more or fewer components.
A server comprises a construction method for realizing the high-precision map and/or a construction system of the high-precision map.
As a preferred embodiment, the memory (i.e., readable storage medium) includes flash memory, hard disk, multimedia card, card memory (e.g., SD or DX memory, etc.), random Access Memory (RAM), static Random Access Memory (SRAM), read-only memory (ROM), electrically erasable programmable read-only memory (EEPROM), programmable read-only memory (PROM), magnetic memory, magnetic disk, optical disk, etc. In some embodiments, the memory may be an internal storage unit of a computer device, such as a hard disk or memory of the computer device. In other embodiments, the memory may also be an external storage device of a computer device, such as a plug-in hard disk, smart Media Card (SMC), secure Digital (SD) Card, flash Card (Flash Card) or the like, which are provided on the computer device. Of course, the memory may also include both internal storage units of the computer device and external storage devices. In this embodiment, the memory is typically used to store an operating system installed in the computer device and various types of application software, such as the program code of the method for constructing the high-precision map in the embodiment. In addition, the memory can be used to temporarily store various types of data that have been output or are to be output.
A computer-readable storage medium having stored thereon a computer program, characterized by: the program, when executed by the processor, implements the steps in the above-described high-precision map construction method.
The present embodiment also provides a computer-readable storage medium such as a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, a server, an App application store, etc., on which a computer program is stored, which when executed by a processor, performs the corresponding functions. The computer-readable storage medium of the present embodiment is for storing a data management method program based on a high-precision map, which when executed by a processor, implements the construction method of the high-precision map in the data management method embodiment of the high-precision map.
The above embodiments are merely illustrative of the principles of the present invention and its effectiveness, and are not intended to limit the invention. Modifications and variations may be made to the above-described embodiments by those skilled in the art without departing from the spirit and scope of the invention. Accordingly, it is intended that all equivalent modifications and variations of the invention be covered by the claims of this invention, which are within the skill of those skilled in the art, be included within the spirit and scope of this invention.

Claims (9)

1. The method for constructing the high-precision map is characterized by comprising the following steps of:
s01: acquiring radar scanning point clouds, registering the point clouds pairwise, generating reconstructed point clouds, and removing point cloud distortion generated in the radar movement process by using an inertial measurement unit;
s02: extracting characteristic points of each frame of point cloud, carrying out association matching on the characteristic points of adjacent frames, and obtaining the transformation of radar gestures in the interval time of the acquisition point cloud;
s03: extracting ground points in the three-dimensional point cloud to form a ground feature set to generate a high-precision map;
the step S01 of registering the point clouds in pairs includes: obtaining data;
estimating characteristic points, and downsampling the point cloud;
calculating a characteristic value, wherein the adopted characteristic value is NARF, FPFH, SIFT;
and after the association points are matched in a neighbor searching mode, the rigid transformation of the two-frame point cloud can be calculated according to the matched points.
2. The method for constructing a high-precision map according to claim 1, wherein in the step S01, the specific step of removing the point cloud distortion generated during the radar movement by using the inertial measurement unit is:
s021: acquiring an IMU state_imuStart of the point cloud at the initial moment, and after the actual scanning time relSweeptime is passed, acquiring an IMU state_imuCur of the current moment;
s022: after the scanning time is calculated, the drift generated by the current point is calculated: position-imuPositionShift= position-imuCur. Position-imuStart. Position-relSweeptime;
s023: the above equation assumes that the vehicle is moving at a constant speed during the scan time relSweepTime.
3. The method for constructing a high-precision map according to claim 1, wherein the point cloud pairwise registration in the step S01 is implemented by using nearest point iteration (Iterative Closest Point, ICP);
the characteristic value is calculated for matching;
the rigid transformation adopts a Singular Value Decomposition (SVD) method.
4. A method of constructing a high precision map according to claim 3, characterized in that to avoid the drawbacks of the closest point iterations (Iterative Closest Point, ICP) initial values are calculated using algorithms including greedy initial registration (greedy initial alignment), random sampling (sampleonsensus), and then fine registration using ICP.
5. A high-precision map construction system, comprising:
the point cloud reconstruction unit is used for registering two-by-two point clouds;
the inertial measurement unit is used for removing point cloud distortion generated in the radar moving process;
the feature extraction module is used for extracting feature points of each frame point cloud;
the association matching module is used for carrying out association matching on the extracted characteristic points of the adjacent frames; acquiring the transformation of radar attitude in the acquisition point cloud interval time;
the map generation module is used for generating a high-precision map according to the feature points and the characteristics of the feature point association area;
the registering of the point clouds in the point cloud reconstruction unit comprises the following steps: obtaining data;
estimating characteristic points, and downsampling the point cloud;
calculating a characteristic value, wherein the adopted characteristic value is NARF, FPFH, SIFT;
and after the association points are matched in a neighbor searching mode, the rigid transformation of the two-frame point cloud can be calculated according to the matched points.
6. The system of claim 5, further comprising a ground feature set generation module that extracts feature points from the feature extraction module that are on the ground at feature points and forms a ground feature set that is used for characterization of ground contours and obstacles.
7. A server, characterized in that it comprises a construction system for implementing the high-precision map according to any of the preceding claims 1 to 4 and/or according to any of the claims 5 to 6.
8. A terminal device, characterized in that the terminal device is a vehicle-mounted terminal control device for scheduling the high-precision map construction system according to any one of the preceding claims 5 to 6.
9. A computer-readable storage medium, on which a computer program is stored, characterized in that the program, when being executed by a processor, realizes the steps in the method of constructing a high-precision map as claimed in any one of claims 1 to 4.
CN202010495762.9A 2020-06-03 2020-06-03 High-precision map construction method, system, terminal and storage medium Active CN111612829B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010495762.9A CN111612829B (en) 2020-06-03 2020-06-03 High-precision map construction method, system, terminal and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010495762.9A CN111612829B (en) 2020-06-03 2020-06-03 High-precision map construction method, system, terminal and storage medium

Publications (2)

Publication Number Publication Date
CN111612829A CN111612829A (en) 2020-09-01
CN111612829B true CN111612829B (en) 2024-04-09

Family

ID=72198438

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010495762.9A Active CN111612829B (en) 2020-06-03 2020-06-03 High-precision map construction method, system, terminal and storage medium

Country Status (1)

Country Link
CN (1) CN111612829B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111710039A (en) * 2020-06-03 2020-09-25 纵目科技(上海)股份有限公司 Method, system, terminal and storage medium for constructing high-precision map
CN115079128B (en) * 2022-08-23 2022-12-09 深圳市欢创科技有限公司 Method and device for distortion removal of laser radar point cloud data and robot

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106296693A (en) * 2016-08-12 2017-01-04 浙江工业大学 Based on 3D point cloud FPFH feature real-time three-dimensional space-location method
CN108225348A (en) * 2017-12-29 2018-06-29 百度在线网络技术(北京)有限公司 Map building and the method and apparatus of movement entity positioning
CN109308737A (en) * 2018-07-11 2019-02-05 重庆邮电大学 A kind of mobile robot V-SLAM method of three stage point cloud registration methods
CN109345574A (en) * 2018-08-31 2019-02-15 西安电子科技大学 Laser radar three-dimensional based on semantic point cloud registering builds drawing method
CN109709801A (en) * 2018-12-11 2019-05-03 智灵飞(北京)科技有限公司 A kind of indoor unmanned plane positioning system and method based on laser radar
WO2019099802A1 (en) * 2017-11-17 2019-05-23 DeepMap Inc. Iterative closest point process based on lidar with integrated motion estimation for high definitions maps
CN109974742A (en) * 2017-12-28 2019-07-05 沈阳新松机器人自动化股份有限公司 A kind of laser Method for Calculate Mileage and map constructing method
CN110223379A (en) * 2019-06-10 2019-09-10 于兴虎 Three-dimensional point cloud method for reconstructing based on laser radar
CN110261870A (en) * 2019-04-15 2019-09-20 浙江工业大学 It is a kind of to synchronize positioning for vision-inertia-laser fusion and build drawing method
CN110658530A (en) * 2019-08-01 2020-01-07 北京联合大学 Map construction method and system based on double-laser-radar data fusion and map
CN111220993A (en) * 2020-01-14 2020-06-02 长沙智能驾驶研究院有限公司 Target scene positioning method and device, computer equipment and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10539676B2 (en) * 2017-03-22 2020-01-21 Here Global B.V. Method, apparatus and computer program product for mapping and modeling a three dimensional structure
US10481267B2 (en) * 2017-06-13 2019-11-19 TuSimple Undistorted raw LiDAR scans and static point extractions method for ground truth static scene sparse flow generation
US10866101B2 (en) * 2017-06-13 2020-12-15 Tusimple, Inc. Sensor calibration and time system for ground truth static scene sparse flow generation
US10964053B2 (en) * 2018-07-02 2021-03-30 Microsoft Technology Licensing, Llc Device pose estimation using 3D line clouds

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106296693A (en) * 2016-08-12 2017-01-04 浙江工业大学 Based on 3D point cloud FPFH feature real-time three-dimensional space-location method
WO2019099802A1 (en) * 2017-11-17 2019-05-23 DeepMap Inc. Iterative closest point process based on lidar with integrated motion estimation for high definitions maps
CN109974742A (en) * 2017-12-28 2019-07-05 沈阳新松机器人自动化股份有限公司 A kind of laser Method for Calculate Mileage and map constructing method
CN108225348A (en) * 2017-12-29 2018-06-29 百度在线网络技术(北京)有限公司 Map building and the method and apparatus of movement entity positioning
CN109308737A (en) * 2018-07-11 2019-02-05 重庆邮电大学 A kind of mobile robot V-SLAM method of three stage point cloud registration methods
CN109345574A (en) * 2018-08-31 2019-02-15 西安电子科技大学 Laser radar three-dimensional based on semantic point cloud registering builds drawing method
CN109709801A (en) * 2018-12-11 2019-05-03 智灵飞(北京)科技有限公司 A kind of indoor unmanned plane positioning system and method based on laser radar
CN110261870A (en) * 2019-04-15 2019-09-20 浙江工业大学 It is a kind of to synchronize positioning for vision-inertia-laser fusion and build drawing method
CN110223379A (en) * 2019-06-10 2019-09-10 于兴虎 Three-dimensional point cloud method for reconstructing based on laser radar
CN110658530A (en) * 2019-08-01 2020-01-07 北京联合大学 Map construction method and system based on double-laser-radar data fusion and map
CN111220993A (en) * 2020-01-14 2020-06-02 长沙智能驾驶研究院有限公司 Target scene positioning method and device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN111612829A (en) 2020-09-01

Similar Documents

Publication Publication Date Title
CN111710040B (en) High-precision map construction method, system, terminal and storage medium
CN109974693B (en) Unmanned aerial vehicle positioning method and device, computer equipment and storage medium
Daneshmand et al. 3d scanning: A comprehensive survey
US8437501B1 (en) Using image and laser constraints to obtain consistent and improved pose estimates in vehicle pose databases
CN110033489A (en) A kind of appraisal procedure, device and the equipment of vehicle location accuracy
US20090154793A1 (en) Digital photogrammetric method and apparatus using intergrated modeling of different types of sensors
CN111540048A (en) Refined real scene three-dimensional modeling method based on air-ground fusion
CN105096386A (en) Method for automatically generating geographic maps for large-range complex urban environment
KR102200299B1 (en) A system implementing management solution of road facility based on 3D-VR multi-sensor system and a method thereof
Jiang et al. Unmanned Aerial Vehicle-Based Photogrammetric 3D Mapping: A survey of techniques, applications, and challenges
CN113593017A (en) Method, device and equipment for constructing surface three-dimensional model of strip mine and storage medium
Verykokou et al. Oblique aerial images: a review focusing on georeferencing procedures
CN113192193A (en) High-voltage transmission line corridor three-dimensional reconstruction method based on Cesium three-dimensional earth frame
CN111612829B (en) High-precision map construction method, system, terminal and storage medium
Wendel et al. Automatic alignment of 3D reconstructions using a digital surface model
CN114494618A (en) Map generation method and device, electronic equipment and storage medium
JP2007193850A (en) Change region recognition apparatus
CN111710039A (en) Method, system, terminal and storage medium for constructing high-precision map
CN113129422A (en) Three-dimensional model construction method and device, storage medium and computer equipment
CN115601517A (en) Rock mass structural plane information acquisition method and device, electronic equipment and storage medium
JP3966419B2 (en) Change area recognition apparatus and change recognition system
Al-Durgham The registration and segmentation of heterogeneous Laser scanning data
Rumpler et al. Rapid 3d city model approximation from publicly available geographic data sources and georeferenced aerial images
Brenner et al. Automated reconstruction of 3D city models
CN110148205A (en) A kind of method and apparatus of the three-dimensional reconstruction based on crowdsourcing image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant