CN109470255B - High-precision map automatic generation method based on high-precision positioning and lane line identification - Google Patents

High-precision map automatic generation method based on high-precision positioning and lane line identification Download PDF

Info

Publication number
CN109470255B
CN109470255B CN201811468535.6A CN201811468535A CN109470255B CN 109470255 B CN109470255 B CN 109470255B CN 201811468535 A CN201811468535 A CN 201811468535A CN 109470255 B CN109470255 B CN 109470255B
Authority
CN
China
Prior art keywords
frame
map
lane line
point set
precision
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811468535.6A
Other languages
Chinese (zh)
Other versions
CN109470255A (en
Inventor
胡禹超
戴震
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Heduo Technology Guangzhou Co ltd
Original Assignee
HoloMatic Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HoloMatic Technology Beijing Co Ltd filed Critical HoloMatic Technology Beijing Co Ltd
Priority to CN201811468535.6A priority Critical patent/CN109470255B/en
Publication of CN109470255A publication Critical patent/CN109470255A/en
Application granted granted Critical
Publication of CN109470255B publication Critical patent/CN109470255B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)
  • Instructional Devices (AREA)

Abstract

The invention discloses a high-precision map automatic generation method based on high-precision positioning and lane line identification, which comprises the following steps: synchronizing the high-precision positioning data acquired each time with the lane line data; establishing a map frame by using the synchronized data, and storing the map frame in a map frame database; matching the newly acquired lane line data with all established map frames, and if the matching fails, establishing a new map frame; if the matching is successful, updating the information of the established map frame; performing interframe smoothing on all existing frames in the established map frame to obtain a smoothed cubic curve; and (4) generating a high-precision map by splicing the cubic curves. The method and the device can generate the high-precision map which is accurately positioned to the lane line based on high precision and can automatically splice the lane line, reduce the complexity of generating the high-precision map and avoid the problems of large manpower consumption and high error rate required by the traditional high-precision map manufacturing.

Description

High-precision map automatic generation method based on high-precision positioning and lane line identification
Technical Field
The invention relates to the technical field of unmanned driving, in particular to a high-precision map automatic generation method based on high-precision positioning and lane line identification.
Background
The high-precision map is used as a scarce resource in the field of unmanned driving and is just needed, plays a core role in the whole field, can help an unmanned vehicle to sense complex road information such as gradient, curvature, course and the like in advance, and is an indispensable data source for unmanned vehicle driving by combining with intelligent path planning to make a correct decision for the unmanned vehicle. The information collected by the sensor needs to be compared with a stored high-precision map to judge the position and the direction to ensure that the unmanned vehicle can safely drive to a destination, so that the accuracy of high-precision map data acquisition is very critical for unmanned driving; the preparation of traditional high accuracy map needs a large amount of manpower marks, and is not only wasted time and energy, and the error rate that the error that leads to because of artifical mark produced is also higher moreover to do not benefit to unmanned development, the preparation of high accuracy map now, because of the required precision is higher, the computational process is complicated, the preparation process is also consuming time relatively, it is very necessary to provide one kind based on can automatic concatenation lane on high accuracy location and lane line discernment basis, the short high accuracy map of consuming time.
Disclosure of Invention
An object of the present invention is to solve at least the above problems and to provide at least the advantages described later.
It is still another object of the present invention to provide a high-precision map automatic generation method based on high-precision positioning and lane line identification, so as to generate a high-precision map that is accurate to a lane line based on high-precision positioning and can automatically splice the lane lines, reduce the complexity of generating the high-precision map, and avoid the problems of large amount of labor consumption and high error rate required by the conventional high-precision map production.
To achieve these objects and other advantages in accordance with the present invention, there is provided a high-precision map automatic generation method based on high-precision positioning and lane line recognition, comprising:
step 1, synchronizing the high-precision positioning data acquired each time with lane line data through time alignment processing to obtain the position and the posture of the synchronized high-precision positioning data.
And 2, establishing a map frame through the synchronized lane line data and the high-precision positioning data obtained in the step 1, and storing the map frame in a map frame database.
Step 3, matching the newly acquired lane line data with all the map frames established according to the acquired lane line data, and if the matching fails, establishing a new map frame by using the newly acquired lane line data; and if the matching is successful, updating the information of the map frame established in the step 2 until the updating is finished.
And 4, performing interframe smoothing treatment on all the existing frames in the map frames established in the steps 2 and 3, and recalculating by combining the point set of the map frame updated in the step 3 to obtain a cubic curve representing the lane line information corresponding to the frames subjected to interframe smoothing treatment.
And 5, splicing the cubic curves to generate a high-precision map.
Preferably, step 1 further comprises:
taking the timestamp for acquiring the lane line data as a time alignment point, the position p 'and the posture r' of the high-precision positioning data aligned to the timestamp of the lane line data are respectively:
p′=p+v(tm-tl); (1)
r′=rω(tm-tl); (2)
wherein, tmA timestamp representing the acquisition of the lane line data.
tlA time stamp representing the acquisition of the high accuracy positioning data.
p, r, v and ω represent the position, attitude and linear velocity of the high-precision positioning data before alignment, respectively
Degrees and angular velocities.
Preferably, step 2 further comprises:
the map frame mainly comprises the following elements:
PF: a location of the frame spatial information; rF: a pose of the frame space information.
CF: a cubic curve of lane line information; sF: and (5) sampling point sets of lane lines.
LF: and associating the frames before and after the inter-frame topology information.
The elements are all represented by a frame coordinate system, wherein the frame coordinate system takes the transverse direction as an x axis, the longitudinal direction as a y axis, and the directions vertical to the x axis and the y axis as a z axis.
Preferably, step 2 further comprises:
the map frame is established on the premise that the type and/or the color of the lane line are changed and/or the lane line is disconnected and/or an unmanned vehicle changes lanes in driving and/or the y-axis direction length of the map frame exceeds a threshold value.
Preferably, the conditions for successful matching in step 3 are as follows:
the newly acquired lane line data and one or more of all the map frames established according to the acquired lane line data have an overlapping part, and the overlapping part reaches the threshold value; and/or
And the newly acquired lane line data is connected with any one of the map frames which are established in a front-back mode.
Preferably, the information updating of the map frame in step 3 further includes:
and step C, sampling the cubic curve representing the lane line at certain intervals to obtain a sampling point set.
And D, calculating the position and the posture of the sampling point set relative to the non-updated map frame, and projecting the position of the sampling point set into a frame coordinate system of the non-updated map frame to obtain the sampling point set in the frame coordinate system.
And E, if the position of the sampling point set in the frame coordinate system exceeds the length limit of the frame coordinate system, cutting out an excess part to serve as new lane line data, and entering the step C to update the map frame.
And F, merging the sampling point set of the non-updated map frame with the sampling point set in the frame coordinate system in the step D to obtain a sampling point set which is re-sampled at a certain interval after merging, and updating the sampling point set of the non-updated map frame into the re-sampled sampling point set.
And G, carrying out cubic curve fitting on the sampling point set re-sampled in the step F to obtain a fitted sampling point set, and updating the result of cubic curve fitting of the sampling point set of the non-updated map frame into the fitted sampling point set.
And H, finishing the incidence relation between the front frame and the rear frame of the connected frame by calculating the connection relation between the new map frame and the established map frame so as to finish the information updating of the established map frame.
Preferably, the inter-frame smoothing process in step 4 is performed on the premise that no new lane line data is input.
Preferably, the inter-frame smoothing process in step 4 further includes:
and establishing a cubic curve, wherein the point set for fitting the cubic curve is obtained by mixing a second half point set in the point sets of the previous frame and a first half point set in the point sets of the subsequent frame in the previous frame and the subsequent frame which have a correlation.
And projecting the second half point set onto the cubic curve along a direction perpendicular to a y axis of the frame coordinate system to obtain a projected point set.
Smoothing the projection points corresponding to the points under the former frame coordinate system and the latter half point set on the y axis of the frame coordinate system to obtain smoothed points, thereby obtaining smoothed frames; the smoothing treatment of the point set under the frame coordinate system is similar; the formula of the point-to-point smoothing process is as follows:
P″=((1-a)x+ax′,y,(1-a)z+az′) (3)
wherein, the smoothing coefficient a of the previous frame point set is y/L; l represents the length of the frame.
And the smoothing coefficient a of the subsequent frame point set is 1-y/L.
And P ═ x, y, z denotes any point in the latter half set of points in the former frame coordinate system.
P ' ═ x ', y, z ' denotes any point in the first half set of points in the frame-after coordinate system.
The invention at least comprises the following beneficial effects:
according to the method, time intervals exist between high-precision positioning data and lane line data due to the lead or lag of the high-precision positioning data, time is aligned to the acquisition time stamp of the lane line data through time alignment processing, so that the consistency of data time during data processing is ensured, and the data processing is performed on the premise that a high-precision map is generated, and the coordinate system conversion between the high-precision positioning and the lane line data is required; when a new lane line data is acquired and a map frame is established, the position and the posture of spatial information in the frame are the orientation, namely the position and the posture of the high-precision positioning data after time alignment processing, by taking the high-precision positioning data and the lane line data as prior conditions, and when the lane line data changes, the lane line data is firstly matched with the established map frame to be overlapped or front-back associated parts according to requirements, so that the acquired lane line data is updated and replaced, and if the matching is unsuccessful, the acquired new lane line data establishes a new map frame again, so that the lane line splicing is automatically completed through matching updating and reconstruction; and through the interframe smoothing processing, interframe smooth transition is realized, the overlapped parts of the point sets in the frames are smoothly connected to generate a high-precision map which is accurate to the lane line based on high-precision positioning and can automatically splice the lane line, and the high-precision map which can automatically splice the lane line is generated through the prior conditions of the high-precision positioning and the lane line acquisition, so that the complexity of generating the high-precision map is reduced, the problems of large manpower consumption and high error rate required by the traditional high-precision map manufacturing are solved, and the method has important significance for the unmanned driving to reliably and safely drive.
Additional advantages, objects, and features of the invention will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following or may be learned from practice of the invention.
Drawings
FIG. 1 is a flow chart of a high-precision map automatic generation method based on high-precision positioning and lane line identification according to the present invention;
FIG. 2 is a diagram illustrating elements included in a map frame in a frame coordinate system according to the present invention;
FIG. 3 is a schematic diagram of the lane line data of the present invention when there is a blind area;
FIG. 4 is a diagram illustrating a process for updating map frame information according to the present invention;
fig. 5 is a schematic diagram of the updated map frame information according to the present invention.
Detailed Description
The present invention is further described in detail below with reference to the attached drawings so that those skilled in the art can implement the invention by referring to the description text.
It will be understood that terms such as "having," "including," and "comprising," as used herein, do not preclude the presence or addition of one or more other elements or groups thereof.
As shown in fig. 1, the present invention provides a high-precision map automatic generation method based on high-precision positioning and lane line identification, including:
step 1, synchronizing the high-precision positioning data acquired each time with lane line data through time alignment processing to obtain the position and the posture of the synchronized high-precision positioning data.
And 2, establishing a map frame through the synchronized lane line data and the high-precision positioning data obtained in the step 1, and storing the map frame in a map frame database.
Step 3, matching the newly acquired lane line data with all the map frames established according to the acquired lane line data, and if the matching fails, establishing a new map frame by using the newly acquired lane line data; and if the matching is successful, updating the information of the map frame established in the step 2 until the updating is finished.
And 4, performing interframe smoothing treatment on all the existing frames in the map frames established in the steps 2 and 3, and recalculating by combining the point set of the map frame updated in the step 3 to obtain a cubic curve representing the lane line information corresponding to the frames subjected to interframe smoothing treatment.
And 5, splicing the cubic curves to generate a high-precision map.
In the scheme, a time interval exists between the high-precision positioning data and the lane line data due to the lead or lag of the high-precision positioning data, and the time is aligned to the acquisition timestamp of the lane line data through time alignment processing so as to ensure the consistency of data time during data processing, which is taken as the premise of high-precision map generation, and the coordinate system conversion between the high-precision positioning and the lane line data is firstly required in the data processing; when a new lane line data is acquired and a map frame is established, the position and the posture of spatial information in the frame are the orientation, namely the position and the posture of the high-precision positioning data after time alignment processing, by taking the high-precision positioning data and the lane line data as prior conditions, and when the lane line data changes, the lane line data is firstly matched with the established map frame to be overlapped or front-back associated parts according to requirements, so that the acquired lane line data is updated and replaced, and if the matching is unsuccessful, the acquired new lane line data establishes a new map frame again, so that the lane line splicing is automatically completed through matching updating and reconstruction; and through the interframe smoothing processing, interframe smooth transition is realized, the overlapped parts of the point sets in the frames are smoothly connected to generate a high-precision map which is accurate to the lane line based on high-precision positioning and can automatically splice the lane line, and the high-precision map which can automatically splice the lane line is generated through the prior conditions of the high-precision positioning and the lane line acquisition, so that the complexity of generating the high-precision map is reduced, the problems of large manpower consumption and high error rate required by the traditional high-precision map manufacturing are solved, and the method has important significance for the unmanned driving to reliably and safely drive.
In a preferred embodiment, step 1 further comprises:
taking the timestamp for acquiring the lane line data as a time alignment point, the position p 'and the posture r' of the high-precision positioning data aligned to the timestamp of the lane line data are respectively:
p′=p+v(tm-tl); (1)
r′=rω(tm-tl); (2)
wherein, tmA timestamp representing the acquisition of the lane line data.
tlA time stamp representing the acquisition of the high accuracy positioning data.
p, r, v and ω represent the position, attitude and linear velocity of the high-precision positioning data before alignment, respectively
Degrees and angular velocities.
In the above scheme, a timestamp is set for the time for acquiring the lane line data, and the timestamp for acquiring the lane line data is used as a reference, so that high-precision positioning data is acquired, and certain advance or delay is generated due to different data sampling moments, so that the lane line data and the high-precision data are synchronized in the generation method, and are converted by formula 1, and the high-precision positioning data after algorithm synchronization is obtained.
In a preferred embodiment, step 2 further comprises:
the map frame mainly comprises the following elements:
PF: a location of the frame spatial information; rF: a pose of the frame space information.
CF: a cubic curve of lane line information; sF: and (5) sampling point sets of lane lines.
LF: and associating the frames before and after the inter-frame topology information.
The elements are all represented by a frame coordinate system, wherein the frame coordinate system takes the transverse direction as an x axis, the longitudinal direction as a y axis, and the directions vertical to the x axis and the y axis as a z axis.
In the above scheme, as shown in fig. 2, the map frame includes elements in the frame coordinate system.
In a preferred embodiment, step 2 further comprises:
the map frame is established on the premise that the type and/or the color of the lane line are changed and/or the lane line is disconnected and/or an unmanned vehicle changes lanes in driving and/or the y-axis direction length of the map frame exceeds a threshold value.
In the scheme, each frame represented by the map frame corresponds to a position and a posture when lane line data are acquired, and different positions and postures are formed, namely different map frames form a high-precision map based on lane line recognition, when the lane line is recognized, the lane line data change, on the premise of unsuccessful matching, a new map frame is established, the lane line data change, and when the map frame is established, if the y-axis direction of the lane line in the map frame exceeds a threshold value, the position P of the map frameFAnd attitude RFAre assigned values of p 'and r' respectively,as shown in fig. 3, the lane line data in the area indicated by W is extracted to the map frame database because the acquired lane line data has a blind area [0, y ] in the y-axis directionb]That is, in the range of G, searching the data corresponding to the area in the existing map frame, if the data can be found, filling the corresponding data into the blind area.
In a preferred embodiment, the conditions for successful matching in step 3 are as follows:
the newly acquired lane line data and one or more of all the map frames established according to the acquired lane line data have an overlapping part, and the overlapping part reaches the threshold value; and/or
And the newly acquired lane line data is connected with any one of the map frames which are established in a front-back mode.
In the above solution, the newly acquired lane line data first needs to be matched with an already established map frame to acquire a map frame which overlaps or is connected with the newly acquired lane line data, so as to determine the position of the newly acquired lane line data, and update the map frame.
In a preferred embodiment, the information updating of the map frame in step 3 further includes:
and step C, sampling the cubic curve representing the lane line at certain intervals to obtain a sampling point set.
And D, calculating the position and the posture of the sampling point set relative to the non-updated map frame, and projecting the position of the sampling point set into a frame coordinate system of the non-updated map frame to obtain the sampling point set in the frame coordinate system.
And E, if the position of the sampling point set in the frame coordinate system exceeds the length limit of the frame coordinate system, cutting out an excess part to serve as new lane line data, and entering the step C to update the map frame.
And F, merging the sampling point set of the non-updated map frame with the sampling point set in the frame coordinate system in the step D to obtain a sampling point set which is re-sampled at a certain interval after merging, and updating the sampling point set of the non-updated map frame into the re-sampled sampling point set.
And G, carrying out cubic curve fitting on the sampling point set re-sampled in the step F to obtain a fitted sampling point set, and updating the result of cubic curve fitting of the sampling point set of the non-updated map frame into the fitted sampling point set.
And H, finishing the incidence relation between the front frame and the rear frame of the connected frame by calculating the connection relation between the new map frame and the established map frame so as to finish the information updating of the established map frame.
In the above solution, as shown in fig. 4 and 5, the basic flow of map frame information update is as follows:
sampling the cubic curve representing the lane line A at certain intervals to obtain a sampling point set SA
Calculating the position and the attitude of A relative to the current map frame F and sampling the sampling point set SAIs projected on F, and is recorded as
Figure BDA0001890403500000081
If it is not
Figure BDA0001890403500000082
If the length limit of the current map frame F is exceeded, cutting the exceeded part out and updating the map frame again by taking the exceeded part as new lane line data;
sampling point S of current map frame FFAnd
Figure BDA0001890403500000083
combining, and resampling at certain intervals to obtain SF', and will SFUpdate to SF′;
To SF' fitting cubic curve to obtain CF', and CFIs updated to CF′;
Calculating the new map frame and the existing mapThe connection relation of the frames is updated according to the connection relation L of the frames before and after the connectionF
In a preferred embodiment, the inter-frame smoothing in step 4 is performed on the premise that no new lane line data is input.
In the above scheme, the condition that the update of the established map frame information is completed is that no new lane data line is acquired, otherwise, once the new lane data is acquired, the new lane data will enter into matching with the established map frame, or establishing a new map frame or entering into the update of the established map frame, so that the map frame can be guaranteed to be updated to perform inter-frame smoothing only under the condition that no new lane data is input.
In a preferred embodiment, the inter-frame smoothing process in step 4 further includes:
and establishing a cubic curve, wherein the point set for fitting the cubic curve is obtained by mixing a second half point set in the point sets of the previous frame and a first half point set in the point sets of the subsequent frame in the previous frame and the subsequent frame which have a correlation.
And projecting the second half point set onto the cubic curve along a direction perpendicular to a y axis of the frame coordinate system to obtain a projected point set.
Smoothing the projection points corresponding to the points under the former frame coordinate system and the latter half point set on the y axis of the frame coordinate system to obtain smoothed points, thereby obtaining smoothed frames; the smoothing treatment of the point set under the frame coordinate system is similar; the formula of the point-to-point smoothing process is as follows:
P″=((1-a)x+ax′,y,(1-a)z+az′) (3)
wherein, the smoothing coefficient a of the previous frame point set is y/L; l represents the length of the frame.
And the smoothing coefficient a of the subsequent frame point set is 1-y/L.
And P ═ x, y, z denotes any point in the latter half set of points in the former frame coordinate system.
P ' ═ x ', y, z ' denotes any point in the first half set of points in the frame-after coordinate system.
In the above scheme, two frames with a front-back relationship are respectively set as F1And F2Taking a previous frame point set
Figure BDA0001890403500000091
Second half of point set
Figure BDA0001890403500000092
And set of post-frame points
Figure BDA0001890403500000093
First half point set of
Figure BDA0001890403500000094
Mixing and fitting cubic curve Cm(ii) a For point sets
Figure BDA0001890403500000095
Project it to a cubic curve C along a direction perpendicular to the y-axismTo obtain a projection point set
Figure BDA0001890403500000096
For frame F1Set of points under a coordinate system
Figure BDA0001890403500000097
And each point P ═ x, y, z and with it
Figure BDA0001890403500000098
The corresponding point P ' ═(x ', y, z '), where the smoothing coefficient a is y/L, and L is the frame F, ((1-a) x + ax ', y, (1-a) z + az '), is calculated as follows1Length of (d).
Frame F2The point set smoothing of (1) is similar to this except that the smoothing coefficient becomes a ═ 1-x/L; in particular, for a set of points
Figure BDA0001890403500000099
Project it to a cubic curve C along a direction perpendicular to the y-axismTo obtain a projection point set
Figure BDA00018904035000000910
For frame F2Set of points under a coordinate system
Figure BDA00018904035000000911
And each point P ═ x, y, z and with it
Figure BDA00018904035000000912
The corresponding point P ' ═(x ', y, z '), where the smoothing coefficient a is 1-y/L, and the smoothed point P ═ ((1-a) x + ax ', y, (1-a) z + az ') is calculated as follows, and L frames F are obtained2Length of (d);
for the frame with the point set smoothed, recalculating the lane line cubic curve C of the frame according to the updated point setF
While embodiments of the invention have been described above, it is not limited to the applications set forth in the description and the embodiments, which are fully applicable in various fields of endeavor to which the invention pertains, and further modifications may readily be made by those skilled in the art, it being understood that the invention is not limited to the details shown and described herein without departing from the general concept defined by the appended claims and their equivalents.

Claims (6)

1. A high-precision map automatic generation method based on high-precision positioning and lane line identification comprises the following steps:
step 1, synchronizing the high-precision positioning data acquired each time with lane line data through time alignment processing to obtain the position and the posture of the synchronized high-precision positioning data;
step 2, establishing a map frame through the synchronized lane line data and the high-precision positioning data obtained in the step 1, and storing the map frame in a map frame database;
step 3, matching the newly acquired lane line data with all the map frames established according to the acquired lane line data, and if the matching fails, establishing a new map frame by using the newly acquired lane line data; if the matching is successful, the information of the map frame established in the step 2 is updated until the updating is finished;
step 4, performing interframe smoothing treatment on all the existing frames in the map frames established in the step 2 and the step 3, and recalculating by combining the point set of the map frame updated in the step 3 to obtain a cubic curve representing lane line information corresponding to the frames subjected to interframe smoothing treatment;
step 5, splicing the cubic curves to generate a high-precision map;
wherein, the condition of successful matching in the step 3 is as follows:
the newly acquired lane line data and one or more of all the map frames established according to the acquired lane line data have an overlapping part, and the overlapping part reaches a threshold value; and/or
The newly acquired lane line data is connected with any one of the established map frames in a front-back manner;
the information updating of the map frame in step 3 further comprises:
step C, sampling a cubic curve representing the lane line at certain intervals to obtain a sampling point set;
step D, calculating the position and the posture of the sampling point set relative to the non-updated map frame, and projecting the position of the sampling point set into a frame coordinate system of the non-updated map frame to obtain a sampling point set in the frame coordinate system;
step E, if the position of the sampling point set in the frame coordinate system exceeds the length limit of the frame coordinate system, cutting out an excess part to serve as new lane line data, and entering step C to update the map frame;
step F, merging the sampling point set of the non-updated map frame with the sampling point set in the frame coordinate system in the step D to obtain a sampling point set which is re-sampled at a certain interval after merging, and updating the sampling point set of the non-updated map frame into the re-sampled sampling point set;
g, carrying out cubic curve fitting on the sampling point set re-sampled in the step F to obtain a fitted sampling point set, and updating the result of cubic curve fitting of the sampling point set of the non-updated map frame into the fitted sampling point set;
and H, finishing the incidence relation between the front frame and the rear frame of the connected frame by calculating the connection relation between the new map frame and the established map frame so as to finish the information updating of the established map frame.
2. The high-precision map automatic generation method based on high-precision positioning and lane line identification as claimed in claim 1, wherein the step 1 further comprises:
taking the timestamp for acquiring the lane line data as a time alignment point, the position p 'and the posture r' of the high-precision positioning data aligned to the timestamp of the lane line data are respectively:
p′=p+v(tm-tl); (1)
r′=rω(tm-tl); (2)
wherein, tmA timestamp representing the acquisition of the lane line data;
tla time stamp representing the acquisition of the high accuracy positioning data;
p, r, v, and ω represent the position, attitude, linear velocity, and angular velocity, respectively, of the high-precision positioning data before alignment.
3. The high-precision map automatic generation method based on high-precision positioning and lane line identification as claimed in claim 1, wherein the step 2 further comprises:
the map frame mainly comprises the following elements:
PF: a location of the frame spatial information; rF: a pose of the frame spatial information;
CF: a cubic curve of lane line information; sF: a lane line sampling point set;
LF: of said interframe topology informationAssociating the previous frame with the next frame;
the elements are all represented by a frame coordinate system, wherein the frame coordinate system takes the transverse direction as an x axis, the longitudinal direction as a y axis, and the directions vertical to the x axis and the y axis as a z axis.
4. The method for automatically generating a high-precision map based on high-precision positioning and lane line identification as claimed in claim 1, wherein the step 2 further comprises:
the map frame is established on the premise that the type and/or the color of the lane line are changed and/or the lane line is disconnected and/or an unmanned vehicle changes lanes in driving and/or the y-axis direction length of the map frame exceeds a threshold value.
5. A high accuracy map automatic generation method based on high accuracy positioning and lane line identification as claimed in claim 1, wherein said inter frame smoothing process in step 4 is performed on the premise that no new data of said lane line is inputted.
6. The method for automatically generating a high-precision map based on high-precision positioning and lane line identification as claimed in claim 1, wherein the inter-frame smoothing process in step 4 further comprises:
step a, establishing a cubic curve, wherein a point set for fitting the cubic curve is obtained by mixing a second half part point set in the point set of the front frame and a first half part point set in the point set of the rear frame in the front frame and the rear frame which have a correlation;
b, projecting the latter half point set to the cubic curve along a direction perpendicular to a y axis of the frame coordinate system to obtain a projection point set;
c, smoothing the projection points corresponding to the points under the former frame coordinate system and the point sets of the latter half part on the y axis of the frame coordinate system to obtain smoothed points, thereby obtaining smoothed frames; the smoothing treatment of the point set under the frame coordinate system is similar; the formula of the point-to-point smoothing process is as follows:
P″=((1-a)x+ax′,y,(1-a)z+az′) (3)
wherein, the smoothing coefficient a of the previous frame point set is y/L; l represents the length of the frame;
the smoothing coefficient a of the subsequent frame point set is 1-y/L;
(x, y, z) represents any point in the second half set of points in the previous frame coordinate system;
p ' ═ x ', y, z ' denotes any point in the first half set of points in the frame-after coordinate system.
CN201811468535.6A 2018-12-03 2018-12-03 High-precision map automatic generation method based on high-precision positioning and lane line identification Active CN109470255B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811468535.6A CN109470255B (en) 2018-12-03 2018-12-03 High-precision map automatic generation method based on high-precision positioning and lane line identification

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811468535.6A CN109470255B (en) 2018-12-03 2018-12-03 High-precision map automatic generation method based on high-precision positioning and lane line identification

Publications (2)

Publication Number Publication Date
CN109470255A CN109470255A (en) 2019-03-15
CN109470255B true CN109470255B (en) 2022-03-29

Family

ID=65675001

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811468535.6A Active CN109470255B (en) 2018-12-03 2018-12-03 High-precision map automatic generation method based on high-precision positioning and lane line identification

Country Status (1)

Country Link
CN (1) CN109470255B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110160540B (en) * 2019-06-12 2020-12-18 禾多科技(北京)有限公司 Lane line data fusion method based on high-precision map
CN110954128B (en) * 2019-12-03 2021-11-16 阿波罗智能技术(北京)有限公司 Method, device, electronic equipment and storage medium for detecting lane line position change
CN111559373B (en) * 2020-04-26 2021-08-13 东风汽车集团有限公司 Vehicle active steering control method
CN111626206A (en) * 2020-05-27 2020-09-04 北京百度网讯科技有限公司 High-precision map construction method and device, electronic equipment and computer storage medium
CN114238354A (en) * 2021-12-21 2022-03-25 高德软件有限公司 Map data updating method, device and computer storage medium
CN114577225B (en) * 2022-04-28 2022-07-22 北京百度网讯科技有限公司 Map drawing method and device, electronic equipment and storage medium
CN114719872B (en) * 2022-05-13 2022-09-23 高德软件有限公司 Lane line processing method and device and electronic equipment
CN114719873B (en) * 2022-06-02 2022-09-02 四川省公路规划勘察设计研究院有限公司 Low-cost fine map automatic generation method and device and readable medium
CN116793369B (en) * 2023-02-10 2024-03-08 北京斯年智驾科技有限公司 Path planning method, device, equipment and computer readable storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103954275A (en) * 2014-04-01 2014-07-30 西安交通大学 Lane line detection and GIS map information development-based vision navigation method
CN104535070A (en) * 2014-12-26 2015-04-22 上海交通大学 High-precision map data structure, high-precision map data acquiringand processing system and high-precision map data acquiringand processingmethod
CN104573733A (en) * 2014-12-26 2015-04-29 上海交通大学 High-precision map generation system and method based on high-definition ortho-photo map
CN106441319A (en) * 2016-09-23 2017-02-22 中国科学院合肥物质科学研究院 System and method for generating lane-level navigation map of unmanned vehicle
CN106525057A (en) * 2016-10-26 2017-03-22 陈曦 Generation system for high-precision road map
JP2017533482A (en) * 2015-09-10 2017-11-09 バイドゥ オンライン ネットワーク テクノロジー (ベイジン) カンパニー リミテッド Lane data processing method, apparatus, storage medium and equipment
CN107976182A (en) * 2017-11-30 2018-05-01 深圳市隐湖科技有限公司 A kind of Multi-sensor Fusion builds drawing system and its method
CN108036794A (en) * 2017-11-24 2018-05-15 华域汽车系统股份有限公司 A kind of high accuracy map generation system and generation method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111542860B (en) * 2016-12-30 2024-08-27 辉达公司 Sign and lane creation for high definition maps of autonomous vehicles
CN108955670B (en) * 2017-05-25 2021-02-09 百度在线网络技术(北京)有限公司 Information acquisition method and device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103954275A (en) * 2014-04-01 2014-07-30 西安交通大学 Lane line detection and GIS map information development-based vision navigation method
CN104535070A (en) * 2014-12-26 2015-04-22 上海交通大学 High-precision map data structure, high-precision map data acquiringand processing system and high-precision map data acquiringand processingmethod
CN104573733A (en) * 2014-12-26 2015-04-29 上海交通大学 High-precision map generation system and method based on high-definition ortho-photo map
JP2017533482A (en) * 2015-09-10 2017-11-09 バイドゥ オンライン ネットワーク テクノロジー (ベイジン) カンパニー リミテッド Lane data processing method, apparatus, storage medium and equipment
CN106441319A (en) * 2016-09-23 2017-02-22 中国科学院合肥物质科学研究院 System and method for generating lane-level navigation map of unmanned vehicle
CN106525057A (en) * 2016-10-26 2017-03-22 陈曦 Generation system for high-precision road map
CN108036794A (en) * 2017-11-24 2018-05-15 华域汽车系统股份有限公司 A kind of high accuracy map generation system and generation method
CN107976182A (en) * 2017-11-30 2018-05-01 深圳市隐湖科技有限公司 A kind of Multi-sensor Fusion builds drawing system and its method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
High-precision lane-level road map building for vehicle navigation;Anning Chen等;《IEEE/ION Position, Location and Navigation Symposium》;20100506;第1035-1042页 *
基于多传感器的车道级高精细地图制作方法;贺勇等;《长安大学学报(自然科学版)》;20150115;全文 *
基于激光点云扫描的高精导航地图关键技术研究;杨玉荣等;《现代计算机(专业版)》;20180325(第09期);全文 *

Also Published As

Publication number Publication date
CN109470255A (en) 2019-03-15

Similar Documents

Publication Publication Date Title
CN109470255B (en) High-precision map automatic generation method based on high-precision positioning and lane line identification
CN110389348B (en) Positioning and navigation method and device based on laser radar and binocular camera
CN109166149A (en) A kind of positioning and three-dimensional wire-frame method for reconstructing and system of fusion binocular camera and IMU
CN109165680B (en) Single-target object dictionary model improvement method in indoor scene based on visual SLAM
CN109816696A (en) A kind of robot localization and build drawing method, computer installation and computer readable storage medium
CN103824049A (en) Cascaded neural network-based face key point detection method
CN112734841B (en) Method for realizing positioning by using wheel type odometer-IMU and monocular camera
CN103954275A (en) Lane line detection and GIS map information development-based vision navigation method
CN109059930A (en) A kind of method for positioning mobile robot of view-based access control model odometer
WO2020131498A1 (en) Systems and methods for automatic labeling of images for supervised machine learning
CN108241623B (en) Automatic assignment method and device, electronic map intelligent production system and navigation equipment
WO2021036587A1 (en) Positioning method and system for electric power patrol scenario
CN108564657A (en) A kind of map constructing method, electronic equipment and readable storage medium storing program for executing based on high in the clouds
CN107564065B (en) The measuring method of man-machine minimum range under a kind of Collaborative environment
CN109218524A (en) The cell phone application and method that house measurement generates floor plan are carried out based on video recording of taking pictures
CN109465830B (en) Robot monocular stereoscopic vision calibration system and method
CN114115298B (en) Unmanned vehicle path smoothing method and system
CN104778465A (en) Target tracking method based on feature point matching
CN109917430A (en) A kind of satellite positioning track drift method for correcting error based on smooth trajectory algorithm
CN114279434B (en) Picture construction method and device, electronic equipment and storage medium
CN109285163A (en) Lane line based on laser point cloud or so contour line interactive mode extracting method
CN110405731A (en) A kind of quick double mechanical arms basis coordinates system scaling method
CN110717141A (en) Lane line optimization method and device and storage medium
CN107710229A (en) Shape recognition process, device, equipment and computer-readable storage medium in image
CN109358624B (en) Coupling positioning method for robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 201, 202, 301, No. 56-4 Fenghuang South Road, Huadu District, Guangzhou City, Guangdong Province, 510806

Patentee after: Heduo Technology (Guangzhou) Co.,Ltd.

Address before: 100089 21-14, 1st floor, building 21, Enji West Industrial Park, No.1, liangjiadian, Fuwai, Haidian District, Beijing

Patentee before: HOLOMATIC TECHNOLOGY (BEIJING) Co.,Ltd.

PE01 Entry into force of the registration of the contract for pledge of patent right
PE01 Entry into force of the registration of the contract for pledge of patent right

Denomination of invention: A high-precision map automatic generation method based on high-precision positioning and lane recognition

Granted publication date: 20220329

Pledgee: Bank of Shanghai Co.,Ltd. Beijing Branch

Pledgor: Heduo Technology (Guangzhou) Co.,Ltd.

Registration number: Y2024980009891