CN114399587A - Three-dimensional lane line generation method and device, electronic device and computer readable medium - Google Patents

Three-dimensional lane line generation method and device, electronic device and computer readable medium Download PDF

Info

Publication number
CN114399587A
CN114399587A CN202111559581.9A CN202111559581A CN114399587A CN 114399587 A CN114399587 A CN 114399587A CN 202111559581 A CN202111559581 A CN 202111559581A CN 114399587 A CN114399587 A CN 114399587A
Authority
CN
China
Prior art keywords
point cloud
positioning information
cloud data
group
lane line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111559581.9A
Other languages
Chinese (zh)
Other versions
CN114399587B (en
Inventor
胡禹超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Heduo Technology Guangzhou Co ltd
Original Assignee
HoloMatic Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by HoloMatic Technology Beijing Co Ltd filed Critical HoloMatic Technology Beijing Co Ltd
Priority to CN202111559581.9A priority Critical patent/CN114399587B/en
Publication of CN114399587A publication Critical patent/CN114399587A/en
Application granted granted Critical
Publication of CN114399587B publication Critical patent/CN114399587B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0808Diagnosing performance data

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Image Generation (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the disclosure discloses a three-dimensional lane line generation method, a three-dimensional lane line generation device, electronic equipment and a computer readable medium. One embodiment of the method comprises: acquiring a point cloud data set; selecting positioning information corresponding to the point cloud time stamp in each point cloud data in the point cloud data set from a preset positioning information sequence to generate a positioning information set, so as to obtain a positioning information set; generating an interpolation positioning information group based on the positioning information group set; generating a correction rotation matrix group based on the interpolation positioning information group and the point cloud data group; optimizing the point cloud coordinate value of each point cloud data in the point cloud data set by using the correction rotation matrix set to obtain an optimized point cloud coordinate value set; and carrying out lane line extraction processing on each optimized point cloud coordinate value in the optimized point cloud coordinate value group to obtain a three-dimensional lane line equation group. This embodiment may improve the accuracy of the three-dimensional lane line equations from the vehicle.

Description

Three-dimensional lane line generation method and device, electronic device and computer readable medium
Technical Field
The embodiment of the disclosure relates to the technical field of computers, in particular to a three-dimensional lane line generation method, a three-dimensional lane line generation device, electronic equipment and a computer readable medium.
Background
The generation of the three-dimensional lane lines is of great significance to the stable and safe driving of the automatic driving vehicle. At present, when generating a three-dimensional lane line, the following method is generally adopted: firstly, a point cloud data set is generated through a laser radar and a high-precision positioning device which are pre-installed on a vehicle, and then a three-dimensional lane line is extracted from the point cloud data set.
However, when the three-dimensional lane line generation is performed in the above manner, there are often technical problems as follows:
the bumping of the vehicle in the moving process can affect the pose between the laser radar and the high-precision positioning equipment, but the generation of the three-dimensional lane line is relatively sensitive to the change of the pose, and although the displacement between the two is usually considered to be kept unchanged, the change of the pose between the two is relatively large, so that the accuracy of the generated point cloud data is reduced, the accuracy of the generated three-dimensional lane line is reduced, and further, the driving safety is reduced.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Some embodiments of the present disclosure propose a three-dimensional lane line generation method, apparatus, electronic device, and computer readable medium to solve the technical problems mentioned in the background section above.
In a first aspect, some embodiments of the present disclosure provide a three-dimensional lane line generation method, including: acquiring a point cloud data set, wherein each point cloud data in the point cloud data set comprises a point cloud coordinate value and a point cloud timestamp; selecting positioning information corresponding to the point cloud time stamp in each point cloud data in the point cloud data set from a preset positioning information sequence to generate a positioning information set, and obtaining a positioning information set, wherein the positioning information in the positioning information sequence comprises a pose matrix; generating an interpolation positioning information set based on the positioning information set, wherein the interpolation positioning information in the interpolation positioning information set corresponds to each point cloud data in the point cloud data set; generating a correction rotation matrix group based on the interpolation positioning information group and the point cloud data group; optimizing the point cloud coordinate value of each point cloud data in the point cloud data set by using the correction rotation matrix set to obtain an optimized point cloud coordinate value set; and carrying out lane line extraction processing on each optimized point cloud coordinate value in the optimized point cloud coordinate value group to obtain a three-dimensional lane line equation group.
In a second aspect, some embodiments of the present disclosure provide a three-dimensional lane line generation apparatus, including: an acquisition unit configured to acquire a point cloud data set, wherein each point cloud data in the point cloud data set includes a point cloud coordinate value and a point cloud timestamp; the selecting unit is configured to select positioning information corresponding to the point cloud timestamp in each point cloud data in the point cloud data set from a preset positioning information sequence to generate a positioning information set, so as to obtain a positioning information set, wherein the positioning information in the positioning information sequence comprises a pose matrix; a first generation unit configured to generate an interpolated positioning information set based on the set of positioning information sets, wherein interpolated positioning information in the interpolated positioning information set corresponds to each point cloud data in the point cloud data set; a second generation unit configured to generate a correction rotation matrix set based on the interpolation positioning information set and the point cloud data set; an optimization unit configured to optimize the point cloud coordinate value of each point cloud data in the point cloud data set by using the correction rotation matrix set to obtain an optimized point cloud coordinate value set; and the extraction unit is configured to perform lane line extraction processing on each optimized point cloud coordinate value in the optimized point cloud coordinate value group to obtain a three-dimensional lane line equation set.
In a third aspect, some embodiments of the present disclosure provide an electronic device, comprising: one or more processors; a storage device having one or more programs stored thereon, which when executed by one or more processors, cause the one or more processors to implement the method described in any of the implementations of the first aspect.
In a fourth aspect, some embodiments of the present disclosure provide a computer readable medium on which a computer program is stored, wherein the program, when executed by a processor, implements the method described in any of the implementations of the first aspect.
The above embodiments of the present disclosure have the following advantages: by the three-dimensional lane line generation method of some embodiments of the present disclosure, the accuracy of generating a three-dimensional lane line can be improved. Specifically, the reason why the accuracy of the generated three-dimensional lane line equation is reduced is that: the pose between the laser radar and the high-precision positioning equipment is influenced by the bumping of the vehicle in the moving process, the generation of the three-dimensional lane line is relatively sensitive to the change of the pose, and although the displacement between the laser radar and the high-precision positioning equipment can be regarded as being kept unchanged, the pose between the laser radar and the high-precision positioning equipment is relatively large in change. Based on this, in the three-dimensional lane line generation method of some embodiments of the present disclosure, first, a point cloud data set is obtained, where each point cloud data in the point cloud data set includes a point cloud coordinate value and a point cloud timestamp. And then, selecting positioning information corresponding to the point cloud time stamp in each point cloud data in the point cloud data set from a preset positioning information sequence to generate a positioning information set, and obtaining a positioning information set, wherein the positioning information in the positioning information sequence comprises a pose matrix. And then, generating an interpolation positioning information set based on the positioning information set, wherein the interpolation positioning information in the interpolation positioning information set corresponds to each point cloud data in the point cloud data set. By generating the difference positioning information group, point cloud data and positioning information at the same time can be corresponded, and the three-dimensional lane line is convenient to generate. And then, generating a correction rotation matrix group based on the interpolation positioning information group and the point cloud data group. The correction rotation matrix group is generated, and the method can be used for determining the pose error between the laser radar and the high-precision positioning equipment corresponding to each point cloud data. Therefore, the change of the pose between the laser radar and the high-precision positioning equipment due to the vehicle movement bump can be determined. And then, optimizing the point cloud coordinate value of each point cloud data in the point cloud data set by using the correction rotation matrix set to obtain an optimized point cloud coordinate value set. The point cloud data can be optimized by correcting the rotation matrix set to reduce the error of the point cloud data. And finally, carrying out lane line extraction processing on each optimized point cloud coordinate value in the optimized point cloud coordinate value group to obtain a three-dimensional lane line equation group. Thereby, compared to the point cloud data before optimization. By optimizing the point cloud data, the extracted three-dimensional lane line equation can be more accurate. Further, driving safety can be improved.
Drawings
The above and other features, advantages and aspects of various embodiments of the present disclosure will become more apparent by referring to the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and elements are not necessarily drawn to scale.
FIG. 1 is a schematic diagram of one application scenario of a three-dimensional lane line generation method of some embodiments of the present disclosure;
FIG. 2 is a flow diagram of some embodiments of a three-dimensional lane line generation method according to the present disclosure;
FIG. 3 is a flow diagram of further embodiments of a three-dimensional lane line generation method according to the present disclosure;
FIG. 4 is a schematic structural diagram of some embodiments of a three-dimensional lane line generating device according to the present disclosure;
FIG. 5 is a schematic structural diagram of an electronic device suitable for use in implementing some embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings. The embodiments and features of the embodiments in the present disclosure may be combined with each other without conflict.
It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 is a schematic diagram of an application scenario of the three-dimensional lane line generation method of some embodiments of the present disclosure.
In the application scenario of fig. 1, first, the computing device 101 may acquire a point cloud data set 102, where each point cloud data in the point cloud data set 102 includes a point cloud coordinate value and a point cloud timestamp. Next, the computing device 101 may select positioning information corresponding to the point cloud timestamp in each point cloud data in the point cloud data set 102 from a preset positioning information sequence 103 to generate a positioning information set, resulting in a positioning information set 104. Then, the computing device 101 may generate an interpolated localization information set 105 based on the set of localization information sets 104, wherein the interpolated localization information in the interpolated localization information set corresponds to each point cloud data in the point cloud data set. Thereafter, the computing device 101 may generate a set of corrective rotational matrices 106 based on the set of interpolated positional information 105 and the set of point cloud data 102. Then, the computing device 101 may optimize the point cloud coordinate values of each point cloud data in the point cloud data set by using the corrected rotation matrix set 106, so as to obtain an optimized point cloud coordinate value set 107. Finally, the computing device 101 may perform lane line extraction processing on each optimized point cloud coordinate value in the optimized point cloud coordinate value group 107 to obtain a three-dimensional lane line equation group 108.
The computing device 101 may be hardware or software. When the computing device is hardware, it may be implemented as a distributed cluster composed of multiple servers or terminal devices, or may be implemented as a single server or a single terminal device. When the computing device is embodied as software, it may be installed in the hardware devices enumerated above. It may be implemented, for example, as multiple software or software modules to provide distributed services, or as a single software or software module. And is not particularly limited herein.
It should be understood that the number of computing devices in FIG. 1 is merely illustrative. There may be any number of computing devices, as implementation needs dictate.
With continued reference to fig. 2, a flow 200 of some embodiments of a three-dimensional lane line generation method according to the present disclosure is shown. The flow 200 of the three-dimensional lane line generation method includes the following steps:
step 201, a point cloud data set is obtained.
In some embodiments, the executing subject of the three-dimensional lane line generation method (e.g., the computing device 101 shown in fig. 1) may acquire the point cloud data set in a wired manner or a wireless manner. Each point cloud data in the point cloud data set may include a point cloud coordinate value and a point cloud timestamp. Each point cloud data in the point cloud data set may be point cloud data scanned by a laser radar on the current vehicle. Since each point cloud data includes a point cloud timestamp. Therefore, the individual point cloud data in the point cloud data set may also be arranged in the order of the time stamps. Each point cloud data may additionally include a laser reflection intensity value.
Step 202, selecting positioning information corresponding to the point cloud time stamp in each point cloud data in the point cloud data set from a preset positioning information sequence to generate a positioning information set, and obtaining a positioning information set.
In some embodiments, the executing entity may select, from a preset positioning information sequence, positioning information corresponding to the point cloud timestamp in each point cloud data of the point cloud data set to generate a positioning information set, so as to obtain a positioning information set. The positioning information in the positioning information sequence may include a pose matrix. The pose matrix can be a pose matrix of the current vehicle at a certain time point and is used for representing the position and the pose of the current vehicle. The preset positioning information sequence can be the positioning information of the current vehicle measured by the high-precision positioning equipment on the current vehicle. High precision positioning devices may include, but are not limited to: global positioning systems or inertial measurement units, etc. Each piece of positioning information in the positioning information sequence corresponds to a different time stamp. For each point cloud data in the point cloud data set, two pieces of positioning information with time stamps adjacent to the time stamps in the point cloud data can be selected from the positioning information sequence to serve as a positioning information set. And the time point corresponding to the time stamp of one of the two pieces of positioning information is before the time stamp in the point cloud data, and the time point corresponding to the time stamp of the other piece of positioning information is after the time stamp in the point cloud data. A set of positioning information sets can thus be obtained.
And step 203, generating an interpolation positioning information group based on the positioning information group set.
In some embodiments, the execution body may generate an interpolated positioning information set based on the set of positioning information sets. The interpolation positioning information in the interpolation positioning information set may correspond to each point cloud data in the point cloud data set. For each positioning information group, interpolation algorithm can be used for interpolating the pose matrix included in each positioning information in the positioning information group to obtain a pose matrix corresponding to the target timestamp as interpolation positioning information. The target timestamp may be a timestamp in the point cloud data corresponding to the set of positioning information. The interpolation algorithm may include, but is not limited to, at least one of: linear interpolation algorithms, bilinear interpolation algorithms, etc. Thus, an interpolated positioning information set can be obtained.
Specifically, the time stamp of the point cloud data is different from the time stamp corresponding to the positioning information. And positioning information corresponding to the point cloud data, i.e., a pose matrix, needs to be determined. Therefore, the pose matrix in each positioning information group is interpolated. Therefore, a pose matrix corresponding to the time stamp of the point cloud data can be determined, and an interpolation positioning information set is obtained. Thus, errors caused by non-correspondence of time points can be avoided. Further, it can be used to improve the accuracy of three-dimensional lane line generation.
In some optional implementations of some embodiments, the executing body generating the interpolated positioning information group based on the positioning information group set may include:
firstly, interpolating the pose matrix included in each positioning information group in the positioning information group set to generate an interpolation pose matrix, and obtaining an interpolation pose matrix group. The pose matrix included in each positioning information in the positioning information group set may be composed of a rotation matrix and a displacement vector. For each positioning information group, the interpolation process may be: firstly, the rotation matrix in the pose matrix included by the two positioning information in the positioning information group can be interpolated through a spherical linear interpolation algorithm to obtain an interpolated rotation matrix. Then, the translation vectors included in the two positioning information in the positioning information group may be interpolated through the above-mentioned linear interpolation or bilinear interpolation, etc., to obtain interpolated translation vectors. Finally, the interpolated rotation matrix and the interpolated translation vector may be combined into an interpolated pose matrix. Therefore, an interpolation pose matrix group can be obtained.
And secondly, determining the interpolation pose matrix group as an interpolation positioning information group.
And step 204, generating a correction rotation matrix group based on the interpolation positioning information group and the point cloud data group.
In some embodiments, the executing entity may generate the correction rotation matrix set by any means based on the interpolation positioning information set and the point cloud data set.
In some optional implementations of some embodiments, the executing body generating a set of corrective rotation matrices based on the set of interpolated positioning information and the set of point cloud data may include:
in a first step, an angular velocity value variation of a current vehicle is determined. Wherein the angular velocity variation of the current vehicle may be acquired by the inertial measurement unit.
And secondly, generating a correction parameter based on the angular velocity variation. The angular velocity variation may be input to a preset conversion equation to obtain a correction parameter. The conversion equation may be a linear equation or a curved equation. The value range of the dependent variable of the conversion equation can be [0, 1 ]. In addition, the amount of angular velocity change is inversely proportional to the above-described correction parameter. The angular velocity variation amount may be used to characterize the degree of pitching of the present vehicle.
And thirdly, based on the point cloud data set, the correction parameters and the interpolation positioning information set, correcting each initial rotation matrix in a preset initial rotation matrix set to obtain a corrected rotation matrix set. Each initial rotation matrix in the initial rotation matrix set may correspond to each point cloud data in the point cloud data set. First, a quadric equation is constructed that can be used to characterize the ground in the vicinity of the current vehicle. Each correction rotation matrix in the set of correction rotation matrices may then be generated by the following equation:
Figure BDA0003420243050000081
wherein i represents a serial number. Δ R' represents the correction rotation matrix in the above-described correction rotation matrix set. Delta R'iRepresents the ith correction rotation matrix of the correction rotation matrix set. n represents the number of point cloud data in the point cloud data set, and is equal to the number of correction rotation matrixes in the correction rotation matrix set. Δ R denotes an initial rotation matrix in the above initial rotation matrix group. Δ RiThe ith initial rotation matrix in the set of initial rotation matrices is represented. And R represents an interpolation rotation matrix in the interpolation pose matrix in the interpolation positioning information group. RiIndicating the insertion of the ith interpolated positioning information in the interpolated positioning information setAn interpolated rotation matrix in the value pose matrix. P represents a 3 × 1 matrix composed of abscissa, ordinate, and ordinate values of point cloud coordinate values in the point cloud data set. PiAnd a 3 x 1 matrix formed by an abscissa value, an ordinate value and an ordinate value of the point cloud coordinate values in the ith point cloud data in the point cloud data set. And Y represents an interpolation translation vector in the interpolation position matrix in the interpolation positioning information group. Y isiAnd representing the interpolation translation vector in the interpolation pose matrix in the ith interpolation positioning information in the interpolation positioning information group. And L represents an interpolation pose matrix in the interpolation positioning information group. L isiAnd representing an interpolation pose matrix in the ith interpolation positioning information in the interpolation positioning information group.
Figure BDA0003420243050000091
And the transposed matrix represents the interpolated pose matrix in the ith interpolated positioning information in the interpolated positioning information group. Q represents the coefficient of the quadratic term in the above-mentioned quadratic surface equation, and may be a 3 × 3 symmetric matrix with random data. d represents a constant term in the above-described quadratic equation. λ represents the above-mentioned correction parameter. arccos () represents an inverse cosine function. tr () represents the trace of the matrix. Δ Ri+1And indicating an interpolation rotation matrix in the interpolation pose matrix in the next positioning information adjacent to the time stamp corresponding to the ith interpolation positioning information in the interpolation positioning information group. I | · | purple wind2Represents a 2-way expression. X represents a 3 × 1 matrix composed of abscissa, ordinate, and ordinate values in the coordinates in the above-described quadric equation. XTAnd a transpose matrix representing a 3 × 1 matrix composed of abscissa, ordinate, and ordinate values in the coordinates in the above-described quadric equation. S represents the coefficient of the first order term in the above-mentioned quadratic surface equation, and may be a vector of one row and three columns. STThe coefficients representing the first order terms in the above-described quadratic surface equation may be the transpose of a vector of one row and three columns. A denotes a replacement matrix (for formula wrapping). B denotes a regularizer.
In practice, the regularization matrix may be used to limit the variation between the interpolation rotation matrices corresponding to the adjacent timestamps at a smooth road segment, so as to reduce the limitation on the variation between the correction rotation matrix and the interpolation rotation matrix if the variation between the correction rotation matrix and the interpolation rotation matrix is smaller at a bumpy road segment and the correction parameter value is smaller. The above formula can be solved using a non-linear method. The principal consideration for the effect of attitude changes from vehicle jounce is pitch angle. Therefore, the rotation matrix in the interpolation pose matrix in each interpolation positioning information in the interpolation positioning information group can be reduced to 1 degree of freedom:
Figure BDA0003420243050000101
where Δ θ represents a pitch angle corresponding to the interpolated location information, and the pitch angle may be a pitch angle of the current vehicle at a time point corresponding to a time stamp of the interpolated location information.
And step 205, optimizing the point cloud coordinate value of each point cloud data in the point cloud data set by using the correction rotation matrix set to obtain an optimized point cloud coordinate value set.
In some embodiments, the executing entity may optimize the point cloud coordinate values of each point cloud data in the point cloud data set by using the correction rotation matrix set, and may obtain the optimized point cloud coordinate value set in any manner.
In some optional implementations of some embodiments, the optimizing, by the executing body, the point cloud coordinate values of each point cloud data in the point cloud data set by using the correction rotation matrix set may include:
firstly, a target rotation matrix group is generated based on the correction rotation matrix group and the interpolation positioning information group. For each correction rotation matrix and corresponding interpolation positioning information, a product of the correction rotation matrix and an interpolation rotation matrix in the interpolation positioning information can be determined as a target rotation matrix. Thereby, a target rotation matrix set can be generated.
And secondly, determining an optimized point cloud coordinate value of each point cloud data in the point cloud data set based on the target rotation matrix set to obtain an optimized point cloud coordinate value set. For the point cloud data, an optimized point cloud coordinate value can be generated through the following formula:
Figure BDA0003420243050000102
wherein the content of the first and second substances,
Figure BDA0003420243050000103
and a 3 x 1 matrix composed of an abscissa value, an ordinate value and an ordinate value of the optimized point cloud coordinate value. Δ R*And representing a target rotation matrix corresponding to the point cloud data in the target rotation matrix group. R*And representing an interpolation rotation matrix in the interpolation position matrix in the interpolation positioning information corresponding to the point cloud data in the interpolation positioning information group. P*And a 3 x 1 matrix composed of an abscissa value, an ordinate value and an ordinate value representing the point cloud coordinate values in the point cloud data. And Y' represents an interpolation translation vector in the interpolation position matrix in the interpolation positioning information corresponding to the point cloud data in the interpolation positioning information group.
The three formulas and the relevant contents thereof are taken as an invention point of the embodiment of the disclosure, and the technical problems mentioned in the background art that the bumping of the vehicle in the moving process can affect the pose between the laser radar and the high-precision positioning equipment, the generation of the three-dimensional lane line is relatively sensitive to the pose change, and although the displacement between the two is usually considered to be kept unchanged, the pose change between the two is relatively large, so that the accuracy of the generated point cloud data is reduced, the accuracy of the generated three-dimensional lane line is reduced, and further, the driving safety is reduced are solved. Factors that cause a reduction in the accuracy of the generated three-dimensional lane lines tend to be as follows: the pose between the laser radar and the high-precision positioning equipment is influenced by the bumping of the vehicle in the moving process, the generation of the three-dimensional lane line is relatively sensitive to the change of the pose, and although the displacement between the laser radar and the high-precision positioning equipment can be regarded as being kept unchanged, the pose between the laser radar and the high-precision positioning equipment is relatively large in change. If the above factors are solved, the accuracy of the generated three-dimensional lane line can be improved. To achieve this, first, by introducing the first formula, an error between the laser radar and the high-precision positioning apparatus due to vehicle jounce can be detected. Thereby, a correction rotation matrix corresponding to each point cloud data is obtained. In addition, the main factor that is affected by the change in attitude due to the vehicle pitching is the pitch angle. Therefore, the rotation matrix in the interpolation pose matrix in each interpolation positioning information in the interpolation positioning information group can be reduced to 1 degree of freedom through the second formula. This reduces the number of optimization targets and improves the generation efficiency of the correction rotation matrix. And the condition that the solving is inaccurate or fails due to excessive data is avoided. And finally, substituting the correction rotation matrix into a third formula to optimize the point cloud coordinate value. Therefore, the optimized point cloud coordinate value corresponding to each point cloud coordinate value can be obtained. The purpose of optimizing each point cloud data in the point cloud data set is achieved.
And step 206, carrying out lane line extraction processing on each optimized point cloud coordinate value in the optimized point cloud coordinate value group to obtain a three-dimensional lane line equation set.
In some embodiments, the executing body may perform lane line extraction processing on each optimized point cloud coordinate value in the optimized point cloud coordinate value set through a lane line extraction algorithm, so as to obtain a three-dimensional lane line equation set. The lane line extraction algorithm may include, but is not limited to: UFLD (Ultra Fast Structure-aware Lane Detection, Lane line Fast Detection algorithm), LaneNet (multi-branch Lane line Detection network), and the like.
In some optional implementation manners of some embodiments, the executing unit may perform lane line extraction processing on each optimized point cloud coordinate value in the optimized point cloud coordinate value set to obtain a three-dimensional lane line equation set, and may include the following steps:
and step one, selecting the optimized coordinate values meeting preset screening conditions from the optimized coordinate value groups as target coordinate value groups to obtain the target coordinate value groups. Wherein, the point cloud coordinate value corresponds to the laser reflection intensity value. Thus, each optimized coordinate value may also correspond to a laser reflection intensity value. The predetermined screening condition may be to select an optimized coordinate value having a laser reflection intensity value within a predetermined intensity range (e.g., [8, 50]) as the target coordinate value. Thus, a target coordinate value group can be obtained.
And secondly, carrying out lane line identification processing on the target coordinate value set to obtain a three-dimensional lane line equation set. And carrying out lane line identification processing on the target coordinate value set through a random sampling consistency algorithm to obtain a three-dimensional lane line equation set.
Optionally, the execution main body may further send the three-dimensional lane line equation set to a display terminal for displaying a three-dimensional lane line.
The above embodiments of the present disclosure have the following advantages: by the three-dimensional lane line generation method of some embodiments of the present disclosure, the accuracy of generating a three-dimensional lane line can be improved. Specifically, the reason why the accuracy of the generated three-dimensional lane line equation is reduced is that: the pose between the laser radar and the high-precision positioning equipment is influenced by the bumping of the vehicle in the moving process, the generation of the three-dimensional lane line is relatively sensitive to the change of the pose, and although the displacement between the laser radar and the high-precision positioning equipment can be regarded as being kept unchanged, the pose between the laser radar and the high-precision positioning equipment is relatively large in change. Based on this, in the three-dimensional lane line generation method of some embodiments of the present disclosure, first, a point cloud data set is obtained, where each point cloud data in the point cloud data set includes a point cloud coordinate value and a point cloud timestamp. And then, selecting positioning information corresponding to the point cloud time stamp in each point cloud data in the point cloud data set from a preset positioning information sequence to generate a positioning information set, and obtaining a positioning information set, wherein the positioning information in the positioning information sequence comprises a pose matrix. And then, generating an interpolation positioning information set based on the positioning information set, wherein the interpolation positioning information in the interpolation positioning information set corresponds to each point cloud data in the point cloud data set. By generating the difference positioning information group, point cloud data and positioning information at the same time can be corresponded, and the three-dimensional lane line is convenient to generate. And then, generating a correction rotation matrix group based on the interpolation positioning information group and the point cloud data group. The correction rotation matrix group is generated, and the method can be used for determining the pose error between the laser radar and the high-precision positioning equipment corresponding to each point cloud data. Therefore, the change of the pose between the laser radar and the high-precision positioning equipment due to the vehicle movement bump can be determined. And then, optimizing the point cloud coordinate value of each point cloud data in the point cloud data set by using the correction rotation matrix set to obtain an optimized point cloud coordinate value set. The point cloud data can be optimized by correcting the rotation matrix set to reduce the error of the point cloud data. And finally, carrying out lane line extraction processing on each optimized point cloud coordinate value in the optimized point cloud coordinate value group to obtain a three-dimensional lane line equation group. Thereby, compared to the point cloud data before optimization. By optimizing the point cloud data, the extracted three-dimensional lane line equation can be more accurate. Further, driving safety can be improved.
With further reference to fig. 3, a flow 300 of further embodiments of a three-dimensional lane line generation method is illustrated. The flow 300 of the three-dimensional lane line generation method includes the following steps:
step 301, a point cloud data set is obtained.
In some embodiments, the specific implementation manner and technical effects of step 301 may refer to step 201 in those embodiments corresponding to fig. 2, and are not described herein again.
Step 302, selecting point cloud data with point cloud coordinate values within a preset coordinate range from the point cloud data set to obtain a target point cloud data set.
In some embodiments, an executing entity (e.g., the computing device 101 shown in fig. 1) of the three-dimensional lane line generation method may select point cloud data having point cloud coordinate values within a preset coordinate range from the point cloud data set to obtain a target point cloud data set. The selected point cloud data with the point cloud coordinate values in the preset coordinate range can reduce interference of other point clouds, and extraction of the three-dimensional lane line is facilitated.
Step 303, selecting positioning information corresponding to the point cloud time stamp in each target point cloud data in the target point cloud data set from a preset positioning information sequence to generate a positioning information set, so as to obtain a positioning information set.
And 304, generating an interpolation positioning information group based on the positioning information group set.
And 305, generating a correction rotation matrix group based on the interpolation positioning information group and the point cloud data group.
And step 306, optimizing the point cloud coordinate value of each point cloud data in the point cloud data set by using the correction rotation matrix set to obtain an optimized point cloud coordinate value set.
And 307, performing lane line extraction processing on each optimized point cloud coordinate value in the optimized point cloud coordinate value group to obtain a three-dimensional lane line equation group.
In some embodiments, the specific implementation manner and technical effects of steps 303-307 can refer to steps 202-206 in those embodiments corresponding to fig. 2, which are not described herein again.
As can be seen from fig. 3, compared with the description of some embodiments corresponding to fig. 2, the process 300 of the three-dimensional lane line generation method in some embodiments corresponding to fig. 3 represents the step of obtaining the target point cloud data set. Therefore, the interference of other point clouds can be further reduced, and the extraction of the three-dimensional lane line is facilitated. So that the accuracy of the generated three-dimensional lane line equation can be improved.
With further reference to fig. 4, as an implementation of the methods shown in the above figures, the present disclosure provides some embodiments of a three-dimensional lane line generation apparatus, which correspond to those shown in fig. 2, and which may be applied in various electronic devices in particular.
As shown in fig. 4, the three-dimensional lane line generation apparatus 400 of some embodiments includes: an acquisition unit 401, an extraction unit 402, a first generation unit 403, a second generation unit 404, an optimization unit 405, and an extraction unit 406. The acquiring unit 401 is configured to acquire a point cloud data set, where each point cloud data in the point cloud data set includes a point cloud coordinate value and a point cloud timestamp; a selecting unit 402, configured to select, from a preset positioning information sequence, positioning information corresponding to a point cloud timestamp in each point cloud data in the point cloud data set to generate a positioning information set, resulting in a positioning information set, where the positioning information in the positioning information sequence includes a pose matrix; a first generating unit 403 configured to generate an interpolated positioning information set based on the set of positioning information sets, wherein interpolated positioning information in the interpolated positioning information set corresponds to each point cloud data in the point cloud data set; a second generating unit 404 configured to generate a correction rotation matrix set based on the interpolation positioning information set and the point cloud data set; an optimizing unit 405 configured to optimize the point cloud coordinate value of each point cloud data in the point cloud data set by using the correction rotation matrix set, so as to obtain an optimized point cloud coordinate value set; and the extraction unit 406 is configured to perform lane line extraction processing on each optimized point cloud coordinate value in the optimized point cloud coordinate value group to obtain a three-dimensional lane line equation set.
It will be understood that the elements described in the apparatus 400 correspond to various steps in the method described with reference to fig. 2. Thus, the operations, features and resulting advantages described above with respect to the method are also applicable to the apparatus 400 and the units included therein, and will not be described herein again.
Referring now to FIG. 5, a block diagram of an electronic device (e.g., computing device 101 of FIG. 1)500 suitable for use in implementing some embodiments of the present disclosure is shown. The electronic device shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 5, electronic device 500 may include a processing means (e.g., central processing unit, graphics processor, etc.) 501 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)502 or a program loaded from a storage means 508 into a Random Access Memory (RAM) 503. In the RAM503, various programs and data necessary for the operation of the electronic apparatus 500 are also stored. The processing device 501, the ROM 502, and the RAM503 are connected to each other through a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
Generally, the following devices may be connected to the I/O interface 505: input devices 506 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 507 including, for example, a Liquid Crystal Display (LCD), speakers, vibrators, and the like; storage devices 508 including, for example, magnetic tape, hard disk, etc.; and a communication device 509. The communication means 509 may allow the electronic device 500 to communicate with other devices wirelessly or by wire to exchange data. While fig. 5 illustrates an electronic device 500 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 5 may represent one device or may represent multiple devices as desired.
In particular, according to some embodiments of the present disclosure, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, some embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In some such embodiments, the computer program may be downloaded and installed from a network via the communication means 509, or installed from the storage means 508, or installed from the ROM 502. The computer program, when executed by the processing device 501, performs the above-described functions defined in the methods of some embodiments of the present disclosure.
It should be noted that the computer readable medium described above in some embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In some embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In some embodiments of the present disclosure, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network Protocol, such as HTTP (HyperText Transfer Protocol), and may interconnect with any form or medium of digital data communication (e.g., a communications network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the Internet (e.g., the Internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed network.
The computer readable medium may be embodied in the apparatus; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring a point cloud data set, wherein each point cloud data in the point cloud data set comprises a point cloud coordinate value and a point cloud timestamp; selecting positioning information corresponding to the point cloud time stamp in each point cloud data in the point cloud data set from a preset positioning information sequence to generate a positioning information set, and obtaining a positioning information set, wherein the positioning information in the positioning information sequence comprises a pose matrix; generating an interpolation positioning information set based on the positioning information set, wherein the interpolation positioning information in the interpolation positioning information set corresponds to each point cloud data in the point cloud data set; generating a correction rotation matrix group based on the interpolation positioning information group and the point cloud data group; optimizing the point cloud coordinate value of each point cloud data in the point cloud data set by using the correction rotation matrix set to obtain an optimized point cloud coordinate value set; and carrying out lane line extraction processing on each optimized point cloud coordinate value in the optimized point cloud coordinate value group to obtain a three-dimensional lane line equation group.
Computer program code for carrying out operations for embodiments of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in some embodiments of the present disclosure may be implemented by software, and may also be implemented by hardware. The described units may also be provided in a processor, and may be described as: a processor includes an acquisition unit, a selection unit, a first generation unit, a second generation unit, an optimization unit, and an extraction unit. The names of the units do not in some cases form a limitation on the units themselves, and for example, the acquisition unit may also be described as a "unit for acquiring a point cloud data set".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the embodiments of the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is made without departing from the inventive concept as defined above. For example, the above features and (but not limited to) technical features with similar functions disclosed in the embodiments of the present disclosure are mutually replaced to form the technical solution.

Claims (10)

1. A three-dimensional lane line generation method includes:
acquiring a point cloud data set, wherein each point cloud data in the point cloud data set comprises a point cloud coordinate value and a point cloud timestamp;
selecting positioning information corresponding to a point cloud timestamp in each point cloud data in the point cloud data group from a preset positioning information sequence to generate a positioning information group, and obtaining a positioning information group set, wherein the positioning information in the positioning information sequence comprises a pose matrix;
generating an interpolation positioning information set based on the positioning information set, wherein interpolation positioning information in the interpolation positioning information set corresponds to each point cloud data in the point cloud data set;
generating a correction rotation matrix group based on the interpolation positioning information group and the point cloud data group;
optimizing the point cloud coordinate value of each point cloud data in the point cloud data group by using the correction rotation matrix group to obtain an optimized point cloud coordinate value group;
and carrying out lane line extraction processing on each optimized point cloud coordinate value in the optimized point cloud coordinate value group to obtain a three-dimensional lane line equation group.
2. The method of claim 1, wherein the method further comprises:
and sending the three-dimensional lane line equation set to a display terminal for three-dimensional lane line display.
3. The method of claim 1, wherein generating an interpolated set of location information based on the set of sets of location information comprises:
interpolating a pose matrix included in each positioning information group in the positioning information group set to generate an interpolation pose matrix, and obtaining an interpolation pose matrix group;
and determining the interpolation pose matrix group as an interpolation positioning information group.
4. The method of claim 1, wherein the selecting the positioning information corresponding to the point cloud timestamp in each point cloud data of the point cloud data set from a preset positioning information sequence to generate a positioning information set, and obtaining a positioning information set comprises:
selecting point cloud data with point cloud coordinate values within a preset coordinate range from the point cloud data set to obtain a target point cloud data set;
and selecting positioning information corresponding to the point cloud time stamp in each target point cloud data in the target point cloud data set from a preset positioning information sequence to generate a positioning information set, so as to obtain a positioning information set.
5. The method of claim 1, wherein generating a set of corrective rotational matrices based on the set of interpolated positional information and the set of point cloud data comprises:
determining the angular speed value variation of the current vehicle;
generating a correction parameter based on the angular speed variation;
and correcting each initial rotation matrix in a preset initial rotation matrix group based on the point cloud data group, the correction parameters and the interpolation positioning information group to obtain a corrected rotation matrix group, wherein each initial rotation matrix in the initial rotation matrix group corresponds to each point cloud data in the point cloud data group.
6. The method of claim 1, wherein the optimizing the point cloud coordinate values of each point cloud data in the point cloud data set using the set of corrected rotation matrices to obtain a set of optimized point cloud coordinate values comprises:
generating a target rotation matrix group based on the correction rotation matrix group and the interpolation positioning information group;
and determining the optimized point cloud coordinate value of each point cloud data in the point cloud data set based on the target rotation matrix set to obtain an optimized point cloud coordinate value set.
7. The method of claim 1, wherein the performing lane line extraction processing on each optimized point cloud coordinate value in the set of optimized point cloud coordinate values to obtain a three-dimensional lane line equation set comprises:
selecting an optimized coordinate value meeting preset screening conditions from the optimized coordinate value group as a target coordinate value group to obtain a target coordinate value group;
and carrying out lane line identification processing on the target coordinate value set to obtain a three-dimensional lane line equation set.
8. A three-dimensional lane line generation apparatus comprising:
an acquisition unit configured to acquire a point cloud data set, wherein each point cloud data in the point cloud data set includes a point cloud coordinate value and a point cloud timestamp;
the selecting unit is configured to select positioning information corresponding to the point cloud timestamp in each point cloud data in the point cloud data set from a preset positioning information sequence to generate a positioning information set, so as to obtain a positioning information set, wherein the positioning information in the positioning information sequence comprises a pose matrix;
a first generation unit configured to generate an interpolated localization information set based on the set of localization information sets, wherein interpolated localization information in the interpolated localization information set corresponds to each point cloud data in the point cloud data set;
a second generation unit configured to generate a set of corrective rotational matrices based on the set of interpolated localization information and the set of point cloud data;
an optimization unit configured to optimize the point cloud coordinate values of each point cloud data in the point cloud data set by using the correction rotation matrix set to obtain an optimized point cloud coordinate value set;
and the extraction unit is configured to perform lane line extraction processing on each optimized point cloud coordinate value in the optimized point cloud coordinate value group to obtain a three-dimensional lane line equation set.
9. An electronic device, comprising:
one or more processors;
a storage device having one or more programs stored thereon,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-7.
10. A computer-readable medium, on which a computer program is stored, wherein the program, when executed by a processor, implements the method of any one of claims 1-7.
CN202111559581.9A 2021-12-20 2021-12-20 Three-dimensional lane line generation method and device, electronic device and computer readable medium Active CN114399587B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111559581.9A CN114399587B (en) 2021-12-20 2021-12-20 Three-dimensional lane line generation method and device, electronic device and computer readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111559581.9A CN114399587B (en) 2021-12-20 2021-12-20 Three-dimensional lane line generation method and device, electronic device and computer readable medium

Publications (2)

Publication Number Publication Date
CN114399587A true CN114399587A (en) 2022-04-26
CN114399587B CN114399587B (en) 2022-11-11

Family

ID=81227580

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111559581.9A Active CN114399587B (en) 2021-12-20 2021-12-20 Three-dimensional lane line generation method and device, electronic device and computer readable medium

Country Status (1)

Country Link
CN (1) CN114399587B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114842448A (en) * 2022-05-11 2022-08-02 禾多科技(北京)有限公司 Three-dimensional lane line generation method and device, electronic device and computer readable medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110705543A (en) * 2019-08-23 2020-01-17 芜湖酷哇机器人产业技术研究院有限公司 Method and system for recognizing lane lines based on laser point cloud
CN111882612A (en) * 2020-07-21 2020-11-03 武汉理工大学 Vehicle multi-scale positioning method based on three-dimensional laser detection lane line
CN111936821A (en) * 2019-07-12 2020-11-13 北京航迹科技有限公司 System and method for positioning
CN112598762A (en) * 2020-09-16 2021-04-02 禾多科技(北京)有限公司 Three-dimensional lane line information generation method, device, electronic device, and medium
CN112698302A (en) * 2020-12-16 2021-04-23 南京航空航天大学 Sensor fusion target detection method under bumpy road condition
US20210166421A1 (en) * 2019-12-03 2021-06-03 Beijing Baidu Netcom Science And Technology Co., Ltd. Method, electronic device and storage medium for detecting a position change of lane line
CN113269837A (en) * 2021-04-27 2021-08-17 西安交通大学 Positioning navigation method suitable for complex three-dimensional environment

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111936821A (en) * 2019-07-12 2020-11-13 北京航迹科技有限公司 System and method for positioning
CN110705543A (en) * 2019-08-23 2020-01-17 芜湖酷哇机器人产业技术研究院有限公司 Method and system for recognizing lane lines based on laser point cloud
US20210166421A1 (en) * 2019-12-03 2021-06-03 Beijing Baidu Netcom Science And Technology Co., Ltd. Method, electronic device and storage medium for detecting a position change of lane line
CN111882612A (en) * 2020-07-21 2020-11-03 武汉理工大学 Vehicle multi-scale positioning method based on three-dimensional laser detection lane line
CN112598762A (en) * 2020-09-16 2021-04-02 禾多科技(北京)有限公司 Three-dimensional lane line information generation method, device, electronic device, and medium
CN112698302A (en) * 2020-12-16 2021-04-23 南京航空航天大学 Sensor fusion target detection method under bumpy road condition
CN113269837A (en) * 2021-04-27 2021-08-17 西安交通大学 Positioning navigation method suitable for complex three-dimensional environment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114842448A (en) * 2022-05-11 2022-08-02 禾多科技(北京)有限公司 Three-dimensional lane line generation method and device, electronic device and computer readable medium
CN114842448B (en) * 2022-05-11 2023-03-24 禾多科技(北京)有限公司 Three-dimensional lane line generation method and device, electronic device and computer readable medium

Also Published As

Publication number Publication date
CN114399587B (en) 2022-11-11

Similar Documents

Publication Publication Date Title
CN112598762B (en) Three-dimensional lane line information generation method, device, electronic device, and medium
CN112561990B (en) Positioning information generation method, device, equipment and computer readable medium
CN113607185B (en) Lane line information display method, lane line information display device, electronic device, and computer-readable medium
CN113674357B (en) Camera external reference calibration method and device, electronic equipment and computer readable medium
CN112643665B (en) Calibration method and device for installation error of absolute pose sensor
CN113327318B (en) Image display method, image display device, electronic equipment and computer readable medium
CN114399588A (en) Three-dimensional lane line generation method and device, electronic device and computer readable medium
CN114399587B (en) Three-dimensional lane line generation method and device, electronic device and computer readable medium
CN114399589A (en) Three-dimensional lane line generation method and device, electronic device and computer readable medium
CN111898815A (en) Typhoon track prediction method and device, electronic equipment and computer readable medium
CN113781478B (en) Oil tank image detection method, oil tank image detection device, electronic equipment and computer readable medium
CN114993328A (en) Vehicle positioning evaluation method, device, equipment and computer readable medium
CN114894205A (en) Three-dimensional lane line information generation method, device, equipment and computer readable medium
CN117191080A (en) Calibration method, device, equipment and storage medium for camera and IMU external parameters
CN112590929A (en) Correction method, apparatus, electronic device, and medium for steering wheel of autonomous vehicle
CN113379852B (en) Method, device, electronic equipment and medium for verifying camera calibration result
CN115817163B (en) Method, apparatus, electronic device and computer readable medium for adjusting wheel speed of vehicle
CN113808134A (en) Oil tank layout information generation method, oil tank layout information generation device, electronic apparatus, and medium
CN113407045A (en) Cursor control method and device, electronic equipment and storage medium
CN110207699B (en) Positioning method and device
CN110033088B (en) Method and device for estimating GPS data
CN116630436B (en) Camera external parameter correction method, camera external parameter correction device, electronic equipment and computer readable medium
CN115937046B (en) Road ground information generation method, device, equipment and computer readable medium
CN115993137B (en) Vehicle positioning evaluation method, device, electronic equipment and computer readable medium
CN112649017A (en) Method and device for calibrating system error of odometer

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 201, 202, 301, No. 56-4 Fenghuang South Road, Huadu District, Guangzhou City, Guangdong Province, 510806

Patentee after: Heduo Technology (Guangzhou) Co.,Ltd.

Address before: 100099 101-15, 3rd floor, building 9, yard 55, zique Road, Haidian District, Beijing

Patentee before: HOLOMATIC TECHNOLOGY (BEIJING) Co.,Ltd.