CN113835101A - Vehicle positioning method and device based on radar point cloud and storage medium - Google Patents

Vehicle positioning method and device based on radar point cloud and storage medium Download PDF

Info

Publication number
CN113835101A
CN113835101A CN202111157408.6A CN202111157408A CN113835101A CN 113835101 A CN113835101 A CN 113835101A CN 202111157408 A CN202111157408 A CN 202111157408A CN 113835101 A CN113835101 A CN 113835101A
Authority
CN
China
Prior art keywords
point cloud
error
feature
characteristic
radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111157408.6A
Other languages
Chinese (zh)
Inventor
刘祥勇
陈广
熊璐
卓桂荣
卢凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tongji University
Original Assignee
Tongji University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tongji University filed Critical Tongji University
Priority to CN202111157408.6A priority Critical patent/CN113835101A/en
Publication of CN113835101A publication Critical patent/CN113835101A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Traffic Control Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention relates to a vehicle positioning method, a device and a storage medium based on radar point cloud, wherein the method comprises the following steps: s1, performing point cloud feature extraction on the point cloud data acquired by the radar to acquire feature point cloud; s2, establishing a point cloud characteristic error model, and performing error evaluation on the characteristic point cloud by using the point cloud characteristic error model; and S3, determining the error weight of each feature point cloud based on the error evaluation result of the feature point cloud, and registering the feature point cloud and the map features based on the error weight to further position the vehicle. Compared with the prior art, the method has the advantages that the characteristic point cloud is obtained by extracting the point cloud characteristics, the stable structural characteristics in the environment can be focused, the reliability of environment cognition is improved, the reliability of the point cloud characteristics is evaluated through the point cloud characteristic error model, and therefore the method is used for setting and registering the subsequent error weight, the registration precision is improved, and the vehicle positioning precision is improved.

Description

Vehicle positioning method and device based on radar point cloud and storage medium
Technical Field
The invention relates to the technical field of automatic driving, in particular to a vehicle positioning method and device based on radar point cloud and a storage medium.
Background
The automatic driving vehicle runs in an unsupervised state, the driving brain of the automatic driving can replace a driver, human observation (a sensor system), thinking (the driving brain) and operation (planning control) are simulated, and then wheels are controlled to complete action tasks. The main functions involved in automatic driving include automatic driving functions such as self-vehicle driving positioning, moving target detection, route detection, path planning, vehicle tracking control and the like.
The accurate driving and positioning have important significance for automatic driving. The accurate positioning information of the vehicle is the premise of vehicle path planning and tracking control, and is a key technology for guaranteeing automatic driving. As is well known, the accuracy of a GNSS/GPS positioning system cannot reach centimeter-level accuracy, and visual positioning is easily affected by factors such as light, weather, night, camera parameter calibration, and the like. However, the laser radar has the advantages of strong anti-interference capability, high detection precision, long detection distance and the like. Laser positioning is therefore an important branch of autonomous driving.
In the driving process of automatic driving, laser positioning is a calculation process from the environmental characteristics acquired by a laser radar to the relative position (distance and angle) of the radar. If n environmental features are detected at a certain driving location and each feature provides direction and distance constraints, the number of constraints observed in the whole environment will reach 2n, which will lead to redundant reference information and positioning deviations. In addition, the measurement accuracy of the same observed feature varies with the measurement direction and distance, which also results in different positioning results for the same feature. At the present stage, the existing automatic driving system lacks the stable evaluation on the point cloud environment around the vehicle, and adopts a uniform registration positioning method on the laser point cloud, so that the precision and robustness of laser positioning are poor, and certain potential safety hazard is caused to the automatic driving to a certain extent.
Disclosure of Invention
The present invention is directed to a method, an apparatus and a storage medium for vehicle location based on radar point cloud to overcome the above-mentioned drawbacks of the prior art.
The purpose of the invention can be realized by the following technical scheme:
a method for radar point cloud based vehicle localization, the method comprising:
s1, performing point cloud feature extraction on the point cloud data acquired by the radar to acquire feature point cloud;
s2, establishing a point cloud characteristic error model, and performing error evaluation on the characteristic point cloud by using the point cloud characteristic error model;
and S3, determining the error weight of each feature point cloud based on the error evaluation result of the feature point cloud, and registering the feature point cloud and the map features based on the error weight to further position the vehicle.
Preferably, the feature point cloud acquired in step S1 includes a surface feature point cloud and a line feature point cloud.
Preferably, the specific way of performing the feature extraction in step S1 includes:
s11, finding the point riRoughness c ofi
Figure BDA0003289158540000021
Wherein s represents the sum of point riAdjacent and including riAll point sets of rjRepresents the jth point in the set s, | s | represents the total number of points in the set s, | riI represents a point riMode of the corresponding laser beam vector, (r)j-ri) Represents a point rjAnd point riThe distance between individuals;
s12 according to roughness ciThe size of the point cloud is arranged from large to small;
s13, if ci>cthThen point riIs a line feature point cloud, if ci<cthThen point riIs a point cloud of surface features.
Preferably, the point cloud characteristic error model includes a plane error model and a stereo error model, the plane error model is used for calculating a plane distribution error of the characteristic point cloud, and the stereo error model is used for calculating a spatial distribution error of the characteristic point cloud.
Preferably, the plane error model is:
Figure BDA0003289158540000022
s is a point cloud plane error entropy representing a plane distribution error; l is the corresponding measuring distance of the point cloud; beta is a measuring angle corresponding to the point cloud; alpha is the included angle between the laser beam and the projection surface; d0The diameter of the radar beam emitting hole.
Preferably, the stereo error model is:
Figure BDA0003289158540000023
v is a point cloud probability volume representing a spatial distribution error, l is a measurement distance corresponding to the point cloud, rho is the radius of an error ellipsoid, and rho is an empirical constant.
Preferably, the error weight of the feature point cloud of step S3 is obtained by the following formula:
Figure BDA0003289158540000031
Figure BDA0003289158540000032
Figure BDA0003289158540000033
wherein, wiIs the error weight of the ith feature point cloud,
Figure BDA0003289158540000034
the weight corresponding to the ith characteristic point cloud plane error model,
Figure BDA0003289158540000035
the weight corresponding to the ith characteristic point cloud stereo error model, Si(l,α)Is as followsPlane distribution error of i characteristic point clouds, Vi(l)The spatial distribution error of the ith characteristic point cloud is shown, n is the total number of the characteristic point clouds, l is the corresponding measurement distance of the point clouds, and alpha is the included angle between the laser beam and the projection surface.
Preferably, in step S3, the registration of the feature point cloud and the map feature is performed based on the error weight, and further the specific way of locating the vehicle position is as follows:
calculating the minimum flat error E (R, H) of the match:
Figure BDA0003289158540000036
wherein, wiError weight for the ith feature point cloud, GiFor map features matching the ith feature point cloud, PiThe feature point cloud is acquired for the ith feature point cloud, R is a rotation matrix of feature matching, and H is a migration matrix of feature matching;
if E (R, H) is smaller than the threshold value, the feature matching is successful, and a matrix R, H is obtained; and calculates the vehicle position using the matrix R, H.
A vehicle positioning device based on radar point cloud comprises a memory and a processor, wherein the memory is used for storing a computer program, and the processor is used for realizing the vehicle positioning method based on the radar point cloud when the computer program is executed.
A storage medium on which a computer program is stored which, when being executed by a processor, implements the radar point cloud based vehicle localization method.
Compared with the prior art, the invention has the following advantages:
(1) the method evaluates the point cloud characteristics of the laser radar through the point cloud characteristic error model so as to improve the laser positioning precision;
(2) because the point cloud has the scattering characteristic, the characteristic point cloud is obtained by extracting the point cloud characteristic, so that the stable structural characteristic in the environment can be focused, the reliability of environment cognition is improved, and the calculated amount is reduced;
(3) the point cloud projection has a light spot amplification display characteristic, the diameter of a light spot is easily influenced by a projection distance and an angle, a plane error model is established, meanwhile, the point cloud has a sparse characteristic, the extracted characteristic has a spatial distribution error, a three-dimensional error model is established, the reliability of the point cloud characteristic is evaluated through the plane error model and the three-dimensional error model, and therefore the point cloud characteristic is used for setting and registering subsequent error weights, the registration accuracy is improved, and the vehicle positioning accuracy is improved.
Drawings
Fig. 1 is an architecture diagram of a vehicle positioning system based on radar point cloud provided in embodiment 1 of the present invention.
FIG. 2 is a schematic diagram of an elliptical model of spot error of a laser radar projection point.
FIG. 3 is a schematic diagram of the error entropy of the point cloud error ellipse model.
Fig. 4 is a schematic view of a solid error model of the position distribution of the laser projection points.
FIG. 5 is a schematic diagram of the distribution ellipsoid and probability volume of a laser spot.
Fig. 6 is a schematic view of the observed positioning reference.
Fig. 7 is a schematic diagram of weight calculation for environmental references.
Detailed Description
The invention is described in detail below with reference to the figures and specific embodiments. Note that the following description of the embodiments is merely a substantial example, and the present invention is not intended to be limited to the application or the use thereof, and is not limited to the following embodiments.
Examples
The embodiment provides a vehicle positioning method based on radar point cloud, and due to the problem of point cloud projection light spot, the oblique projection of a light beam causes the diameter of the light spot to be increased; meanwhile, sparsity and multi-scale characteristics exist among multiple beams of the laser radar, so that space distribution errors exist in the characteristics of point cloud extraction, and point cloud registration and positioning errors are caused.
Therefore, the method provided by the embodiment is based on a vehicle positioning system based on radar point cloud. As shown in fig. 1, the system includes a point cloud feature extraction system, a feature evaluation system, and a feature optimization registration positioning system, which are in a progressive relationship. The point cloud feature extraction system comprises point cloud coarseness and point cloud line surface feature extraction operation; the characteristic evaluation system comprises a plane error model and a three-dimensional error model which are adopted to evaluate the characteristic point cloud; the feature optimization registration positioning system comprises error entropy distribution of current features and weight of single features in the overall registration features.
(1) Point cloud feature extraction system
The point cloud feature extraction system takes the point cloud collected by the 16-line laser radar as an operation object, and can obtain 28800 scanning point data in one scanning period of the radar. In order to reduce the difficulty of extracting the point cloud data characteristics of the data quantity, 10 points are taken as a group to calculate the average position distance between adjacent points, and the average position distance is taken as a point roughness calculation method; and then, sorting the points according to the roughness, and determining the points as line features when the roughness is greater than a certain threshold value, and determining the points as surface features when the roughness is less than a certain threshold value. The specific way of feature extraction includes:
first, a point r is obtainediRoughness c ofi
Figure BDA0003289158540000051
Wherein s represents the sum of point riAdjacent and including riAll point sets of rjRepresents the jth point in the set s, | s | represents the total number of points in the set s, | riI represents a point riMode of the corresponding laser beam vector, (r)j-ri) Represents a point rjAnd point riThe distance between individuals;
then, according to the roughness ciThe size of the point cloud is arranged from large to small;
finally, if ci>cthThen point riIs a line feature point cloud, if ci<cthThen point riIs a point cloud of surface features.
(2) Point cloud characteristic evaluation system
When the laser beam is projected on the surface of the object, a light spot is generated, and the elliptic model of the light spot of the laser spot is shown in figure 2. The spot diameter d of the point cloud projection is shown in formula (1).
d=2l.tan(β/2)+D0 (1)
In the formula: beta is the measured angle corresponding to the point cloud, D0And l is the diameter of the emission hole of the radar beam, and the measured distance corresponding to the point cloud represents the distance from the emission source point to the projection surface.
In the projection of the light beam, when there is an incident angle, the spot will appear as an ellipse, and the length of the major axis of the spot is calculated by equation (2), D represents the diameter of the major axis of the spot. In the formula: α represents an incident angle, and represents an angle of the laser beam with respect to the projection plane.
Figure BDA0003289158540000052
The shape and area of the spot will affect the accuracy of the laser spot feedback position. The probability distribution function of the laser position is shown in equation (3). In the formula: d and D represent the diameter sizes of the major and minor axes of the spot, respectively.
Figure BDA0003289158540000053
According to the probability distribution function of the laser spot, the information entropy of the laser spot can be calculated by formula (4). The error entropy of the laser spot is calculated in equation (5).
Figure BDA0003289158540000061
Figure BDA0003289158540000062
In the formula: s represents error entropy, l represents distance, and α represents incident angle. The relationship between the three is shown in fig. 3.
The area calculation of the light spot is shown as the formula (6), and the error entropy is half of the area of the light spot. In the formula: sαRepresenting the spot area.
Figure BDA0003289158540000063
Figure BDA0003289158540000064
The laser radar has sparse characteristics among multiple beams, and distribution deviation exists between the extracted features and actual features. The deviations are distributed in the width, height and depth directions. The spatial distribution of the laser spots is ellipsoidal, as shown in fig. 4. The points are distributed in the width, height and depth directions and exhibit a normal distribution law.
In a uniaxial normal distribution, there is only one parameter, (x) to N (0, λ)2) The probability region is calculated by plane integration in equation (8). In the formula (9), the reaction mixture is,
Figure BDA0003289158540000065
and
Figure BDA0003289158540000066
respectively, the normal distribution variance on the three spatial axes. Assuming that the extracted features have distribution errors, the error vector is pe ═ δ x, δ y, δ z]. δ x is related to the horizontal angular resolution θ and the scan distance (d); δ y and vertical angular resolution
Figure BDA0003289158540000067
Related to the scanning distance; δ z is related to the air medium h and the scanning distance d. δ x ═ θ d ═ λ 1,
Figure BDA0003289158540000068
δ z ═ hd ═ λ 3. The three axes of the ellipsoid consist of δ x, δ y and δ z, respectively. In the normal distribution of the triaxial ellipsoid, there are three parameters, (x, y, z) -N (0, λ)1 2;0,λ2 2;0,λ3 2) The probability volume is calculated by volume integration. From equation (9), the probability distribution and probability volume of the laser spot are derived.
Figure BDA0003289158540000069
Figure BDA00032891585400000610
In the formula: f (x) and f (x, y, z) represent the probability density of the single parameter, and S and V represent the probability area and the probability volume, respectively.
Order to
Figure BDA0003289158540000071
Equation (9) may be converted to equation (10).
Figure BDA0003289158540000072
According to the theory of normal distribution, the probability integral of an ellipsoid is 1. Equation (10) can be further converted into equation (11) by the spherical coordinate integration method, and equation (11) represents the calculation result of the probability volume. Different distance values correspond to different probability volumes, and the distribution ellipsoid and the probability quantity are shown in fig. 5.
Figure BDA0003289158540000073
In the formula, theta represents the horizontal angular resolution,
Figure BDA0003289158540000074
represents the vertical angular resolution, ρ represents the radius of the error ellipsoid, ρ is an empirical constant, θ,
Figure BDA0003289158540000075
For laser minesThe configuration parameters of the system.
(3) Point cloud feature registration optimization system
In the registration optimization process of the point cloud features, a plurality of features can be observed at a certain time, the point precision of each feature is different, and a matching error occurs in the feature registration positioning process, as shown in fig. 6, xi represents a host vehicle. Therefore, the feedback information is evaluated by using the error model, and a more accurate matching result can be obtained.
In fig. 7, layer 0 indicates an error between the detected position conversion of the element and the real element of the map, and the calculation of the feature registration is shown in equation (12).
Figure BDA0003289158540000076
Figure BDA0003289158540000077
Figure BDA0003289158540000078
In the formula (I); w[l]Weight matrix, m, representing layer 1[l]The output results of layer 1 are shown, R represents the rotation matrix and H represents the movement matrix.
Calculating the corresponding weight of the plane error model through the distance and the angle of the point feature
Figure BDA0003289158540000079
Calculating corresponding weight of stereo error model by distance
Figure BDA0003289158540000081
The weight is calculated as shown in equation (15). In the formula: n is the number of extracted feature point clouds.
Figure BDA0003289158540000082
Substituting the weight equation (15) into equation (16) calculates the minimum flat error E (R, H) of the match:
Figure BDA0003289158540000083
if the least square error is less than the threshold value, the successful calculation of the feature matching is finished, and therefore a conversion matrix (R, H) of the feature matching is obtained, and the position is calculated. In the formula: giFeatures of the map are represented and Pi represents a feature currently acquired by the lidar.
Based on the above, the present embodiment provides a vehicle positioning method based on radar point cloud, including:
and S1, performing point cloud feature extraction on the point cloud data acquired by the radar to acquire feature point clouds, wherein the acquired feature point clouds comprise surface feature point clouds and line feature point clouds, and the specific extraction method is described in detail above and is not repeated herein.
And S2, establishing a point cloud characteristic error model, and performing error evaluation on the characteristic point cloud by using the point cloud characteristic error model. The point cloud characteristic error model comprises a plane error model and a stereo error model, wherein the plane error model is used for calculating the plane distribution error of the characteristic point cloud, and the stereo error model is used for calculating the space distribution error of the characteristic point cloud. The plane error model is:
Figure BDA0003289158540000084
wherein S is point cloud plane error entropy representing plane distribution error, l is a measurement distance corresponding to the point cloud, β is a measurement angle corresponding to the point cloud, α is an incident angle, the physical meaning is an included angle between a laser beam and a projection plane, and D is0The diameter of the radar beam emitting hole.
The stereo error model is:
Figure BDA0003289158540000085
v is a point cloud probability volume representing a spatial distribution error, l is a measurement distance corresponding to the point cloud, rho is the radius of an error ellipsoid, and rho is an empirical constant.
And S3, determining the error weight of each feature point cloud based on the error evaluation result of the feature point cloud, and registering the feature point cloud and the map features based on the error weight to further position the vehicle.
The error weight of the feature point cloud is obtained by:
Figure BDA0003289158540000091
Figure BDA0003289158540000092
Figure BDA0003289158540000093
wherein, wiIs the error weight of the ith feature point cloud,
Figure BDA0003289158540000094
the weight corresponding to the ith characteristic point cloud plane error model,
Figure BDA0003289158540000095
the weight corresponding to the ith characteristic point cloud stereo error model, Si(l,α)Is the plane distribution error, V, of the ith characteristic point cloudi(l)The spatial distribution error of the ith characteristic point cloud is shown, n is the total number of the characteristic point clouds, l is the corresponding measurement distance of the point clouds, and alpha is the included angle between the laser beam and the projection surface.
The registration of the feature point cloud and the map features is carried out based on the error weight, and the specific mode of further positioning the vehicle position is as follows:
calculating the minimum flat error E (R, H) of the match:
Figure BDA0003289158540000096
wherein, wiError weight for the ith feature point cloud, GiFor map features matching the ith feature point cloud, PiAnd R is a rotation matrix of feature matching, and H is a migration matrix of feature matching.
If E (R, H) is less than the threshold value, the feature matching is successful, and a conversion matrix R, H of the feature matching is obtained;
the vehicle position is calculated using the feature transformation matrix.
In the method, because the point cloud has a scattering characteristic, the characteristic point cloud is obtained by extracting the point cloud characteristic, so that the stable structural characteristic in the environment can be focused, the reliability of environment cognition is improved, and the calculated amount is reduced; the point cloud projection has a light spot amplification display characteristic, the diameter of a light spot is easily influenced by a projection distance and an angle, a plane error model is established, meanwhile, the point cloud has a sparse characteristic, the extracted characteristic has a spatial distribution error, a three-dimensional error model is established, the reliability of the point cloud characteristic is evaluated through the plane error model and the three-dimensional error model, and therefore the point cloud characteristic is used for setting and registering subsequent error weights, the registration accuracy is improved, and the vehicle positioning accuracy is improved.
Example 2
The embodiment provides a vehicle positioning device based on radar point cloud, which includes a memory and a processor, wherein the memory is used for storing a computer program, and the processor is used for implementing the vehicle positioning method based on radar point cloud in embodiment 1 when executing the computer program, and the method is specifically described in embodiment 1, and is not repeated in this embodiment.
Example 3
This embodiment provides a storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the method for positioning a vehicle based on a radar point cloud is implemented, which has been specifically described in embodiment 1, and is not described in detail in this embodiment.
The above embodiments are merely examples and do not limit the scope of the present invention. These embodiments may be implemented in other various manners, and various omissions, substitutions, and changes may be made without departing from the technical spirit of the present invention.

Claims (10)

1. A vehicle positioning method based on radar point cloud is characterized by comprising the following steps:
s1, performing point cloud feature extraction on the point cloud data acquired by the radar to acquire feature point cloud;
s2, establishing a point cloud characteristic error model, and performing error evaluation on the characteristic point cloud by using the point cloud characteristic error model;
and S3, determining the error weight of each feature point cloud based on the error evaluation result of the feature point cloud, and registering the feature point cloud and the map features based on the error weight to further position the vehicle.
2. The radar point cloud-based vehicle positioning method according to claim 1, wherein the feature point cloud obtained in step S1 includes a surface feature point cloud and a line feature point cloud.
3. The method of claim 2, wherein the specific manner of performing feature extraction in step S1 includes:
s11, finding the point riRoughness c ofi
Figure FDA0003289158530000011
Wherein s represents the sum of point riAdjacent and including riAll point sets of rjRepresents the jth point in the set s, | s | represents the total number of points in the set s, | riI represents a point riMode of the corresponding laser beam vector, (r)j-ri) Represents a point rjAnd point riThe distance between individuals;
s12 according to roughness ciThe size of the point cloud is arranged from large to small;
s13, if ci>cthThen point riIs a line feature point cloud, if ci<cthThen point riIs a point cloud of surface features.
4. The method as claimed in claim 1, wherein the point cloud feature error model includes a plane error model and a stereo error model, the plane error model is used for calculating the plane distribution error of the feature point cloud, and the stereo error model is used for calculating the spatial distribution error of the feature point cloud.
5. The method of claim 4, wherein the plane error model is:
Figure FDA0003289158530000012
s is a point cloud plane error entropy representing a plane distribution error; l is the corresponding measuring distance of the point cloud; beta is a measuring angle corresponding to the point cloud; alpha is the included angle between the laser beam and the projection surface; d0The diameter of the radar beam emitting hole.
6. The method of claim 4, wherein the stereo error model is:
Figure FDA0003289158530000021
v is a point cloud probability volume representing a spatial distribution error, l is a measurement distance corresponding to the point cloud, rho is the radius of an error ellipsoid, and rho is an empirical constant.
7. The method of claim 4, wherein the error weight of the feature point cloud of step S3 is obtained by the following formula:
Figure FDA0003289158530000022
Figure FDA0003289158530000023
Figure FDA0003289158530000024
wherein, wiIs the error weight of the ith feature point cloud,
Figure FDA0003289158530000025
the weight corresponding to the ith characteristic point cloud plane error model,
Figure FDA0003289158530000026
the weight corresponding to the ith characteristic point cloud stereo error model, Si(l,α)Is the plane distribution error, V, of the ith characteristic point cloudi(l)The spatial distribution error of the ith characteristic point cloud is shown, n is the total number of the characteristic point clouds, l is the corresponding measurement distance of the point clouds, and alpha is the included angle between the laser beam and the projection surface.
8. The method of claim 1, wherein in step S3, the registration between the feature point cloud and the map feature is performed based on the error weight, and the specific way to locate the vehicle position is as follows:
calculating the minimum flat error E (R, H) of the match:
Figure FDA0003289158530000027
wherein, wiError weight for the ith feature point cloud, GiFor map features matching the ith feature point cloud, PiThe feature point cloud is acquired for the ith feature point cloud, R is a rotation matrix of feature matching, and H is a migration matrix of feature matching;
if E (R, H) is smaller than the threshold value, the feature matching is successful, and a matrix R, H is obtained; and calculates the vehicle position using the matrix R, H.
9. A radar point cloud-based vehicle positioning device, characterized by comprising a memory for storing a computer program and a processor for implementing the radar point cloud-based vehicle positioning method according to any one of claims 1 to 8 when the computer program is executed.
10. A storage medium having stored thereon a computer program, characterized in that the computer program, when being executed by a processor, implements a radar point cloud based vehicle localization method according to any one of claims 1 to 8.
CN202111157408.6A 2021-09-30 2021-09-30 Vehicle positioning method and device based on radar point cloud and storage medium Pending CN113835101A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111157408.6A CN113835101A (en) 2021-09-30 2021-09-30 Vehicle positioning method and device based on radar point cloud and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111157408.6A CN113835101A (en) 2021-09-30 2021-09-30 Vehicle positioning method and device based on radar point cloud and storage medium

Publications (1)

Publication Number Publication Date
CN113835101A true CN113835101A (en) 2021-12-24

Family

ID=78967825

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111157408.6A Pending CN113835101A (en) 2021-09-30 2021-09-30 Vehicle positioning method and device based on radar point cloud and storage medium

Country Status (1)

Country Link
CN (1) CN113835101A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116772887A (en) * 2023-08-25 2023-09-19 北京斯年智驾科技有限公司 Vehicle course initialization method, system, device and readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007069724A1 (en) * 2005-12-16 2007-06-21 Ihi Corporation Three-dimensional shape data aligning method and device
US20200158869A1 (en) * 2018-11-19 2020-05-21 Elmira Amirloo Abolfathi System, device and method of generating a high resolution and high accuracy point cloud
US11002859B1 (en) * 2020-02-27 2021-05-11 Tsinghua University Intelligent vehicle positioning method based on feature point calibration

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007069724A1 (en) * 2005-12-16 2007-06-21 Ihi Corporation Three-dimensional shape data aligning method and device
US20200158869A1 (en) * 2018-11-19 2020-05-21 Elmira Amirloo Abolfathi System, device and method of generating a high resolution and high accuracy point cloud
US11002859B1 (en) * 2020-02-27 2021-05-11 Tsinghua University Intelligent vehicle positioning method based on feature point calibration

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
XIANGYONG LIU ET AL.: "LiDAR point’s elliptical error model and laser positioning for autonomous vehicles", 《MEASUREMENT SCIENCE AND TECHNOLOGY》, pages 1 - 12 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116772887A (en) * 2023-08-25 2023-09-19 北京斯年智驾科技有限公司 Vehicle course initialization method, system, device and readable storage medium
CN116772887B (en) * 2023-08-25 2023-11-14 北京斯年智驾科技有限公司 Vehicle course initialization method, system, device and readable storage medium

Similar Documents

Publication Publication Date Title
CN107340522B (en) Laser radar positioning method, device and system
US7446766B2 (en) Multidimensional evidence grids and system and methods for applying same
CN111429574A (en) Mobile robot positioning method and system based on three-dimensional point cloud and vision fusion
CN108921947A (en) Generate method, apparatus, equipment, storage medium and the acquisition entity of electronic map
CN112346463B (en) Unmanned vehicle path planning method based on speed sampling
CN111273312B (en) Intelligent vehicle positioning and loop detection method
US20230251097A1 (en) Efficient map matching method for autonomous driving and apparatus thereof
CN114549738A (en) Unmanned vehicle indoor real-time dense point cloud reconstruction method, system, equipment and medium
CN111830534B (en) Method for selecting optimal landing points by applying laser radar
CN114119920A (en) Three-dimensional point cloud map construction method and system
JP2018036053A (en) Laser measurement system and laser measurement method
Zhang et al. A framework of using customized LIDAR to localize robot for nuclear reactor inspections
CN113759928B (en) Mobile robot high-precision positioning method for complex large-scale indoor scene
CN113835101A (en) Vehicle positioning method and device based on radar point cloud and storage medium
CN112146627B (en) Aircraft imaging system using projection patterns on featureless surfaces
US20230204363A1 (en) Method for improving localization accuracy of a self-driving vehicle
CN110148218A (en) A kind of method of high-volume airborne lidar point cloud data global optimization
Huang et al. Ground filtering algorithm for mobile LIDAR using order and neighborhood point information
CN113888463A (en) Wheel rotation angle detection method and device, electronic device and storage medium
KR102094773B1 (en) Method for map matching using observed map of moving apparatus, and computing device using the same
US20240087094A1 (en) Systems And Methods For Combining Multiple Depth Maps
CN113204003A (en) Method and device for determining installation attitude of laser radar
CN117649614A (en) Linear array laser imaging target identification method based on simulation point cloud data set
Li et al. Kalman Filtering Jitter Cancellation Based on Lidar Localization
CN116679314A (en) Three-dimensional laser radar synchronous mapping and positioning method and system for fusion point cloud intensity

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination