CN110609290A - Laser radar matching positioning method and device - Google Patents

Laser radar matching positioning method and device Download PDF

Info

Publication number
CN110609290A
CN110609290A CN201910887644.XA CN201910887644A CN110609290A CN 110609290 A CN110609290 A CN 110609290A CN 201910887644 A CN201910887644 A CN 201910887644A CN 110609290 A CN110609290 A CN 110609290A
Authority
CN
China
Prior art keywords
grid
ground
probability
distribution
points
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910887644.XA
Other languages
Chinese (zh)
Other versions
CN110609290B (en
Inventor
梁宝华
黄友
张国龙
张放
李晓飞
张德兆
王肖
霍舒豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Landshipu Information Technology Co ltd
Original Assignee
Beijing Idriverplus Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Idriverplus Technologies Co Ltd filed Critical Beijing Idriverplus Technologies Co Ltd
Priority to CN201910887644.XA priority Critical patent/CN110609290B/en
Publication of CN110609290A publication Critical patent/CN110609290A/en
Application granted granted Critical
Publication of CN110609290B publication Critical patent/CN110609290B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention provides a laser radar matching positioning method, which comprises the following steps: acquiring current laser radar point cloud information; according to the position coordinates, mapping the ground points and the points above the ground on a preset grid map; calculating the distribution of the ground points of each grid in the grid map and the distribution of the points above the ground; calculating the probability of each point in each grid according to the distribution of the ground point of each grid and the distribution of the points above the ground; determining a grid set in a preset search range by taking a first grid in a grid map as a center; when the vehicle is positioned in any grid in the grid set, calculating the matching probability corresponding to different yaw angles according to the probability of points corresponding to the height of the vehicle; determining a probability matrix according to the matching probability corresponding to different yaw angles in each grid in the grid set; evaluating the positioning result according to the probability matrix to generate an evaluation result; and determining the positioning result of the vehicle according to the evaluation result. Therefore, the method eliminates the difference of scenes and can be well adapted to multiple scenes.

Description

Laser radar matching positioning method and device
Technical Field
The invention relates to the field of automatic driving, in particular to a laser radar matching positioning method and device.
Background
In recent years, continuous breakthrough and development of key unmanned technologies greatly accelerate the landing and popularization of the unmanned technologies. The method mainly comprises the steps of high-precision differential GNSS (Global Navigation Satellite System) positioning, laser radar matching positioning, visual positioning and the like. The laser radar matching positioning is used as an important positioning means, and quick and stable positioning and accurate positioning effect evaluation are the most critical factors.
The laser radar matching positioning technology generally comprises two parts of calculating an optimal matching positioning result and evaluating the positioning result.
At present, most laser radar matching methods use a nonlinear optimization technology, and calculate an optimal matching result by taking the maximized laser matching probability as a target. When evaluating whether the laser matching result is available, the availability of the result is often evaluated based on comparing the matching probability of the optimal result with a fixed threshold, and if the matching probability is greater than the threshold, the matching is considered to be available, and if the matching probability is less than the threshold, the matching is considered to be failed.
The following problems exist in the calculation of the positioning result based on the optimized matching probability: the method only provides the probability value corresponding to the optimal solution, cannot provide probability distribution in a search space, loses a large amount of information, and can carry out more effective positioning result judgment by utilizing the information, even can carry out scene identification through the information.
The following problems exist when the matching probability is used as an evaluation index of matching: in different scenarios, the probability threshold for evaluating whether the matching result is valid is often different. For example, some scenes are mainly tree vegetation, some scenes are mainly buildings, and different scenes often have different matching probabilities; in the same scene, the matching probability changes along with the change of seasons. For example, the matching probability of a scene with multiple vegetations, fallen leaves in winter, a luxuriant summer and different seasons changes correspondingly. Therefore, the matching probability cannot be used as a stable and uniform means as a criterion for judging whether positioning is available.
Disclosure of Invention
The embodiment of the invention aims to provide a laser radar matching positioning method and device, and aims to solve the problems that in the prior art, the information loss amount is large and the positioning result cannot be effectively judged by taking the matching probability as an available standard for judging and positioning.
In order to solve the above problem, in a first aspect, the present invention provides a laser radar matching positioning method, where the method includes:
acquiring current laser radar point cloud information; the laser radar point cloud information comprises ground points and points above the ground, and each point has a position coordinate;
mapping the ground point and the points above the ground on a preset grid map according to the position coordinates;
calculating the distribution of ground points and the distribution of points above the ground of each grid in the grid map;
calculating the probability of each point in each grid according to the distribution of the ground point of each grid and the distribution of the points above the ground;
determining a grid set in a preset search range by taking a first grid in a grid map as a center; the first grid is a grid where a positioning result predicted value is located;
when the vehicle is located in any grid in the grid set, calculating matching probabilities corresponding to different yaw angles according to the probability of points corresponding to the height of the vehicle;
determining a probability matrix according to the matching probability corresponding to different yaw angles in each grid in the grid set;
evaluating a positioning result according to the probability matrix to generate an evaluation result;
and determining the positioning result of the vehicle according to the evaluation result.
In one possible implementation, before the obtaining the current lidar point cloud information, the method further includes:
acquiring original laser point cloud information;
and performing coordinate conversion on the original laser point cloud information to obtain the current laser point cloud information under a world coordinate system.
In a possible implementation manner, the calculating the probability of each point in each grid according to the distribution of the ground point and the distribution of the points above the ground for each grid specifically includes:
according to the formulaCalculating the probability of each point in each grid;
wherein (mu)g,σg) Distribution of ground points (μ)o,σo) For the distribution of points above the ground, the position of each grid on the grid map is (i, j), the position of each grid on the world coordinate system is (x, y), and (i, j) ═ x/r, y/r, r is the resolution of the grid map, p is the resolution of the grid mapz|i,jIs the probability, p, of a point in the (i, j) grid at a height zz|i,j=p(x,y,z)
In a possible implementation manner, when the vehicle is located in any one of the grids in the grid set, the calculating, according to the probability of the point corresponding to the vehicle height, the matching probabilities corresponding to different yaw angles to obtain a matching probability set of each grid specifically includes:
according to the formulaDifference of calculationMatching probability corresponding to the yaw angle;
wherein points is the points of the original point cloudrawAfter coordinate change, point cloud under a world coordinate system; in the world coordinate system, the number of search grids in the x and y directions is m, the search range in the yaw direction is da, the search range of grids in the x direction is (r-m, r + m), the search range of grids in the y direction is (c-m, c + m), the search range of yaw is (yaw-da, yaw + da), and yaw is a yaw angle.
In a possible implementation manner, the determining a probability matrix according to matching probabilities corresponding to different yaw angles in each grid in the grid set specifically includes:
determining the maximum matching probability in each grid according to the matching probability corresponding to different yaw angles in each grid in the grid set;
the maximum match probability in each grid constitutes a probability matrix.
In a possible implementation manner, the evaluating the positioning result according to the probability matrix to generate an evaluation result specifically includes:
carrying out normalization processing on the probability matrix;
respectively calculating Gaussian distribution parameters in the x direction and the y direction according to the probability matrix after normalization processing;
and comparing the Gaussian distribution parameters in the x direction and the y direction with preset threshold values, and if each distribution parameter is smaller than the parameters in the preset threshold values, evaluating the result as a positioning result.
In one possible implementation, the formula is based onCarrying out normalization processing on the probability matrix; wherein, PijAs a probability matrix, P0ijSum (p) is the sum of all elements of the probability matrix;
according to the formulaComputingGaussian distribution parameters in the x-direction and y-direction; wherein, muxMean value of x direction, σxIs the variance in the x direction, μyMean value of y direction, σyVariance in y-direction;
if it is notA positioning result is available where abs () is an absolute valued function, mu0,σ0Respectively, a mean threshold and a mean square error threshold.
In a possible implementation manner, the determining a positioning result of the vehicle according to the evaluation result specifically includes:
according to the formula x ═ μx*r,y=σyObtaining a target first coordinate and a target second coordinate of the vehicle in a world coordinate system; wherein r is the resolution of the grid map;
according to the probability matrixObtaining a target yaw angle of the vehicle in a world coordinate system corresponding to yaw _ max;
calculating the mean value mu of grids corresponding to the target first coordinate and the target second coordinate according to the target first coordinate and the target second coordinate of the vehicle in the world coordinate systemgObtaining a target third coordinate of the vehicle in a world coordinate system; the first coordinate, the second coordinate and the third coordinate of the target form a target position, and the target position and the target yaw angle form a positioning result.
In a second aspect, the present invention provides a lidar matching positioning apparatus, comprising:
the system comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring current laser radar point cloud information; the laser radar point cloud information comprises ground points and points above the ground, and each point has a position coordinate;
the mapping unit is used for mapping the ground point and the points above the ground on a preset grid map according to the position coordinates;
a calculation unit for calculating a distribution of ground points and a distribution of points above the ground for each grid in the grid map;
the calculation unit is also used for calculating the probability of each point in each grid according to the distribution of the ground point of each grid and the distribution of the points above the ground;
the determining unit is used for determining a grid set in a preset search range by taking a first grid in a grid map as a center; the first grid is a grid where a positioning result predicted value is located;
the calculation unit is further configured to calculate matching probabilities corresponding to different yaw angles according to probabilities of points corresponding to the vehicle height when the vehicle is located in any one of the grids in the grid set;
the determining unit is further configured to determine a probability matrix according to matching probabilities corresponding to different yaw angles in each grid in the grid set;
the evaluation unit is used for evaluating the positioning result according to the probability matrix to generate an evaluation result;
the determining unit is further used for determining the positioning result of the vehicle according to the evaluation result.
In a third aspect, the invention provides an apparatus comprising a memory for storing a program and a processor for performing the method of any of the first aspects.
In a fourth aspect, the present invention provides a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method according to any one of the first aspect.
In a fifth aspect, the invention provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the method of any of the first aspects.
According to the laser radar matching positioning method and device provided by the embodiment of the invention, the real-time point cloud is divided into the ground part and the part above the ground, so that the decoupling of six degrees of freedom of the vehicle position and attitude is realized, the searching speed is accelerated, and the obtained probability matrix can provide more referenceable information for the positioning result evaluation. And fitting the normalized probability matrix into Gaussian distribution of probability, judging the usability of the positioning result by judging the mean value and the variance of the Gaussian distribution, wherein the mean value represents the deviation of the matching result and the predicted value, and the index is irrelevant to the scene and only related to the precision of the predicted value. Because the probability matrix is normalized, the difference of scenes is eliminated, and therefore, the method can be well suitable for multiple scenes.
Drawings
Fig. 1 is a schematic flow chart of a laser radar matching positioning method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a probability matrix according to a first embodiment of the present invention;
fig. 3 is a schematic structural diagram of a lidar matching positioning apparatus according to a second embodiment of the present invention.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be further noted that, for the convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 is a schematic flow chart of a lidar matching and positioning method according to an embodiment of the present invention, where the method is applied to a terminal equipped with a lidar, such as an unmanned vehicle or a robot equipped with a lidar, and an execution subject of the method is a terminal, a server, or a processor with a computing function. The present application will be described by taking an example of applying the method to an unmanned Vehicle, and when the method is applied to an unmanned Vehicle, an execution subject of the method is an Automated Vehicle Control Unit (AVCU), that is, a central processing Unit of the unmanned Vehicle corresponds to a "brain" of the unmanned Vehicle. As shown in fig. 1, the present application includes the steps of:
step 101, acquiring current laser radar point cloud information; the lidar point cloud information includes ground points and points above the ground, each point having a location coordinate.
Specifically, the laser radar is provided with laser probes vertically arranged, the laser probes measure distance information by adopting a time of flight (ToF) technology, the laser radar rotates around a vertical shaft at a high speed in the operation process, and the laser probes measure environment information at a high frequency. The laser radar can acquire surrounding environment data of one circle by rotating one circle, and the surrounding environment information is provided in the form of discrete sparse three-dimensional space coordinates and is called as original laser point cloud information.
The surrounding environment information includes, but is not limited to, a scene mainly including tree vegetation and a scene mainly including buildings, and the present application is not limited thereto, and may be applied to various scenes.
In this step, the original laser point cloud information may be based on a laser coordinate system, or the point cloud information in the vehicle coordinate system may be obtained according to the installation positions of the vehicle and the laser radar, and after coordinate conversion, the current laser point cloud information in the world coordinate system, which is referred to as laser point cloud information for short, is obtained.
And 102, mapping the ground point and the point above the ground on a preset grid map according to the position coordinates.
Specifically, before the unmanned vehicle is not on the road formally, the test can be performed, and in the test stage, the laser point cloud map can be constructed, and the specific construction method can be that a 3D laser matching algorithm is adopted to perform point cloud interframe matching, and a laser point cloud map is constructed by a Simultaneous Localization and Mapping (SLAM) method based on a laser sensor, which is not limited in the present application.
And dividing the constructed laser point cloud map to obtain a grid map. The grid map is an XY grid plane which divides the surrounding environment information into a series of grids and can be regarded as discretized with a certain resolution r. Each grid represents its position in the grid map by an integer pair (i, j), where the position refers to the position in the upper left corner of the grid. Then the actual position, i.e. the grid map position (i, j) corresponding to the position (x, y) in the world coordinate system, is (x/r, y/r).
Step 103, calculating the distribution of the ground points and the distribution of the points above the ground of each grid in the grid map.
Specifically, the laser point cloud information is mapped to a grid map, and a plurality of points including ground points and points above the ground exist in one grid. Calculating the Gaussian distribution (mu) of the ground points according to the position coordinates of the ground points in the gridg,σg) Wherein, mugIs the mean, σ, of the Gaussian distribution of the ground pointsgIs the variance of the gaussian distribution of the ground points.
Calculating the Gaussian distribution (mu) of the ground points according to the position coordinates of the ground points in the grido,σo) Wherein, muoIs the mean, σ, of the Gaussian distribution of the ground pointsoIs the variance of the gaussian distribution of the ground points.
And 104, calculating the probability of each point in each grid according to the distribution of the ground point of each grid and the distribution of the points above the ground.
Specifically, for a point falling on the (i, j) grid, the probability that the point corresponds to is the probability p that the z-coordinate of the point corresponds toz|i,j
According to the formulaCalculating the probability of each point in each grid;
wherein (mu)g,σg) Distribution of ground points (μ)o,σo) For the distribution of points above the ground, the position of each grid on the grid map is (i, j), the position of each grid on the world coordinate system is (x, y), and (i, j) ═ x/r, y/r, r is the resolution of the grid map, p is the resolution of the grid mapz|i,jIs the probability of a point in the (i, j) grid at a height z.
Thus, the position of one point (x,y, z) can be expressed as
And 105, determining a grid set in a preset search range by taking a first grid in the grid map as a center.
The laser radar matching positioning needs to determine the position coordinates (x, y, z) and the attitude (roll, pitch, yaw) of the vehicle in space, and the three degrees of freedom (x, y, yaw) are the most important degrees of freedom of the vehicle and are also the degrees of freedom which need to be calculated in an important manner. The roll angle roll and the pitch angle pitch may be calculated by using an Inertial Measurement Unit (IMU) mounted on the vehicle. When the vehicle (x, y) coordinates of the vehicle are obtained through calculation, the vehicle z is the ground Gaussian parameter mu of the grid corresponding to the (x, y)g. Therefore, the following description will focus on the calculation of the three degrees of freedom (x, y, yaw).
The first grid is the grid where the predicted value is located. The preset search range may be m in the number of search grids in the x and y directions, da in the search range in the yaw direction, (r-m, r + m) in the x direction, (c-m, c + m) in the y direction, and (yaw-da, yaw + da) in the yaw direction. The grids within the above range and the first grid together constitute a grid set.
And 106, when the vehicle is positioned in any grid in the grid set, calculating the matching probability corresponding to different yaw angles according to the probability of the point corresponding to the height of the vehicle.
Specifically, when the vehicle is located in the grid (i, j) ═ xi/r, yi/r and yaw ═ yawi, it is possible to obtain the ground gauss parameter μ in the grid having the vehicle height zi of (i, j)gThen the vehicle match probability Pi,j,yawiCan be expressed as:
wherein points is the points of the original point cloudrawAfter coordinate change, in world coordinate systemThe lower point cloud.
And step 107, determining a probability matrix according to the matching probability corresponding to different yaw angles in each grid in the grid set.
Specifically, when the vehicle is located in the grid (i, j), the matching probabilities corresponding to different yaw angles are calculated, and the maximum probability of the matching probabilities corresponding to different yaw angles is taken as the matching probability P corresponding to the grid (i, j)ijWherein yawi corresponding to the maximum probability is denoted as yaw _ max, i.e.
All P's within a preset search rangeijThe probability matrix for that match is formed. Referring to fig. 2, fig. 2 is a grid set, each grid in the grid set corresponds to a cell, and the maximum matching probability of each grid in the grid set is obtained within a preset search range, so as to form a probability matrix.
And 108, evaluating the positioning result according to the probability matrix to generate an evaluation result.
Wherein step 108 comprises the following:
firstly, carrying out normalization processing on a probability matrix; then, respectively calculating Gaussian distribution parameters in the x direction and the y direction according to the probability matrix after normalization processing; and finally, comparing the Gaussian distribution parameters in the x and y directions with a preset threshold, and if each distribution parameter is smaller than the parameter in the preset threshold, evaluating the result as a positioning result.
In particular, according to the formulaCarrying out normalization processing on the probability matrix; wherein, PijAs a probability matrix, P0ijSum (p) is the sum of all elements of the probability matrix; the purpose of the normalization process is to eliminate the influence of differences in the environmental scene on the positioning result.
According to the formulaCalculating Gaussian distribution parameters in the x direction and the y direction; wherein, muxMean value of x direction, σxIs the variance in the x direction, μyMean value of y direction, σyThe variance in the y direction is shown, and m is the number of all elements;
if it is notA positioning result is available where abs () is an absolute valued function, mu0,σ0Respectively, a mean threshold and a mean square error threshold.
And step 109, determining the positioning result of the vehicle according to the evaluation result.
Specifically, the formula x ═ μ can be determined according tox*r,y=σyAnd obtaining a target first coordinate and a target second coordinate of the vehicle in a world coordinate system.
According to the probability matrixAnd obtaining the target yaw angle of the vehicle in the world coordinate system corresponding to yaw _ max.
Calculating the mean value mu of the grids corresponding to the target first coordinate and the target second coordinate according to the target first coordinate and the target second coordinate of the vehicle in the world coordinate systemgObtaining a target third coordinate of the vehicle in a world coordinate system; the first coordinate, the second coordinate and the third coordinate of the target form a target position, and the target position and the target yaw angle form a positioning result.
According to the laser radar matching positioning method provided by the embodiment of the invention, the real-time point cloud is divided into the ground part and the part above the ground, so that the decoupling of six degrees of freedom of the vehicle position and attitude is realized, the searching speed is accelerated, and the obtained probability matrix can provide more referenceable information for the positioning result evaluation. And fitting the normalized probability matrix into Gaussian distribution of probability, judging the usability of the positioning result by judging the mean value and the variance of the Gaussian distribution, wherein the mean value represents the deviation of the matching result and the predicted value, and the index is irrelevant to the scene and only related to the precision of the predicted value. Because the probability matrix is normalized, the difference of scenes is eliminated, and therefore, the method can be well suitable for multiple scenes.
Fig. 3 is a schematic structural view of a lidar matching positioning apparatus according to a second embodiment of the present invention, where the lidar matching positioning apparatus is applied to the lidar matching positioning method according to the first embodiment of the present invention, and as shown in fig. 3, the lidar matching positioning apparatus includes: an acquisition unit 301, a mapping unit 302, a calculation unit 303, a determination unit 304, and an evaluation unit 305.
The obtaining unit 301 is configured to obtain current lidar point cloud information; the laser radar point cloud information comprises ground points and points above the ground, and each point has a position coordinate.
The mapping unit 302 is configured to map the ground point and the point above the ground on a preset grid map according to the position coordinates.
The calculation unit 303 is configured to calculate a distribution of ground points and a distribution of points above the ground for each grid in the grid map.
The calculating unit 303 is further configured to calculate the probability of each point in each grid according to the distribution of the ground point and the distribution of the points above the ground of each grid.
The determining unit 304 is configured to determine a grid set within a preset search range by taking a first grid in the grid map as a center; and the first grid is the grid where the positioning result predicted value is located.
The calculation unit 303 is further configured to calculate matching probabilities corresponding to different yaw angles based on the probability of the point corresponding to the vehicle height when the vehicle is located in any one of the grid sets.
The determining unit 304 is further configured to determine a probability matrix according to the matching probability corresponding to different yaw angles in each grid in the grid set.
The evaluation unit 305 is configured to evaluate the positioning result according to the probability matrix, and generate an evaluation result.
The determining unit 304 is further configured to determine a positioning result of the vehicle according to the evaluation result.
The specific function of each unit corresponds to the method in the first embodiment, and is not described herein again.
By the laser radar matching positioning device provided by the embodiment of the invention, the real-time point cloud is divided into the ground part and the part above the ground, the decoupling of six degrees of freedom of the vehicle position and attitude is realized, the searching speed is accelerated, and the obtained probability matrix can provide more referenceable information for the positioning result evaluation. And fitting the normalized probability matrix into Gaussian distribution of probability, judging the usability of the positioning result by judging the mean value and the variance of the Gaussian distribution, wherein the mean value represents the deviation of the matching result and the predicted value, and the index is irrelevant to the scene and only related to the precision of the predicted value. Because the probability matrix is normalized, the difference of scenes is eliminated, and therefore, the method can be well suitable for multiple scenes.
The third embodiment of the invention provides equipment, which comprises a memory and a processor, wherein the memory is used for storing programs, and the memory can be connected with the processor through a bus. The memory may be a non-volatile memory such as a hard disk drive and a flash memory, in which a software program and a device driver are stored. The software program is capable of performing various functions of the above-described methods provided by embodiments of the present invention; the device drivers may be network and interface drivers. The processor is used for executing a software program, and the software program can realize the method provided by the first embodiment of the invention when being executed.
A fourth embodiment of the present invention provides a computer program product including instructions, which, when the computer program product runs on a computer, causes the computer to execute the method provided in the first embodiment of the present invention.
The fifth embodiment of the present invention provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the method provided in the first embodiment of the present invention is implemented.
Those of skill would further appreciate that the various illustrative components and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied in hardware, a software module executed by a processor, or a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The above embodiments are provided to further explain the objects, technical solutions and advantages of the present invention in detail, it should be understood that the above embodiments are merely exemplary embodiments of the present invention and are not intended to limit the scope of the present invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (10)

1. A laser radar matching positioning method is characterized by comprising the following steps:
acquiring current laser radar point cloud information; the laser radar point cloud information comprises ground points and points above the ground, and each point has a position coordinate;
mapping the ground point and the points above the ground on a preset grid map according to the position coordinates;
calculating the distribution of ground points and the distribution of points above the ground of each grid in the grid map;
calculating the probability of each point in each grid according to the distribution of the ground point of each grid and the distribution of the points above the ground;
determining a grid set in a preset search range by taking a first grid in a grid map as a center; the first grid is a grid where a positioning result predicted value is located;
when the vehicle is located in any grid in the grid set, calculating matching probabilities corresponding to different yaw angles according to the probability of points corresponding to the height of the vehicle;
determining a probability matrix according to the matching probability corresponding to different yaw angles in each grid in the grid set;
evaluating a positioning result according to the probability matrix to generate an evaluation result;
and determining the positioning result of the vehicle according to the evaluation result.
2. The method of claim 1, wherein prior to the obtaining current lidar point cloud information, the method further comprises:
acquiring original laser point cloud information;
and performing coordinate conversion on the original laser point cloud information to obtain the current laser point cloud information under a world coordinate system.
3. The method according to claim 1, wherein calculating the probability of each point in each grid according to the distribution of the ground point and the distribution of the points above the ground for each grid comprises:
according to the formulaCalculating the probability of each point in each grid;
wherein (mu)g,σg) Distribution of ground points (μ)o,σo) For the distribution of points above the ground, the position of each grid on the grid map is (i, j), the position of each grid on the world coordinate system is (x, y), and (i, j) ═ x/r, y/r, r is the resolution of the grid map, p is the resolution of the grid mapz|i,jIs the probability, p, of a point in the (i, j) grid at a height zz|i,j=p(x,y,z)
4. The method according to claim 1, wherein when the vehicle is located in any one of the grids, the calculating the matching probabilities corresponding to different yaw angles according to the probability of the point corresponding to the vehicle height to obtain the matching probability set of each grid specifically includes:
according to the formulaCalculating the matching probability corresponding to different yaw angles;
wherein points is the points of the original point cloudrawAfter coordinate change, point cloud under a world coordinate system; in the world coordinate system, the number of search grids in the x and y directions is m, the search range in the yaw direction is da, the search range of grids in the x direction is (r-m, r + m), the search range of grids in the y direction is (c-m, c + m), the search range of yaw is (yaw-da, yaw + da), and yaw is a yaw angle.
5. The method according to claim 1, wherein determining a probability matrix according to the matching probabilities corresponding to different yaw angles in each grid in the grid set specifically includes:
determining the maximum matching probability in each grid according to the matching probability corresponding to different yaw angles in each grid in the grid set;
the maximum match probability in each grid constitutes a probability matrix.
6. The method according to claim 1, wherein the evaluating the positioning result according to the probability matrix to generate an evaluation result specifically includes:
carrying out normalization processing on the probability matrix;
respectively calculating Gaussian distribution parameters in the x direction and the y direction according to the probability matrix after normalization processing;
and comparing the Gaussian distribution parameters in the x direction and the y direction with preset threshold values, and if each distribution parameter is smaller than the parameters in the preset threshold values, evaluating the result as a positioning result.
7. The method of claim 6, wherein the method is based on a formulaCarrying out normalization processing on the probability matrix; wherein, PijAs a probability matrix, P0ijSum (p) is the sum of all elements of the probability matrix;
according to the formulaCalculating Gaussian distribution parameters in the x direction and the y direction; wherein, muxMean value of x direction, σxIs the variance in the x direction, μyMean value of y direction, σyVariance in y-direction;
if it is notA positioning result is available where abs () is an absolute valued function, mu0,σ0Respectively, a mean threshold and a mean square error threshold.
8. The method according to claim 1, wherein determining a positioning result of the vehicle according to the evaluation result specifically comprises:
according to the formula x ═ μx*r,y=σyObtaining a target first coordinate and a target second coordinate of the vehicle in a world coordinate system; wherein r is the resolution of the grid map;
according to the probability matrixObtaining a target yaw angle of the vehicle in a world coordinate system corresponding to yaw _ max;
calculating a target first coordinate and a target second coordinate of the vehicle in a world coordinate systemMean value mu of grid corresponding to coordinate and target second coordinategObtaining a target third coordinate of the vehicle in a world coordinate system; the first coordinate, the second coordinate and the third coordinate of the target form a target position, and the target position and the target yaw angle form a positioning result.
9. A lidar matched positioning apparatus, wherein the apparatus comprises:
the system comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring current laser radar point cloud information; the laser radar point cloud information comprises ground points and points above the ground, and each point has a position coordinate;
the mapping unit is used for mapping the ground point and the points above the ground on a preset grid map according to the position coordinates;
a calculation unit for calculating a distribution of ground points and a distribution of points above the ground for each grid in the grid map;
the calculation unit is also used for calculating the probability of each point in each grid according to the distribution of the ground point of each grid and the distribution of the points above the ground;
the determining unit is used for determining a grid set in a preset search range by taking a first grid in a grid map as a center; the first grid is a grid where a positioning result predicted value is located;
the calculation unit is further configured to calculate matching probabilities corresponding to different yaw angles according to probabilities of points corresponding to the vehicle height when the vehicle is located in any one of the grids in the grid set;
the determining unit is further configured to determine a probability matrix according to matching probabilities corresponding to different yaw angles in each grid in the grid set;
the evaluation unit is used for evaluating the positioning result according to the probability matrix to generate an evaluation result;
the determining unit is further used for determining the positioning result of the vehicle according to the evaluation result.
10. An apparatus, comprising a memory for storing a program and a processor for performing the method of any of claims 1-8.
CN201910887644.XA 2019-09-19 2019-09-19 Laser radar matching positioning method and device Active CN110609290B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910887644.XA CN110609290B (en) 2019-09-19 2019-09-19 Laser radar matching positioning method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910887644.XA CN110609290B (en) 2019-09-19 2019-09-19 Laser radar matching positioning method and device

Publications (2)

Publication Number Publication Date
CN110609290A true CN110609290A (en) 2019-12-24
CN110609290B CN110609290B (en) 2021-07-23

Family

ID=68891615

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910887644.XA Active CN110609290B (en) 2019-09-19 2019-09-19 Laser radar matching positioning method and device

Country Status (1)

Country Link
CN (1) CN110609290B (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111429574A (en) * 2020-03-06 2020-07-17 上海交通大学 Mobile robot positioning method and system based on three-dimensional point cloud and vision fusion
CN111707279A (en) * 2020-05-19 2020-09-25 上海有个机器人有限公司 Matching evaluation method, medium, terminal and device of laser point cloud and map
CN111812658A (en) * 2020-07-09 2020-10-23 北京京东乾石科技有限公司 Position determination method, device, system and computer readable storage medium
CN111812613A (en) * 2020-08-06 2020-10-23 常州市贝叶斯智能科技有限公司 Mobile robot positioning monitoring method, device, equipment and medium
CN112147635A (en) * 2020-09-25 2020-12-29 北京亮道智能汽车技术有限公司 Detection system, method and device
CN112686934A (en) * 2020-12-29 2021-04-20 广州广电研究院有限公司 Point cloud data registration method, device, equipment and medium
WO2021143286A1 (en) * 2020-01-14 2021-07-22 华为技术有限公司 Method and apparatus for vehicle positioning, controller, smart car and system
CN113147738A (en) * 2021-02-26 2021-07-23 重庆智行者信息科技有限公司 Automatic parking positioning method and device
CN113436336A (en) * 2021-06-22 2021-09-24 京东鲲鹏(江苏)科技有限公司 Ground point cloud segmentation method and device and automatic driving vehicle
CN113435227A (en) * 2020-03-23 2021-09-24 阿里巴巴集团控股有限公司 Map generation and vehicle positioning method, system, device and storage medium
CN113916243A (en) * 2020-07-07 2022-01-11 长沙智能驾驶研究院有限公司 Vehicle positioning method, device, equipment and storage medium for target scene area
WO2022007602A1 (en) * 2020-07-09 2022-01-13 北京京东乾石科技有限公司 Method and apparatus for determining location of vehicle
WO2022095438A1 (en) * 2020-11-05 2022-05-12 珠海一微半导体股份有限公司 Laser repositioning system based on hardware acceleration, and chip
CN114688996A (en) * 2020-12-31 2022-07-01 北京华航无线电测量研究所 Method for measuring rotation precision angle of rotary table
CN112147635B (en) * 2020-09-25 2024-05-31 北京亮道智能汽车技术有限公司 Detection system, method and device

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104933708A (en) * 2015-06-07 2015-09-23 浙江大学 Barrier detection method in vegetation environment based on multispectral and 3D feature fusion
CN106530380A (en) * 2016-09-20 2017-03-22 长安大学 Ground point cloud segmentation method based on three-dimensional laser radar
CN106650640A (en) * 2016-12-05 2017-05-10 浙江大学 Negative obstacle detection method based on local structure feature of laser radar point cloud
US20170309268A1 (en) * 2017-06-20 2017-10-26 Signal/Noise Solutions L.L.C. Systems and Methods for Enhancing A Signal-To-Noise Ratio
CN107341819A (en) * 2017-05-09 2017-11-10 深圳市速腾聚创科技有限公司 Method for tracking target and storage medium
CN108427124A (en) * 2018-02-02 2018-08-21 北京智行者科技有限公司 A kind of multi-line laser radar ground point separation method and device, vehicle
CN108732582A (en) * 2017-04-20 2018-11-02 百度在线网络技术(北京)有限公司 Vehicle positioning method and device
CN108801268A (en) * 2018-06-27 2018-11-13 广州视源电子科技股份有限公司 Localization method, device and the robot of target object
CN109541571A (en) * 2018-12-29 2019-03-29 北京智行者科技有限公司 The combined calibrating method of EPS zero bias and multi-line laser radar
CN109557925A (en) * 2018-12-29 2019-04-02 北京智行者科技有限公司 Automatic driving vehicle barrier preventing collision method and device
CN110120070A (en) * 2019-05-15 2019-08-13 南京林业大学 Filtering method based on airborne laser radar point cloud volume elements Continuity Analysis
CN110163871A (en) * 2019-05-07 2019-08-23 北京易控智驾科技有限公司 A kind of ground dividing method of multi-line laser radar
CN110210389A (en) * 2019-05-31 2019-09-06 东南大学 A kind of multi-targets recognition tracking towards road traffic scene
CN110223314A (en) * 2019-06-06 2019-09-10 电子科技大学 A kind of single wooden dividing method based on the distribution of tree crown three-dimensional point cloud
CN110221616A (en) * 2019-06-25 2019-09-10 清华大学苏州汽车研究院(吴江) A kind of method, apparatus, equipment and medium that map generates

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104933708A (en) * 2015-06-07 2015-09-23 浙江大学 Barrier detection method in vegetation environment based on multispectral and 3D feature fusion
CN106530380A (en) * 2016-09-20 2017-03-22 长安大学 Ground point cloud segmentation method based on three-dimensional laser radar
CN106650640A (en) * 2016-12-05 2017-05-10 浙江大学 Negative obstacle detection method based on local structure feature of laser radar point cloud
CN108732582A (en) * 2017-04-20 2018-11-02 百度在线网络技术(北京)有限公司 Vehicle positioning method and device
CN107341819A (en) * 2017-05-09 2017-11-10 深圳市速腾聚创科技有限公司 Method for tracking target and storage medium
US20170309268A1 (en) * 2017-06-20 2017-10-26 Signal/Noise Solutions L.L.C. Systems and Methods for Enhancing A Signal-To-Noise Ratio
CN108427124A (en) * 2018-02-02 2018-08-21 北京智行者科技有限公司 A kind of multi-line laser radar ground point separation method and device, vehicle
CN108801268A (en) * 2018-06-27 2018-11-13 广州视源电子科技股份有限公司 Localization method, device and the robot of target object
CN109541571A (en) * 2018-12-29 2019-03-29 北京智行者科技有限公司 The combined calibrating method of EPS zero bias and multi-line laser radar
CN109557925A (en) * 2018-12-29 2019-04-02 北京智行者科技有限公司 Automatic driving vehicle barrier preventing collision method and device
CN110163871A (en) * 2019-05-07 2019-08-23 北京易控智驾科技有限公司 A kind of ground dividing method of multi-line laser radar
CN110120070A (en) * 2019-05-15 2019-08-13 南京林业大学 Filtering method based on airborne laser radar point cloud volume elements Continuity Analysis
CN110210389A (en) * 2019-05-31 2019-09-06 东南大学 A kind of multi-targets recognition tracking towards road traffic scene
CN110223314A (en) * 2019-06-06 2019-09-10 电子科技大学 A kind of single wooden dividing method based on the distribution of tree crown three-dimensional point cloud
CN110221616A (en) * 2019-06-25 2019-09-10 清华大学苏州汽车研究院(吴江) A kind of method, apparatus, equipment and medium that map generates

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
WENFANG YE ET AL.: "Real time UGV positioning based on Reference beacons aided LiDAR scan matching", 《2018 UBIQUITOUS POSITIONING, INDOOR NAVIGATION AND LOCATION-BASED SERVICES (UPINLBS)》 *
谭志国 等: "基于点云-模型匹配的激光雷达目标识别", 《计算机工程与科学》 *

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021143286A1 (en) * 2020-01-14 2021-07-22 华为技术有限公司 Method and apparatus for vehicle positioning, controller, smart car and system
CN111429574A (en) * 2020-03-06 2020-07-17 上海交通大学 Mobile robot positioning method and system based on three-dimensional point cloud and vision fusion
CN113435227A (en) * 2020-03-23 2021-09-24 阿里巴巴集团控股有限公司 Map generation and vehicle positioning method, system, device and storage medium
CN111707279A (en) * 2020-05-19 2020-09-25 上海有个机器人有限公司 Matching evaluation method, medium, terminal and device of laser point cloud and map
CN111707279B (en) * 2020-05-19 2023-09-08 上海有个机器人有限公司 Matching evaluation method, medium, terminal and device for laser point cloud and map
CN113916243B (en) * 2020-07-07 2022-10-14 长沙智能驾驶研究院有限公司 Vehicle positioning method, device, equipment and storage medium for target scene area
CN113916243A (en) * 2020-07-07 2022-01-11 长沙智能驾驶研究院有限公司 Vehicle positioning method, device, equipment and storage medium for target scene area
WO2022007776A1 (en) * 2020-07-07 2022-01-13 长沙智能驾驶研究院有限公司 Vehicle positioning method and apparatus for target scene region, device and storage medium
CN111812658A (en) * 2020-07-09 2020-10-23 北京京东乾石科技有限公司 Position determination method, device, system and computer readable storage medium
CN111812658B (en) * 2020-07-09 2021-11-02 北京京东乾石科技有限公司 Position determination method, device, system and computer readable storage medium
WO2022007602A1 (en) * 2020-07-09 2022-01-13 北京京东乾石科技有限公司 Method and apparatus for determining location of vehicle
CN111812613A (en) * 2020-08-06 2020-10-23 常州市贝叶斯智能科技有限公司 Mobile robot positioning monitoring method, device, equipment and medium
CN112147635A (en) * 2020-09-25 2020-12-29 北京亮道智能汽车技术有限公司 Detection system, method and device
CN112147635B (en) * 2020-09-25 2024-05-31 北京亮道智能汽车技术有限公司 Detection system, method and device
WO2022095438A1 (en) * 2020-11-05 2022-05-12 珠海一微半导体股份有限公司 Laser repositioning system based on hardware acceleration, and chip
CN112686934A (en) * 2020-12-29 2021-04-20 广州广电研究院有限公司 Point cloud data registration method, device, equipment and medium
CN114688996A (en) * 2020-12-31 2022-07-01 北京华航无线电测量研究所 Method for measuring rotation precision angle of rotary table
CN114688996B (en) * 2020-12-31 2023-11-03 北京华航无线电测量研究所 Method for measuring rotation precision angle of turntable
CN113147738A (en) * 2021-02-26 2021-07-23 重庆智行者信息科技有限公司 Automatic parking positioning method and device
CN113436336A (en) * 2021-06-22 2021-09-24 京东鲲鹏(江苏)科技有限公司 Ground point cloud segmentation method and device and automatic driving vehicle
CN113436336B (en) * 2021-06-22 2024-01-12 京东鲲鹏(江苏)科技有限公司 Ground point cloud segmentation method and device and automatic driving vehicle

Also Published As

Publication number Publication date
CN110609290B (en) 2021-07-23

Similar Documents

Publication Publication Date Title
CN110609290B (en) Laser radar matching positioning method and device
CN109270545B (en) Positioning true value verification method, device, equipment and storage medium
CN110673115B (en) Combined calibration method, device, equipment and medium for radar and integrated navigation system
CN109214248B (en) Method and device for identifying laser point cloud data of unmanned vehicle
EP3620823B1 (en) Method and device for detecting precision of internal parameter of laser radar
CN111812658B (en) Position determination method, device, system and computer readable storage medium
CN112985842B (en) Parking performance detection method, electronic device and readable storage medium
CN109407073B (en) Reflection value map construction method and device
CN110930495A (en) Multi-unmanned aerial vehicle cooperation-based ICP point cloud map fusion method, system, device and storage medium
CN111563450B (en) Data processing method, device, equipment and storage medium
CN110673107B (en) Road edge detection method and device based on multi-line laser radar
CN113147738A (en) Automatic parking positioning method and device
CN110031825B (en) Laser positioning initialization method
CN111915675B (en) Particle drift-based particle filtering point cloud positioning method, device and system thereof
CN114485698B (en) Intersection guide line generation method and system
CN112146682B (en) Sensor calibration method and device for intelligent automobile, electronic equipment and medium
CN114111774B (en) Vehicle positioning method, system, equipment and computer readable storage medium
CN112684432A (en) Laser radar calibration method, device, equipment and storage medium
CN114820749A (en) Unmanned vehicle underground positioning method, system, equipment and medium
CN115436920A (en) Laser radar calibration method and related equipment
CN111812669A (en) Winding inspection device, positioning method thereof and storage medium
CN113822944B (en) External parameter calibration method and device, electronic equipment and storage medium
WO2022078342A1 (en) Dynamic occupancy grid estimation method and apparatus
CN113296120B (en) Obstacle detection method and terminal
CN112154355B (en) High-precision map positioning method, system, platform and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20220727

Address after: 401122 No.1, 1st floor, building 3, No.21 Yunzhu Road, Yubei District, Chongqing

Patentee after: Chongqing landshipu Information Technology Co.,Ltd.

Address before: B4-006, maker Plaza, 338 East Street, Huilongguan town, Changping District, Beijing 100096

Patentee before: Beijing Idriverplus Technology Co.,Ltd.

TR01 Transfer of patent right