CN113538699A - Positioning method, device and equipment based on three-dimensional point cloud and storage medium - Google Patents

Positioning method, device and equipment based on three-dimensional point cloud and storage medium Download PDF

Info

Publication number
CN113538699A
CN113538699A CN202110687387.2A CN202110687387A CN113538699A CN 113538699 A CN113538699 A CN 113538699A CN 202110687387 A CN202110687387 A CN 202110687387A CN 113538699 A CN113538699 A CN 113538699A
Authority
CN
China
Prior art keywords
vehicle
point cloud
pose
matrix
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110687387.2A
Other languages
Chinese (zh)
Inventor
李晓欢
苏昭宇
陈倩
唐欣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangxi Comprehensive Transportation Big Data Research Institute
Guilin University of Electronic Technology
Original Assignee
Guangxi Comprehensive Transportation Big Data Research Institute
Guilin University of Electronic Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangxi Comprehensive Transportation Big Data Research Institute, Guilin University of Electronic Technology filed Critical Guangxi Comprehensive Transportation Big Data Research Institute
Priority to CN202110687387.2A priority Critical patent/CN113538699A/en
Publication of CN113538699A publication Critical patent/CN113538699A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Geometry (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

The application provides a positioning method, a positioning device, positioning equipment and a storage medium based on three-dimensional point cloud, which relate to the technical field of automatic driving, and the method comprises the following steps: the method comprises the steps of respectively acquiring point cloud data measured by combined navigation data and point cloud data measured by a laser radar, respectively processing the point cloud data and the combined navigation data, determining current pose information of a vehicle based on the combined navigation data, determining a transformation matrix and a translation matrix in point cloud positioning, determining laser radar pose information of the vehicle based on the transformation matrix and the translation matrix, then determining a posture calculation equation of the vehicle based on the laser radar pose information and the posture information of the vehicle, and then positioning the vehicle based on the posture state calculation equation by adopting an extended Kalman filtering method, so that the method is more consistent with the form rule of the vehicle, and the accuracy of vehicle positioning is ensured.

Description

Positioning method, device and equipment based on three-dimensional point cloud and storage medium
Technical Field
The application relates to the technical field of automatic driving, in particular to a positioning method, a positioning device, positioning equipment and a storage medium based on three-dimensional point cloud.
Background
In the positioning process based on the high-precision map, due to the non-real-time property of the map and the noise of the sensor, the problem that the data acquired in real time are inconsistent with the map exists. This problem often affects the accuracy of map matching based localization algorithms, and even results in situations where the localization is locally optimal or the localization is not converged. Aiming at the existing problems, a plurality of positioning methods fusing different sensors are provided at present, the fusion method is mostly based on KF (Kalman filtering), but in the real vehicle motion process, the system does not belong to a linear system, meanwhile, because point cloud has noise, a classical registration algorithm such as ICP or NDT has accumulated errors, the positioning is not converged in the point cloud positioning process, and finally the positioning is inaccurate.
Disclosure of Invention
The objective of the present application is to solve at least one of the above technical defects, especially the technical defect of the prior art that the positioning is not converged in the point cloud positioning process due to the accumulated error, and finally the positioning is inaccurate.
In a first aspect, a three-dimensional point cloud-based positioning method is provided, including:
acquiring combined navigation data of a vehicle and point cloud data measured by a laser radar;
analyzing the combined navigation data to obtain current combined navigation pose information of the vehicle, and determining a point cloud transformation matrix and a translation matrix of the vehicle in point cloud positioning based on the current combined navigation pose information;
matching the current frame point cloud of the vehicle with a pre-established map based on the transformation matrix and the translation matrix to obtain laser radar pose information of the vehicle;
analyzing the combined navigation data to obtain current attitude information of the vehicle, combining the current attitude information with the laser radar attitude information, and determining an attitude state calculation equation of the vehicle;
and positioning the vehicle based on the pose state calculation equation by adopting an extended Kalman filtering method.
As a possible embodiment of the present application, in this embodiment, the laser radar pose information of the vehicle is:
X=[LT,WT,VT]T
wherein, L ═ Lx,ly,lz)TFor representing position information of the lidar, whereinxFor lidar x-axis coordinate information, lyFor lidar y-axis coordinate information, lzThe Z-axis coordinate information of the laser radar is obtained;
Figure BDA0003125244110000021
for representing the Euler angle of the lidar, wherein
Figure BDA0003125244110000022
A pitch angle, theta is a roll angle, psi is a heading angle, and V is (V)x,vy,vz)TFor indicating the speed of the lidar, where vxIs the x-axis velocity, vyIs the y-axis velocity, vzIs the z-axis velocity.
As one possible embodiment of the present application, in this embodiment, the current posture information of the vehicle is:
U=[wT,aT]T
wherein w ═ wx,wy,wz)TAngular velocity for representing the vehicle attitude, wherein wxAngular velocity of x-axis, wyAngular velocity of y-axisDegree, wzAngular velocity for the z-axis; a ═ ax,ay,az)TAcceleration for representing the vehicle attitude, wherein axAcceleration of the x-axis, ayAcceleration in the y-axis, azIs the z-axis acceleration.
As a possible embodiment of the present application, in this embodiment, the pose state calculation equation of the vehicle is:
Figure BDA0003125244110000023
wherein,
Figure BDA0003125244110000024
the state equation of the vehicle pose and speed is expressed as the vehicle speed,
Figure BDA0003125244110000025
representing a vehicle pose angular velocity state equation, wherein w represents a vehicle pose angular velocity;
Figure BDA0003125244110000026
is a direction cosine matrix;
Figure BDA0003125244110000031
is a state transition matrix;
Figure BDA0003125244110000032
is a gravity matrix, where g is the acceleration of gravity.
As a possible embodiment of the present application, in this embodiment, before matching the current frame point cloud of the vehicle with a pre-established map based on the transformation matrix and the translation matrix, the method further includes:
and performing down-sampling processing on the point cloud data measured by the laser radar, and performing feature extraction on the point cloud data subjected to the down-sampling processing.
As a possible implementation manner of the present application, in this implementation manner, the locating the vehicle based on the pose state calculation equation by using the extended kalman filtering method includes:
according to the formula (P)+)-1=k·P-1+(1-k)·CTR-1C, calculating prior estimation covariance;
determine an attitude prediction equation of
Figure BDA0003125244110000033
Determining Kalman gain as K ═ PCT(k·R+CTPC)-1
Using a formula
Figure BDA0003125244110000034
Updating the pose of the laser radar;
determining the updated covariance as P+=P-(1-k)-1·KCP;
Wherein, P+The covariance matrix is estimated a priori, P is the covariance matrix,
Figure BDA0003125244110000035
estimating pose in prior, R is system error matrix, C is pose estimation information projection matrix when point cloud is initially registered, F is pose information matrix of laser radar, k is matching degree of point cloud between frames,
Figure BDA0003125244110000036
Figure BDA0003125244110000037
wherein, PjRepresents one point in the current frame point cloud, and Pi represents the corresponding point of the map and Pj which is nearest.
In a second aspect, a three-dimensional point cloud based positioning apparatus is provided, the apparatus comprising:
the data acquisition module is used for acquiring the combined navigation data of the vehicle and the point cloud data measured by the laser radar;
the matrix determination module is used for analyzing the integrated navigation data to obtain current integrated navigation pose information of the vehicle, and determining a point cloud transformation matrix and a translation matrix of the vehicle in point cloud positioning based on the current integrated navigation pose information;
the position and orientation determining module is used for matching the current frame point cloud of the vehicle with a pre-established map based on the transformation matrix and the translation matrix to obtain laser radar position and orientation information of the vehicle;
the equation determination module is used for analyzing the combined navigation data to obtain current attitude information of the vehicle, combining the current attitude information with the laser radar attitude information and determining a pose state calculation equation of the vehicle;
and the positioning module is used for positioning the vehicle based on the pose state calculation equation by adopting an extended Kalman filtering method.
In a third aspect, an electronic device is provided, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor implements the above-mentioned three-dimensional point cloud-based positioning method when executing the program.
In a fourth aspect, a computer-readable storage medium is provided, which stores at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by the processor to implement the above-mentioned three-dimensional point cloud based localization method.
According to the method and the device, the point cloud data measured by the laser radar and the point cloud data measured by the combined navigation data are respectively obtained and processed, the current pose information of the vehicle is determined based on the combined navigation data, the transformation matrix and the translation matrix in the point cloud positioning are determined, the laser radar pose information of the vehicle is determined based on the transformation matrix and the translation matrix, the attitude calculation equation of the vehicle is determined based on the laser radar pose information and the attitude information of the vehicle, the vehicle is positioned based on the pose state calculation equation by adopting an extended Kalman filtering method, the form rule of the vehicle is better met, and the accuracy of the vehicle positioning is ensured.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings used in the description of the embodiments of the present application will be briefly described below.
Fig. 1 is a schematic flowchart of a three-dimensional point cloud-based positioning method according to an embodiment of the present disclosure;
fig. 2 is a block flow diagram of a positioning method according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of a three-dimensional point cloud-based positioning apparatus according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
The above and other features, advantages and aspects of various embodiments of the present application will become more apparent from the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and features are not necessarily drawn to scale.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary only for the purpose of explaining the present application and are not to be construed as limiting the present application.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. As used herein, the term "and/or" includes all or any element and all combinations of one or more of the associated listed items.
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
The application provides a three-dimensional point cloud-based positioning method, a three-dimensional point cloud-based positioning device, an electronic device and a computer-readable storage medium, which aim to solve the above technical problems in the prior art.
The following describes the technical solutions of the present application and how to solve the above technical problems with specific embodiments. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
The embodiment of the application provides a positioning method based on three-dimensional point cloud, and as shown in fig. 1, the method comprises the following steps:
step S101, acquiring combined navigation data of a vehicle and point cloud data measured by a laser radar;
step S102, analyzing the integrated navigation data to obtain current integrated navigation pose information of the vehicle, and determining a point cloud transformation matrix and a translation matrix of the vehicle in point cloud positioning based on the current integrated navigation pose information;
step S103, matching the current frame point cloud of the vehicle with a pre-established map based on the transformation matrix and the translation matrix to obtain laser radar pose information of the vehicle;
step S104, analyzing the combined navigation data to obtain current attitude information of the vehicle, combining the current attitude information with the laser radar attitude information, and determining an attitude state calculation equation of the vehicle;
and S105, positioning the vehicle based on the pose state calculation equation by adopting an extended Kalman filtering method.
In the embodiment of the application, the positioning method based on the three-dimensional point cloud is suitable for the technical field of automatic driving or navigation, and in the positioning process based on the high-precision map, due to the non-real-time property of the map and the noise of a sensor, the problem of inconsistency between data acquired in real time and the map exists. This problem often affects the accuracy of map matching based localization algorithms, and even results in situations where the localization is locally optimal or the localization is not converged. Aiming at the existing problems, a plurality of positioning methods fusing different sensors are provided at present, the fusion methods are mostly based on KF (Kalman filtering), but in the real vehicle motion process, the method does not belong to a linear system, meanwhile, due to the fact that noise exists in point cloud, a classical registration algorithm such as ICP or NDT is used, an accumulated error exists, positioning non-convergence exists in the point cloud positioning process, and finally positioning inaccuracy is caused.
For convenience of illustration, taking a specific embodiment as an example, the laser radar and GPS combined navigation system is installed on an autonomous vehicle or a vehicle with an autonomous navigation function, and the onboard sensor receives the combined navigation data and the point cloud data measured by the laser radar and processes the combined navigation data and the point cloud data measured by the laser radar, wherein the combined navigation data is analyzed to obtain the position and orientation information of the vehicle at the current time, wherein the position and orientation information Z is obtained0=(x0,y0,z0,yaw0,pitch0,roll0) As initial positioning pose information of the vehicle, calculating to obtain a point cloud positioning midpoint cloud transformation matrix R of the vehicle in point cloud data measured by the laser radar according to the pose information at the current moment0And a translation matrix t0Then, based on the transformation matrix and the translation matrix, matching the current frame point cloud with a pre-established mapDetermining laser radar pose information of the vehicle
X=[LT,WT,VT]T
Wherein, L ═ Lx,ly,lz)TFor representing position information of the lidar,
Figure BDA0003125244110000071
an Euler angle for representing a pitch angle, a roll angle, and a heading angle of the laser radar, V ═ V (V ═ Vx,vy,vz)TFor indicating the speed of the lidar.
Analyzing the combined navigation data to obtain current attitude information of the vehicle, combining the current attitude information with the laser radar attitude information, and determining an attitude state calculation equation of the vehicle:
Figure BDA0003125244110000072
is a direction cosine matrix;
Figure BDA0003125244110000073
is a state transition matrix;
Figure BDA0003125244110000074
is a gravity matrix.
Positioning the vehicle based on the pose state calculation equation by adopting an extended Kalman filtering method, comprising the following steps: according to the formula (P)+)-1=k·P-1+(1-k)·CTR-1C, calculating prior estimation covariance; determine an attitude prediction equation of
Figure BDA0003125244110000075
Determining Kalman gain as K ═ PCT(k·R+CTPC)-1(ii) a Using a formula
Figure BDA0003125244110000076
Updating the pose of the laser radar; determining updated agreementsVariance is P+=P-(1-k)-1KCP; wherein, P is a prior estimation covariance matrix, R is a system error matrix, C is an attitude estimation information projection matrix during initial registration of the point cloud, F is an attitude information matrix of the laser radar, k is the matching degree of the point cloud between frames,
Figure BDA0003125244110000077
wherein, PjRepresenting a point, P, in the current frame point cloudiRepresentation map and middle PjThe nearest corresponding point.
According to the method and the device, the point cloud data measured by the laser radar and the point cloud data measured by the combined navigation data are respectively obtained and processed, the current pose information of the vehicle is determined based on the combined navigation data, the transformation matrix and the translation matrix in the point cloud positioning are determined, the laser radar pose information of the vehicle is determined based on the transformation matrix and the translation matrix, the attitude calculation equation of the vehicle is determined based on the laser radar pose information and the attitude information of the vehicle, the vehicle is positioned based on the pose state calculation equation by adopting an extended Kalman filtering method, the form rule of the vehicle is better met, and the accuracy of the vehicle positioning is ensured.
As a possible embodiment of the present application, in this embodiment, the current attitude information of the vehicle is obtained by analyzing the combined navigation data, and the obtained current attitude information of the vehicle is
U=[wT,aT]T
Wherein w ═ wx,wy,wz)TAngular velocity for representing the vehicle attitude, a ═ ax,ay,az)TAcceleration representing the vehicle pose.
As a possible embodiment of the present application, in this embodiment, before matching the current frame point cloud of the vehicle with a pre-established map based on the transformation matrix and the translation matrix, the method further includes:
and performing down-sampling processing on the point cloud data measured by the laser radar, and performing feature extraction on the point cloud data subjected to the down-sampling processing.
In the embodiment of the application, the vehicle acquires the point cloud data and the combined navigation data measured by the laser radar, and the two sets of data are processed differently. The acquired point cloud data is subjected to down-sampling processing, and the purpose of down-sampling is to filter out points too far away from the laser radar and points too far away from the center of the laser radar, so that the registration accuracy and efficiency are improved. And then, extracting characteristic points of the point cloud data: the points of one scanning are classified by curvature values, the points with the curvature larger than the threshold value of the characteristic point are edge points, and the points with the curvature smaller than the threshold value of the characteristic point are plane points. In order to distribute the feature points evenly in the environment, one scan is divided into 4 independent sub-regions, each providing at most 2 edge points and 4 planes. In addition, unstable characteristic points (blemishes) were excluded. And simultaneously, carrying out data analysis on the acquired data of the integrated navigation to obtain GPS data, carrying out UTM transformation on the GPS data, namely carrying out conversion from longitude and latitude to local coordinates according to the GPS data, wherein the result obtained by the transformation is used for calculating a rotation matrix R and a translation matrix t of initial point cloud registration. When the current frame point cloud is matched with a pre-constructed map, the laser radar pose measurement value of the current vehicle can be calculated, and meanwhile, GPS data is not needed any more.
In the embodiment of the present application, as shown in fig. 2, after processing point cloud data and combined navigation data measured by a laser radar, EKF fusion needs to be performed, an Extended Kalman Filter (EKF) state and an observation state model are established, and laser radar attitude information X ═ L of a vehicleT,WT,VT]TWherein L ═ Lx,ly,lz)TFor representing position information of the lidar,
Figure BDA0003125244110000081
an Euler angle for representing a pitch angle, a roll angle, and a heading angle of the laser radar, V ═ V (V ═ Vx,vy,vz)TFor indicating the speed of the vehicle. Through a combination guideAfter the original navigation data is analyzed, attitude information U ═ w can be obtainedT,aT]TWherein w ═ wx,wy,wz)TFor indicating the angular velocity of the lidar, a ═ ax,ay,az)TFor representing the acceleration of the lidar. The vehicle pose estimation state equation can then be established:
Figure BDA0003125244110000091
wherein EwBeing a state transition matrix, RwThe direction of the x axis of the combined navigation and the laser radar navigation is kept consistent in the moving process of the vehicle, so that the corresponding matrix is as follows:
Figure BDA0003125244110000092
Figure BDA0003125244110000093
Figure BDA0003125244110000094
finally, prediction and update of EKF are carried out: prior estimation of covariance: (P)+)-1=k·P-1+(1-k)·CTR-1C; laser radar attitude prediction equation:
Figure BDA0003125244110000095
Figure BDA0003125244110000096
kalman gain: k ═ PCTk · R + CTPC-1; LiDAR pose updating:
Figure BDA0003125244110000097
updatingThe covariance of the posterior estimate: p+=P-(1-k)-1KCP; wherein P is a prior estimation covariance matrix, R is a system error matrix, C is an attitude estimation information projection matrix during initial registration of the point cloud, F is an attitude information matrix output by the laser radar, k is the matching degree of the point cloud between frames,
Figure BDA0003125244110000098
Pjrepresenting a point, P, in the current frame point cloudiRepresenting the sum of P in a mapjThe nearest corresponding point; finally obtaining the optimal estimation value of the attitude measured by the laser radar according to the EKF
Figure BDA0003125244110000099
And obtaining the positioning information of the current vehicle through coordinate transformation.
According to the method and the device, the point cloud data measured by the laser radar and the point cloud data measured by the combined navigation data are respectively obtained and processed, the current pose information of the vehicle is determined based on the combined navigation data, the transformation matrix and the translation matrix in the point cloud positioning are determined, the laser radar pose information of the vehicle is determined based on the transformation matrix and the translation matrix, the attitude calculation equation of the vehicle is determined based on the laser radar pose information and the attitude information of the vehicle, the vehicle is positioned based on the pose state calculation equation by adopting an extended Kalman filtering method, the form rule of the vehicle is better met, and the accuracy of the vehicle positioning is ensured.
The embodiment of the present application provides a positioning method, a positioning device, a positioning apparatus and a storage medium device based on three-dimensional point cloud, as shown in fig. 3, the positioning device 30 based on three-dimensional point cloud may include: a data acquisition module 301, a matrix determination module 302, a pose determination module 303, an equation determination module 304, and a localization module 305, wherein,
the data acquisition module 301 is used for acquiring the combined navigation data of the vehicle and the point cloud data measured by the laser radar;
a matrix determining module 302, configured to analyze the integrated navigation data to obtain current integrated navigation pose information of the vehicle, and determine a point cloud transformation matrix and a translation matrix in point cloud positioning of the vehicle based on the current integrated navigation pose information;
a pose determining module 303, configured to match a current frame point cloud of the vehicle with a pre-established map based on the transformation matrix and the translation matrix to obtain laser radar pose information of the vehicle;
an equation determining module 304, configured to analyze the combined navigation data to obtain current attitude information of the vehicle, and combine the current attitude information with the laser radar attitude information to determine a pose state calculation equation of the vehicle;
a positioning module 305, configured to position the vehicle based on the pose state calculation equation by using an extended kalman filter method.
As one possible embodiment of the present application, in this embodiment, the current posture information of the vehicle is:
U=[wT,aT]T
wherein w ═ wx,wy,wz)TAngular velocity for representing the vehicle attitude, a ═ ax,ay,az)TAcceleration representing the vehicle pose.
As a possible embodiment of the present application, in this embodiment, the pose state calculation equation of the vehicle is:
Figure BDA0003125244110000111
wherein,
Figure BDA0003125244110000112
is a direction cosine matrix;
Figure BDA0003125244110000113
is a state transition matrix;
Figure BDA0003125244110000114
is a gravity matrix.
As a possible embodiment of the present application, in this embodiment, before matching the current frame point cloud of the vehicle with a pre-established map based on the transformation matrix and the translation matrix, the method further includes:
and performing down-sampling processing on the point cloud data measured by the laser radar, and performing feature extraction on the point cloud data subjected to the down-sampling processing.
As a possible implementation manner of the present application, in this implementation manner, the locating the vehicle based on the pose state calculation equation by using the extended kalman filtering method includes:
according to the formula (P)+)-1=k·P-1+(1-k)·CTR-1C, calculating prior estimation covariance;
determine an attitude prediction equation of
Figure BDA0003125244110000115
Determining Kalman gain as K ═ PCT(k·R+CTPC)-1
Using a formula
Figure BDA0003125244110000116
Updating the pose of the laser radar;
determining the updated covariance as P+=P-(1-k)-1·CP;
Wherein, P is a prior estimation covariance matrix, R is a system error matrix, C is an attitude estimation information projection matrix during initial registration of the point cloud, F is an attitude information matrix of the laser radar, k is the matching degree of the point cloud between frames,
Figure BDA0003125244110000117
wherein, PjRepresenting a point, P, in the current frame point cloudiRepresentation map and middle PjNearest corresponding point。
According to the method and the device, the point cloud data measured by the laser radar and the point cloud data measured by the combined navigation data are respectively obtained and processed, the current pose information of the vehicle is determined based on the combined navigation data, the transformation matrix and the translation matrix in the point cloud positioning are determined, the laser radar pose information of the vehicle is determined based on the transformation matrix and the translation matrix, the attitude calculation equation of the vehicle is determined based on the laser radar pose information and the attitude information of the vehicle, the vehicle is positioned based on the pose state calculation equation by adopting an extended Kalman filtering method, the form rule of the vehicle is better met, and the accuracy of the vehicle positioning is ensured.
An embodiment of the present application provides an electronic device, including: a memory and a processor; at least one program stored in the memory for, when executed by the processor, acquiring combined navigation data of the vehicle and point cloud data of the lidar measurements; analyzing the combined navigation data to obtain current combined navigation pose information of the vehicle, and determining a point cloud transformation matrix and a translation matrix of the vehicle in point cloud positioning based on the current combined navigation pose information; matching the current frame point cloud of the vehicle with a pre-established map based on the transformation matrix and the translation matrix to obtain laser radar pose information of the vehicle; analyzing the combined navigation data to obtain current attitude information of the vehicle, combining the current attitude information with the laser radar attitude information, and determining an attitude state calculation equation of the vehicle; and positioning the vehicle based on the pose state calculation equation by adopting an extended Kalman filtering method.
Compared with the prior art, the method can realize that: according to the method and the device, the point cloud data measured by the laser radar and the point cloud data measured by the combined navigation data are respectively obtained and processed, the current pose information of the vehicle is determined based on the combined navigation data, the transformation matrix and the translation matrix in the point cloud positioning are determined, the laser radar pose information of the vehicle is determined based on the transformation matrix and the translation matrix, the attitude calculation equation of the vehicle is determined based on the laser radar pose information and the attitude information of the vehicle, the vehicle is positioned based on the pose state calculation equation by adopting an extended Kalman filtering method, the form rule of the vehicle is better met, and the accuracy of the vehicle positioning is ensured.
In an alternative embodiment, an electronic device is provided, as shown in fig. 4, the electronic device 4000 shown in fig. 4 comprising: a processor 4001 and a memory 4003. Processor 4001 is coupled to memory 4003, such as via bus 4002. Optionally, the electronic device 4000 may further comprise a transceiver 4004. In addition, the transceiver 4004 is not limited to one in practical applications, and the structure of the electronic device 4000 is not limited to the embodiment of the present application.
The Processor 4001 may be a CPU (Central Processing Unit), a general-purpose Processor, a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array) or other Programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. The processor 4001 may also be a combination that performs a computational function, including, for example, a combination of one or more microprocessors, a combination of a DSP and a microprocessor, or the like.
Bus 4002 may include a path that carries information between the aforementioned components. The bus 4002 may be a PCI (Peripheral Component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The bus 4002 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 4, but this does not indicate only one bus or one type of bus.
The Memory 4003 may be a ROM (Read Only Memory) or other types of static storage devices that can store static information and instructions, a RAM (Random Access Memory) or other types of dynamic storage devices that can store information and instructions, an EEPROM (Electrically Erasable Programmable Read Only Memory), a CD-ROM (Compact Disc Read Only Memory) or other optical Disc storage, optical Disc storage (including Compact Disc, laser Disc, optical Disc, digital versatile Disc, blu-ray Disc, etc.), a magnetic Disc storage medium or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to these.
The memory 4003 is used for storing application codes for executing the scheme of the present application, and the execution is controlled by the processor 4001. Processor 4001 is configured to execute application code stored in memory 4003 to implement what is shown in the foregoing method embodiments.
The present application provides a computer-readable storage medium, on which a computer program is stored, which, when running on a computer, enables the computer to execute the corresponding content in the foregoing method embodiments. Compared with the prior art, the method and the device have the advantages that the point cloud data measured by the laser radar and the point cloud data measured by the combined navigation data are respectively obtained and processed, the current pose information of the vehicle is determined based on the combined navigation data, the transformation matrix and the translation matrix in the point cloud positioning are determined, the laser radar pose information of the vehicle is determined based on the transformation matrix and the translation matrix, the attitude calculation equation of the vehicle is determined based on the laser radar pose information and the attitude information of the vehicle, the vehicle is positioned based on the pose state calculation equation by adopting the extended Kalman filtering method, the form rule of the vehicle is better met, and the accuracy of the vehicle positioning is guaranteed.
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least a portion of the steps in the flow chart of the figure may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
The foregoing is only a partial embodiment of the present application, and it should be noted that, for those skilled in the art, several modifications and decorations can be made without departing from the principle of the present application, and these modifications and decorations should also be regarded as the protection scope of the present application.

Claims (9)

1. A positioning method based on three-dimensional point cloud is characterized by comprising the following steps:
acquiring combined navigation data of a vehicle and point cloud data measured by a laser radar;
analyzing the combined navigation data to obtain current combined navigation pose information of the vehicle, and determining a point cloud transformation matrix and a translation matrix of the vehicle in point cloud positioning based on the current combined navigation pose information;
matching the current frame point cloud of the vehicle with a pre-established map based on the transformation matrix and the translation matrix to obtain laser radar pose information of the vehicle;
analyzing the combined navigation data to obtain current attitude information of the vehicle, combining the current attitude information with the laser radar attitude information, and determining an attitude state calculation equation of the vehicle;
and positioning the vehicle based on the pose state calculation equation by adopting an extended Kalman filtering method.
2. The three-dimensional point cloud-based positioning method according to claim 1, wherein the laser radar pose information of the vehicle is:
X=[LT,WT,VT]T
wherein, L ═ Lx,ly,lz)TFor representing position information of the lidar, whereinxFor lidar x-axis coordinate information, lyFor lidar y-axis coordinate information, lzThe Z-axis coordinate information of the laser radar is obtained;
Figure FDA0003125244100000011
for representing the Euler angle of the lidar, wherein
Figure FDA0003125244100000012
A pitch angle, theta is a roll angle, psi is a heading angle, and V is (V)x,vy,vz)TFor indicating the speed of the lidar, where vxIs the x-axis velocity, vyIs the y-axis velocity, vzIs the z-axis velocity.
3. The three-dimensional point cloud-based positioning method according to claim 1, wherein the current attitude information of the vehicle is:
U=[wT,aT]T
wherein w ═ wx,wy,wz)TAngular velocity for representing the vehicle attitude, wherein wxAngular velocity of x-axis, wyAngular velocity of the y-axis, wzAngular velocity for the z-axis; a ═ ax,ay,az)TAcceleration for representing the vehicle attitude, wherein axAcceleration of the x-axis, ayAcceleration in the y-axis, azIs the z-axis acceleration.
4. The three-dimensional point cloud-based positioning method according to claim 1, wherein the pose state calculation equation of the vehicle is:
Figure FDA0003125244100000021
wherein,
Figure FDA0003125244100000022
representing a vehicle pose velocity state equation, V is the vehicle velocity,
Figure FDA0003125244100000023
representing a vehicle pose angular velocity state equation, wherein w represents a vehicle pose angular velocity;
Figure FDA0003125244100000024
is a direction cosine matrix;
Figure FDA0003125244100000025
is a state transition matrix;
Figure FDA0003125244100000026
is a gravity matrix, where g is the acceleration of gravity.
5. The three-dimensional point cloud-based positioning method according to claim 1, wherein before matching the current frame point cloud of the vehicle with a pre-established map based on the transformation matrix and the translation matrix, the method further comprises:
and performing down-sampling processing on the point cloud data measured by the laser radar, and performing feature extraction on the point cloud data subjected to the down-sampling processing.
6. The three-dimensional point cloud-based positioning method according to claim 1, wherein the positioning the vehicle based on the pose state calculation equation by using the extended kalman filtering method comprises:
according to the formula (P)+)-1=k·P-1+(1-k)·CTR-1C, calculating prior estimation covariance;
determine an attitude prediction equation of
Figure FDA0003125244100000027
Determining Kalman gain as K ═ PCT(k·R+CTPC)-1
Using a formula
Figure FDA0003125244100000028
Updating the pose of the laser radar;
determining the updated covariance as P+=P-(1-k)-1·KCP;
Wherein P + is a prior estimated covariance matrix, P is a covariance matrix,
Figure FDA0003125244100000029
estimating pose in prior, R is system error matrix, C is pose estimation information projection matrix when point cloud is initially registered, F is pose information matrix of laser radar, k is matching degree of point cloud between frames,
Figure FDA00031252441000000210
Figure FDA00031252441000000211
wherein, Pj represents a point in the current frame point cloud, and Pi represents the corresponding point between the map and Pj.
7. A positioning device based on three-dimensional point cloud is characterized by comprising:
the data acquisition module is used for acquiring the combined navigation data of the vehicle and the point cloud data measured by the laser radar;
the matrix determination module is used for analyzing the integrated navigation data to obtain current integrated navigation pose information of the vehicle, and determining a point cloud transformation matrix and a translation matrix of the vehicle in point cloud positioning based on the current integrated navigation pose information;
the position and orientation determining module is used for matching the current frame point cloud of the vehicle with a pre-established map based on the transformation matrix and the translation matrix to obtain laser radar position and orientation information of the vehicle;
the equation determination module is used for analyzing the combined navigation data to obtain current attitude information of the vehicle, combining the current attitude information with the laser radar attitude information and determining a pose state calculation equation of the vehicle;
and the positioning module is used for positioning the vehicle based on the pose state calculation equation by adopting an extended Kalman filtering method.
8. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the three-dimensional point cloud based localization method of any one of claims 1 to 6 when executing the program.
9. A computer readable storage medium, characterized in that the storage medium stores at least one instruction, at least one program, a set of codes, or a set of instructions that is loaded and executed by the processor to implement the three-dimensional point cloud based localization method according to any of claims 1-6.
CN202110687387.2A 2021-06-21 2021-06-21 Positioning method, device and equipment based on three-dimensional point cloud and storage medium Pending CN113538699A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110687387.2A CN113538699A (en) 2021-06-21 2021-06-21 Positioning method, device and equipment based on three-dimensional point cloud and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110687387.2A CN113538699A (en) 2021-06-21 2021-06-21 Positioning method, device and equipment based on three-dimensional point cloud and storage medium

Publications (1)

Publication Number Publication Date
CN113538699A true CN113538699A (en) 2021-10-22

Family

ID=78125442

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110687387.2A Pending CN113538699A (en) 2021-06-21 2021-06-21 Positioning method, device and equipment based on three-dimensional point cloud and storage medium

Country Status (1)

Country Link
CN (1) CN113538699A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114429515A (en) * 2021-12-21 2022-05-03 北京地平线机器人技术研发有限公司 Point cloud map construction method, device and equipment
WO2024001649A1 (en) * 2022-06-29 2024-01-04 深圳市海柔创新科技有限公司 Robot positioning method, apparatus and computing readable storage medium
CN118226401A (en) * 2024-05-22 2024-06-21 南京航空航天大学 Fixed wing aircraft pose measurement method and device based on laser radar three-dimensional point cloud

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108320329A (en) * 2018-02-02 2018-07-24 维坤智能科技(上海)有限公司 A kind of 3D map creating methods based on 3D laser
CN111161353A (en) * 2019-12-31 2020-05-15 深圳一清创新科技有限公司 Vehicle positioning method and device, readable storage medium and computer equipment
CN111915675A (en) * 2020-06-17 2020-11-10 广西综合交通大数据研究院 Particle filter point cloud positioning method based on particle drift, and device and system thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108320329A (en) * 2018-02-02 2018-07-24 维坤智能科技(上海)有限公司 A kind of 3D map creating methods based on 3D laser
CN111161353A (en) * 2019-12-31 2020-05-15 深圳一清创新科技有限公司 Vehicle positioning method and device, readable storage medium and computer equipment
CN111915675A (en) * 2020-06-17 2020-11-10 广西综合交通大数据研究院 Particle filter point cloud positioning method based on particle drift, and device and system thereof

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
严小意: "基于LiDAR/IMU融合的移动机器人导航定位技术研究", 《中国优秀硕士学位论文全文数据库信息科技辑》, pages 16 - 47 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114429515A (en) * 2021-12-21 2022-05-03 北京地平线机器人技术研发有限公司 Point cloud map construction method, device and equipment
WO2024001649A1 (en) * 2022-06-29 2024-01-04 深圳市海柔创新科技有限公司 Robot positioning method, apparatus and computing readable storage medium
CN118226401A (en) * 2024-05-22 2024-06-21 南京航空航天大学 Fixed wing aircraft pose measurement method and device based on laser radar three-dimensional point cloud

Similar Documents

Publication Publication Date Title
CN111947671B (en) Method, apparatus, computing device and computer-readable storage medium for positioning
CN112113574B (en) Method, apparatus, computing device and computer-readable storage medium for positioning
CN111136660B (en) Robot pose positioning method and system
CN113538699A (en) Positioning method, device and equipment based on three-dimensional point cloud and storage medium
EP2549288B1 (en) Identifying true feature matches for vision based navigation
CN112835085B (en) Method and device for determining vehicle position
JP2018124787A (en) Information processing device, data managing device, data managing system, method, and program
CN112183171A (en) Method and device for establishing beacon map based on visual beacon
CN112154429B (en) High-precision map positioning method, system, platform and computer readable storage medium
CN112964291A (en) Sensor calibration method and device, computer storage medium and terminal
CN111915675A (en) Particle filter point cloud positioning method based on particle drift, and device and system thereof
CN115711616B (en) Smooth positioning method and device for indoor and outdoor traversing unmanned aerial vehicle
CN114264301A (en) Vehicle-mounted multi-sensor fusion positioning method and device, chip and terminal
CN116777984A (en) System for calibrating external parameters of cameras in autonomous transportation vehicles
CN117218350A (en) SLAM implementation method and system based on solid-state radar
CN117726673A (en) Weld joint position obtaining method and device and electronic equipment
CN117075158A (en) Pose estimation method and system of unmanned deformation motion platform based on laser radar
CN115560744A (en) Robot, multi-sensor-based three-dimensional mapping method and storage medium
CN114842224A (en) Monocular unmanned aerial vehicle absolute vision matching positioning scheme based on geographical base map
CN111811501B (en) Trunk feature-based unmanned aerial vehicle positioning method, unmanned aerial vehicle and storage medium
CN114690226A (en) Monocular vision distance measurement method and system based on carrier phase difference technology assistance
CN111678515A (en) Device state estimation method and device, electronic device and storage medium
CN113034538A (en) Pose tracking method and device of visual inertial navigation equipment and visual inertial navigation equipment
CN115128655B (en) Positioning method and device for automatic driving vehicle, electronic equipment and storage medium
CN115711617B (en) Strong-consistency odometer and high-precision mapping method and system for offshore complex water area

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination