CN118311561A - Fusion positioning method of laser radar and millimeter wave radar and related equipment - Google Patents

Fusion positioning method of laser radar and millimeter wave radar and related equipment Download PDF

Info

Publication number
CN118311561A
CN118311561A CN202410432672.3A CN202410432672A CN118311561A CN 118311561 A CN118311561 A CN 118311561A CN 202410432672 A CN202410432672 A CN 202410432672A CN 118311561 A CN118311561 A CN 118311561A
Authority
CN
China
Prior art keywords
millimeter wave
laser radar
point cloud
cloud
radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410432672.3A
Other languages
Chinese (zh)
Inventor
姜梦馨
程宇威
王培栋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shaanxi Orca Electronic Intelligent Technology Co ltd
Original Assignee
Shaanxi Orca Electronic Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shaanxi Orca Electronic Intelligent Technology Co ltd filed Critical Shaanxi Orca Electronic Intelligent Technology Co ltd
Priority to CN202410432672.3A priority Critical patent/CN118311561A/en
Publication of CN118311561A publication Critical patent/CN118311561A/en
Pending legal-status Critical Current

Links

Landscapes

  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention provides a fusion positioning method of a laser radar and a millimeter wave radar and related equipment, wherein the method comprises the following steps: constructing a millimeter wave Lei Dadian cloud speed estimation model; performing platform speed estimation and screening millimeter wave Lei Dadian cloud information of a dynamic target based on millimeter wave radar point clouds through a millimeter wave Lei Dadian cloud speed estimation model; filtering dynamic target point cloud in the laser radar point cloud by utilizing millimeter wave Lei Dadian cloud information of the dynamic target; extracting and matching laser radar feature points based on the residual laser radar point clouds to obtain pose variation; and comparing the inter-frame displacement based on the millimeter wave radar speed estimation result with the inter-frame displacement based on the laser radar point cloud matching, judging the degradation condition of the laser radar odometer, and outputting a final fusion positioning result based on the degradation condition. The invention has the beneficial effects that: the positioning stability of a scene with a moving target, a laser radar point cloud degradation scene and a multi-rotation scene can be improved.

Description

Fusion positioning method of laser radar and millimeter wave radar and related equipment
Technical Field
The invention relates to the technical field of radar fusion, in particular to a fusion positioning method of a laser radar and a millimeter wave radar and related equipment.
Background
In recent years, the requirements for stable and robust positioning systems are higher and higher due to the rapid development of robots and autopilot technologies, tasks such as drawing and planning can be performed based on accurate positioning results, and the autonomous platform can effectively complete task execution. However, the conventional satellite positioning system cannot cover all scenes, especially outdoor scenes or indoor scenes with occlusion. Therefore, positioning technologies based on other external sensors, such as lidar, millimeter wave radar, etc., are receiving increasing attention of researchers in the relevant fields. The laser radar point cloud can represent accurate three-dimensional structural information of an external environment, and in an environment with obvious structural characteristics, a better positioning result can be obtained by matching and positioning the laser radar point cloud. The millimeter wave radar has low cost and small volume, is widely applied to automatic driving in recent years, and can measure the Doppler speed of a target so as to further back-push the speed of the platform, so that the millimeter wave radar is also commonly used in positioning tasks.
However, relying only on the positioning of either of these two sensors has drawbacks: for a positioning algorithm based on the laser radar, when a scene with insufficient structural property appears, the positioning based on the laser radar often fails, such as a long corridor, a tunnel and other scenes; for the positioning algorithm based on the millimeter wave radar, because the structural information of the point cloud is weak, although the self speed can be estimated through the Doppler speed of the point cloud, a better angle estimation result cannot be obtained, so that the estimation result in the scenes of partial multiple rotations and the like is not good without the assistance of other sensors.
Disclosure of Invention
The technical problems to be solved by the invention are as follows: the method aims to solve the problem of inaccurate fusion positioning of the laser radar and the millimeter wave radar.
In order to solve the technical problems, the invention adopts the following technical scheme: a fusion positioning method of a laser radar and a millimeter wave radar comprises the following steps:
Constructing a millimeter wave Lei Dadian cloud speed estimation model;
Performing platform speed estimation and screening millimeter wave Lei Dadian cloud information of a dynamic target based on millimeter wave radar point clouds through a millimeter wave Lei Dadian cloud speed estimation model;
Filtering dynamic target point cloud in the laser radar point cloud by utilizing millimeter wave Lei Dadian cloud information of the dynamic target;
extracting and matching laser radar feature points based on the residual laser radar point clouds to obtain pose variation;
Comparing the inter-frame displacement based on millimeter wave radar speed estimation results with the inter-frame displacement based on laser radar point cloud matching, and judging the degradation condition of the laser radar odometer;
and outputting a final fusion positioning result based on the degradation condition of the laser radar odometer.
Further, the constructing the millimeter wave Lei Dadian cloud speed estimation model includes:
Representing the millimeter wave Lei Dadian cloud at the time t as Wherein p i representsP i={rii,vi }, where r ii,vi represents the distance of the point from the origin, the horizontal azimuth of the point, the Doppler velocity value of the point, and m t representsThe total point cloud number in (a);
if the millimeter wave radar is assembled in the middle of the platform, the y-axis direction of the radar faces forward, and the x-axis direction faces right; the platform speed is the speed along the front direction of the platform And a speed perpendicular to the forward direction of the platform, toward the right side of the platformIf the targets in the scene are all static targets, the relationship between the Doppler velocity and the platform velocity of the millimeter wave Lei Dadian cloud can be expressed as follows:
where v i and θ i are known amounts, Is the quantity to be solved.
Further, performing platform speed estimation based on millimeter wave radar point clouds through a millimeter wave Lei Dadian cloud speed estimation model includes:
If t=0, i.e. t is the starting time, assuming that there is no dynamic object in the scene, obtaining by least square fitting
If t >0, the speed estimated at the last time, i.e., time t-1, is usedAs a priori, filtering out a dynamic target point cloud possibly existing in the millimeter wave Lei Dadian cloud at the moment of t, namely bringing the estimated speed at the moment of t-1 into the cloud, and calculating the expected Doppler speed of the point cloud:
Comparing the expected Doppler velocity with the actual Doppler velocity of each point cloud, if the difference exceeds a set threshold v Thre, namely The point p i is considered to be a dynamic target point cloud, the dynamic target point cloud is pre-filtered, and the point cloud set after the pre-filtering is recorded as
For a pair ofThe points in (2) are obtained by iterative weighted least square fitting
Further, the millimeter wave Lei Dadian cloud information of the dynamic target screening includes:
Based on platform velocity estimation Calculating theoretical Doppler velocity for each point
According to the difference between the theoretical Doppler velocity and the actual Doppler velocity, extracting millimeter wave Lei Dadian cloud which possibly is a dynamic target, namely ifThen p i is considered a dynamic target point, where VDoppler thre is the speed difference threshold; In (1), the set of all dynamic target points is recorded as
For a pair ofClustering all points in the model (C) by using a density-based clustering algorithm DBSCAN to obtainDynamic target millimeter wave Lei Dadian cloud cluster group is recorded as
Further, filtering the dynamic target point cloud in the laser radar point cloud by utilizing millimeter wave Lei Dadian cloud information of the dynamic target includes:
According to calculations Dynamic target millimeter wave Lei Dadian cloud clustering groupCalculating an outsourcing boundary of each millimeter wave radar dynamic target point cloud clustering group; for the ith point cloud cluster groupIts 3-dimensional outer packet boundary, recorded asX-axis minimum in all points in (1)Maximum x-axisMinimum value of y-axisMaximum value of y axisMinimum z-axis valueMaximum z-axisThe method is characterized by comprising the following steps:
For a frame of laser radar point cloud nearest to time t Is thatIn (2), whereinX L,yL,zL is the coordinates of the point cloud in the x, y and z three axes respectively, and the laser radar points in the 3-dimensional outer package boundary of the dynamic target extracted by the millimeter wave radar are filtered to obtain a point cloud set
Further, extracting and matching laser radar feature points based on the residual laser radar point cloud, and obtaining pose variation comprises the following steps:
Using a homogeneous matrix T i with dimensions of 4 x 4 to represent the pose at time i, wherein R i is a rotation matrix, the dimension is 3×3, t i is a displacement matrix, and the dimension is 3×1; the facing pose estimation object only performs two-dimensional plane motion, so that R i(1,3),Ri(2,3),Ri(3,3),Ri(3,1),Ri (3, 2) in R i is 0, and t i (3, 1) in t i is 0;
If t=0, i.e. the initial time, the pose of the initial time is set Wherein the method comprises the steps of T 0=[0,0,0]T, representing the pose of each moment by taking the position of the millimeter wave radar coordinate system at the initial moment as the origin of the coordinate system;
If t >0, then And (3) withFeature extraction and matching are carried out by adopting a feature extraction mode in a laser radar mileage calculation method LOAM and an inter-frame point cloud matching mode, so as to obtain pose transformation quantity of t moment relative to t-1 momentWherein the relative rotation amount isThe relative displacement is
Further, comparing the frame shift amount based on the millimeter wave radar speed estimation result with the frame shift amount based on the laser radar point cloud matching, and judging the degradation condition of the laser radar odometer comprises:
calculating inter-frame displacement based on millimeter wave radar speed estimation result The calculation method is that if the time difference between the time t-1 and the time t is delta t
The relative displacement obtained by laser radar point cloud matching isThe inter-frame displacement calculated based on millimeter wave radar speed estimation result isCalculating the ratio of the two estimated displacement sizesWhere |·| represents the displacement length, if s maxThre<st<smaxThre, then the lidar point cloud matching is considered not degraded, where s minThre and s maxThre are the set lower and upper thresholds, respectively.
Further, outputting a final fusion positioning result based on the degradation condition of the laser radar odometer comprises:
If the laser radar point cloud matching is judged not to be degraded, the pose transformation quantity between the t-1 moment and the t moment is considered to be the pose transformation quantity obtained through the laser radar point cloud matching The pose at the previous moment is T t-1, and the pose at the current moment
If the laser radar point cloud matching is judged to be degraded, recalculating the pose conversion quantity between the t moment and the t-1 moment according to the inter-frame displacement quantity based on the millimeter wave radar speed estimation result
The invention also provides a fusion positioning device of the laser radar and the millimeter wave radar, which comprises:
the speed estimation model construction module is used for constructing a millimeter wave Lei Dadian cloud speed estimation model;
The speed estimation and point cloud screening module is used for carrying out platform speed estimation and screening millimeter wave Lei Dadian cloud information of a dynamic target based on millimeter wave radar point cloud through a millimeter wave Lei Dadian cloud speed estimation model;
the dynamic target point Yun Lvchu module is used for filtering dynamic target point clouds in the laser radar point clouds by utilizing millimeter wave Lei Dadian cloud information of the dynamic target;
the pose change amount acquisition module is used for extracting and matching laser radar characteristic points based on the residual laser radar point cloud to obtain pose change amounts;
the laser radar odometer degradation module is used for comparing the frame displacement based on the millimeter wave radar speed estimation result with the frame displacement based on the laser radar point cloud matching and judging the degradation condition of the laser radar odometer;
And the fusion positioning result output module is used for outputting a final fusion positioning result based on the degradation condition of the laser radar odometer.
The invention also provides computer equipment, which comprises a memory and a processor, wherein the memory stores a computer program, and the processor realizes the fusion positioning method of the laser radar and the millimeter wave radar when executing the computer program.
The invention has the beneficial effects that: firstly, carrying out platform speed estimation and millimeter wave Lei Dadian cloud information of a dynamic target screening based on millimeter wave radar point clouds through a millimeter wave Lei Dadian cloud speed estimation model; then filtering dynamic target point cloud in the laser radar point cloud by utilizing millimeter wave Lei Dadian cloud information of the dynamic target; then extracting and matching laser radar feature points based on the residual laser radar point cloud to obtain pose variation; and comparing the inter-frame displacement based on the millimeter wave radar speed estimation result with the inter-frame displacement based on the laser radar point cloud matching, judging the degradation condition of the laser radar odometer, and outputting a final fusion positioning result. The scheme can improve the positioning stability of a moving target scene, a laser radar point cloud degradation scene and a multi-rotation scene, and better assist the stable operation of the robot platform and the automatic driving platform.
Drawings
The specific structure of the present invention will be described in detail with reference to the accompanying drawings.
FIG. 1 is a flow chart of a method for positioning a laser radar and a millimeter wave radar in a fused manner according to an embodiment of the present invention;
FIG. 2 is a block diagram of a fusion positioning device of a laser radar and a millimeter wave radar according to an embodiment of the present invention;
Fig. 3 is a schematic block diagram of a computer device in accordance with an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be understood that the terms "comprises" and "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in the present specification and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
As shown in fig. 1, the embodiment of the present invention is as follows: a fusion positioning method of a laser radar and a millimeter wave radar comprises the following steps:
s10, constructing a millimeter wave Lei Dadian cloud speed estimation model;
s20, carrying out platform speed estimation and millimeter wave Lei Dadian cloud information of a dynamic target screening based on millimeter wave radar point clouds through a millimeter wave Lei Dadian cloud speed estimation model;
S30, filtering dynamic target point cloud in the laser radar point cloud by utilizing millimeter wave Lei Dadian cloud information of the dynamic target;
S40, extracting and matching laser radar feature points based on the residual laser radar point cloud to obtain pose variation;
s50, comparing the inter-frame displacement based on the millimeter wave radar speed estimation result with the inter-frame displacement based on the laser radar point cloud matching, and judging the degradation condition of the laser radar odometer;
and S60, outputting a final fusion positioning result based on the degradation condition of the laser radar odometer.
In the embodiment, through speed estimation based on the millimeter wave radar, pose transformation estimation based on the laser radar point cloud assisted by the millimeter wave radar and fusion positioning result output of the laser radar and the millimeter wave radar. The method can improve the positioning stability of a scene with a moving target, a laser radar point cloud degradation scene and a multi-rotation scene, and better assist the stable operation of a robot platform and an automatic driving platform.
1. Speed estimation based on millimeter wave radar:
in a specific embodiment, in step S10, constructing the millimeter wave Lei Dadian cloud speed estimation model includes:
Representing the millimeter wave Lei Dadian cloud at the time t as Wherein p i representsP i={rii,vi }, where r ii,vi represents the distance of the point from the origin, the horizontal azimuth of the point, the Doppler velocity value of the point, and m t representsThe total point cloud number in (a);
if the millimeter wave radar is assembled in the middle of the platform, the y-axis direction of the radar faces forward, and the x-axis direction faces right; the platform speed is the speed along the front direction of the platform And a speed perpendicular to the forward direction of the platform, toward the right side of the platformIf the targets in the scene are all static targets, the relationship between the Doppler velocity and the platform velocity of the millimeter wave Lei Dadian cloud can be expressed as follows (this step is obtained from the Doppler velocity projection relationship):
where v i and θ i are known amounts, Is the quantity to be solved.
In a specific embodiment, in step S20, performing platform speed estimation based on millimeter wave radar point cloud through a millimeter wave Lei Dadian cloud speed estimation model includes:
for the constructed millimeter wave Lei Dadian cloud speed estimation model, because not only static targets but also a small amount of dynamic targets can exist in a scene, namely interference noise in model solving, the dynamic target point pair solving is needed to be considered in solving Is a part of the interference of the (c).
If t=0, i.e. t is the starting time, assuming that there is no dynamic object in the scene, obtaining by least square fitting
If t >0, the speed estimated at the last time, i.e., time t-1, is usedAs a priori, filtering out a dynamic target point cloud possibly existing in the millimeter wave Lei Dadian cloud at the moment of t, namely bringing the estimated speed at the moment of t-1 into the cloud, and calculating the expected Doppler speed of the point cloud:
Comparing the expected Doppler velocity with the actual Doppler velocity of each point cloud, if the difference exceeds a set threshold v Thre, considering the point as a dynamic target point cloud, and prefiltering the target point cloud, namely The point p i is considered to be a dynamic target point cloud, the dynamic target point cloud is pre-filtered, and the point cloud set after the pre-filtering is recorded as
For a pair ofThe points in (2) are obtained by iterative weighted least square fitting
In a specific embodiment, in step S20, filtering millimeter wave Lei Dadian cloud information of a dynamic target includes:
Based on platform velocity estimation Calculating theoretical Doppler velocity for each point
According to the difference between the theoretical Doppler velocity and the actual Doppler velocity, extracting millimeter wave Lei Dadian cloud which possibly is a dynamic target, namely ifThen p i is considered a dynamic target point, where VDoppler thre is the speed difference threshold; In (1), the set of all dynamic target points is recorded as
For a pair ofClustering all points in the model (C) by using a density-based clustering algorithm DBSCAN to obtainDynamic target millimeter wave Lei Dadian cloud cluster group is recorded as
2. Millimeter wave radar-assisted pose transformation estimation based on laser radar point cloud:
Millimeter wave radar-assisted pose transformation estimation based on laser radar point cloud: for the odometer based on laser radar point cloud matching, the targets in the scene are always assumed to be static targets, and the dynamic targets in the scene can cause the point cloud matching accuracy to be reduced or even completely disabled. Therefore, dynamic targets in the laser radar point cloud are filtered by utilizing the dynamic target area extracted by the millimeter wave radar, so that the accuracy of laser radar point cloud matching is improved.
In a specific embodiment, in step S30, filtering the dynamic target point cloud in the laser radar point cloud by using the millimeter wave Lei Dadian cloud information of the dynamic target includes:
According to calculations Dynamic target millimeter wave Lei Dadian cloud clustering groupCalculating an outsourcing boundary of each millimeter wave radar dynamic target point cloud clustering group; for the ith point cloud cluster groupIts 3-dimensional outer packet boundary, recorded asX-axis minimum in all points in (1)Maximum x-axisMinimum value of y-axisMaximum value of y axisMinimum z-axis valueMaximum z-axisThe method is characterized by comprising the following steps:
For a frame of laser radar point cloud nearest to time t Is thatIn (2), whereinX L,yL,zL is the coordinates of the point cloud in the x, y and z three axes respectively, and the laser radar points in the 3-dimensional outer package boundary of the dynamic target extracted by the millimeter wave radar are filtered to obtain a point cloud setIn particular, the method comprises the steps of,
Firstly, laser radar point cloudTransferring the laser radar-millimeter wave radar external parameter matrix T L2R into a millimeter wave radar coordinate system to obtain a laser radar point cloud coordinate under the millimeter wave radar coordinate system Wherein the method comprises the steps of
For a pair ofIn the method, point clouds in the outsourcing boundaries of all millimeter wave radar dynamic target point cloud cluster groups are filtered, namely the outsourcing boundaries of one dynamic target point cloud cluster group are filteredIf it isThe following conditions are satisfied, and the dynamic target laser radar point cloud is considered as:
Delete all The dynamic target laser radar point cloud in the system obtains a point cloud set
In a specific embodiment, in step S40, extracting and matching the lidar feature points based on the remaining lidar point cloud, to obtain the pose variation amount includes:
Using a homogeneous matrix T i with dimensions of 4 x 4 to represent the pose at time i, wherein R i is a rotation matrix, the dimension is 3×3, t i is a displacement matrix, and the dimension is 3×1; the facing pose estimation object only performs two-dimensional plane motion, so that R i(1,3),Ri(2,3),Ri(3,3),Ri(3,1),Ri (3, 2) in R i is 0, and t i (3, 1) in t i is 0;
If t=0, i.e. the initial time, the pose of the initial time is set Wherein the method comprises the steps of T 0=[0,0,0]T, representing the pose of each moment by taking the position of the millimeter wave radar coordinate system at the initial moment as the origin of the coordinate system;
If t >0, then And (3) withFeature extraction and matching are carried out by adopting a feature extraction mode (namely laser radar point cloud corner point and plane point extraction) and an inter-frame point cloud matching mode in a laser radar mileage calculation method LOAM, so as to obtain pose transformation quantity of t moment relative to t-1 momentWherein the relative rotation amount isThe relative displacement is
3. And outputting a fusion positioning result of the laser radar and the millimeter wave radar:
in a specific embodiment, in step S50, comparing the amount of inter-frame displacement based on the millimeter wave radar speed estimation result with the amount of inter-frame displacement based on the matching of the laser radar point cloud, the determining the degradation of the laser radar odometer includes:
calculating inter-frame displacement based on millimeter wave radar speed estimation result The calculation method is that if the time difference between the time t-1 and the time t is delta t
The relative displacement obtained by laser radar point cloud matching isThe inter-frame displacement calculated based on millimeter wave radar speed estimation result isCalculating the ratio of the two estimated displacement sizesAnd if s minThre<st<smaxThre, the laser radar point cloud matching is considered not to be degraded, wherein s minThre and s maxThre are respectively set lower and upper thresholds, namely if the laser radar point cloud matching obtains that the displacement ratio calculated by the displacement and millimeter wave radar speed estimation result is within a set range, the laser radar point cloud matching is considered not to be degraded, and otherwise, the laser radar point cloud matching is considered to be degraded.
In a specific embodiment, in step S60, outputting the final fusion positioning result based on the degradation of the lidar odometer includes:
If the laser radar point cloud matching is judged not to be degraded, the pose transformation quantity between the t-1 moment and the t moment is considered to be the pose transformation quantity obtained through the laser radar point cloud matching The pose at the previous moment is T t-1, and the pose at the current moment
If the laser radar point cloud matching is judged to be degraded, recalculating the pose conversion quantity between the t moment and the t-1 moment according to the inter-frame displacement quantity based on the millimeter wave radar speed estimation resultIn particular to a special-shaped ceramic tile,
Displacement conversion amount estimated by millimeter wave radarAs the initial value of the displacement conversion amount in the pose conversion amount between the time t-1 and the time t, namely setting the initial value of the pose conversion amount asWherein the relative rotation amount is an initial valueIs arranged asThe initial value of the relative displacement is set as
By the transformation amount of the poseRe-executing laser radar point cloud at t moment as initial valueLaser radar point cloud at time t-1The matching method is 2.2, and is characterized in that in the matching process, the initial value of the pose transformation amount is setAnd constraining the displacement conversion amount estimation result between the t-1 time and the t time and the displacement conversion amount estimation result between the t-1 time and the t timeThe difference between the relative displacement conversion quantity estimation result and the matching process does not exceed the set threshold valueThe absolute value of the difference is not greater than (Δχ Thre,ΔyThre), where Δχ Thre,ΔyThre is a set threshold;
Through the matching, the limited pose transformation amount is obtained The pose at the previous moment is T t-1, and the pose at the current moment is T
By the method, the pose estimation result T t at the time T under the condition that the laser radar is not degraded or degenerated is obtained.
As shown in fig. 2, another embodiment of the present invention is a fusion positioning device of a laser radar and a millimeter wave radar, including:
The speed estimation model construction module 10 is used for constructing a millimeter wave Lei Dadian cloud speed estimation model;
the speed estimation and point cloud screening module 20 is used for estimating the platform speed and screening millimeter wave Lei Dadian cloud information of the dynamic target based on millimeter wave radar point cloud through a millimeter wave Lei Dadian cloud speed estimation model;
The dynamic target point Yun Lvchu module 30 is configured to filter out a dynamic target point cloud in the laser radar point cloud by using millimeter wave Lei Dadian cloud information of the dynamic target;
the pose change amount acquisition module 40 is used for extracting and matching laser radar feature points based on the residual laser radar point clouds to obtain pose change amounts;
the laser radar odometer degradation module 50 is used for comparing the frame-to-frame shift amount based on the millimeter wave radar speed estimation result with the frame-to-frame shift amount based on the laser radar point cloud matching, and judging the degradation condition of the laser radar odometer;
And the fusion positioning result output module 60 is used for outputting a final fusion positioning result based on the degradation condition of the laser radar odometer.
It should be noted that, it can be clearly understood by those skilled in the art that the specific implementation process of the fusion positioning device of the laser radar and the millimeter wave radar can refer to the corresponding description in the foregoing method embodiment, and for convenience and brevity of description, the description is omitted here.
The above-described fused locating device of a lidar and a millimeter wave radar may be implemented in the form of a computer program that is executable on a computer device as shown in fig. 3.
Referring to fig. 3, fig. 3 is a schematic block diagram of a computer device according to an embodiment of the present application. The computer device 500 may be a terminal or a server, where the terminal may be an electronic device with a communication function, such as a smart phone, a tablet computer, a notebook computer, a desktop computer, a personal digital assistant, and a wearable device. The server may be an independent server or a server cluster formed by a plurality of servers.
With reference to FIG. 3, the computer device 500 includes a processor 502, memory, and a network interface 505, connected by a system bus 501, where the memory may include a non-volatile storage medium 503 and an internal memory 504.
The non-volatile storage medium 503 may store an operating system 5031 and a computer program 5032. The computer program 5032 includes program instructions that, when executed, cause the processor 502 to perform a method of fusion localization of lidar and millimeter-wave radar.
The processor 502 is used to provide computing and control capabilities to support the operation of the overall computer device 500.
The internal memory 504 provides an environment for the execution of a computer program 5032 in the non-volatile storage medium 503, which computer program 5032, when executed by the processor 502, causes the processor 502 to perform a method for fusion localization of lidar and millimeter-wave radar.
The network interface 505 is used for network communication with other devices. It will be appreciated by those skilled in the art that the architecture shown in fig. 3 is merely a block diagram of some of the architecture relevant to the present inventive arrangements and is not limiting of the computer device 500 to which the present inventive arrangements may be implemented, and that a particular computer device 500 may include more or fewer components than shown, or may combine some of the components, or have a different arrangement of components.
The processor 502 is configured to execute a computer program 5032 stored in a memory, so as to implement the fusion positioning method of the laser radar and the millimeter wave radar as described above.
It should be appreciated that in embodiments of the present application, the Processor 502 may be a central processing unit (Central Processing Unit, CPU), the Processor 502 may also be other general purpose processors, digital signal processors (DIGITAL SIGNAL processors, DSPs), application SPECIFIC INTEGRATED Circuits (ASICs), off-the-shelf Programmable gate arrays (Field-Programmable GATEARRAY, FPGA) or other Programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. Wherein the general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
Those skilled in the art will appreciate that all or part of the flow in a method embodying the above described embodiments may be accomplished by computer programs instructing the relevant hardware. The computer program comprises program instructions, and the computer program can be stored in a storage medium, which is a computer readable storage medium. The program instructions are executed by at least one processor in the computer system to implement the flow steps of the embodiments of the method described above.
Accordingly, the present invention also provides a storage medium. The storage medium may be a computer readable storage medium. The storage medium stores a computer program, wherein the computer program includes program instructions. The program instructions, when executed by the processor, cause the processor to perform the fusion positioning method of the laser radar and the millimeter wave radar as described above.
The storage medium may be a U-disk, a removable hard disk, a Read-Only Memory (ROM), a magnetic disk, or an optical disk, or other various computer-readable storage media that can store program codes.
Those of ordinary skill in the art will appreciate that the elements and algorithm steps described in connection with the embodiments disclosed herein may be embodied in electronic hardware, in computer software, or in a combination of the two, and that the elements and steps of the examples have been generally described in terms of function in the foregoing description to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the several embodiments provided by the present invention, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the device embodiments described above are merely illustrative. For example, the division of each unit is only one logic function division, and there may be another division manner in actual implementation. For example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed.
The steps in the method of the embodiment of the invention can be sequentially adjusted, combined and deleted according to actual needs. The units in the device of the embodiment of the invention can be combined, divided and deleted according to actual needs. In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The integrated unit may be stored in a storage medium if implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the present invention is essentially or a part contributing to the prior art, or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a terminal, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention.
While the invention has been described with reference to certain preferred embodiments, it will be understood by those skilled in the art that various changes and substitutions of equivalents may be made and equivalents will be apparent to those skilled in the art without departing from the scope of the invention. Therefore, the protection scope of the invention is subject to the protection scope of the claims.

Claims (10)

1. A fusion positioning method of a laser radar and a millimeter wave radar is characterized by comprising the following steps:
Constructing a millimeter wave Lei Dadian cloud speed estimation model;
Performing platform speed estimation and screening millimeter wave Lei Dadian cloud information of a dynamic target based on millimeter wave radar point clouds through a millimeter wave Lei Dadian cloud speed estimation model;
Filtering dynamic target point cloud in the laser radar point cloud by utilizing millimeter wave Lei Dadian cloud information of the dynamic target;
extracting and matching laser radar feature points based on the residual laser radar point clouds to obtain pose variation;
Comparing the inter-frame displacement based on millimeter wave radar speed estimation results with the inter-frame displacement based on laser radar point cloud matching, and judging the degradation condition of the laser radar odometer;
and outputting a final fusion positioning result based on the degradation condition of the laser radar odometer.
2. The fusion positioning method of a lidar and a millimeter wave radar according to claim 1, wherein the constructing a millimeter wave Lei Dadian cloud speed estimation model comprises:
Representing the millimeter wave Lei Dadian cloud at the time t as Wherein p i representsP i={rii,vi }, where r ii,vi represents the distance of the point from the origin, the horizontal azimuth of the point, the Doppler velocity value of the point, and m t representsThe total point cloud number in (a);
if the millimeter wave radar is assembled in the middle of the platform, the y-axis direction of the radar faces forward, and the x-axis direction faces right; the platform speed is the speed along the front direction of the platform And a speed perpendicular to the forward direction of the platform, toward the right side of the platformIf the targets in the scene are all static targets, the relationship between the Doppler velocity and the platform velocity of the millimeter wave Lei Dadian cloud can be expressed as follows:
where v i and θ i are known amounts, Is the quantity to be solved.
3. The fusion positioning method of a laser radar and a millimeter wave radar according to claim 2, wherein the platform speed estimation based on the millimeter wave radar point cloud by a millimeter wave Lei Dadian cloud speed estimation model comprises:
If t=0, i.e. t is the starting time, assuming that there is no dynamic object in the scene, obtaining by least square fitting
If t >0, the speed estimated at the last time, i.e., time t-1, is usedAs a priori, filtering out a dynamic target point cloud possibly existing in the millimeter wave Lei Dadian cloud at the moment of t, namely bringing the estimated speed at the moment of t-1 into the cloud, and calculating the expected Doppler speed of the point cloud:
Comparing the expected Doppler velocity with the actual Doppler velocity of each point cloud, if the difference exceeds a set threshold v Thre, namely The point p i is considered to be a dynamic target point cloud, the dynamic target point cloud is pre-filtered, and the point cloud set after the pre-filtering is recorded as
For a pair ofThe points in (2) are obtained by iterative weighted least square fitting
4. The fusion positioning method of a laser radar and a millimeter wave radar according to claim 3, wherein the filtering millimeter wave Lei Dadian cloud information of a dynamic target comprises:
Based on platform velocity estimation Calculating theoretical Doppler velocity for each point
According to the difference between the theoretical Doppler velocity and the actual Doppler velocity, extracting millimeter wave Lei Dadian cloud which possibly is a dynamic target, namely ifThen p i is considered a dynamic target point, where VDoppler thre is the speed difference threshold; In (1), the set of all dynamic target points is recorded as
For a pair ofClustering all points in the model (C) by using a density-based clustering algorithm DBSCAN to obtainDynamic target millimeter wave Lei Dadian cloud cluster group is recorded as
5. The fusion positioning method of the laser radar and the millimeter wave radar according to claim 4, wherein filtering the dynamic target point cloud in the laser radar point cloud by utilizing millimeter wave Lei Dadian cloud information of the dynamic target comprises:
According to calculations Dynamic target millimeter wave Lei Dadian cloud clustering groupCalculating an outsourcing boundary of each millimeter wave radar dynamic target point cloud clustering group; for the ith point cloud cluster groupIts 3-dimensional outer packet boundary, recorded asX-axis minimum in all points in (1)Maximum x-axisMinimum value of y-axisMaximum value of y axisMinimum z-axis valueMaximum z-axisThe method is characterized by comprising the following steps:
For a frame of laser radar point cloud nearest to time t Is thatIn (2), whereinX L,yL,zL is the coordinates of the point cloud in the x, y and z three axes respectively, and the laser radar points in the 3-dimensional outer package boundary of the dynamic target extracted by the millimeter wave radar are filtered to obtain a point cloud set
6. The fusion positioning method of the laser radar and the millimeter wave radar according to claim 5, wherein the extracting and matching of the laser radar feature points based on the residual laser radar point cloud, and the obtaining of the pose variation comprises:
Using a homogeneous matrix T i with dimensions of 4 x 4 to represent the pose at time i, wherein R i is a rotation matrix, the dimension is 3×3, t i is a displacement matrix, and the dimension is 3×1; the facing pose estimation object only performs two-dimensional plane motion, so that R i(1,3),Ri(2,3),Ri(3,3),Ri(3,1),Ri (3, 2) in R i is 0, and t i (3, 1) in t i is 0;
If t=0, i.e. the initial time, the pose of the initial time is set Wherein the method comprises the steps of T 0=[0,0,0]T, representing the pose of each moment by taking the position of the millimeter wave radar coordinate system at the initial moment as the origin of the coordinate system;
If t >0, then And (3) withFeature extraction and matching are carried out by adopting a feature extraction mode in a laser radar mileage calculation method LOAM and an inter-frame point cloud matching mode, so as to obtain pose transformation quantity of t moment relative to t-1 momentWherein the relative rotation amount isThe relative displacement is
7. The fusion positioning method of the laser radar and the millimeter wave radar according to claim 6, wherein comparing the amount of the frame-to-frame shift based on the velocity estimation result of the millimeter wave radar with the amount of the frame-to-frame shift based on the matching of the laser radar point cloud comprises:
calculating inter-frame displacement based on millimeter wave radar speed estimation result The calculation method is that if the time difference between the time t-1 and the time t is delta t
The relative displacement obtained by laser radar point cloud matching isThe inter-frame displacement calculated based on millimeter wave radar speed estimation result isCalculating the ratio of the two estimated displacement sizesWhere |·| represents the displacement length, if s minThre<st<smaxThre, then the lidar point cloud matching is considered not degraded, where s minThre and s maxThre are the set lower and upper thresholds, respectively.
8. The fusion positioning method of a lidar and a millimeter wave radar according to claim 7, wherein outputting a final fusion positioning result based on a degradation of the lidar odometer comprises:
If the laser radar point cloud matching is judged not to be degraded, the pose transformation quantity between the t-1 moment and the t moment is considered to be the pose transformation quantity obtained through the laser radar point cloud matching The pose at the previous moment is T t-1, and the pose at the current moment
If the laser radar point cloud matching is judged to be degraded, recalculating the pose conversion quantity between the t moment and the t-1 moment according to the inter-frame displacement quantity based on the millimeter wave radar speed estimation result
9. The utility model provides a laser radar and millimeter wave radar's integration positioner which characterized in that includes:
the speed estimation model construction module is used for constructing a millimeter wave Lei Dadian cloud speed estimation model;
The speed estimation and point cloud screening module is used for carrying out platform speed estimation and screening millimeter wave Lei Dadian cloud information of a dynamic target based on millimeter wave radar point cloud through a millimeter wave Lei Dadian cloud speed estimation model;
the dynamic target point Yun Lvchu module is used for filtering dynamic target point clouds in the laser radar point clouds by utilizing millimeter wave Lei Dadian cloud information of the dynamic target;
the pose change amount acquisition module is used for extracting and matching laser radar characteristic points based on the residual laser radar point cloud to obtain pose change amounts;
the laser radar odometer degradation module is used for comparing the frame displacement based on the millimeter wave radar speed estimation result with the frame displacement based on the laser radar point cloud matching and judging the degradation condition of the laser radar odometer;
And the fusion positioning result output module is used for outputting a final fusion positioning result based on the degradation condition of the laser radar odometer.
10. A computer device, characterized by: the computer device comprises a memory and a processor, wherein the memory stores a computer program, and the processor realizes the fusion positioning method of the laser radar and the millimeter wave radar according to any one of claims 1 to 8 when executing the computer program.
CN202410432672.3A 2024-04-11 2024-04-11 Fusion positioning method of laser radar and millimeter wave radar and related equipment Pending CN118311561A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410432672.3A CN118311561A (en) 2024-04-11 2024-04-11 Fusion positioning method of laser radar and millimeter wave radar and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410432672.3A CN118311561A (en) 2024-04-11 2024-04-11 Fusion positioning method of laser radar and millimeter wave radar and related equipment

Publications (1)

Publication Number Publication Date
CN118311561A true CN118311561A (en) 2024-07-09

Family

ID=91721813

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410432672.3A Pending CN118311561A (en) 2024-04-11 2024-04-11 Fusion positioning method of laser radar and millimeter wave radar and related equipment

Country Status (1)

Country Link
CN (1) CN118311561A (en)

Similar Documents

Publication Publication Date Title
US20190195631A1 (en) Positioning method, positioning device, and robot
CN110276834B (en) Construction method of laser point cloud map, terminal and readable storage medium
CN113933818A (en) Method, device, storage medium and program product for calibrating laser radar external parameter
US11045953B2 (en) Relocalization method and robot using the same
CN112051575B (en) Method for adjusting millimeter wave radar and laser radar and related device
CN111209978A (en) Three-dimensional visual repositioning method and device, computing equipment and storage medium
CN112558072B (en) Vehicle positioning method, device, system, electronic equipment and storage medium
CN115079168B (en) Mapping method, device and equipment based on fusion of laser radar and millimeter wave radar
CN115077541A (en) Positioning method and device for automatic driving vehicle, electronic equipment and storage medium
CN114527490A (en) Detecting three-dimensional structural models while a vehicle is in motion
CN114593735A (en) Pose prediction method and device
CN114528941A (en) Sensor data fusion method and device, electronic equipment and storage medium
CN113252023A (en) Positioning method, device and equipment based on odometer
CN112835370A (en) Vehicle positioning method and device, computer readable storage medium and processor
CN116929343A (en) Pose estimation method, related equipment and storage medium
CN106885567B (en) Inertial navigation cooperation positioning method and positioning equipment
CN116883460A (en) Visual perception positioning method and device, electronic equipment and storage medium
CN118311561A (en) Fusion positioning method of laser radar and millimeter wave radar and related equipment
CN116772858A (en) Vehicle positioning method, device, positioning equipment and storage medium
CN114894222B (en) External parameter calibration method of IMU-GNSS antenna and related method and equipment
CN115629374A (en) Unmanned ship under-bridge positioning method based on millimeter wave radar and related equipment
CN116380062A (en) Robot positioning method and device, mobile robot and readable storage medium
CN112965076B (en) Multi-radar positioning system and method for robot
CN116659487B (en) Pose adjustment method, pose adjustment device, electronic equipment and readable storage medium
CN113534095B (en) Laser radar map construction method and robot autonomous navigation method

Legal Events

Date Code Title Description
PB01 Publication