CN116359873A - Method, device, processor and storage medium for realizing SLAM processing of vehicle-end 4D millimeter wave radar by combining fisheye camera - Google Patents
Method, device, processor and storage medium for realizing SLAM processing of vehicle-end 4D millimeter wave radar by combining fisheye camera Download PDFInfo
- Publication number
- CN116359873A CN116359873A CN202310318253.2A CN202310318253A CN116359873A CN 116359873 A CN116359873 A CN 116359873A CN 202310318253 A CN202310318253 A CN 202310318253A CN 116359873 A CN116359873 A CN 116359873A
- Authority
- CN
- China
- Prior art keywords
- millimeter wave
- wave radar
- point cloud
- vehicle
- pose
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 65
- 238000012545 processing Methods 0.000 title claims abstract description 36
- 230000009466 transformation Effects 0.000 claims abstract description 23
- 238000001914 filtration Methods 0.000 claims abstract description 15
- 230000008569 process Effects 0.000 claims abstract description 13
- 230000000007 visual effect Effects 0.000 claims description 15
- 241000251468 Actinopterygii Species 0.000 claims description 14
- 238000013519 translation Methods 0.000 claims description 11
- 230000004927 fusion Effects 0.000 claims description 10
- 238000004590 computer program Methods 0.000 claims description 4
- 230000001131 transforming effect Effects 0.000 claims description 4
- 238000000605 extraction Methods 0.000 claims description 3
- 238000010276 construction Methods 0.000 abstract description 3
- 230000008878 coupling Effects 0.000 description 4
- 238000010168 coupling process Methods 0.000 description 4
- 238000005859 coupling reaction Methods 0.000 description 4
- CLOMYZFHNHFSIQ-UHFFFAOYSA-N clonixin Chemical compound CC1=C(Cl)C=CC=C1NC1=NC=CC=C1C(O)=O CLOMYZFHNHFSIQ-UHFFFAOYSA-N 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012805 post-processing Methods 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/41—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to a method for realizing SLAM processing of a vehicle-end 4D millimeter wave radar by combining a fisheye camera, which comprises the following steps: (1) The intelligent driving vehicle collects point cloud data of the 4D millimeter wave radar and image data of the fish-eye camera in real time, and pre-processes the data of the point cloud data and the image data of the fish-eye camera; (2) Simultaneously carrying out pose estimation processing on the point cloud data and the image data to respectively obtain the current pose of the vehicle based on the 4D millimeter wave radar and the current pose of the vehicle based on the fisheye image; (3) Performing EKF filtering treatment on the two poses to obtain more accurate poses; (4) Performing transformation processing on the accurate pose of the vehicle and a point cloud key frame of the 4D millimeter wave radar to obtain a point cloud map; (5) And outputting the accurate pose and the point cloud map as SLAM. The invention also relates to a corresponding device, a processor and a storage medium thereof. The method, the device, the processor and the storage medium thereof greatly improve the quality of image construction and the positioning accuracy.
Description
Technical Field
The invention relates to the technical field of intelligent driving, in particular to the technical field of millimeter wave radar and SLAM, and specifically relates to a method, a device, a processor and a computer readable storage medium for realizing SLAM processing of a vehicle-end 4D millimeter wave radar by combining a fisheye camera.
Background
In the field of intelligent driving, vehicles often need to sense surrounding environment information in real time and determine own position information so as to achieve accurate sensing of the surrounding environment and output the sensing result to a motion planning decision module. This requires SLAM techniques to determine the position of the vehicle in a global reference frame while constructing a map around the vehicle.
At present, SLAM technology is realized for a vehicle-end pure 4D millimeter wave radar, and is limited by the point cloud sparsity of the millimeter wave radar and the inaccuracy of the elevation information pitching angle measurement of the point cloud, so that the quality of an established environment map is poor (the outline of an object in the map is not clear enough and has ghost) and the determined position accuracy is poor (the difference between the outline and the true position is larger), and particularly, the phenomenon is obvious when the vehicle turns. Poor map quality and low positioning accuracy often lead to jitter in subsequent motion control, and poor driving experience of passengers.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a method, a device, a processor and a computer readable storage medium thereof for realizing SLAM processing of a vehicle-end 4D millimeter wave radar by combining the quality of image construction and positioning precision with a fisheye camera.
In order to achieve the above object, the method, the device, the processor and the computer readable storage medium thereof for realizing the SLAM processing of the 4D millimeter wave radar at the vehicle end by combining the fisheye camera according to the present invention are as follows:
the method for realizing SLAM processing of the 4D millimeter wave radar at the vehicle end by combining the fisheye camera is mainly characterized by comprising the following steps of:
(1) The intelligent driving vehicle collects point cloud data of the 4D millimeter wave radar and image data of the fish-eye camera in real time, and pre-processes the data of the point cloud data and the image data of the fish-eye camera;
(2) Simultaneously carrying out pose estimation processing on the preprocessed point cloud data and the preprocessed image data to respectively obtain the current pose of the vehicle based on the 4D millimeter wave radar and the current pose of the vehicle based on the fisheye image;
(3) Performing EKF filtering processing on the two poses, and obtaining the pose with higher accuracy after filtering;
(4) Transforming the accurate pose of the vehicle and a point cloud key frame of the 4D millimeter wave radar to obtain a key frame point cloud under a world coordinate system, and splicing a plurality of frames of the key frame point cloud to obtain a point cloud map;
(5) And outputting the obtained accurate pose and the point cloud map as SLAM.
Preferably, the step (1) specifically includes the following steps:
the method comprises the steps of (1.1) carrying out time synchronization on collected data of each 4D millimeter wave radar, and splicing the data into a 360-degree point cloud around a vehicle in a sequential fusion mode;
and (1.2) performing time alignment synchronization on the point cloud data of the 4D millimeter wave radar and the image data of the fish-eye camera, and performing alignment according to the time of the 4D millimeter wave Lei Dadian cloud.
Preferably, the pose estimation based on the 4D millimeter wave radar performed in the step (2) specifically includes the following steps:
(2.1. A) extracting point cloud keyframes: according to the motion information of the vehicle, namely the distance delta D or the rotation angle delta alpha from the last key frame vehicle motion extraction, if delta D or delta alpha is greater than a certain threshold value theta, the point cloud P of the current 4D millimeter wave radar is obtained i {x 0 ,x 1 …x m Added as key frames, denoted K i ;
(2.2. A) point cloud keyframe to keyframe matching: i.e. key frame K i And the last key frame K i-1 Normal distribution transformation ndt matching is carried out, and key frame K is obtained i Point cloud P of (2) i {x 0 ,x 1 …x m Sum key frame K i-1 Point cloud P of (2) i-1 {x 0 ,x 1 …x n Dividing into individual voxel cells, and estimating normal distribution p obeyed by all points in each voxel cell voxel N (μ, σ), where μ is the mean of the normal distribution and σ is the covariance of the normal distribution; and solving a transformation to make key frame K i The points transformed to fall into key frame K i-1 Probability value p of voxel cell voxel where point is located voxel Maximally, the solving mode is as follows:
finally, the relative translation transformation between the 4D millimeter wave radar frames is obtainedAnd relative rotation transformation->
(2.3. A) calculating the current pose of the 4D millimeter wave radar-based vehicle using the following formula
Preferably, the estimating of the pose based on the fisheye image in the step (2) specifically includes the following steps:
(2.1. B) extracting visual features: extracting visual ORB features on each frame of fish-eye image, and marking the features extracted from the current time frame as F i The features extracted from the image frame at the previous moment are marked as F i-1 Current frame of fish eye imagePose of (2) isThe pose of the image frame at the previous moment is +.>
(2.2. B) visual feature matching: according to characteristic F i And F i-1 The Hamming distance between the two frames finds the corresponding matching point between the two frames, and calculates the relative translation transformation of the current time frame relative to the previous time frame by using PnPAnd relative rotation transformation->
(2.3. B) calculating the obtained current pose based on the fisheye image using the following formula
Preferably, the step (3) specifically includes:
pose to be obtained based on 4D millimeter wave radar estimationAnd pose estimated based on fish eye image +.>Is +.>Then, the 2 poses are subjected to EKF filtering, in particular according to +.>And->The position and posture covariance of the vehicle is updated to obtain a Kalman gain, so that the state of the position and posture is updated, and the accurate position and posture X of the vehicle which is more accurate after being filtered is obtained i 。
Preferably, the step (4) specifically includes the following steps:
(4.1) extracting key frame K of the 4D millimeter wave radar i Corresponding point cloud P i {x 0 ,x 1 …x m And the precise pose X i Calculating a keyframe point cloud according to the following formula
Wherein, the key frame point cloudA key frame point cloud under a world coordinate system is obtained for transformation to a global reference coordinate system;
(4.2) adding up and splicing the point clouds of the keyframes obtained after the transformation to form the point cloud map of the whole environment
Preferably, the step (5) specifically includes:
the accurate pose X i And the point cloud map M i As SLAM output.
The device for realizing SLAM processing of the 4D millimeter wave radar at the vehicle end by combining the fisheye camera is mainly characterized by comprising the following components:
a processor configured to execute computer-executable instructions;
and the memory stores one or more computer executable instructions which, when executed by the processor, implement the steps of the method for implementing the SLAM processing of the vehicle-end 4D millimeter wave radar by combining the fisheye camera.
The processor for realizing SLAM processing of the 4D millimeter wave radar at the vehicle end by combining the fisheye camera is mainly characterized in that the processor is configured to execute computer executable instructions, and when the computer executable instructions are executed by the processor, the steps of the method for realizing SLAM processing of the 4D millimeter wave radar at the vehicle end by combining the fisheye camera are realized.
The computer readable storage medium is mainly characterized in that a computer program is stored on the computer readable storage medium, and the computer program can be executed by a processor to realize the steps of the method for realizing SLAM processing of the 4D millimeter wave radar at the vehicle end by combining the fisheye camera.
The method, the device, the processor and the computer readable storage medium for realizing SLAM processing of the vehicle-end 4D millimeter wave radar by combining the fisheye camera are adopted, the fisheye camera marked by the intelligent driving vehicle is used for carrying out filtering fusion on the rotation R and the pose estimated by the pure millimeter wave radar, so that more accurate pose estimation is obtained, the map quality is improved, because the environment information utilized by the method of the fisheye camera VO is more abundant, the estimated pose rotation R is more accurate than that of the pure millimeter wave radar, but the depth cannot be estimated by monocular, and the translation part of the VO is not fused. The pose of the pure millimeter wave radar is corrected and updated by fusing the fisheye estimation R, so that higher pose estimation precision is achieved, and the pose precision is improved, so that the quality of map splicing is improved; in addition, the technical scheme has good adaptability and ductility, the pose estimation based on the image frames can be extended to a pinhole camera, is not limited to a fish-eye camera, can be applied to SLAM methods of different intelligent driving vehicles, and enables the pose estimation to be more accurate.
Drawings
Fig. 1 is a flow chart of a method for realizing SLAM processing of a vehicle-end 4D millimeter wave radar by combining a fisheye camera according to the present invention.
Detailed Description
In order to more clearly describe the technical contents of the present invention, a further description will be made below in connection with specific embodiments.
Before describing in detail embodiments that are in accordance with the present invention, it should be observed that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Before describing the present technical solution in detail, the relevant abbreviations involved in the present disclosure will now be further explained:
SLAM: simultaneous mapping and positioning Simultaneous localization and mapping
● VO: visual odometer
● ndt: normal distribution transformation normal distribution transform
● t translation in pose
R Rotation in pose
● radar, herein referred to as 4D millimeter wave radar
EKF: extended kalman filtering Extend Kalman Filter
ORB: FAST corner with direction and rotation descriptor Oriented FAST and Rotated BRIEF
PnP: multi-Perspective geometry method, persponsive-n-Points
● Voxel cell
● w: world reference frame, short for world
● r: radar coordinate system
M: short for Map
Referring to fig. 1, the method for implementing SLAM processing of a 4D millimeter wave radar at a vehicle end by combining a fisheye camera includes the following steps:
(1) The intelligent driving vehicle collects point cloud data of the 4D millimeter wave radar and image data of the fish-eye camera in real time, and pre-processes the data of the point cloud data and the image data of the fish-eye camera;
(2) Simultaneously carrying out pose estimation processing on the preprocessed point cloud data and the preprocessed image data to respectively obtain the current pose of the vehicle based on the 4D millimeter wave radar and the current pose of the vehicle based on the fisheye image;
(3) Performing EKF filtering processing on the two poses, and obtaining the pose with higher accuracy after filtering;
(4) Transforming the accurate pose of the vehicle and a point cloud key frame of the 4D millimeter wave radar to obtain a key frame point cloud under a world coordinate system, and splicing a plurality of frames of the key frame point cloud to obtain a point cloud map;
(5) And outputting the obtained accurate pose and the point cloud map as SLAM.
As a preferred embodiment of the present invention, the step (1) specifically includes the steps of:
the method comprises the steps of (1.1) carrying out time synchronization on collected data of each 4D millimeter wave radar, and splicing the data into a 360-degree point cloud around a vehicle in a sequential fusion mode;
and (1.2) performing time alignment synchronization on the point cloud data of the 4D millimeter wave radar and the image data of the fish-eye camera, and performing alignment according to the time of the 4D millimeter wave Lei Dadian cloud.
As a preferred embodiment of the present invention, the estimating of the pose based on the 4D millimeter wave radar in the step (2) specifically includes the following steps:
(2.1. A) extracting point cloud keyframes: according to the motion information of the vehicle, namely the distance delta D or the rotation angle delta alpha from the last key frame vehicle motion extraction, if delta D or delta alpha is greater than a certain threshold value theta, the point cloud P of the current 4D millimeter wave radar is obtained i {x 0 ,x 1 …x m Added as key frames, denoted K i ;
(2.2. A) point cloud keyframe to keyframe matching: i.e. key frame K i And the last key frame K i-1 Normal distribution transformation ndt matching is carried out, and key frame K is obtained i Point cloud P of (2) i {x 0 ,x 1 …x m Sum key frame K i-1 Point cloud P of (2) i-1 {x 0 ,x 1 …x n Dividing into individual voxel cells, and estimating normal distribution p obeyed by all points in each voxel cell voxel N (μ, σ), where μ is the mean of the normal distribution and σ is the covariance of the normal distribution; and solving a transformation to make key frame K i The points transformed to fall into key frame K i-1 Probability value p of voxel cell voxel where point is located voxel Maximally, the solving mode is as follows:
finally, the relative translation transformation between the 4D millimeter wave radar frames is obtainedAnd relative rotation transformation->
(2.3. A) calculating the current pose of the 4D millimeter wave radar-based vehicle using the following formula
As a preferred embodiment of the present invention, the estimating of the pose based on the fisheye image in the step (2) specifically includes the following steps:
(2.1. B) extracting visual features: extracting visual ORB features on each frame of fish-eye image, and marking the features extracted from the current time frame as F i The features extracted from the image frame at the previous moment are marked as F i-1 Recording the pose of the current frame of the fisheye image asThe pose of the image frame at the previous moment is +.>
(2.2. B) visual feature matching: according to characteristic F i And F i-1 The Hamming distance between two frames finds out the corresponding matching point between two frames and calculates the motion transformation delta t of the current time frame relative to the previous time frame by using PnP i And DeltaR i ;
(2.3. B) calculating the obtained current pose based on the fisheye image using the following formula
As a preferred embodiment of the present invention, the step (3) specifically includes:
pose to be obtained based on 4D millimeter wave radar estimationAnd pose estimated based on fish eye image +.>Is +.>Then, the 2 poses are subjected to EKF filtering, in particular according to +.>And->The position and posture covariance of the vehicle is updated to obtain a Kalman gain, so that the state of the position and posture is updated, and the accurate position and posture X of the vehicle which is more accurate after being filtered is obtained i 。
As a preferred embodiment of the present invention, the step (4) specifically includes the following steps:
(4.1) extracting key frame K of the 4D millimeter wave radar i Corresponding point cloud P i {x 0 ,x 1 …x m And the precise pose X i Calculating a keyframe point cloud according to the following formula
Wherein, the key frame point cloudA key frame point cloud under a world coordinate system is obtained for transformation to a global reference coordinate system;
(4.2) adding up and splicing the point clouds of the keyframes obtained after the transformation to form the point cloud map of the whole environment
As a preferred embodiment of the present invention, the step (5) specifically includes:
the accurate pose X i And the point cloud map M i As SLAM output.
In practical application, the main steps of the technical scheme for carrying out SLAM by combining the fisheye camera and the 4D millimeter wave radar include:
1) And (5) inputting data. The intelligent driving vehicle collects point cloud data from the 6 4D millimeter wave radar and image data from the 1 fish-eye camera in real time, and preprocesses the data of the point cloud data and the image data, wherein the preprocessing comprises the following steps:
a. the data of each millimeter wave radar are time synchronized, and the vehicle is spliced according to the fusion mode of who fuses the data
A 360 degree point cloud around the vehicle;
b. performing time alignment synchronization on point cloud data of the millimeter wave radar and image data of the fish eyes, and aligning according to the time of the radar point cloud;
2) And (5) pose estimation. Namely, to estimate the translation t and the rotation R of the vehicle i at present, the pose estimation of the millimeter wave radar based on the point cloud and the pose estimation of the fish eye based on the image are simultaneously performed, and are calculated in parallel, so that the method is a post-processing mode from the framework. Respectively obtaining bits based on millimeter wave radar estimationAnd bit estimated based on fish eye image +.>
a. Pose estimation based on millimeter wave radar includes the following steps:
i. and extracting the key frames. Based on information about the motion of the vehicle, i.e. the distance Δd from the last extracted key frame of the motion of the vehicle
Or the rotation angle delta alpha, and if delta d or delta alpha is larger than a certain threshold value theta, the point cloud P of the current millimeter wave radar is obtained i {x 0 ,x 1 …x m Added as key frames, denoted K i ;
And ii, matching point cloud key frames. I.e. key frame K i And the last key frame K i-1 Ndt matching is carried out, ndt key frame K is based on the normal distribution assumption of voxel i Points of (2)Cloud P i {x 0 ,x 1 …x m Sum key frame K i-1 Point cloud P of (2) i-1 {x 0 ,x 1 …x n Dividing into individual voxel cells, and estimating normal distribution p obeyed by all points in each voxel cell voxel N (μ, σ), μ being the mean of the normal distribution and σ being the covariance of the normal distribution. Solving a transformation to key frame K i The points transformed to fall into key frame K i-1 Probability value p of voxel where point is located voxel Maximum, finally obtaining the relative motion transformation delta t between frames i And DeltaR i ;
Using the following formula
b. The pose estimation based on the fisheye image comprises the following steps:
i. visual features are extracted. Extracting visual ORB features on each frame of fish-eye image, and marking the features extracted from the current time frame as F i The features extracted from the image frame at the previous moment are marked as F i-1 Recording the pose of the current frame of the fisheye image asThe pose of the image frame at the previous moment is +.>
Visual feature matching. According to characteristic F i And F i-1 The Hamming distance between two frames finds the corresponding matching point between two frames, and calculates the motion transformation delta t of the current time frame relative to the previous time frame by using PnP i And DeltaR i ;
Using the formula:
obtaining current pose estimation based on fish-eye imageBut only output the rotation part of the pose +.>Only rotation is used for assisting the millimeter wave radar in estimating the pose, and the pose is estimated by the millimeter wave radar, so that the estimation of the translation part is more accurate, and other translation assisting estimation is not needed;
3) And (5) pose fusion. Through the step 2), the pose estimated based on the millimeter wave radar is obtainedAnd +.o. estimated based on fish eye image>Is +.>After that, the 2 poses are EKF-filtered, in particular according to +.>And->The position and posture covariance of the model (1) is updated by Kalman gain, so that the state of the position and posture is updated, and the position and posture X which is more accurate after filtering is obtained i ;
4) And (5) splicing the map. Obtaining the accurate pose X of the vehicle i Afterwards, 2) extracted key frame K i Corresponding point cloud P i {x 0 ,x 1 …x m Sum pose X i According to the formula:
transforming into global reference coordinate systemObtaining a key frame point cloud in world coordinatesThen the key frame point clouds of the transformed one frame are added up to form a map of the whole environment>The obtained map is a point cloud map of the millimeter wave radar, is not an image characteristic point map, and is a relatively dense map.
5) SLAM output. The filtered accurate pose X obtained in the step 3) is processed i (t i ,R i ) And step 4) map spliced by the mapThe output is the output of SLAM in the present embodiment.
In practical application, compared with the prior art, the technical scheme has the following advantages:
(1) The technical scheme provides a vehicle-end 4D millimeter wave radar SLAM method combined with a fish eye, and compared with the original pure 4D millimeter wave radar SLAM method, the construction quality and the positioning accuracy are greatly improved;
(2) The technical scheme has great suitability, and can be used on intelligent driving vehicles of different types and different structural designs;
(3) According to the technical scheme, the fish-eye sensors are combined, so that each sensor on the vehicle is utilized to the maximum extent, the utilization rate of hardware equipment is improved, and the cost performance is high;
(4) In the technical scheme, only the rotation R in the fish eye VO is taken out and fused with the pose estimated by the 4D millimeter wave radar; visual closed-loop detection and global pose optimization are not needed; the rotation R in the fish eye VO plays a complementary role in pose estimation of the millimeter wave radar, so that the pose estimation is more robust;
(5) The method proposed by the technical scheme is based on a 4D millimeter wave radar and is not applicable to a laser radar sensor;
(6) The SLAM method in the technical scheme is a loose coupling frame taking 4D millimeter wave radar pose estimation as a main part and fisheye vision estimation as an auxiliary part, and is a post-fusion processing mode;
(7) The 4D millimeter wave Lei Dadian cloud in the technical scheme is obtained through the operations of time synchronization of multiple radars, external parameter calibration, filtering and the like;
(8) The technical scheme is that millimeter wave point clouds around the intelligent vehicle are obtained through fusion, not panoramic images, and the visual sensor only needs to be a small-visual-angle fish-eye sensor;
(9) In the technical scheme, the pose is estimated by the method based on the voxel probability maximization aiming at the characteristics of the 4D millimeter wave Lei Dadian cloud.
The device for realizing SLAM processing of the 4D millimeter wave radar at the vehicle end by combining the fisheye camera comprises:
a processor configured to execute computer-executable instructions;
and the memory stores one or more computer executable instructions which, when executed by the processor, implement the steps of the method for implementing the SLAM processing of the vehicle-end 4D millimeter wave radar by combining the fisheye camera.
The processor for realizing SLAM processing of the 4D millimeter wave radar at the vehicle end by combining the fisheye camera is configured to execute computer executable instructions, and when the computer executable instructions are executed by the processor, the steps of the method for realizing SLAM processing of the 4D millimeter wave radar at the vehicle end by combining the fisheye camera are realized.
The computer readable storage medium has stored thereon a computer program executable by a processor to perform the steps of the method for implementing a car-end 4D millimeter wave radar SLAM process in combination with a fish eye camera described above.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and further implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
It is to be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution device.
Those of ordinary skill in the art will appreciate that all or a portion of the steps carried out in the method of the above-described embodiments may be implemented by a program to instruct related hardware, and the program may be stored in a computer readable storage medium, where the program when executed includes one or a combination of the steps of the method embodiments.
The above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, or the like.
In the description of the present specification, reference to the terms "one embodiment," "some embodiments," "examples," "specific examples," or "embodiments," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the present invention have been shown and described above, it will be understood that the above embodiments are illustrative and not to be construed as limiting the invention, and that variations, modifications, alternatives and variations may be made to the above embodiments by one of ordinary skill in the art within the scope of the invention.
Aiming at the imaging characteristics of the 4D imaging millimeter wave radar, the technical scheme is characterized by point cloud output, and the designed radar is a radar fusion SLAM system with main vision as auxiliary, so that the system has certain novelty, and the SLAM output effect of the system is far better than that of the traditional radar.
According to the method, only a small-view fisheye camera is used, rather than a more complex wide-angle panoramic camera, the pose rotating part of the intelligent vehicle can be estimated, the pose estimation of the millimeter wave radar is complemented, the pose estimation of the intelligent vehicle is more accurate, and the obtained environmental map effect is better.
The SLAM method in the technical scheme is essentially a loose coupling post-processing frame mainly comprising 4D millimeter wave Lei Dadian cloud and a fisheye image as an auxiliary, but is not a tight coupling pre-processing frame mainly comprising panoramic vision and an auxiliary laser radar in the fusion of the panoramic vision and the laser radar, and the loose coupling frame has stronger robustness and a more robust system, and the whole system is not crashed and invalid due to invalid vision sensors.
The SLAM method provided by the technical scheme obtains the pose with higher precision, greatly improves the precision of vehicle motion planning control, and ensures that passengers have better driving experience, comfort and safety.
The method, the device, the processor and the computer readable storage medium for realizing SLAM processing of the vehicle-end 4D millimeter wave radar by combining the fisheye camera are adopted, the fisheye camera marked by the intelligent driving vehicle is used for carrying out filtering fusion on the rotation R and the pose estimated by the pure millimeter wave radar, so that more accurate pose estimation is obtained, the map quality is improved, because the environment information utilized by the method of the fisheye camera VO is more abundant, the estimated pose rotation R is more accurate than that of the pure millimeter wave radar, but the depth cannot be estimated by monocular, and the translation part of the VO is not fused. The pose of the pure millimeter wave radar is corrected and updated by fusing the fisheye estimation R, so that higher pose estimation precision is achieved, and the pose precision is improved, so that the quality of map splicing is improved; in addition, the technical scheme has good adaptability and ductility, the pose estimation based on the image frames can be extended to a pinhole camera, is not limited to a fish-eye camera, can be applied to SLAM methods of different intelligent driving vehicles, and enables the pose estimation to be more accurate.
In this specification, the invention has been described with reference to specific embodiments thereof. It will be apparent, however, that various modifications and changes may be made without departing from the spirit and scope of the invention. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.
Claims (10)
1. The method for realizing SLAM processing of the 4D millimeter wave radar at the vehicle end by combining a fisheye camera is characterized by comprising the following steps of:
(1) The intelligent driving vehicle collects point cloud data of the 4D millimeter wave radar and image data of the fish-eye camera in real time, and pre-processes the data of the point cloud data and the image data of the fish-eye camera;
(2) Simultaneously carrying out pose estimation processing on the preprocessed point cloud data and the preprocessed image data to respectively obtain the current pose of the vehicle based on the 4D millimeter wave radar and the current pose of the vehicle based on the fisheye image;
(3) Performing EKF filtering processing on the two poses, and obtaining the pose with higher accuracy after filtering;
(4) Transforming the accurate pose of the vehicle and a point cloud key frame of the 4D millimeter wave radar to obtain a key frame point cloud under a world coordinate system, and splicing a plurality of frames of the key frame point cloud to obtain a point cloud map;
(5) And outputting the obtained accurate pose and the point cloud map as SLAM.
2. The method for realizing SLAM processing of the 4D millimeter wave radar at the vehicle end by combining the fisheye camera according to claim 1, wherein the step (1) specifically comprises the following steps:
the method comprises the steps of (1.1) carrying out time synchronization on collected data of each 4D millimeter wave radar, and splicing the data into a 360-degree point cloud around a vehicle in a sequential fusion mode;
and (1.2) performing time alignment synchronization on the point cloud data of the 4D millimeter wave radar and the image data of the fish-eye camera, and performing alignment according to the time of the 4D millimeter wave Lei Dadian cloud.
3. The method for realizing SLAM processing of the 4D millimeter wave radar at the vehicle end by combining the fisheye camera according to claim 2, wherein the pose estimation based on the 4D millimeter wave radar in the step (2) specifically comprises the following steps:
(2.1. A) extracting point cloud keyframes: according to the motion information of the vehicle, namely the distance delta D or the rotation angle delta alpha from the last key frame vehicle motion extraction, if delta D or delta alpha is greater than a certain threshold value theta, the point cloud P of the current 4D millimeter wave radar is obtained i {x 0 ,x 1 …x m Added as key frames, denoted K i ;
(2.2. A) point cloud keyframe to keyframe matching: i.e. key frame K i And the last key frame K i-1 Normal distribution transformation ndt matching is carried out, and key frame K is obtained i Point cloud P of (2) i {x 0 ,x 1 …x m Sum key frame K i-1 Point cloud P of (2) i-1 {x 0 ,x 1 ...x n Dividing into individual voxel cells, and estimating normal distribution p obeyed by all points in each voxel cell voxel N (μ, σ), where μ is the mean of the normal distribution and σ is the covariance of the normal distribution; and solve for a transformation T i (R i ,t i ) So that key frame K i The points transformed to fall into key frame K i-1 Probability value p of voxel cell voxel where point is located voxel Maximally, the solving mode is as follows:
finally, the relative translation transformation between the 4D millimeter wave radar frames is obtainedAnd relative rotation transformation->
(2.3. A) calculating the current pose of the 4D millimeter wave radar-based vehicle using the following formula
4. The method for realizing the SLAM processing of the 4D millimeter wave radar at the vehicle end by combining the fisheye camera according to claim 3, wherein the step (2) of estimating the pose based on the fisheye image specifically comprises the following steps:
(2.1. B) extracting visual features: extracting visual ORB features on each frame of fish-eye image, and marking the features extracted from the current time frame as F i The features extracted from the image frame at the previous moment are marked as F i-1 Recording the pose of the current frame of the fisheye image asThe pose of the image frame at the previous moment is +.>
(2.2. B) visual feature matching: according to characteristic F i And F i-1 The Hamming distance between the two frames finds the corresponding matching point between the two frames, and calculates the relative translation transformation of the current time frame relative to the previous time frame by using PnPAnd relative rotation transformation->
(2.3. B) calculating the obtained current pose based on the fisheye image using the following formula
5. The method for realizing SLAM processing of the 4D millimeter wave radar at the vehicle end by combining the fisheye camera according to claim 4, wherein the step (3) is specifically:
pose to be obtained based on 4D millimeter wave radar estimationAnd pose estimated based on fish eye image +.>Is +.>Then, the 2 poses are subjected to EKF filtering, in particular according to +.>And->The position and posture covariance of the vehicle is updated to obtain a Kalman gain, so that the state of the position and posture is updated, and the accurate position and posture X of the vehicle which is more accurate after being filtered is obtained i 。
6. The method for realizing SLAM processing of the 4D millimeter wave radar at the vehicle end by combining the fisheye camera according to claim 5, wherein the step (4) specifically comprises the following steps:
(4.1) extracting key frame K of the 4D millimeter wave radar i Corresponding point cloud P i {x 0 ,x 1 …x m And the precise pose X i Calculating a keyframe point cloud according to the following formula
Wherein, the key frame point cloudA key frame point cloud under a world coordinate system is obtained for transformation to a global reference coordinate system;
7. The method for realizing SLAM processing of the 4D millimeter wave radar at the vehicle end by combining the fisheye camera according to claim 6, wherein the step (5) is specifically:
to the accuracy ofPose X i And the point cloud map M i As SLAM output.
8. The device for realizing SLAM processing of the 4D millimeter wave radar at the vehicle end by combining a fisheye camera is characterized by comprising the following components:
a processor configured to execute computer-executable instructions;
memory storing one or more computer-executable instructions which, when executed by the processor, perform the steps of the method of any one of claims 1 to 7 for implementing a vehicle-side 4D millimeter wave radar SLAM process in conjunction with a fish eye camera.
9. A processor for implementing a car-end 4D millimeter wave radar SLAM process in combination with a fisheye camera, wherein the processor is configured to execute computer executable instructions that, when executed by the processor, implement the steps of the method for implementing a car-end 4D millimeter wave radar SLAM process in combination with a fisheye camera as set forth in any one of claims 1 to 7.
10. A computer-readable storage medium, having stored thereon a computer program executable by a processor to perform the steps of the method of any one of claims 1 to 7 for implementing a SLAM process of a vehicle-end 4D millimeter wave radar in combination with a fish eye camera.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310318253.2A CN116359873A (en) | 2023-03-29 | 2023-03-29 | Method, device, processor and storage medium for realizing SLAM processing of vehicle-end 4D millimeter wave radar by combining fisheye camera |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310318253.2A CN116359873A (en) | 2023-03-29 | 2023-03-29 | Method, device, processor and storage medium for realizing SLAM processing of vehicle-end 4D millimeter wave radar by combining fisheye camera |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116359873A true CN116359873A (en) | 2023-06-30 |
Family
ID=86914462
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310318253.2A Pending CN116359873A (en) | 2023-03-29 | 2023-03-29 | Method, device, processor and storage medium for realizing SLAM processing of vehicle-end 4D millimeter wave radar by combining fisheye camera |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116359873A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117011179A (en) * | 2023-08-09 | 2023-11-07 | 北京精英路通科技有限公司 | Image conversion method and device, electronic equipment and storage medium |
CN117590371A (en) * | 2024-01-18 | 2024-02-23 | 上海几何伙伴智能驾驶有限公司 | Method for realizing global parking space state detection based on 4D millimeter wave imaging radar |
CN117789198A (en) * | 2024-02-28 | 2024-03-29 | 上海几何伙伴智能驾驶有限公司 | Method for realizing point cloud degradation detection based on 4D millimeter wave imaging radar |
-
2023
- 2023-03-29 CN CN202310318253.2A patent/CN116359873A/en active Pending
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117011179A (en) * | 2023-08-09 | 2023-11-07 | 北京精英路通科技有限公司 | Image conversion method and device, electronic equipment and storage medium |
CN117590371A (en) * | 2024-01-18 | 2024-02-23 | 上海几何伙伴智能驾驶有限公司 | Method for realizing global parking space state detection based on 4D millimeter wave imaging radar |
CN117590371B (en) * | 2024-01-18 | 2024-03-29 | 上海几何伙伴智能驾驶有限公司 | Method for realizing global parking space state detection based on 4D millimeter wave imaging radar |
CN117789198A (en) * | 2024-02-28 | 2024-03-29 | 上海几何伙伴智能驾驶有限公司 | Method for realizing point cloud degradation detection based on 4D millimeter wave imaging radar |
CN117789198B (en) * | 2024-02-28 | 2024-05-14 | 上海几何伙伴智能驾驶有限公司 | Method for realizing point cloud degradation detection based on 4D millimeter wave imaging radar |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109211241B (en) | Unmanned aerial vehicle autonomous positioning method based on visual SLAM | |
CN106873619B (en) | Processing method of flight path of unmanned aerial vehicle | |
JP7326720B2 (en) | Mobile position estimation system and mobile position estimation method | |
CN110068335B (en) | Unmanned aerial vehicle cluster real-time positioning method and system under GPS rejection environment | |
Bu et al. | Map2DFusion: Real-time incremental UAV image mosaicing based on monocular SLAM | |
CN107167826B (en) | Vehicle longitudinal positioning system and method based on variable grid image feature detection in automatic driving | |
US11064178B2 (en) | Deep virtual stereo odometry | |
CN116359873A (en) | Method, device, processor and storage medium for realizing SLAM processing of vehicle-end 4D millimeter wave radar by combining fisheye camera | |
WO2020113423A1 (en) | Target scene three-dimensional reconstruction method and system, and unmanned aerial vehicle | |
CN110910453B (en) | Vehicle pose estimation method and system based on non-overlapping view field multi-camera system | |
CN111862673B (en) | Parking lot vehicle self-positioning and map construction method based on top view | |
CN106595659A (en) | Map merging method of unmanned aerial vehicle visual SLAM under city complex environment | |
EP4124829B1 (en) | Map construction method, apparatus, device and storage medium | |
CN111830953A (en) | Vehicle self-positioning method, device and system | |
CN109631911B (en) | Satellite attitude rotation information determination method based on deep learning target recognition algorithm | |
Menozzi et al. | Development of vision-aided navigation for a wearable outdoor augmented reality system | |
CN111986261B (en) | Vehicle positioning method and device, electronic equipment and storage medium | |
CN112734765A (en) | Mobile robot positioning method, system and medium based on example segmentation and multi-sensor fusion | |
CN113240813B (en) | Three-dimensional point cloud information determining method and device | |
CN113689331B (en) | Panoramic image stitching method under complex background | |
CN113222820A (en) | Pose information assisted aerial remote sensing image splicing method | |
Bazin et al. | UAV attitude estimation by vanishing points in catadioptric images | |
CN115077519A (en) | Positioning and mapping method and device based on template matching and laser inertial navigation loose coupling | |
CN117541655A (en) | Method for eliminating radar map building z-axis accumulated error by fusion of visual semantics | |
WO2020113417A1 (en) | Three-dimensional reconstruction method and system for target scene, and unmanned aerial vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |