CN113866742B - Method for point cloud processing and target classification of 4D millimeter wave radar - Google Patents

Method for point cloud processing and target classification of 4D millimeter wave radar Download PDF

Info

Publication number
CN113866742B
CN113866742B CN202111466169.2A CN202111466169A CN113866742B CN 113866742 B CN113866742 B CN 113866742B CN 202111466169 A CN202111466169 A CN 202111466169A CN 113866742 B CN113866742 B CN 113866742B
Authority
CN
China
Prior art keywords
point
target
trace
track
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111466169.2A
Other languages
Chinese (zh)
Other versions
CN113866742A (en
Inventor
宋玛君
王奇
朱彦博
吴军
张洁
张我弓
张吉
汪玮喆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Chuhang Technology Co ltd
Original Assignee
Nanjing Chuhang Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Chuhang Technology Co ltd filed Critical Nanjing Chuhang Technology Co ltd
Priority to CN202111466169.2A priority Critical patent/CN113866742B/en
Publication of CN113866742A publication Critical patent/CN113866742A/en
Application granted granted Critical
Publication of CN113866742B publication Critical patent/CN113866742B/en
Priority to PCT/CN2022/092002 priority patent/WO2023097971A1/en
Priority to DE112022000017.1T priority patent/DE112022000017T5/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/415Identification of targets based on measurements of movement associated with the target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/418Theoretical aspects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • G01S13/726Multiple target tracking
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/414Discriminating targets with respect to background clutter

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a point cloud processing and target classification method for a 4D millimeter wave radar. The method comprises the steps of trace point input, trace point preprocessing, Kalman filtering prediction, association of trace points and tracks, clustering of trace points, starting of tracks, updating of tracks and track management. The invention realizes the transformation from a two-dimensional plane to a three-dimensional plane, and the trace point characteristic of the target is more obvious. The method comprises the steps of constructing a target measurement virtual point trace in a signal-to-noise ratio weighting mode, projecting an associated point trace on an xoy plane, rotating a course angle of the xoy plane clockwise by taking the virtual point trace as an original point, calculating size information of a target by adopting a multi-frame sliding window mode and a virtual point position relation displacement among multiple frames, improving the problem that the size of the target is not obvious due to the fact that millimeter wave point cloud is sparse, obtaining the classification probability of the target of a frame according to the class probability given to the target of the single frame according to the characteristics of the target, combining a historical probability and a single frame probability weighting mode, and adopting the class with the largest frame probability as the final classification result of the frame.

Description

Method for point cloud processing and target classification of 4D millimeter wave radar
Technical Field
The invention relates to the technical field of 4D millimeter wave radar point cloud processing and target classification methods, in particular to a 4D millimeter wave radar point cloud processing and target classification method.
Background
At present, point clouds of millimeter wave radars are sparse, the contained target features are few, and the influence of different directions of targets on target classification in the driving process is not fully considered in a classification method using point trace features, so that the target classification method based on the millimeter wave radars is low in accuracy and poor in practicability, great challenges are brought to the development of the millimeter wave radars in practical application, the existing target classification based on the millimeter wave radars is mainly applied to distinguishing pedestrians and vehicles, and the market demands for target classification are not only the same. The 4D millimeter wave radar increases the pitch angle information on the original distance, horizontal angle and speed, and expands a target from a two-dimensional plane to a three-dimensional space in space, so that the shape characteristic of the target is more obvious. Under the conditions that point cloud of a traditional millimeter wave radar is sparse and various targets influence target classification judgment in reality at different azimuth angles, the 4D radar further improves the number and quality of traces, meanwhile, point cloud processing is carried out on a three-dimensional space to extract course information of the targets, the trace rotation is carried out according to the course angle information by associating the trace information with the targets through a multi-frame sliding window, the length, width and height shape characteristics of the targets are calculated through trace displacement based on the position relation of multi-frame virtual traces, the unobvious characteristics of the sizes of the targets are improved, and the applicability of the classification of the targets in different azimuth driving is improved. Meanwhile, the length, width, height, RCS, volume and the like of the target are used as the characteristics of the target, the probability of each class of the single-frame target is given, the classification probability of the target of the frame is obtained by combining the historical probability and the single-frame probability weighting mode, the class with the maximum probability of the frame is used as the final classification result of the frame, pedestrians, two-wheel vehicles, trolleys and commercial vehicles can be distinguished in real time, and the accuracy and universality of target classification are further improved.
Disclosure of Invention
The invention aims to provide a method for processing point cloud and classifying targets of a 4D millimeter wave radar aiming at the defects in the prior art.
In order to achieve the above object, the present invention provides a method for point cloud processing and target classification of a 4D millimeter wave radar, comprising:
inputting a point trace of a target acquired by a 4D millimeter wave radar, and preprocessing the point trace, wherein the measurement of the point trace comprises the distance of the target
Figure DEST_PATH_IMAGE001
Horizontal, horizontalAngle of rotation
Figure DEST_PATH_IMAGE002
Pitch angle
Figure DEST_PATH_IMAGE003
And radial velocity
Figure DEST_PATH_IMAGE004
Performing Kalman filtering prediction on all existing tracks, and then predicting the loss times of all tracks after prediction
Figure DEST_PATH_IMAGE005
And life cycle of all tracks
Figure DEST_PATH_IMAGE006
Respectively adding 1 to the extrapolated time of the flight path and the radar period T;
associating the preprocessed trace points with the existing flight path before the frame;
clustering the preprocessed point tracks which are not related to the upper track by adopting a density clustering mode;
carrying out initial track on the clustering result;
and updating the flight path, wherein the flight path updating comprises Kalman filtering updating, and the Kalman filtering updating mode is as follows: traversing all the effective tracks, judging whether each track has an associated point track in the frame, if so, initializing a Kalman filter for the track, and if the track is more than two frames, updating Kalman filtering; then, the following operations are carried out on all tracks with associated point tracks in the frame: updating the radar scattering cross section of the flight path to be the maximum value of the radar scattering cross section in the associated point path of the frame, resetting the number of the associated point paths of the flight path to be 0, and losing the flight path
Figure 399452DEST_PATH_IMAGE005
Subtracting 1, and resetting the extrapolation time of the flight path to 0;
and deleting the flight path of the unreal target or the flight path which cannot be stably tracked when the target is not in the radar detection range.
Further, the pretreatment comprises angle correction, and the horizontal angle after the angle correction
Figure DEST_PATH_IMAGE007
Pitch angle
Figure DEST_PATH_IMAGE008
Respectively as follows:
Figure DEST_PATH_IMAGE009
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE010
respectively the level angle and the pitch installation angle of the calibrated radar.
Further, the pretreatment also comprises dynamic and static separation, and the dynamic and static separation mode is as follows:
radial velocity of the spot
Figure DEST_PATH_IMAGE011
Decompose into
Figure DEST_PATH_IMAGE012
Plane velocity of the object
Figure DEST_PATH_IMAGE013
Decomposing the speed of the vehicle where the radar is located into a point track and projecting the point track in the radial direction
Figure 120808DEST_PATH_IMAGE012
Plane velocity of the object
Figure DEST_PATH_IMAGE014
Radial velocity of the spot
Figure DEST_PATH_IMAGE015
Resolving to Z-axis direction to obtain velocity
Figure DEST_PATH_IMAGE016
If velocity
Figure 510070DEST_PATH_IMAGE013
And speed
Figure 998076DEST_PATH_IMAGE014
Sum less than threshold
Figure DEST_PATH_IMAGE017
And speed of
Figure DEST_PATH_IMAGE018
Less than threshold
Figure DEST_PATH_IMAGE019
If not, the trace point is judged as a moving target.
Further, associating the preprocessed trace point with the existing track before the current frame specifically includes: when the preprocessed point track passes through a distance wave gate, a horizontal angle wave gate, a pitching angle wave gate and a radial speed wave gate set by a certain track, the distance between the point track and the track is recorded, and finally, the nearest track is associated with the predicted point track by adopting a nearest neighbor association mode.
Further, the step of starting the track of the clustering result specifically includes:
traversing the storage of the clustering result and the flight path, and if a certain position in the array for storing the flight path is empty, storing a new flight path at the position;
and (3) performing dynamic and static inspection on the traces in each class, specifically as follows: if the number of the motion points exceeds the total associated point trace ratio threshold value
Figure DEST_PATH_IMAGE020
Marking the track as non-stationary, otherwise, marking the track as stationary;
constructing a virtual measuring point track of the initial flight path, which comprises the following specific steps:
Figure DEST_PATH_IMAGE021
Figure DEST_PATH_IMAGE022
Figure DEST_PATH_IMAGE023
Figure DEST_PATH_IMAGE024
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE025
the number of all traces in a class,
Figure DEST_PATH_IMAGE026
to construct the distance of the virtual metrology trace,
Figure DEST_PATH_IMAGE027
the distance of the ith trace point within the class,
Figure DEST_PATH_IMAGE028
to construct the horizontal angle of the virtual measurement trace,
Figure DEST_PATH_IMAGE029
is the horizontal angle of the ith point trace in the class,
Figure DEST_PATH_IMAGE030
To construct the pitch angle of the virtual metrology trace,
Figure DEST_PATH_IMAGE031
the pitch angle of the ith point trace in the class,
Figure DEST_PATH_IMAGE032
for constructed virtual metrologyThe radial velocity of the point trace is,
Figure DEST_PATH_IMAGE033
the radial velocity of the ith point trace in the class,
Figure DEST_PATH_IMAGE034
the sum weight of the signal-to-noise ratios of all points in the signal-to-noise ratio occupation class of each trace is taken as the weight of the sum of the signal-to-noise ratios of all points in the signal-to-noise ratio occupation class of each trace;
of the initial flight path
Figure DEST_PATH_IMAGE035
Generating a period
Figure DEST_PATH_IMAGE036
Initialized to 1, number of losses
Figure DEST_PATH_IMAGE037
The initialization is 0, the filtering state is set to be uninitialized, the track associated point track number is initialized to be 0, and the distance of the initial track is
Figure 593444DEST_PATH_IMAGE026
Horizontal angle of
Figure 623717DEST_PATH_IMAGE028
A pitch angle of
Figure 34363DEST_PATH_IMAGE030
And a radial velocity of
Figure 808284DEST_PATH_IMAGE032
Further, the track update further includes a target classification, and the target classification is performed in the following manner:
the sliding window stores the track information of a plurality of frames of targets, and the track information is included in
Figure DEST_PATH_IMAGE038
Calculating the position of a virtual measuring point track of a course angle alpha, a flight track and a flight track associated point track on a plane, and calculating all associated point tracks of each frame of flight track on the xoy planeThe maximum values of the projection points on the X axis and the Y axis respectively after the clockwise rotation according to the course angle alpha
Figure DEST_PATH_IMAGE039
Figure DEST_PATH_IMAGE040
And the minimum value of projection points of all associated point tracks of each frame track on the xoy surface on the X axis and the Y axis after clockwise rotation according to the heading angle alpha
Figure DEST_PATH_IMAGE041
Figure DEST_PATH_IMAGE042
Calculating the maximum and minimum values of all associated point tracks of each frame track on the Z axis
Figure DEST_PATH_IMAGE043
Figure DEST_PATH_IMAGE044
Respectively calculating multi-frame track associated point tracks
Figure DEST_PATH_IMAGE045
Figure DEST_PATH_IMAGE046
Figure DEST_PATH_IMAGE047
And
Figure DEST_PATH_IMAGE048
respectively carrying out displacement according to the coincidence of the virtual measuring point trace coordinate of each frame and the measuring point trace coordinate of the first frame, and then respectively obtaining the maximum values on the X axis and the Y axis after the displacement
Figure DEST_PATH_IMAGE049
And
Figure DEST_PATH_IMAGE050
and minimum values in X-axis and Y-axis after translation, respectively
Figure DEST_PATH_IMAGE051
And
Figure DEST_PATH_IMAGE052
(ii) a Of multiple frames
Figure DEST_PATH_IMAGE053
Figure DEST_PATH_IMAGE054
Respectively carrying out displacement according to the coincidence of the Z-axis coordinate of the virtual measuring point trace of each frame and the Z-axis coordinate of the measuring point trace of the first frame, and taking the maximum value after displacement
Figure DEST_PATH_IMAGE055
And minimum value
Figure DEST_PATH_IMAGE056
Respectively calculating:
length of the target
Figure DEST_PATH_IMAGE057
Width of the target
Figure DEST_PATH_IMAGE058
Height of the target
Figure DEST_PATH_IMAGE059
Volume of target
Figure DEST_PATH_IMAGE060
Giving the length of each class, the radar scattering cross section and the corresponding threshold value of the volume according to the distance and the horizontal angle of the target, and determining the length of the target
Figure DEST_PATH_IMAGE061
Speed, velocity
Figure DEST_PATH_IMAGE062
Radar cross-sectional area and volume
Figure DEST_PATH_IMAGE063
Calculating class probability of a single-frame target based on the characteristics of the target as the characteristics of the target, calculating the classification probability of the target of the frame in a weighted form by combining the historical probability and the class probability of the single-frame target, and taking the class with the maximum classification probability of the target of the frame as the final classification result of the frame, wherein,
Figure DEST_PATH_IMAGE064
Figure DEST_PATH_IMAGE065
Figure DEST_PATH_IMAGE066
the velocities on the x-axis and the y-axis after the k-th kalman filter update of the target, respectively, k being an integer greater than zero.
Has the advantages that: the target tracking method based on the 4D millimeter wave point cloud data realizes the conversion from a two-dimensional plane to a three-dimensional plane, and the point trace characteristics of the target are more obvious. On the basis of target tracking, a course angle alpha on an xoy plane is calculated through a filtering speed of a moving track, a target measurement virtual point track is constructed in a signal-to-noise ratio weighting mode, an associated point track is projected onto the xoy plane, the virtual point track serves as an original point and rotates alpha clockwise, meanwhile, the size information of a target is calculated in a multi-frame sliding window mode through the displacement of the virtual point position relation among multiple frames, the problem that the size of the target is not obvious due to the fact that millimeter wave point cloud is sparse is solved, meanwhile, the length, RCS, the volume and the like of the target serve as the characteristics of the target, the class probability of a single-frame target is given, the classification probability of the frame target is obtained through combination of a historical probability and a single-frame probability weighting mode, the class with the largest probability of the frame serves as the final classification result of the frame, and classification of people, two-wheeled vehicles, vehicles and commercial vehicles is achieved. The invention meets the future intelligent requirement of the development of the vehicle-mounted millimeter wave radar and has important practical significance on the research of the automatic driving perception capability.
Drawings
FIG. 1 is a schematic diagram of a coordinate system employed for trace point processing;
FIG. 2 is a flow chart of a method of 4D millimeter wave radar point cloud processing and target classification in an embodiment of the invention;
FIG. 3 is a flow chart of trace point clustering according to an embodiment of the invention;
FIG. 4 is a flow diagram illustrating object classification according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a sliding window storing multiple frames of target information;
FIG. 6 is a schematic diagram of calculating the length, width and height of a target;
FIG. 7 is a flow diagram of classification based on calculating class probabilities.
Detailed Description
The present invention will be further illustrated with reference to the accompanying drawings and specific examples, which are carried out on the premise of the technical solution of the present invention, and it should be understood that these examples are only for illustrating the present invention and are not intended to limit the scope of the present invention.
As shown in fig. 1 to 7, an embodiment of the present invention provides a method for point cloud processing and target classification of a 4D millimeter wave radar, which performs point trace processing based on the coordinate system shown in fig. 1, where a radar is installed at a coordinate dot in fig. 1, and fig. 1 is a diagram where the radar is installed correctly, that is, a normal line coincides with an X axis. The method specifically comprises the following steps:
inputting a point trace of a target acquired by a 4D millimeter wave radar, preprocessing the point trace, and measuring the point trace to obtain the distance of the target
Figure 159892DEST_PATH_IMAGE001
Horizontal angle
Figure DEST_PATH_IMAGE067
Pitch angle
Figure 908536DEST_PATH_IMAGE003
Diameter of the pipeVelocity of rotation
Figure DEST_PATH_IMAGE068
. The preprocessing of the trace comprises angle correction and horizontal angle after the angle correction
Figure DEST_PATH_IMAGE069
Pitch angle
Figure 193761DEST_PATH_IMAGE008
Respectively as follows:
Figure DEST_PATH_IMAGE070
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE071
respectively the level angle and the pitch installation angle of the calibrated radar.
The pretreatment of the trace also preferably comprises dynamic and static separation, and the dynamic and static separation mode is as follows:
radial velocity of the spot
Figure 898937DEST_PATH_IMAGE004
Decompose into
Figure DEST_PATH_IMAGE072
Plane velocity of the object
Figure 493997DEST_PATH_IMAGE013
Decomposing the speed of the vehicle where the radar is located into a point track and projecting the point track in the radial direction
Figure 866073DEST_PATH_IMAGE038
Plane velocity of the object
Figure 513961DEST_PATH_IMAGE014
Radial velocity of the spot
Figure DEST_PATH_IMAGE073
Resolving to Z-axis direction to obtain velocity
Figure 442734DEST_PATH_IMAGE016
If velocity
Figure 344830DEST_PATH_IMAGE013
And speed
Figure DEST_PATH_IMAGE074
Sum less than threshold
Figure DEST_PATH_IMAGE075
And speed of
Figure 393865DEST_PATH_IMAGE018
Less than threshold
Figure 827251DEST_PATH_IMAGE019
If not, the trace point is judged as a moving target.
Performing Kalman filtering prediction on all existing tracks, calculating prediction covariance, and the distance, horizontal angle, pitch angle and radial speed of the predicted target, and predicting the loss times of all tracks
Figure 730354DEST_PATH_IMAGE005
And life cycle of all tracks
Figure DEST_PATH_IMAGE076
1 is added to each, and the extrapolated time of the flight path is added to the radar period T. Specifically, the Kalman filtering prediction is carried out by utilizing a uniform linear motion model, and an equation of the uniform linear motion model is as follows:
Figure DEST_PATH_IMAGE077
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE078
to predict or obtainObtaining the state vector of the k +1 th measuring point trace of the target,
Figure DEST_PATH_IMAGE079
in order to be a noise of the process,
Figure DEST_PATH_IMAGE080
to predict the state vector of the target kth measurement point trace obtained,
Figure DEST_PATH_IMAGE081
can be expressed as:
Figure DEST_PATH_IMAGE082
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE083
Figure DEST_PATH_IMAGE084
Figure DEST_PATH_IMAGE085
Figure DEST_PATH_IMAGE086
Figure DEST_PATH_IMAGE087
and
Figure DEST_PATH_IMAGE088
respectively sequentially updating the position on an x axis, the position on a y axis, the position on a z axis, the speed on the x axis, the speed on the y axis and the speed on the z axis of the target after the k-th filtering, wherein k is an integer greater than 0;
Figure DEST_PATH_IMAGE089
in order to be a state transition matrix,
Figure DEST_PATH_IMAGE090
expressed as:
Figure DEST_PATH_IMAGE091
wherein T is radar single frame time.
While updating, the Kalman filter measures the input observations
Figure DEST_PATH_IMAGE092
Comprises the following steps:
Figure DEST_PATH_IMAGE093
Figure DEST_PATH_IMAGE094
Figure DEST_PATH_IMAGE095
Figure DEST_PATH_IMAGE096
and
Figure DEST_PATH_IMAGE097
the distance, the horizontal angle, the pitch angle and the radial velocity of the k-th measurement point trace of the target are respectively shown, and then the observation equation is as follows:
Figure DEST_PATH_IMAGE098
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE099
in order to measure the noise, the noise is measured,
Figure DEST_PATH_IMAGE100
for the observation function:
Figure DEST_PATH_IMAGE101
Figure DEST_PATH_IMAGE102
wherein a and b are
Figure DEST_PATH_IMAGE103
A variable in a function.
And associating the preprocessed point track with the existing track before the current frame. The method comprises the following specific steps: when the preprocessed point track passes through a distance wave gate, a horizontal angle wave gate, a pitching angle wave gate and a radial speed wave gate set by a certain track, the distance between the point track and the track is recorded, and finally, the nearest track is associated with the predicted point track by adopting a nearest neighbor association mode. The principle of associating the trace points with the tracks is that a single trace point can only be associated with a single track, but a single track can be associated with a plurality of traces.
Referring to fig. 3, the embodiment of the present invention preferably uses Density-Based Spatial Clustering of Applications with Noise (DBSCAN) to cluster predicted traces that are not associated with tracks. Specifically, set variables
Figure DEST_PATH_IMAGE104
Figure DEST_PATH_IMAGE105
If the Euclidean distance between trace A and trace B is smaller than the Euclidean distance
Figure DEST_PATH_IMAGE106
If the number of trace points in the neighborhood of trace point A is less than the number of trace points in the neighborhood of trace point A, then trace point B is called the trace point in the neighborhood of trace point A
Figure DEST_PATH_IMAGE107
Then, point trace A is called as a core object, and the points in the neighborhood of A are all reachable by the direct density of point A.
And carrying out initial track on the clustering result. Specifically, the clustering result and the flight path are stored, and if a certain position in the array for storing the flight path is empty, a new flight path is stored at the position.
And (3) performing dynamic and static inspection on the traces in each class, specifically as follows: if the number of the motion points exceeds the total associated point trace ratio threshold value
Figure DEST_PATH_IMAGE108
The track is marked as non-stationary, otherwise, the track is marked as stationary.
Constructing a measuring point trace of the initial flight trace, wherein the signal-to-noise ratio of the point trace is higher than the intensity of the reflective signal, and the accuracy of the point trace in all aspects is generally considered to be higher if the signal-to-noise ratio is higher, so that a virtual measuring point trace is preferably constructed in a signal-to-noise ratio weighting manner, which is specifically as follows:
Figure DEST_PATH_IMAGE109
Figure 645481DEST_PATH_IMAGE022
Figure DEST_PATH_IMAGE110
Figure DEST_PATH_IMAGE111
wherein the content of the first and second substances,
Figure 283661DEST_PATH_IMAGE025
the number of all traces in a class,
Figure 906141DEST_PATH_IMAGE026
to construct the distance of the virtual metrology trace,
Figure 363667DEST_PATH_IMAGE027
the distance of the ith trace point within the class,
Figure 725509DEST_PATH_IMAGE028
to construct the horizontal angle of the virtual measurement trace,
Figure 610289DEST_PATH_IMAGE029
is the horizontal angle of the ith point trace in the class,
Figure 191836DEST_PATH_IMAGE030
To construct the pitch angle of the virtual metrology trace,
Figure DEST_PATH_IMAGE112
the pitch angle of the ith point trace in the class,
Figure DEST_PATH_IMAGE113
to construct the radial velocity of the virtual metrology trace,
Figure 328419DEST_PATH_IMAGE033
the radial velocity of the ith point trace in the class,
Figure DEST_PATH_IMAGE114
the sum weight of the signal-to-noise ratios of all points in the signal-to-noise ratio occupation class of each trace is
Of the initial flight path
Figure DEST_PATH_IMAGE115
Generating cycle
Figure 449828DEST_PATH_IMAGE036
Number of times of loss
Figure 771088DEST_PATH_IMAGE037
The filtering state is set to be not initialized, the track number of the track associated points is initialized to be 0, and the distance of the initial track is
Figure 308773DEST_PATH_IMAGE026
Horizontal angle of
Figure 373681DEST_PATH_IMAGE028
A pitch angle of
Figure 975695DEST_PATH_IMAGE030
And a radial velocity of
Figure 202277DEST_PATH_IMAGE113
And updating the flight path, wherein the updating of the flight path comprises Kalman filtering updating, specifically, traversing all effective flight paths, judging whether each flight path has an associated point path in the frame, if so, initializing a Kalman filter for the flight path, and if the flight path is more than two frames, updating the Kalman filtering. Thereby updating the track filter state and the filter covariance. Then, the following operations are carried out on all tracks with associated point tracks in the frame: updating the radar scattering cross section (RCS) of the flight path to the maximum value of the radar scattering cross section in the associated point path of the frame, resetting the number of the associated point paths of the flight path to 0 and the loss times of the flight path
Figure DEST_PATH_IMAGE116
Subtract 1 and reset the extrapolated time of the flight path to 0.
And carrying out track management to delete the track of the unreal target or the track which cannot be stably tracked when the target is not in the radar detection range. Specifically, all tracks are traversed, loss times and extrapolation time detection are carried out on effective tracks, and if the tracks are around L frames before the tracks in the track initialization stage and the track loss times are larger than or equal to m, the tracks are deleted. And if the extrapolation time of the flight path is greater than the radar cycle time of the s frames, deleting the flight path. In general, L can be from 4 to 8, m can be from 2 to 4, and s can be from 5 to 8.
The track update of the embodiment of the present invention further includes a target classification, see fig. 4, the target classification mode is as follows:
the sliding window stores the track information of a plurality of frames of targets, and the track information is included in
Figure 817803DEST_PATH_IMAGE012
Calculating the course angle alpha on the plane, the virtual measuring point track position of the flight track and the flight track associated point track, and calculating the relation of each frame of flight trackThe projection point of the linkage point trace on the xoy surface rotates clockwise according to the course angle alpha and then is respectively positioned at the maximum values of the X axis and the Y axis
Figure 171556DEST_PATH_IMAGE039
Figure DEST_PATH_IMAGE117
And the minimum value of projection points of all associated point tracks of each frame track on the xoy surface on the X axis and the Y axis after clockwise rotation according to the heading angle alpha
Figure 914560DEST_PATH_IMAGE041
Figure 577622DEST_PATH_IMAGE042
. Specifically, referring to fig. 5, the black dots are projected to the associated trace of the flight path
Figure DEST_PATH_IMAGE118
Projecting virtual measuring point trace with plane point and small grey dots as flight trace to
Figure DEST_PATH_IMAGE119
Points of the plane, alpha being according to the flight path
Figure DEST_PATH_IMAGE120
Calculating course angle by using speed on the plane, rotating all black small dots clockwise by alpha by using gray small dots as circle centers, and calculating the positions of the dots after the rotation is finishedXYMaximum and minimum on axis
Figure DEST_PATH_IMAGE121
Figure 572123DEST_PATH_IMAGE046
Figure DEST_PATH_IMAGE122
And
Figure 293348DEST_PATH_IMAGE048
. Calculating the maximum and minimum values of all associated point tracks of each frame of flight path on the Z axis
Figure DEST_PATH_IMAGE123
Figure DEST_PATH_IMAGE124
Figure DEST_PATH_IMAGE125
And
Figure DEST_PATH_IMAGE126
directly according to the maximum and minimum height of the associated trace point.
Referring to FIG. 6, the black origin in the graph is the projection of the first frame measurement trace onto
Figure DEST_PATH_IMAGE127
Points of a plane, white origin being projected onto a measurement trace of some non-first frame
Figure DEST_PATH_IMAGE128
The virtual measuring point trace constructed by each frame is projected to the plane point and the gray round point
Figure 978276DEST_PATH_IMAGE127
A point of the plane. Respectively calculating multi-frame track associated point tracks
Figure DEST_PATH_IMAGE129
Figure 126754DEST_PATH_IMAGE046
Figure DEST_PATH_IMAGE130
And
Figure 952758DEST_PATH_IMAGE048
respectively carrying out displacement according to the coincidence of the virtual measuring point trace coordinate of each frame and the measuring point trace coordinate of the first frame, and then respectively obtaining the maximum values on the X axis and the Y axis after the displacement
Figure DEST_PATH_IMAGE131
And
Figure DEST_PATH_IMAGE132
and minimum values in X-axis and Y-axis after translation, respectively
Figure DEST_PATH_IMAGE133
And
Figure 615690DEST_PATH_IMAGE052
(ii) a Of multiple frames
Figure 282688DEST_PATH_IMAGE053
Figure 21974DEST_PATH_IMAGE054
Respectively carrying out displacement according to the coincidence of the Z-axis coordinate of the virtual measuring point trace of each frame and the Z-axis coordinate of the measuring point trace of the first frame, and taking the maximum value after displacement
Figure DEST_PATH_IMAGE134
And minimum value
Figure DEST_PATH_IMAGE135
Respectively calculating:
length of the target
Figure 912438DEST_PATH_IMAGE057
Width of the target
Figure 660951DEST_PATH_IMAGE058
Height of the target
Figure 680991DEST_PATH_IMAGE059
Volume of target
Figure 856758DEST_PATH_IMAGE060
See FIG. 7, according to the offline divisionAnalyzing the obtained empirical values of each class, giving the length of each class, the radar cross-section area and the corresponding threshold value of the volume according to the distance and the horizontal angle of the target, and determining the length of the target
Figure 108221DEST_PATH_IMAGE061
Speed, velocity
Figure DEST_PATH_IMAGE136
Radar cross-sectional area and volume
Figure 207895DEST_PATH_IMAGE063
Calculating class probability of a single-frame target based on the target characteristics as target characteristics, calculating the classification probability of the target of the current frame in a weighted form by combining the historical probability and the class probability of the single-frame target, and taking the class with the maximum classification probability of the target of the current frame as the final classification result of the current frame, wherein,
Figure DEST_PATH_IMAGE137
Figure DEST_PATH_IMAGE138
Figure 971190DEST_PATH_IMAGE066
the velocities on the x-axis and the y-axis after the k-th kalman filter update of the target, respectively, k being an integer greater than zero. The above categories include pedestrians, motorcycles, cars, and commercial vehicles (trucks, buses), etc.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that other parts not specifically described are within the prior art or common general knowledge to those of ordinary skill in the art. Without departing from the principle of the invention, several improvements and modifications can be made, and these improvements and modifications should also be construed as the scope of the invention.

Claims (5)

1. A method for point cloud processing and target classification of a 4D millimeter wave radar is characterized by comprising the following steps:
inputting a point trace of a target acquired by a 4D millimeter wave radar, and preprocessing the point trace, wherein the measurement of the point trace comprises the distance of the target
Figure 65856DEST_PATH_IMAGE001
Horizontal angle
Figure 297117DEST_PATH_IMAGE002
Pitch angle
Figure 649339DEST_PATH_IMAGE003
And radial velocity
Figure 42274DEST_PATH_IMAGE004
Performing Kalman filtering prediction on all existing tracks, and then predicting the loss times of all tracks after prediction
Figure 269993DEST_PATH_IMAGE005
And life cycle of all tracks
Figure 180312DEST_PATH_IMAGE006
Respectively adding 1 to the extrapolated time of the flight path and the radar period T;
associating the preprocessed trace points with the existing flight tracks;
clustering the preprocessed point tracks which are not related to the upper track by adopting a density clustering mode;
performing an initial track on the clustering result, specifically comprising:
traversing the storage of the clustering result and the flight path, and if a certain position in the array for storing the flight path is empty, storing a new flight path at the position;
and (3) performing dynamic and static inspection on the traces in each class, specifically as follows: if the number of the motion points exceeds the total associated point trace ratio threshold value
Figure 685242DEST_PATH_IMAGE007
Marking the track as non-stationary, otherwise, marking the track as stationary;
constructing a virtual measuring point track of the initial flight path, which comprises the following specific steps:
Figure 639292DEST_PATH_IMAGE008
Figure 340925DEST_PATH_IMAGE009
Figure 38623DEST_PATH_IMAGE010
Figure 398060DEST_PATH_IMAGE011
wherein the content of the first and second substances,
Figure 273743DEST_PATH_IMAGE012
the number of all traces in a class,
Figure 210475DEST_PATH_IMAGE013
to construct the distance of the virtual metrology trace,
Figure 852809DEST_PATH_IMAGE014
the distance of the ith trace point within the class,
Figure 440654DEST_PATH_IMAGE015
to construct the horizontal angle of the virtual measurement trace,
Figure 877452DEST_PATH_IMAGE016
is the horizontal angle of the ith point trace in the class,
Figure 35900DEST_PATH_IMAGE017
To construct the pitch angle of the virtual metrology trace,
Figure 560554DEST_PATH_IMAGE018
the pitch angle of the ith point trace in the class,
Figure 66489DEST_PATH_IMAGE019
to construct the radial velocity of the virtual metrology trace,
Figure 533242DEST_PATH_IMAGE020
the radial velocity of the ith point trace in the class,
Figure 460878DEST_PATH_IMAGE021
the sum weight of the signal-to-noise ratios of all points in the signal-to-noise ratio occupation class of each trace is taken as the weight of the sum of the signal-to-noise ratios of all points in the signal-to-noise ratio occupation class of each trace;
of the initial flight path
Figure 179435DEST_PATH_IMAGE022
Generating a period
Figure 492605DEST_PATH_IMAGE023
Initialized to 1, number of losses
Figure 116878DEST_PATH_IMAGE024
The initialization is 0, the filtering state is set to be uninitialized, the track associated point track number is initialized to be 0, and the distance of the initial track is
Figure 515498DEST_PATH_IMAGE013
Horizontal angle of
Figure 381954DEST_PATH_IMAGE015
A pitch angle of
Figure 595635DEST_PATH_IMAGE017
And a radial velocity of
Figure 138612DEST_PATH_IMAGE019
And updating the flight path, wherein the flight path updating comprises Kalman filtering updating, and the Kalman filtering updating mode is as follows: traversing all the effective tracks, judging whether each track has an associated point track in the frame, if so, initializing a Kalman filter for the track, and if the track is more than two frames, updating Kalman filtering; then, the following operations are carried out on all tracks with associated point tracks in the frame: updating the radar scattering cross section of the flight path to be the maximum value of the radar scattering cross section in the associated point path of the frame, resetting the number of the associated point paths of the flight path to be 0, and losing the flight path
Figure 244103DEST_PATH_IMAGE025
Subtracting 1, and resetting the extrapolation time of the flight path to 0;
and deleting the flight path of the unreal target or the flight path which cannot be stably tracked when the target is not in the radar detection range.
2. The method for 4D millimeter wave radar point cloud processing and target classification as claimed in claim 1, wherein the pre-processing comprises angle rectification, horizontal angle after angle rectification
Figure 570042DEST_PATH_IMAGE026
Pitch angle
Figure 592224DEST_PATH_IMAGE027
Respectively as follows:
Figure 712627DEST_PATH_IMAGE028
wherein the content of the first and second substances,
Figure 338037DEST_PATH_IMAGE029
respectively the level angle and the pitch installation angle of the calibrated radar.
3. The method for point cloud processing and target classification of a 4D millimeter wave radar according to claim 1, wherein the pre-processing further comprises dynamic and static separation in the following manner:
radial velocity of the spot
Figure 795563DEST_PATH_IMAGE030
Decompose into
Figure 157405DEST_PATH_IMAGE031
Plane velocity of the object
Figure 307764DEST_PATH_IMAGE032
Decomposing the speed of the vehicle where the radar is located into a point track and projecting the point track in the radial direction
Figure 417540DEST_PATH_IMAGE031
Plane velocity of the object
Figure 85282DEST_PATH_IMAGE033
Radial velocity of the spot
Figure 816477DEST_PATH_IMAGE034
Resolving to Z-axis direction to obtain velocity
Figure 154049DEST_PATH_IMAGE035
If velocity
Figure 377220DEST_PATH_IMAGE032
And speed
Figure 442128DEST_PATH_IMAGE033
Sum less than threshold
Figure 434355DEST_PATH_IMAGE036
And speed of
Figure 444292DEST_PATH_IMAGE037
Less than threshold
Figure 748235DEST_PATH_IMAGE038
If not, the trace point is judged as a moving target.
4. The method for point cloud processing and target classification of a 4D millimeter wave radar according to claim 1, wherein associating the preprocessed point trace with an existing track of the current frame specifically comprises: when the preprocessed point track passes through a distance wave gate, a horizontal angle wave gate, a pitching angle wave gate and a radial speed wave gate set by a certain track, the distance between the point track and the track is recorded, and finally, the nearest track is associated with the predicted point track by adopting a nearest neighbor association mode.
5. The method of 4D millimeter wave radar point cloud processing and target classification of claim 1, wherein the track update further comprises a target classification by:
the sliding window stores the track information of a plurality of frames of targets, and the track information is included in
Figure 570828DEST_PATH_IMAGE039
Calculating the maximum values of projection points of all associated point tracks of each frame of flight path on the xoy surface after clockwise rotating according to the course angle alpha and then respectively rotating on the X axis and the Y axis
Figure 828679DEST_PATH_IMAGE040
Figure 39212DEST_PATH_IMAGE041
And projection points of all associated point tracks of each frame track on the xoy surface are as followsMinimum value of course angle alpha on X axis and Y axis after clockwise rotation
Figure 96030DEST_PATH_IMAGE042
Figure 79904DEST_PATH_IMAGE043
Calculating the maximum and minimum values of all associated point tracks of each frame track on the Z axis
Figure 125351DEST_PATH_IMAGE044
Figure 693736DEST_PATH_IMAGE045
Respectively calculating multi-frame track associated point tracks
Figure 490047DEST_PATH_IMAGE046
Figure 575815DEST_PATH_IMAGE047
Figure 865982DEST_PATH_IMAGE048
And
Figure 792218DEST_PATH_IMAGE049
respectively carrying out displacement according to the coincidence of the virtual measuring point trace coordinate of each frame and the measuring point trace coordinate of the first frame, and then respectively obtaining the maximum values on the X axis and the Y axis after the displacement
Figure 341405DEST_PATH_IMAGE050
And
Figure 371809DEST_PATH_IMAGE051
and minimum values in X-axis and Y-axis after translation, respectively
Figure 952701DEST_PATH_IMAGE052
And
Figure 534992DEST_PATH_IMAGE053
(ii) a Of multiple frames
Figure 509157DEST_PATH_IMAGE054
Figure 795782DEST_PATH_IMAGE055
Respectively carrying out displacement according to the coincidence of the Z-axis coordinate of the virtual measuring point trace of each frame and the Z-axis coordinate of the measuring point trace of the first frame, and taking the maximum value after displacement
Figure 935908DEST_PATH_IMAGE056
And minimum value
Figure 16996DEST_PATH_IMAGE057
Respectively calculating:
length of the target
Figure 741107DEST_PATH_IMAGE058
Width of the target
Figure 831423DEST_PATH_IMAGE059
Height of the target
Figure 91634DEST_PATH_IMAGE060
Volume of target
Figure 750149DEST_PATH_IMAGE061
Giving the length of each class, the radar scattering cross section and the corresponding threshold value of the volume according to the distance and the horizontal angle of the target, and determining the length of the target
Figure 977868DEST_PATH_IMAGE062
Speed, velocity
Figure 135790DEST_PATH_IMAGE063
Radar cross-sectional area and volume
Figure 499776DEST_PATH_IMAGE064
Calculating class probability of a single-frame target based on the characteristics of the target as the characteristics of the target, calculating the classification probability of the target of the frame in a weighted form by combining the historical probability and the class probability of the single-frame target, and taking the class with the maximum classification probability of the target of the frame as the final classification result of the frame, wherein,
Figure 329191DEST_PATH_IMAGE065
Figure 794939DEST_PATH_IMAGE066
Figure 961478DEST_PATH_IMAGE067
the velocities on the x-axis and the y-axis after the k-th kalman filter update of the target, respectively, k being an integer greater than zero.
CN202111466169.2A 2021-12-03 2021-12-03 Method for point cloud processing and target classification of 4D millimeter wave radar Active CN113866742B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN202111466169.2A CN113866742B (en) 2021-12-03 2021-12-03 Method for point cloud processing and target classification of 4D millimeter wave radar
PCT/CN2022/092002 WO2023097971A1 (en) 2021-12-03 2022-05-10 4d millimeter wave radar data processing method
DE112022000017.1T DE112022000017T5 (en) 2021-12-03 2022-05-10 DATA PROCESSING METHODS FOR 4D MILLIMETER WAVE RADAR

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111466169.2A CN113866742B (en) 2021-12-03 2021-12-03 Method for point cloud processing and target classification of 4D millimeter wave radar

Publications (2)

Publication Number Publication Date
CN113866742A CN113866742A (en) 2021-12-31
CN113866742B true CN113866742B (en) 2022-02-22

Family

ID=78985803

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111466169.2A Active CN113866742B (en) 2021-12-03 2021-12-03 Method for point cloud processing and target classification of 4D millimeter wave radar

Country Status (3)

Country Link
CN (1) CN113866742B (en)
DE (1) DE112022000017T5 (en)
WO (1) WO2023097971A1 (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113866742B (en) * 2021-12-03 2022-02-22 南京楚航科技有限公司 Method for point cloud processing and target classification of 4D millimeter wave radar
CN115236674B (en) * 2022-06-15 2024-06-04 北京踏歌智行科技有限公司 Mining area environment sensing method based on 4D millimeter wave radar
CN115656962B (en) * 2022-12-26 2023-03-31 南京楚航科技有限公司 Method for identifying height-limited object based on millimeter wave radar
CN115825912B (en) * 2023-01-09 2023-05-23 南京隼眼电子科技有限公司 Radar signal processing method, device and storage medium
CN115840221B (en) * 2023-02-20 2023-04-25 上海几何伙伴智能驾驶有限公司 Method for achieving target feature extraction and multi-target tracking based on 4D millimeter wave radar
CN116593973A (en) * 2023-03-29 2023-08-15 深圳承泰科技有限公司 Method and system for automatically calibrating installation angle of vehicle-mounted millimeter wave radar
CN116881385B (en) * 2023-09-08 2023-12-01 中国铁塔股份有限公司 Track smoothing method, track smoothing device, electronic equipment and readable storage medium
CN116990773A (en) * 2023-09-27 2023-11-03 广州辰创科技发展有限公司 Low-speed small target detection method and device based on self-adaptive threshold and storage medium
CN117250595B (en) * 2023-11-20 2024-01-12 长沙莫之比智能科技有限公司 False alarm suppression method for vehicle-mounted millimeter wave radar metal well lid target
CN117491965B (en) * 2024-01-02 2024-03-19 上海几何伙伴智能驾驶有限公司 Target track starting method based on 4D millimeter wave radar
CN117647806B (en) * 2024-01-30 2024-04-12 安徽隼波科技有限公司 Point trace condensation and target tracking method based on millimeter wave radar
CN117647807B (en) * 2024-01-30 2024-04-19 安徽隼波科技有限公司 Motor vehicle size estimation method based on millimeter wave radar

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110031834A (en) * 2018-01-12 2019-07-19 西安艾索信息技术有限公司 A kind of improved multiple target radar track processing method
CN111428573A (en) * 2020-03-02 2020-07-17 南京莱斯电子设备有限公司 Infrared weak and small target detection false alarm suppression method under complex background
CN111929655A (en) * 2020-09-08 2020-11-13 中国电子科技集团公司第三十八研究所 Automobile millimeter wave radar road target tracking method and system
CN112166336A (en) * 2019-09-27 2021-01-01 深圳市大疆创新科技有限公司 Method and device for calibrating pitching installation angle of millimeter wave radar, vehicle control system and vehicle
CN113671481A (en) * 2021-07-21 2021-11-19 西安电子科技大学 3D multi-target tracking processing method based on millimeter wave radar
CN113721234A (en) * 2021-08-30 2021-11-30 南京慧尔视智能科技有限公司 Vehicle-mounted millimeter wave radar point cloud data dynamic and static separation filtering method and device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10705198B2 (en) * 2018-03-27 2020-07-07 Infineon Technologies Ag System and method of monitoring an air flow using a millimeter-wave radar sensor
WO2019234795A1 (en) * 2018-06-04 2019-12-12 三菱電機株式会社 Light impingement device
CN113866742B (en) * 2021-12-03 2022-02-22 南京楚航科技有限公司 Method for point cloud processing and target classification of 4D millimeter wave radar

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110031834A (en) * 2018-01-12 2019-07-19 西安艾索信息技术有限公司 A kind of improved multiple target radar track processing method
CN112166336A (en) * 2019-09-27 2021-01-01 深圳市大疆创新科技有限公司 Method and device for calibrating pitching installation angle of millimeter wave radar, vehicle control system and vehicle
CN111428573A (en) * 2020-03-02 2020-07-17 南京莱斯电子设备有限公司 Infrared weak and small target detection false alarm suppression method under complex background
CN111929655A (en) * 2020-09-08 2020-11-13 中国电子科技集团公司第三十八研究所 Automobile millimeter wave radar road target tracking method and system
CN113671481A (en) * 2021-07-21 2021-11-19 西安电子科技大学 3D multi-target tracking processing method based on millimeter wave radar
CN113721234A (en) * 2021-08-30 2021-11-30 南京慧尔视智能科技有限公司 Vehicle-mounted millimeter wave radar point cloud data dynamic and static separation filtering method and device

Also Published As

Publication number Publication date
CN113866742A (en) 2021-12-31
WO2023097971A1 (en) 2023-06-08
DE112022000017T5 (en) 2023-07-27

Similar Documents

Publication Publication Date Title
CN113866742B (en) Method for point cloud processing and target classification of 4D millimeter wave radar
US11093799B2 (en) Rare instance classifiers
CN109087510B (en) Traffic monitoring method and device
Wang et al. A point cloud-based robust road curb detection and tracking method
Lin et al. A Real‐Time Vehicle Counting, Speed Estimation, and Classification System Based on Virtual Detection Zone and YOLO
US9110163B2 (en) Lidar-based classification of object movement
CN106228125B (en) Method for detecting lane lines based on integrated study cascade classifier
CN111667512B (en) Multi-target vehicle track prediction method based on improved Kalman filtering
CN111340855A (en) Road moving target detection method based on track prediction
CN113033303B (en) SAR image rotation ship detection implementation method based on RCIoU loss
CN114358140A (en) Rapid capturing method for sparse point cloud aircraft under low visibility
CN114879192A (en) Decision tree vehicle type classification method based on road side millimeter wave radar and electronic equipment
CN114488350A (en) Short-term rainfall forecasting method for mountain city based on radar map information processing
CN117689995A (en) Unknown spacecraft level detection method based on monocular image
Piroli et al. Towards robust 3D object detection in rainy conditions
CN113313008B (en) Target and identification tracking method based on YOLOv3 network and mean shift
Qi et al. Vehicle detection under unmanned aerial vehicle based on improved YOLOv3
CN114170196A (en) SAR image small target identification method based on CenterNet2
Wang et al. An Improved Object Detection Method for Underwater Sonar Image Based on PP‐YOLOv2
Abdalwohab et al. Deep learning based camera and radar fusion for object detection and classification
Ding et al. Lane line detection based on YOLOv4
Peruničić et al. Vision-based Vehicle Speed Estimation Using the YOLO Detector and RNN
CN117991250B (en) Mobile robot positioning detection method, system, equipment and medium
Wen et al. Target Detection and Tracking Techniques for Vehicles in Real-Time Scenarios Based on YOLOv8
Du et al. Tracking Assisted LiDAR Target Detection Method During Rainy and Foggy Weather

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant