CN113420743A - Radar-based target classification method, system and storage medium - Google Patents

Radar-based target classification method, system and storage medium Download PDF

Info

Publication number
CN113420743A
CN113420743A CN202110978264.4A CN202110978264A CN113420743A CN 113420743 A CN113420743 A CN 113420743A CN 202110978264 A CN202110978264 A CN 202110978264A CN 113420743 A CN113420743 A CN 113420743A
Authority
CN
China
Prior art keywords
target
category
class
final
radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110978264.4A
Other languages
Chinese (zh)
Inventor
胡溢鑫
郭坤鹏
张燎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Hawkeye Electronic Technology Co Ltd
Original Assignee
Nanjing Hawkeye Electronic Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Hawkeye Electronic Technology Co Ltd filed Critical Nanjing Hawkeye Electronic Technology Co Ltd
Priority to CN202110978264.4A priority Critical patent/CN113420743A/en
Publication of CN113420743A publication Critical patent/CN113420743A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/24323Tree-organised classifiers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application discloses a target classification method, a target classification system and a storage medium based on radar. The target classification method comprises the following steps: acquiring target parameters of a target object through the radar; inputting the target parameters into a preset classification model to obtain a reference class of the target object; and determining the final category of the target object based on the reference category and according to a preset output strategy. According to the target classification method based on the radar, the classification accuracy of the radar is effectively improved in a mode of setting an output strategy, and the application field of the radar is expanded.

Description

Radar-based target classification method, system and storage medium
Technical Field
The present application relates to the field of radar identification technologies, and in particular, to a target classification method and system based on radar, and a storage medium.
Background
In the prior art, in the practical application scene of the millimeter wave radar, only information such as the direction and the speed of a target object can be provided, and the type of the target object cannot be identified. In other words, in the case of using only the millimeter wave radar, the user cannot acquire the kind of the target object. The reason why the above problems are caused is that the classification accuracy of the target object classification scheme of the millimeter wave radar is too low, so that the classification result of the millimeter wave radar has no reference value, and further, the millimeter wave radar has no possibility of mass production in the field of target object classification, and the development direction of the millimeter wave radar is greatly limited.
Therefore, it is necessary to provide a solution to the problems in the prior art.
Disclosure of Invention
The invention aims to provide a target classification method, a target classification system and a storage medium based on radar, so as to solve the problem of low classification accuracy of millimeter wave radar.
In order to achieve the above object, an embodiment of the present invention provides a target classification method based on radar, including: acquiring target parameters of a target object through the radar; inputting the target parameters into a preset classification model to obtain a reference class of the target object; and determining the final category of the target object based on the reference category and according to a preset output strategy.
Further, the step of obtaining target parameters of a target object by the radar is preceded by: obtaining at least two sets of training samples by the radar; wherein each set of the training samples comprises a sample identifier for representing a reference parameter of a reference object of the same category; introducing the training samples into a preset machine learning model for training so as to construct the classification model; and the step of inputting the target parameters into a preset classification model to obtain the reference class of the target object comprises: and outputting the reference category of the target object according to the constructed classification model and the target parameter of the target object.
Further, the step of obtaining target parameters of a target object by the radar is preceded by: judging whether the target identifier of the target object is represented as a final category; when it is determined that the object identifier of the object is represented in the final category, the step of determining whether the object is marked with the final category is re-executed.
Further, the target parameters include: at least one of the number of radar reflection points, the size of the target object, the signal-to-noise ratio, the radar scattering cross section area, the ground speed of the target object, the acceleration of the target object and the azimuth of the target object.
Further, the step of determining the final class of the target object based on the reference class and according to a preset output strategy comprises: saving at least one of the reference category, a confidence score corresponding to the reference category, into a target location corresponding to the target object.
Further, the step of determining a final class of the target object based on the reference class and according to a preset output strategy further includes: obtaining a total number of reference categories in the target location; determining whether the total number of reference categories is equal to a first threshold; acquiring a total number of reference categories belonging to the same category in the target position when it is determined that the total number of reference categories is equal to the first threshold; judging whether the total number of the reference categories belonging to the same category in the target position is greater than or equal to a second threshold value; when the total number of the reference categories belonging to the same category in the target position is determined to be greater than or equal to the second threshold value, determining the value of the final category of the target object as the value of the reference category of the same category.
Further, the step of determining a final class of the target object based on the reference class and according to a preset output strategy further includes: acquiring all confidence scores corresponding to reference categories belonging to the same category in a target position corresponding to the target object; calculating an average of all confidence scores corresponding to reference categories belonging to the same category; comparing the average values corresponding to the reference classes belonging to the different classes to obtain a maximum average value; acquiring a reference category corresponding to the maximum average value; and determining the value of the final class of the target object as the value of the reference class corresponding to the maximum average value.
Further, the step of determining a final class of the target object based on the reference class and according to a preset output strategy further includes: judging whether a final category exists in a target position corresponding to the target object; when the final category is judged to exist, acquiring the priority of the reference category and the priority of the final category; judging whether the priority of the reference category is greater than that of the final category; replacing the value of the final class with the value of the reference class when it is determined that the priority of the reference class is greater than the priority of the final class.
The embodiment of the invention also provides a target classification system based on radar, which comprises: the target parameter acquisition unit is used for acquiring target parameters of a target object through the radar; a reference category acquiring unit, configured to input the target parameter into a preset classification model to acquire a reference category of the target object; and a final class determination unit for determining a final class of the target object based on the reference class and according to a preset output strategy.
The embodiment of the invention also provides a storage medium, wherein the storage medium stores a computer program, and when the computer program is read and operated by a processor, the computer program executes any step in the target classification method based on the radar.
The radar-based target classification method has the advantages that the radar-based target classification method effectively improves the classification accuracy of the radar in a mode of setting an output strategy, and also expands the application field of the radar. The same is true of the radar-based object classification system described in the embodiments of the present invention.
Drawings
The technical solution and other advantages of the present application will become apparent from the detailed description of the embodiments of the present application with reference to the accompanying drawings.
Fig. 1 is a flowchart of a target classification method based on radar according to an embodiment of the present disclosure.
Fig. 2 is a front flow chart of the radar-based target classification method according to the above embodiment of the present application.
Fig. 3 is a first flowchart of step S300 shown in fig. 1.
Fig. 4 is a second flowchart of step S300 shown in fig. 1.
Fig. 5 is a third flowchart of step S300 shown in fig. 1.
Fig. 6 is a schematic structural diagram of a target classification system based on radar according to an embodiment of the present application.
Fig. 7 is a schematic diagram of a first structure of the final class determination unit shown in fig. 6.
Fig. 8 is a second configuration diagram of the final class determination unit shown in fig. 6.
Description of reference numerals:
100. a target classification system; 110. A target parameter acquisition unit;
120. a reference category acquisition unit; 130. a final category determination unit;
131A, a category total number acquisition unit; 132A, a first threshold comparison unit;
133A, same category acquisition means; 134A and a second threshold comparison unit;
135A, a final type setting unit; 131B, a confidence score obtaining unit;
132B, a whole mean value acquisition unit; 133B, a maximum mean value obtaining unit;
134B, a mean value type acquisition unit; 135B, final category corresponding unit.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, the present embodiment provides a target classification method based on radar, including the following steps.
And S100, acquiring target parameters of the target object through a radar. The target object can be any object or pedestrian in the radar detection range. The target parameters include: at least one of the number of radar reflection points, the size of the target object, the signal-to-noise ratio, the radar scattering cross section area, the ground speed of the target object, the acceleration of the target object and the azimuth of the target object.
As shown in fig. 2, step S100 may include steps S10 and S20.
And step S10, acquiring at least two groups of training samples through radar. Wherein each set of training samples comprises a sample identifier for indicating reference parameters of a reference object of the same class. The reference parameters include: at least one of the number of radar reflection points, the size of a reference object, the signal-to-noise ratio, the radar scattering cross section area, the ground speed of the reference object, the acceleration of the reference object and the orientation of the reference object. Specifically, each sample identifier corresponds to a target object type, and the correspondence relationship can be set by a user. Illustratively, the training samples may include: the markers are used to mark training samples of pedestrians, training samples of motor vehicles, non-motor vehicles and irrelevant objects. The more types of sample identifiers, the more types of target species that can be identified by the trained machine learning model.
Illustratively, when the reference parameter is the ground speed of the reference object, the sample identifier is marked according to the ground speed range of different reference objects. For example: the maximum speed of the first ground speed range is greater than the maximum speed of the second ground speed range, the maximum speed of the second ground speed range is greater than the maximum speed of the third ground speed range, and the maximum speed of the third ground speed range is greater than the maximum speed of the fourth ground speed range. Taking the sample identifiers as the motor cars, the motor vehicles, the non-motor vehicles and the pedestrians as examples, the first ground speed range is the ground speed range which can be reached by the motor cars, the second ground speed range is the ground speed range which can be reached by the motor cars and the motor vehicles, the third ground speed range is the ground speed range which can be reached by the motor cars, the motor vehicles and the non-motor vehicles, and the fourth ground speed range is the ground speed range which can be reached by the motor cars, the motor vehicles, the non-motor vehicles and the pedestrians. The sample identifier corresponding to the reference object can be generated by referring to the first, second, third and fourth pairs of ground speed ranges.
Illustratively, when the reference parameter is a reference object size, the sample identifier is marked according to a range of different reference object sizes. For example: the maximum size of the first range of sizes is greater than the maximum size of the second range of sizes, which is greater than the maximum size of the third range of sizes, which is greater than the maximum size of the fourth range of sizes. Sample identifiers are exemplified by a bullet train, truck/commercial vehicle, car/minibus, non-motor vehicle/pedestrian. The first size range is the range of sizes achievable with motor vehicles, the second size range is the range of sizes achievable with trucks/commercial vehicles, the third size range is the range of sizes achievable with cars/small vans, and the fourth size range is the range of sizes achievable with non-motor vehicles/pedestrians. The sample identifier corresponding to the reference object can be generated by referring to the first, second, third and fourth size ranges.
And step S20, introducing training samples into a preset machine learning model for training so as to construct a classification model. The classification model is applied in a subsequent step S200. The embodiment is only exemplified here, and the type of the machine learning model is not limited, and the manufacturer may select the machine learning model suitable for the actual requirement according to its own requirement.
Illustratively, the preset machine learning model may be an SVM (support vector machine) algorithm model. The training samples of the SVM algorithm model can be selected from the most representative reference parameters, such as: the reference parameters may be selected from the number of points detected, the size of the target, the total amount of RCS (i.e., radar cross-sectional area), the longitudinal velocity to ground, the longitudinal distance, etc. When the SVM algorithm model is applied to the radar-based target classification method provided by the embodiment, the accuracy of the obtained reference category is between 80% and 90%, and the time for obtaining the reference category is about 70 microseconds. The generalization capability of the SVM algorithm model is strong, in other words, the SVM algorithm model can well deal with untrained target parameters.
Illustratively, the preset machine learning model may be a random forest algorithm model. The training sample of the random forest algorithm model can select the most representative reference parameters, such as: the reference parameters may be selected from the number of points detected, the size of the target, the total amount of RCS (i.e., radar cross-sectional area), the longitudinal velocity to ground, the longitudinal distance, etc. When the random forest algorithm model is applied to the radar-based target classification algorithm provided by the embodiment, the accuracy of the obtained reference category is between 80% and 95%, and the time for obtaining the reference category is about 60 microseconds. The random forest algorithm model can be applied to a large reference parameter set, and is high in fitting resistance and high in generalization capability, so that the random forest algorithm model can be applied to most traffic scenes.
Illustratively, the preset machine learning model may be a linear discriminant algorithm model. The training samples of the linear discriminant algorithm model can be selected from the most representative reference parameters, such as: the reference parameters may be selected from the number of points detected, the size of the target, the total amount of RCS (i.e., radar cross-sectional area), the longitudinal velocity to ground, the longitudinal distance, etc. When the linear discriminant algorithm model is applied to the radar-based target classification algorithm provided by the embodiment, the accuracy of the obtained reference class is between 80% and 90%, and the time for obtaining the reference class is about 40 microseconds. The speed of the linear discrimination algorithm model is higher than that of other machine learning models for obtaining the reference category, and the accuracy rate is acceptable, so that the linear discrimination algorithm model can be applied to traffic scenes with a large number of target objects.
Illustratively, the preset machine learning model may be a stepwise enhanced algorithm model. The training samples of the stepwise enhanced algorithm model may be selected from the most representative reference parameters, such as: the reference parameters may be selected from the number of points detected, the size of the target, the total amount of RCS (i.e., radar cross-sectional area), the longitudinal velocity to ground, the longitudinal distance, etc. When the stepwise enhancement algorithm model is applied to the radar-based target classification algorithm provided by the embodiment, the accuracy of the obtained reference class is between 80% and 95%, and the time for obtaining the reference class is about 80 microseconds. The classification precision of the step-by-step enhancement algorithm model is extremely high, so that the step-by-step enhancement algorithm model can be applied to traffic scenes with too many target object types.
For example, the preset machine learning model may be a KNN algorithm model (i.e., a k-nearest neighbor algorithm model). The training sample of the KNN algorithm model may use the most representative reference parameters, such as: the reference parameters can be the number of detection points, the size of the target, the total amount of RCS (radar scattering cross section), the longitudinal speed to ground, the longitudinal distance and the like, and can also be the reference parameters with smaller representativeness. When the KNN model is applied to the radar-based target classification algorithm provided in this embodiment, the accuracy of the obtained reference class is between 80% and 95%, and the time for obtaining the reference class is about 35 microseconds.
Further, step S100 may also include the following steps: judging whether the target identifier of the target object is represented as a final category; when it is determined that the object identifier of the object is represented in the final category, the step of determining whether the object is marked with the final category is re-executed. That is, when the object identifier is indicated as the final class, the class of the object is not discriminated any more, so that the stability of the system and the validity of the calculation are ensured.
Step S200, inputting target parameters into a preset classification model to obtain a reference class of the target object. And step S300, determining the final category of the target object based on the reference category and according to a preset output strategy. Further, step S300 may be preceded by the steps of: saving at least one of the reference category, the confidence score corresponding to the reference category, into a target location corresponding to the target object. The target location may be a database, for example. The confidence score is related to the accuracy of the reference classes obtained by the machine learning model. In other words, the higher the confidence score of a certain reference category, the higher the accuracy that the machine learning model considers the reference category (the accuracy is considered by the machine learning model itself and is not an objective accuracy). The preset output strategy is a strategy capable of improving the accuracy of the final class, and if the accuracy of the final class does not need to be improved, the output strategy can be 'a final class taking the reference class of the target object as the target object'. Other output strategies will be described in detail later.
Further, before step S300, the method further includes the following steps: saving at least one of the reference category, the confidence score corresponding to the reference category, into a target location corresponding to the target object. In the present embodiment, a target position corresponds to a target object to distinguish target parameters of different target objects.
Referring to fig. 3, step S300 optionally includes steps S3101 to S3501, that is, the preset output policy is the output policy provided in steps S3101 to S3501.
Step S3101, the total number of reference categories in the target position is acquired.
Step S3201, it is determined whether the total number of reference categories is equal to a first threshold. The selection range of the first threshold may be set according to actual road condition requirements, and illustratively, the larger the first threshold is, the higher the accuracy of the final category is, but the more time is consumed for the calculation.
Step S3301, when it is determined that the total number of reference categories is equal to the first threshold value, acquires the total number of reference categories belonging to the same category in the target position. Illustratively, when it is determined that the total number of reference categories is smaller than the first threshold value, step S3101 is re-executed.
Step S3401, determine whether the total number of reference categories belonging to the same category in the target position is greater than or equal to a second threshold. The selection range of the second threshold value can be set according to actual road condition requirements. Illustratively, the larger the second threshold, the higher the accuracy of the final class, but the more time-consuming the operation. Optionally, the total number of the reference categories belonging to the same category in the target position may also be compared to obtain the maximum total number, and the reference category corresponding to the maximum total number is determined as the final category.
Step S3501, when it is determined that the total number of the reference categories belonging to the same category in the target position is greater than or equal to the second threshold value, determining that the value of the final category of the target object is the value of the reference category of the same category. Illustratively, when the total number of reference categories belonging to the same category in the determination target position is smaller than the second threshold, the reference category total number is reset, and step S3101 is re-executed.
As shown in fig. 4, step S300 optionally includes steps S3102 through S3502.
Step S3102, all confidence scores corresponding to the reference categories belonging to the same category in the target position corresponding to the target object are acquired.
Step S3202 calculates an average of all the confidence scores corresponding to the reference categories belonging to the same category. Alternatively, the sum of all the confidence scores corresponding to the reference classes belonging to the same class may also be calculated.
Step S3302, average values corresponding to reference categories belonging to different categories are compared to obtain a maximum average value. Optionally, the sum values corresponding to reference classes belonging to different classes may also be compared to obtain a maximum sum value.
And step S3402, acquiring a reference category corresponding to the maximum average value. Alternatively, a reference category corresponding to the maximum sum value may also be acquired.
Step S3502, determining that the final class value of the target object is the reference class value corresponding to the maximum average value. Alternatively, the value of the final class of the target object may also be determined as the value of the reference class corresponding to the maximum sum value.
Referring to fig. 5, step S300 optionally includes steps S3103 to S3403.
Step S3103 is to determine whether the final type of the target position corresponding to the target object exists.
Step S3203, when it is determined that the final category exists, acquires the priority of the reference category and the priority of the final category.
And step S3303, judging whether the priority of the reference category is greater than that of the final category.
And step S3403, when the priority of the reference category is judged to be higher than that of the final category, replacing the value of the final category with the value of the reference category. Illustratively, the priority is positively correlated to the volume size, i.e., the priority for the pedestrian category is 1, the priority for the non-motor vehicle is 2, and the priority for the motor vehicle is 3. If the final class of the object is identified as a pedestrian but the reference class includes a non-motor vehicle, the final class of the object is replaced with the non-motor vehicle since the priority of the non-motor vehicle class is greater than the priority of the pedestrian class (i.e., the volume of the non-motor vehicle is normally greater than the volume of the pedestrian). If the final class of the object is identified as a vehicle, the reference class of which includes non-vehicles, the final class of the object is not changed since the class priority of the non-vehicles is less than the class priority of the vehicles (i.e., the vehicle is normally larger than the non-vehicles). Further, step S3403 includes the steps of: when the priority of the reference category is judged to be greater than the priority of the final category, whether the confidence score of the reference category is greater than or equal to a threshold value is judged. And when the confidence score of the reference category is judged to be larger than or equal to the threshold value, replacing the value of the final category as the value of the reference category so as to improve the accuracy of the final category.
According to the target classification method based on the radar, the classification accuracy of the radar is effectively improved in a mode of setting an output strategy, and the application field of the radar is expanded.
Referring to fig. 6, based on the same inventive concept, the present embodiment further provides a target classification system 100 based on radar, where the target classification system 100 includes: a target parameter acquiring unit 110, a reference category acquiring unit 120, and a final category determining unit 130.
The target parameter acquiring unit 110 is configured to acquire a target parameter of a target object by radar. The target object can be any object or pedestrian in the radar detection range. The target parameters include: at least one of the number of radar reflection points, the size of the target object, the signal-to-noise ratio, the radar scattering cross section area, the ground speed of the target object, the acceleration of the target object and the azimuth of the target object. The reference type obtaining unit 120 is used for inputting the target parameters into a preset classification model to obtain the reference type of the target object. The final class determination unit 130 is configured to determine a final class of the target object according to a preset output policy based on the reference class. Further, the target classification system 100 includes a reference category storage unit configured to store at least one of a reference category and a confidence score corresponding to the reference category in a target location corresponding to the target object. The target location may be a database, for example. The confidence score is related to the accuracy of the reference classes obtained by the machine learning model. In other words, the higher the confidence score of a certain reference category, the higher the accuracy that the machine learning model considers the reference category (the accuracy is considered by the machine learning model itself and is not an objective accuracy). The preset output strategy is a strategy capable of improving the accuracy of the final class, and if the accuracy of the final class does not need to be improved, the output strategy can be 'a final class taking the reference class of the target object as the target object'. Other output strategies will be described in detail later.
Further, the target classification system 100 further includes a training sample obtaining unit and a classification model constructing unit.
The training sample acquisition unit is used for acquiring at least two groups of training samples through radar. Wherein each set of training samples comprises a sample identifier for indicating reference parameters of a reference object of the same class. The reference parameters include: at least one of the number of radar reflection points, the size of a reference object, the signal-to-noise ratio, the radar scattering cross section area, the ground speed of the reference object, the acceleration of the reference object and the orientation of the reference object. Specifically, each sample identifier corresponds to a target object type, and the correspondence relationship can be set by a user. Illustratively, the training samples may include: the markers are used to mark training samples of pedestrians, training samples of motor vehicles, non-motor vehicles and irrelevant objects. The more types of sample identifiers, the more types of target species that can be identified by the trained machine learning model.
Illustratively, when the reference parameter is the ground speed of the reference object, the sample identifier is marked according to the ground speed range of different reference objects. For example: the maximum speed of the first ground speed range is greater than the maximum speed of the second ground speed range, the maximum speed of the second ground speed range is greater than the maximum speed of the third ground speed range, and the maximum speed of the third ground speed range is greater than the maximum speed of the fourth ground speed range. Taking the sample identifiers as the motor cars, the motor vehicles, the non-motor vehicles and the pedestrians as examples, the first ground speed range is the ground speed range which can be reached by the motor cars, the second ground speed range is the ground speed range which can be reached by the motor cars and the motor vehicles, the third ground speed range is the ground speed range which can be reached by the motor cars, the motor vehicles and the non-motor vehicles, and the fourth ground speed range is the ground speed range which can be reached by the motor cars, the motor vehicles, the non-motor vehicles and the pedestrians. The sample identifier corresponding to the reference object can be generated by referring to the first, second, third and fourth pairs of ground speed ranges.
Illustratively, when the reference parameter is a reference object size, the sample identifier is marked according to a range of different reference object sizes. For example: the maximum size of the first range of sizes is greater than the maximum size of the second range of sizes, which is greater than the maximum size of the third range of sizes, which is greater than the maximum size of the fourth range of sizes. Sample identifiers are exemplified by a bullet train, truck/commercial vehicle, car/minibus, non-motor vehicle/pedestrian. The first size range is the range of sizes achievable with motor vehicles, the second size range is the range of sizes achievable with trucks/commercial vehicles, the third size range is the range of sizes achievable with cars/small vans, and the fourth size range is the range of sizes achievable with non-motor vehicles/pedestrians. The sample identifier corresponding to the reference object can be generated by referring to the first, second, third and fourth size ranges.
The classification model construction unit is used for introducing training samples into a preset machine learning model for training so as to construct a classification model. The embodiment is only exemplified here, and the type of the machine learning model is not limited, and the manufacturer may select the machine learning model suitable for the actual requirement according to its own requirement.
Illustratively, the preset machine learning model may be an SVM (support vector machine) algorithm model. The training samples of the SVM algorithm model can be selected from the most representative reference parameters, such as: the reference parameters may be selected from the number of points detected, the size of the target, the total amount of RCS (i.e., radar cross-sectional area), the longitudinal velocity to ground, the longitudinal distance, etc. When the SVM algorithm model is applied to the radar-based target classification method provided by the embodiment, the accuracy of the obtained reference category is between 80% and 90%, and the time for obtaining the reference category is about 70 microseconds. The generalization capability of the SVM algorithm model is strong, in other words, the SVM algorithm model can well deal with untrained target parameters.
Illustratively, the preset machine learning model may be a random forest algorithm model. The training sample of the random forest algorithm model can select the most representative reference parameters, such as: the reference parameters may be selected from the number of points detected, the size of the target, the total amount of RCS (i.e., radar cross-sectional area), the longitudinal velocity to ground, the longitudinal distance, etc. When the random forest algorithm model is applied to the radar-based target classification algorithm provided by the embodiment, the accuracy of the obtained reference category is between 80% and 95%, and the time for obtaining the reference category is about 60 microseconds. The random forest algorithm model can be applied to a large reference parameter set, and is high in fitting resistance and high in generalization capability, so that the random forest algorithm model can be applied to most traffic scenes.
Illustratively, the preset machine learning model may be a linear discriminant algorithm model. The training samples of the linear discriminant algorithm model can be selected from the most representative reference parameters, such as: the reference parameters may be selected from the number of points detected, the size of the target, the total amount of RCS (i.e., radar cross-sectional area), the longitudinal velocity to ground, the longitudinal distance, etc. When the linear discriminant algorithm model is applied to the radar-based target classification algorithm provided by the embodiment, the accuracy of the obtained reference class is between 80% and 90%, and the time for obtaining the reference class is about 40 microseconds. The speed of the linear discrimination algorithm model is higher than that of other machine learning models for obtaining the reference category, and the accuracy rate is acceptable, so that the linear discrimination algorithm model can be applied to traffic scenes with a large number of target objects.
Illustratively, the preset machine learning model may be a stepwise enhanced algorithm model. The training samples of the stepwise enhanced algorithm model may be selected from the most representative reference parameters, such as: the reference parameters may be selected from the number of points detected, the size of the target, the total amount of RCS (i.e., radar cross-sectional area), the longitudinal velocity to ground, the longitudinal distance, etc. When the stepwise enhancement algorithm model is applied to the radar-based target classification algorithm provided by the embodiment, the accuracy of the obtained reference class is between 80% and 95%, and the time for obtaining the reference class is about 80 microseconds. The classification precision of the step-by-step enhancement algorithm model is extremely high, so that the step-by-step enhancement algorithm model can be applied to traffic scenes with too many target object types.
For example, the preset machine learning model may be a KNN algorithm model (i.e., a k-nearest neighbor algorithm model). The training sample of the KNN algorithm model may use the most representative reference parameters, such as: the reference parameters can be the number of detection points, the size of the target, the total amount of RCS (radar scattering cross section), the longitudinal speed to ground, the longitudinal distance and the like, and can also be the reference parameters with smaller representativeness. When the KNN model is applied to the radar-based target classification algorithm provided in this embodiment, the accuracy of the obtained reference class is between 80% and 95%, and the time for obtaining the reference class is about 35 microseconds.
Further, the object classification system 100 further includes a final class determination unit configured to determine whether the object identifier of the object is represented as a final class. That is, when the object identifier is indicated as the final class, the class of the object is not discriminated any more, so that the stability of the system and the validity of the calculation are ensured.
Referring to fig. 7, optionally, the final category determining unit 130 includes a category total acquiring unit 131A, a first threshold comparing unit 132A, a same category acquiring unit 133A, a second threshold comparing unit 134A, and a final category setting unit 135A.
The category total number acquisition unit 131A is configured to acquire the total number of reference categories in the target position.
The first threshold comparing unit 132A is used to determine whether the total number of the reference classes is equal to the first threshold. The selection range of the first threshold may be set according to actual road condition requirements, and illustratively, the larger the first threshold is, the higher the accuracy of the final category is, but the more time is consumed for the calculation.
The same category acquisition unit 133A is configured to acquire the total number of reference categories belonging to the same category in the target position.
The second threshold comparing unit 134A is used to determine whether the total number of the reference categories belonging to the same category in the target position is greater than or equal to a second threshold. The selection range of the second threshold value can be set according to actual road condition requirements. Illustratively, the larger the second threshold, the higher the accuracy of the final class, but the more time-consuming the operation. Optionally, the total number of the reference categories belonging to the same category in the target position may also be compared to obtain the maximum total number, and the reference category corresponding to the maximum total number is determined as the final category.
The final class setting unit 135A is configured to determine that the value of the final class of the target object is the value of the reference class of the same class.
Referring to fig. 8, the final class determination unit 130 may optionally include a confidence score obtaining unit 131B, an overall mean obtaining unit 132B, a maximum mean obtaining unit 133B, a mean class obtaining unit 134B, and a final class corresponding unit 135B.
The confidence score acquiring unit 131B is configured to acquire all the confidence scores corresponding to the reference categories belonging to the same category in the target position corresponding to the target object.
The overall average acquisition unit 132B is configured to calculate an average of all the confidence scores corresponding to the reference classes belonging to the same class. Alternatively, the all-mean acquisition unit 132B is configured to calculate a sum of all the confidence scores corresponding to the reference classes belonging to the same class.
The maximum average value obtaining unit 133B is configured to compare average values corresponding to reference categories belonging to different categories to obtain a maximum average value. Optionally, the maximum average acquisition unit 133B is configured to compare the sum values corresponding to the reference categories belonging to different categories to obtain a maximum sum value.
The mean class acquiring unit 134B is configured to acquire a reference class corresponding to the maximum mean value. Optionally, the mean category acquiring unit 134B is configured to acquire a reference category corresponding to the maximum sum value.
The final class corresponding unit 135B is used to determine that the value of the final class of the target object is the value of the reference class corresponding to the maximum average value. Optionally, the final class corresponding unit 135B is configured to determine that the value of the final class of the target object is the value of the reference class corresponding to the maximum sum value.
Optionally, the final category determining unit 130 includes a category existence unit, a priority obtaining unit, a priority comparison unit, and a final category replacing unit.
The category existence unit is used for judging whether a final category exists in a target position corresponding to the target object. The priority acquiring unit is used for acquiring the priority of the reference category and the priority of the final category. The priority comparison unit is used for judging whether the priority of the reference class is larger than that of the final class. The final class replacing unit is used for replacing the value of the final class with the value of the reference class.
Illustratively, the priority is positively correlated to the volume size, i.e., the priority for the pedestrian category is 1, the priority for the non-motor vehicle is 2, and the priority for the motor vehicle is 3. If the final class of the object is identified as a pedestrian but the reference class includes a non-motor vehicle, the final class of the object is replaced with the non-motor vehicle since the priority of the non-motor vehicle class is greater than the priority of the pedestrian class (i.e., the volume of the non-motor vehicle is normally greater than the volume of the pedestrian). If the final class of the object is identified as a vehicle, the reference class of which includes non-vehicles, the final class of the object is not changed since the class priority of the non-vehicles is less than the class priority of the vehicles (i.e., the vehicle is normally larger than the non-vehicles).
Optionally, the final class determining unit 130 includes a threshold determining unit, configured to determine whether the confidence score of the reference class is greater than or equal to a threshold value, and replace the value of the final class with the value of the reference class, so as to improve the accuracy of the final class.
The target classification system based on the radar provided by the embodiment effectively improves the classification accuracy of the radar by setting the output strategy, and also expands the application field of the radar.
Based on the same inventive concept, the present invention further provides a storage medium storing a computer program, which is read and executed by a processor to perform any one of the steps of the above-mentioned radar-based object classification method.
The radar-based target classification method, system and storage medium provided by the embodiments of the present application are introduced in detail, and a specific example is applied to illustrate the principle and implementation manner of the present application, and the description of the embodiments is only used to help understand the technical scheme and core ideas of the present application; those of ordinary skill in the art will understand that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications or substitutions do not depart from the spirit and scope of the present disclosure as defined by the appended claims.

Claims (10)

1. A radar-based target classification method, comprising:
acquiring target parameters of a target object through the radar;
inputting the target parameters into a preset classification model to obtain a reference class of the target object; and
and determining the final category of the target object based on the reference category and according to a preset output strategy.
2. The object classification method according to claim 1, characterized in that said step of obtaining target parameters of the object by said radar is preceded by:
obtaining at least two sets of training samples by the radar; wherein each set of the training samples comprises a sample identifier for representing a reference parameter of a reference object of the same category;
introducing the training samples into a preset machine learning model for training so as to construct the classification model; and
the step of inputting the target parameter into a preset classification model to obtain a reference class of the target object includes:
and outputting the reference category of the target object according to the constructed classification model and the target parameter of the target object.
3. The object classification method according to claim 1, characterized in that said step of obtaining target parameters of the object by said radar is preceded by:
judging whether the target identifier of the target object is represented as a final category;
when it is determined that the object identifier of the object is represented in the final category, the step of determining whether the object is marked with the final category is re-executed.
4. The object classification method according to claim 1, characterized in that the object parameters comprise: at least one of the number of radar reflection points, the size of the target object, the signal-to-noise ratio, the radar scattering cross section area, the ground speed of the target object, the acceleration of the target object and the azimuth of the target object.
5. The object classification method according to claim 1, wherein the step of determining the final class of the object based on the reference class and according to a preset output strategy is preceded by: saving at least one of the reference category, a confidence score corresponding to the reference category, into a target location corresponding to the target object.
6. The object classification method according to claim 5, wherein the step of determining a final class of the object based on the reference class and according to a preset output strategy further comprises:
obtaining a total number of reference categories in the target location;
determining whether the total number of reference categories is equal to a first threshold;
acquiring a total number of reference categories belonging to the same category in the target position when it is determined that the total number of reference categories is equal to the first threshold;
judging whether the total number of the reference categories belonging to the same category in the target position is greater than or equal to a second threshold value;
when the total number of the reference categories belonging to the same category in the target position is determined to be greater than or equal to the second threshold value, determining the value of the final category of the target object as the value of the reference category of the same category.
7. The object classification method according to claim 5, wherein the step of determining a final class of the object based on the reference class and according to a preset output strategy further comprises:
acquiring all confidence scores corresponding to reference categories belonging to the same category in a target position corresponding to the target object;
calculating an average of all confidence scores corresponding to reference categories belonging to the same category;
comparing the average values corresponding to the reference classes belonging to the different classes to obtain a maximum average value;
acquiring a reference category corresponding to the maximum average value;
and determining the value of the final class of the target object as the value of the reference class corresponding to the maximum average value.
8. The method for classifying an object according to claim 1, wherein the step of determining the final class of the object based on the reference class and according to a preset output strategy further comprises:
judging whether a final category exists in a target position corresponding to the target object;
when the final category is judged to exist, acquiring the priority of the reference category and the priority of the final category;
judging whether the priority of the reference category is greater than that of the final category;
replacing the value of the final class with the value of the reference class when it is determined that the priority of the reference class is greater than the priority of the final class.
9. A radar-based object classification system, characterized in that the object classification system comprises:
the target parameter acquisition unit is used for acquiring target parameters of a target object through the radar;
a reference category acquiring unit, configured to input the target parameter into a preset classification model to acquire a reference category of the target object; and
and the final class determining unit is used for determining the final class of the target object based on the reference class and according to a preset output strategy.
10. A storage medium, characterized in that it stores a computer program which, when read and executed by a processor, performs the steps of the radar-based object classification method according to any one of claims 1 to 8.
CN202110978264.4A 2021-08-25 2021-08-25 Radar-based target classification method, system and storage medium Pending CN113420743A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110978264.4A CN113420743A (en) 2021-08-25 2021-08-25 Radar-based target classification method, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110978264.4A CN113420743A (en) 2021-08-25 2021-08-25 Radar-based target classification method, system and storage medium

Publications (1)

Publication Number Publication Date
CN113420743A true CN113420743A (en) 2021-09-21

Family

ID=77719434

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110978264.4A Pending CN113420743A (en) 2021-08-25 2021-08-25 Radar-based target classification method, system and storage medium

Country Status (1)

Country Link
CN (1) CN113420743A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114114266A (en) * 2022-01-24 2022-03-01 北京宏锐星通科技有限公司 Detection method and device for synthetic aperture radar

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105550636A (en) * 2015-12-04 2016-05-04 中国电子科技集团公司第三研究所 Method and device for identifying target types
CN106874889A (en) * 2017-03-14 2017-06-20 西安电子科技大学 Multiple features fusion SAR target discrimination methods based on convolutional neural networks
CN109753874A (en) * 2018-11-28 2019-05-14 南京航空航天大学 A kind of low slow small classification of radar targets method based on machine learning
CN111985349A (en) * 2020-07-30 2020-11-24 河海大学 Radar received signal type classification and identification method and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105550636A (en) * 2015-12-04 2016-05-04 中国电子科技集团公司第三研究所 Method and device for identifying target types
CN106874889A (en) * 2017-03-14 2017-06-20 西安电子科技大学 Multiple features fusion SAR target discrimination methods based on convolutional neural networks
CN109753874A (en) * 2018-11-28 2019-05-14 南京航空航天大学 A kind of low slow small classification of radar targets method based on machine learning
CN111985349A (en) * 2020-07-30 2020-11-24 河海大学 Radar received signal type classification and identification method and system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114114266A (en) * 2022-01-24 2022-03-01 北京宏锐星通科技有限公司 Detection method and device for synthetic aperture radar

Similar Documents

Publication Publication Date Title
CN109087510B (en) Traffic monitoring method and device
CN112133089B (en) Vehicle track prediction method, system and device based on surrounding environment and behavior intention
CN107985189B (en) Early warning method for lane changing depth of driver in high-speed driving environment
JP2019185347A (en) Object recognition device and object recognition method
CN113109802B (en) Target motion state judging method, device, radar equipment and storage medium
CN111796286B (en) Brake grade evaluation method and device, vehicle and storage medium
CN108960074B (en) Small-size pedestrian target detection method based on deep learning
US11619946B2 (en) Method and apparatus for generating U-turn path in deep learning-based autonomous vehicle
CN113536850B (en) Target object size testing method and device based on 77G millimeter wave radar
CN111695619A (en) Multi-sensor target fusion method and device, vehicle and storage medium
CN111284501A (en) Apparatus and method for managing driving model based on object recognition, and vehicle driving control apparatus using the same
CN113420743A (en) Radar-based target classification method, system and storage medium
CN115099051A (en) Automatic driving simulation test scene generation method and device, vehicle and storage medium
CN117809458A (en) Real-time assessment method and system for traffic accident risk
CN111695820B (en) Engineering vehicle electronic coupon management method and device, terminal and storage medium
CN113642114A (en) Modeling method for humanoid random car following driving behavior capable of making mistakes
CN116383689A (en) Parking area identification method and device, vehicle and storage medium
CN115805865A (en) Heavy-duty car blind area early warning method, device, equipment and storage medium
CN113643512B (en) Fatigue driving detection method and device, electronic equipment and storage medium
CN115520216A (en) Driving state judging method and device, computer equipment and storage medium
CN114662691A (en) Characteristic knowledge base system and method for automatic driving vehicle
CN113658435A (en) Processing method for road vehicle behavior prediction priority
CN110674853A (en) Ultrasonic data processing method and device and vehicle
CN116626630B (en) Object classification method and device, electronic equipment and storage medium
CN113627562A (en) Target detection method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination