CN114384483A - Radar sensor model fidelity assessment method based on deep learning - Google Patents

Radar sensor model fidelity assessment method based on deep learning Download PDF

Info

Publication number
CN114384483A
CN114384483A CN202210020097.7A CN202210020097A CN114384483A CN 114384483 A CN114384483 A CN 114384483A CN 202210020097 A CN202210020097 A CN 202210020097A CN 114384483 A CN114384483 A CN 114384483A
Authority
CN
China
Prior art keywords
data
evaluation
radar
fidelity
radar sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210020097.7A
Other languages
Chinese (zh)
Other versions
CN114384483B (en
Inventor
孟康
曹阳
刘瑜平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Research Institute Of Tsinghua Pearl River Delta
Original Assignee
Research Institute Of Tsinghua Pearl River Delta
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Research Institute Of Tsinghua Pearl River Delta filed Critical Research Institute Of Tsinghua Pearl River Delta
Priority to CN202210020097.7A priority Critical patent/CN114384483B/en
Publication of CN114384483A publication Critical patent/CN114384483A/en
Application granted granted Critical
Publication of CN114384483B publication Critical patent/CN114384483B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • G01S7/4052Means for monitoring or calibrating by simulation of echoes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Electromagnetism (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a method for evaluating the fidelity of a radar sensor model based on deep learning, which comprises the following steps: acquiring radar point cloud data, wherein the radar point cloud data comprises real data and simulation data; carrying out similarity evaluation on the basis of the real data and the simulation data to obtain an evaluation result; and realizing fidelity evaluation of the radar sensor model based on the evaluation result. According to the method, fidelity evaluation is performed on the radar model through the traditional manually specified index and the implicit measurement index which is learned based on the deep neural network, the characteristics of the radar point cloud are learned by using the deep neural network, and the authenticity of the radar model can be comprehensively evaluated by combining the traditional index. And the method can be applied to various radar sensor models, so that the effectiveness of the automatic driving virtual test method can be evaluated, and the method has high economic and social benefits.

Description

Radar sensor model fidelity assessment method based on deep learning
Technical Field
The invention belongs to the field of radar fidelity assessment, and particularly relates to a method for assessing the fidelity of a radar sensor model based on deep learning.
Background
A Radar (Radar) irradiates a target by transmitting an electromagnetic wave and receives an echo thereof, thereby obtaining information on a distance, a distance change rate (radial velocity), an azimuth, an altitude, and the like of the target to an electromagnetic wave transmission point. At present, the automatic driving virtual test method based on simulation is gradually widely applied, and the difference between simulation and reality needs to be quantified to verify whether the fidelity of the adopted sensor model meets the expected application. The generation of the simulation data is mainly divided into two steps, simulation is carried out based on the recorded ground real data, a virtual scene of the environment is generated from the angle of the sensor, and simulated radar point cloud is obtained. There is currently no reliable method to measure radar sensor model fidelity, nor is there a suitable measure. The traditional measurement method mainly comprises an evaluation method of an original data level and an evaluation method of a detection level after forming a laser point cloud. The evaluation method of the original data level can only detect simple scenes and basic functions, the evaluation method of the detection level mostly adopts qualitative evaluation at present and lacks quantitative evaluation, and the evaluation indexes mainly adopt manually made explicit indexes and lack measurement of implicit indexes.
Disclosure of Invention
In order to solve the above problems, the present invention provides the following solutions: a method for evaluating the fidelity of a radar sensor model based on deep learning comprises the following steps:
acquiring radar point cloud data, wherein the radar point cloud data comprises real data and simulation data;
carrying out similarity evaluation on the basis of the real data and the simulation data to obtain an evaluation result;
and realizing fidelity evaluation of the radar sensor model based on the evaluation result.
Preferably, the acquiring of the real data comprises,
real driving is carried out based on a test scene, real radar point cloud information is generated, and the real data are obtained;
the obtaining of the analog data may include,
simulating based on the real data to obtain the simulation data; or generating a virtual scene based on the radar sensor to obtain simulated radar point cloud information; and obtaining the simulation data based on the radar point cloud information.
Preferably, the similarity evaluation based on the real data and the simulation data includes a conventional index evaluation and a depth index evaluation.
Preferably, the conventional index is evaluated to calculate the similarity between the real data and the simulated data based on the difference between the two-dimensional distance and the doppler velocity.
Preferably, the evaluation index of similarity at least comprises the distance between point clouds and the distance between Waserstein;
the distance between the point clouds is the normalized sum of the minimum Euclidean distances from the real point clouds to the simulated point clouds.
Preferably, the depth index evaluation includes randomly mixing the real data and radar model data to obtain a first data set; enhancing the first data set, and disturbing through random Gaussian noise based on the enhanced data set to obtain a second data set; and performing depth evaluation measurement on the point cloud data of the second data set based on a PointNet + + network model to obtain a measurement result.
Preferably, the point cloud data input into the PointNet + + network model at least includes two spatial coordinates and a doppler velocity.
Preferably, the depth index evaluation further includes sampling by a random repetition method under the condition of oversampling and by a drawing method under the condition of undersampling, so as to fix the number of input points of the point cloud.
Preferably, the depth evaluation metric performs depth index evaluation based on a prediction confidence score of a real radar point cloud class.
Preferably, the evaluation method further comprises the step of evaluating result standardization, which comprises the step of scaling the measurement result, then carrying out z-score standardization, and carrying out space numerical value mapping of fidelity evaluation by adjusting the importance coefficients of the traditional index and the depth index.
The invention discloses the following technical effects:
according to the method for evaluating the fidelity of the radar sensor model based on deep learning, provided by the invention, from the detection aspect, the fidelity of the radar model is evaluated through the traditional manually specified index and the implicit measurement index based on deep neural network learning. The method utilizes the deep neural network to learn the characteristics of the radar point cloud, and can comprehensively evaluate the authenticity of the radar model by combining with the traditional indexes. And the method can be applied to various radar sensor models, so that the effectiveness of the automatic driving virtual test method can be evaluated, and the method has high economic and social benefits.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
FIG. 1 is a flowchart of a method according to an embodiment of the present invention;
FIG. 2 is a diagram of a training process configuration according to an embodiment of the present invention;
FIG. 3 is a block diagram of a test process according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
With the development of the automatic driving test technology, the virtual test becomes more and more important. The radar sensor model is used as an important component of virtual testing, and the truth of the radar sensor model is decisive for the reliability of the virtual testing. In order to determine the reliability of the sensor model, it is necessary to detect the gap between the simulated data and the real data of the radar sensor model. At present, a plurality of radar sensor simulation methods exist, but the problem of verifying and quantitatively evaluating the overall fidelity of a radar model is not solved yet. Based on the method, a multi-level assessment method combining traditional indexes and hidden measurement evaluation based on deep learning is provided, and the method is used for carrying out overall quantitative assessment on the fidelity of the radar sensor model.
As shown in fig. 1, the present invention provides a method for evaluating the fidelity of a deep learning-based radar sensor model, comprising:
acquiring radar point cloud data, wherein the radar point cloud data comprises real data and simulation data;
carrying out similarity evaluation on the basis of the real data and the simulation data to obtain an evaluation result;
and realizing fidelity evaluation of the radar sensor model based on the evaluation result.
The acquiring of the real data may include,
real driving is carried out based on a test scene, real radar point cloud information is generated, and the real data are obtained;
the obtaining of the analog data may include,
simulating based on the real data to obtain the simulation data; or generating a virtual scene based on the radar sensor to obtain simulated radar point cloud information; and obtaining the simulation data based on the radar point cloud information.
And performing similarity evaluation based on the real data and the simulation data, wherein the similarity evaluation comprises traditional index evaluation and depth index evaluation.
The conventional index evaluation is to calculate the similarity of the real data and the simulated data based on the difference of two-dimensional distance and doppler velocity.
The evaluation index of the similarity at least comprises the distance between point clouds and the distance between Watherstan;
the distance between the point clouds is the normalized sum of the minimum Euclidean distances from the real point clouds to the simulated point clouds.
The depth index evaluation comprises the steps of randomly mixing the real data and radar model data to obtain a first data set; enhancing the first data set, and disturbing through random Gaussian noise based on the enhanced data set to obtain a second data set; and performing depth evaluation measurement on the point cloud data of the second data set based on a PointNet + + network model to obtain a measurement result.
The point cloud data input into the PointNet + + network model at least comprises two space coordinates and Doppler velocity.
The depth index evaluation further comprises the steps of sampling by adopting a random repetition method under the condition of oversampling and a drawing method under the condition of undersampling, so that the input point number of the point cloud is fixed.
The depth evaluation measurement is used for evaluating a depth index based on the prediction confidence score of the real radar point cloud class.
The evaluation method further comprises the step of standardizing the evaluation result, namely performing z-score standardization after the scaling of the measurement result, and performing space numerical value mapping of fidelity evaluation by adjusting the importance coefficients of the traditional index and the depth index.
Example one
Further, the method for evaluating the fidelity of the radar sensor model based on deep learning provided by the invention comprises the following steps:
the method comprises the following steps of firstly, selecting a test scene, such as a lane change cut-in scene of a front vehicle.
Step two: and carrying out real driving and generating real radar point cloud information.
Step three: and simulating driving and generating radar point cloud data by a simulation method to be evaluated.
Step four: and evaluating the traditional indexes of the detection layer.
And comparing the similarity of the two-dimensional distance and the Doppler velocity of real and simulated radar detection data, wherein two evaluation indexes are respectively the distance Dpp between point clouds and the Waserstein distance EMD.
According to the method, from the detection aspect, the fidelity of the radar model is evaluated through two indexes, namely a traditional index and a depth index.
Traditional index evaluation:
the traditional index evaluation calculates the similarity between a real radar point cloud and a radar point cloud generated by simulation through the difference of two components, namely a two-dimensional distance and a Doppler velocity, and mainly comprises two calculation measures.
The first metric is the normalized sum of the minimum Euclidean distances from the real point cloud to the simulated point cloud, expressed as Dpp (Point cloud to Point cloud distance). Real point cloud is represented by X ═ X1,x2,...,xM) The simulated point cloud is represented by (Y)1,y2,...,yN) Denotes xm,ynE R3 are all three-dimensional points. Since Dpp is asymmetric, it is normalized by dividing by the respective points and assuming the worst case.
Figure BDA0003462119250000071
The second metric, Wasserstein distance, also known as the Earth Mover Distance (EMD), is used to compare the point distributions of the real and simulated radar point clouds, determined by the optimal cost of rearranging one distribution to the other. In addition to the three-dimensional point clouds X and Y, m and n describe the number of points in a point set, the optimal stream f for the solution of the transmission problem between the two point cloud distributionsm,nExpressing that the Euclidean distance is selected as the ground distance dm,n. EMD naturally extends the notion of distance between single points to the distance between point distributions.
Figure BDA0003462119250000081
Step five: and training a PointNet + + network model, and evaluating from the depth index. The processing mode of the training data is as follows: and randomly mixing the real data and the data of various radar models, and enhancing the data set to avoid overfitting the models. And testing the point cloud data generated by the specific simulation method by using the trained classifier model, and taking the prediction confidence score of the 'real radar point cloud' class as Depth Evaluation Measurement (DEM).
And (3) evaluating the depth index:
because the traditional evaluation method relies on the manually defined index to evaluate the specific characteristics, some implicit characteristics are not considered, and the comprehensiveness of manually defining the index as the measurement cannot be guaranteed. To solve the problem of correct measurement, depth index evaluation classifies real and simulated radar data by training a neural network. Compared with the traditional index evaluation, the method aims to learn and distinguish the potential features of real and simulated radar point clouds. The metric is the prediction confidence score of the "real radar point cloud" class of the classifier. The network structure uses a hierarchical neural network PointNet + + structure that can directly use point clouds and learn local features.
Data acquisition, in order to simplify verification, the invention only considers radar monitoring around the target vehicle, namely, sensor data is recorded in an empty test scene, and real driving conditions are reproduced in simulation to generate simulated radar data. When the radar model contains random components to approximate the true real time, the influence can be reduced by carrying out multiple times of simulation and averaging. The resulting data set is comparatively balanced between the real point cloud and the simulated point cloud. The radar point cloud input to the network contains two spatial coordinates and Doppler velocity (Doppler velocity).
And training a PointNet + + network model, and evaluating from the depth index. The processing mode of the training data is as follows: and randomly mixing the real data and the data of various radar models, and enhancing the data set to avoid overfitting the models. And testing the point cloud data generated by the specific simulation method by using the trained classifier model, and taking the prediction confidence score of the 'real radar point cloud' class as Depth Evaluation Measurement (DEM).
Training and testing. In the process of training the model, firstly, real data and data of various radar models need to be randomly mixed. The data set then needs to be enhanced to avoid model overfitting. The data set was perturbed using random gaussian noise with a mean of 0 and a standard deviation of 0.1, changing the spatial coordinates and doppler velocity. Meanwhile, in order to ensure that the number of input points of each point cloud is fixed, a random repetition method is adopted under the condition of oversampling, and a drawing method is adopted under the condition of undersampling for sampling. The entire data set was randomly divided into 7: training and testing set of 3 scales. And classifying the simulated radar data by using a classifier obtained by training, wherein the prediction confidence score of the real radar point cloud class is Depth Evaluation Metric (DEM). The process structure of training and testing is shown in fig. 2-3.
Step six: the results of the measurements are normalized. The directly calculated traditional index and the depth index have different units and need to be scaled, the result is normalized by z-score, and then the numerical value is mapped in a space of 0-1, and the importance coefficients of the traditional index and the depth index can be adjusted according to the needs.
According to the method for evaluating the fidelity of the radar sensor model based on deep learning, provided by the invention, from the detection aspect, the fidelity of the radar model is evaluated through the traditional manually specified index and the implicit measurement index based on deep neural network learning. The method utilizes the deep neural network to learn the characteristics of the radar point cloud, and can comprehensively evaluate the authenticity of the radar model by combining with the traditional indexes. And the method can be applied to various radar sensor models, so that the effectiveness of the automatic driving virtual test method can be evaluated, and the method has high economic and social benefits.
The above-described embodiments are merely illustrative of the preferred embodiments of the present invention, and do not limit the scope of the present invention, and various modifications and improvements of the technical solutions of the present invention can be made by those skilled in the art without departing from the spirit of the present invention, and the technical solutions of the present invention are within the scope of the present invention defined by the claims.

Claims (10)

1. A method for evaluating the fidelity of a radar sensor model based on deep learning is characterized by comprising the following steps:
acquiring radar point cloud data, wherein the radar point cloud data comprises real data and simulation data;
carrying out similarity evaluation on the basis of the real data and the simulation data to obtain an evaluation result;
and realizing fidelity evaluation of the radar sensor model based on the evaluation result.
2. The deep learning-based radar sensor model fidelity assessment method according to claim 1,
the acquiring of the real data may include,
real driving is carried out based on a test scene, real radar point cloud information is generated, and the real data are obtained;
the obtaining of the analog data may include,
simulating based on the real data to obtain the simulation data; or generating a virtual scene based on the radar sensor to obtain simulated radar point cloud information; and obtaining the simulation data based on the radar point cloud information.
3. The deep learning-based radar sensor model fidelity assessment method according to claim 1,
and performing similarity evaluation based on the real data and the simulation data, wherein the similarity evaluation comprises traditional index evaluation and depth index evaluation.
4. The deep learning based radar sensor model fidelity assessment method of claim 3,
the conventional index evaluation is to calculate the similarity of the real data and the simulated data based on the difference of two-dimensional distance and doppler velocity.
5. The deep learning-based radar sensor model fidelity assessment method according to claim 4, wherein the assessment indicators of similarity comprise at least a distance between point clouds, a Watherstan distance;
the distance between the point clouds is the normalized sum of the minimum Euclidean distances from the real point clouds to the simulated point clouds.
6. The deep learning based radar sensor model fidelity assessment method of claim 3,
the depth index evaluation comprises the steps of randomly mixing the real data and radar model data to obtain a first data set; enhancing the first data set, and disturbing through random Gaussian noise based on the enhanced data set to obtain a second data set; and performing depth evaluation measurement on the point cloud data of the second data set based on a PointNet + + network model to obtain a measurement result.
7. The deep learning-based radar sensor model fidelity assessment method according to claim 6, wherein the point cloud data inputted into the PointNet + + network model comprises at least two spatial coordinates and Doppler velocity.
8. The method for evaluating the fidelity of the deep learning-based radar sensor model according to claim 3, wherein the depth index evaluation further comprises a random repetition method in the case of oversampling and a drawing method in the case of undersampling to fix the number of point clouds.
9. The method of claim 6, wherein the depth evaluation metric performs depth index evaluation based on a prediction confidence score of a cloud of true radar points.
10. The method for evaluating the fidelity of a deep learning-based radar sensor model according to claim 1, further comprising an evaluation result normalization step, wherein the evaluation result normalization step comprises a z-score normalization after scaling the measurement result, and a space value mapping step of fidelity evaluation is performed by adjusting the importance coefficients of the traditional index and the depth index.
CN202210020097.7A 2022-01-10 2022-01-10 Assessment method for radar sensor model fidelity based on deep learning Active CN114384483B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210020097.7A CN114384483B (en) 2022-01-10 2022-01-10 Assessment method for radar sensor model fidelity based on deep learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210020097.7A CN114384483B (en) 2022-01-10 2022-01-10 Assessment method for radar sensor model fidelity based on deep learning

Publications (2)

Publication Number Publication Date
CN114384483A true CN114384483A (en) 2022-04-22
CN114384483B CN114384483B (en) 2024-07-02

Family

ID=81199676

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210020097.7A Active CN114384483B (en) 2022-01-10 2022-01-10 Assessment method for radar sensor model fidelity based on deep learning

Country Status (1)

Country Link
CN (1) CN114384483B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170307732A1 (en) * 2015-10-22 2017-10-26 Uniquesec Ab System for generating virtual radar signatures
CN108107413A (en) * 2018-01-09 2018-06-01 中国空空导弹研究院 A kind of radar simulator calibration system
CN110246112A (en) * 2019-01-21 2019-09-17 厦门大学 Three-dimensional point cloud quality evaluating method in the room laser scanning SLAM based on deep learning

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170307732A1 (en) * 2015-10-22 2017-10-26 Uniquesec Ab System for generating virtual radar signatures
CN108107413A (en) * 2018-01-09 2018-06-01 中国空空导弹研究院 A kind of radar simulator calibration system
CN110246112A (en) * 2019-01-21 2019-09-17 厦门大学 Three-dimensional point cloud quality evaluating method in the room laser scanning SLAM based on deep learning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
曹聪聪;: "复杂电磁环境下雷达脉冲描述字的仿真技术研究", 信息通信, no. 01, 15 January 2020 (2020-01-15) *

Also Published As

Publication number Publication date
CN114384483B (en) 2024-07-02

Similar Documents

Publication Publication Date Title
Hanke et al. Generation and validation of virtual point cloud data for automated driving systems
CN109284280A (en) Emulate data optimization methods, device and storage medium
CN111722199A (en) Radar signal detection method based on convolutional neural network
CN103020978A (en) SAR (synthetic aperture radar) image change detection method combining multi-threshold segmentation with fuzzy clustering
CN104318593A (en) Simulation method and system of radar sea clusters
CN109446894A (en) The multispectral image change detecting method clustered based on probabilistic segmentation and Gaussian Mixture
Ngo et al. A multi-layered approach for measuring the simulation-to-reality gap of radar perception for autonomous driving
Huang et al. Wave height estimation from X-band nautical radar images using temporal convolutional network
CN108399430A (en) A kind of SAR image Ship Target Detection method based on super-pixel and random forest
CN115731350A (en) Simulation method and device for virtual laser radar of vehicle
Rajasekaran et al. PTRM: Perceived terrain realism metric
Jasiński A generic validation scheme for real-time capable automotive radar sensor models integrated into an autonomous driving simulator
CN115290596A (en) FCN-ACGAN data enhancement-based hidden dangerous goods identification method and equipment
Li et al. Automotive radar modeling for virtual simulation based on mixture density network
CN114384547A (en) Radar sensor model-based fidelity detection evaluation method and system
CN114298299A (en) Model training method, device, equipment and storage medium based on course learning
CN113780346A (en) Method and system for adjusting prior constraint classifier and readable storage medium
Müller et al. Robustness evaluation and improvement for vision-based advanced driver assistance systems
Ngo et al. Deep evaluation metric: Learning to evaluate simulated radar point clouds for virtual testing of autonomous driving
CN110824478B (en) Automatic classification method and device for precipitation cloud types based on diversified 3D radar echo characteristics
Zhang et al. FRS-Net: An efficient ship detection network for thin-cloud and FOG-covered high-resolution optical satellite imagery
CN114384483B (en) Assessment method for radar sensor model fidelity based on deep learning
KR102558609B1 (en) Method for evaluating wind speed patterns to ensure structural integrity of buildings, and computing apparatus for performing the method
Ngo A methodology for validation of a radar simulation for virtual testing of autonomous driving
CN113806920B (en) Unmanned aerial vehicle cluster electromagnetic scattering simulation method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant