CN113128315A - Sensor model performance evaluation method, device, equipment and storage medium - Google Patents

Sensor model performance evaluation method, device, equipment and storage medium Download PDF

Info

Publication number
CN113128315A
CN113128315A CN202010041616.9A CN202010041616A CN113128315A CN 113128315 A CN113128315 A CN 113128315A CN 202010041616 A CN202010041616 A CN 202010041616A CN 113128315 A CN113128315 A CN 113128315A
Authority
CN
China
Prior art keywords
data
physical model
performance evaluation
performance
algorithm
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010041616.9A
Other languages
Chinese (zh)
Inventor
侯皓阳
雷文辉
陈兴国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bayerische Motoren Werke AG
Original Assignee
Bayerische Motoren Werke AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bayerische Motoren Werke AG filed Critical Bayerische Motoren Werke AG
Priority to CN202010041616.9A priority Critical patent/CN113128315A/en
Publication of CN113128315A publication Critical patent/CN113128315A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/411Identification of targets based on measurements of radar reflectivity
    • G01S7/412Identification of targets based on measurements of radar reflectivity based on a comparison between measured values and known or stored values
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N17/00Diagnosis, testing or measuring for television systems or their details
    • H04N17/002Diagnosis, testing or measuring for television systems or their details for television cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Multimedia (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Signal Processing (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application discloses a method, a device, equipment and a storage medium for evaluating the performance of a sensor model, wherein the method comprises the following steps: acquiring original data output by a physical model of a sensor; respectively inputting the original data into at least two algorithms so that each algorithm respectively analyzes and processes the original data; acquiring environment sensing result data respectively output by at least two algorithms; and determining performance evaluation data of the physical model according to the acquired environment sensing result data and the acquired environment truth value data. In the application, the performance of the physical model is evaluated by utilizing the environmental perception result data output by at least two algorithms, compared with the performance of the physical model by utilizing the environmental perception result data output by one algorithm, the influence on the performance evaluation of the physical model caused by the self reason of the algorithm can be avoided as much as possible, so that the performance evaluation data can reflect the performance of the physical model more truly, and further the performance of the physical model is effectively evaluated.

Description

Sensor model performance evaluation method, device, equipment and storage medium
Technical Field
The embodiment of the invention relates to a simulation technology, in particular to a method, a device, equipment and a storage medium for evaluating the performance of a sensor model.
Background
Through daily reconnaissance, technical research and life practices, it is found that a test of Fully Automated Driving (FAD) using simulation technology is a time-saving and economical method.
Simulation systems generally consist of a simulation environment, a sensor model, etc. The sensor model is typically composed of a physical model and corresponding algorithms for data analysis processing. The physical model processes the environment data input by the simulator and outputs original data, and the algorithm analyzes and processes the original data and outputs environment perception result data. For example, the sensor model may be a camera model, in which the physical model outputs image data, and the algorithm performs entity object recognition on the image data and outputs a recognized entity object list; for another example, the sensor model may also be a laser radar model, the physical model of the laser radar outputs point cloud data, and the algorithm identifies the point cloud data and then outputs an identified entity object list. The entity object list includes entity objects existing in the simulation environment.
In order to confirm the accuracy of the simulation system, the performance of the physical model in the sensor model needs to be effectively evaluated, and how to evaluate is a problem to be solved at present.
Disclosure of Invention
The invention provides a sensor model performance evaluation method, a sensor model performance evaluation device and a storage medium, which are used for effectively evaluating the performance of a physical model in a sensor model.
In a first aspect, an embodiment of the present invention provides a method for evaluating performance of a sensor model, including:
acquiring original data output by a physical model of a sensor;
inputting the original data into at least two algorithms respectively so that each algorithm analyzes and processes the original data respectively;
acquiring environment perception result data respectively output by the at least two algorithms;
and determining performance evaluation data of the physical model according to the acquired environment perception result data and the acquired environment truth value data.
One embodiment in the above application has the following advantages or benefits: the performance of the physical model can be effectively evaluated by analyzing and processing the original data output by the physical model of the sensor by using at least two algorithms and according to the environment perception result data output by each algorithm. Because the performance of the physical model is evaluated by using the environment perception result data output by at least two algorithms, compared with the performance of the physical model by using the environment perception result data output by one algorithm, the influence on the performance evaluation of the physical model caused by the self-reason of the algorithm can be avoided as much as possible, so that the performance evaluation data can reflect the performance of the physical model more truly, and further the effective evaluation of the performance of the physical model is realized. The scheme of the embodiment is particularly suitable for effectively evaluating the performance of the physical model of the sensor when the data analysis processing algorithm adopted by the producer of the sensor hardware cannot be obtained, so that the real performance condition of the physical model is evaluated.
Optionally, determining performance evaluation data of the physical model according to the obtained environmental perception result data and the environmental truth value data includes:
determining correct environment sensing result data in the environment sensing result data output by each algorithm by comparing the environment sensing result data output by each algorithm with environment true value data;
and determining the performance evaluation data of the physical model according to the correct environment perception result data and the environment truth value data.
One embodiment in the above application has the following advantages or benefits: the correct environmental perception result data output by each algorithm can be determined by comparing the environmental perception result data output by each algorithm with the environmental truth value data, so that the performance of the physical model is evaluated according to the correct environmental perception result data output by each algorithm, and a more accurate performance evaluation result can be obtained.
Optionally, determining performance evaluation data of the physical model according to the correct environmental perception result data and the environmental truth value data includes:
determining the intersection of the correct environment sensing result data output by each algorithm;
and determining the proportion data of the environmental perception result data in the intersection to the environmental truth value data as the performance evaluation data of the physical model.
One embodiment in the above application has the following advantages or benefits: the method comprises the steps of obtaining correct environment sensing results which can be obtained by a physical model under any algorithm by determining intersection of correct environment sensing result data output by each algorithm, determining the ratio data of the environment sensing result data in the intersection to environment truth value data as performance evaluation data of the physical model, wherein the performance evaluation data represent that the physical model is similar to the physical model which can ensure the provided lowest performance under any algorithm condition, and further performing the same evaluation on different physical models to screen out the physical model which has the best performance without considering algorithm factors.
Optionally, determining performance evaluation data of the physical model according to the correct environmental perception result data and the environmental truth value data includes:
determining the correct environmental perception result data output by each algorithm respectively, wherein the correct environmental perception result data account for the proportion data of the environmental truth value data;
and determining performance evaluation data of the physical model according to the proportional data.
One embodiment in the above application has the following advantages or benefits: the correct environmental perception result data output by each algorithm is determined respectively, the ratio data occupying the environmental truth value data is determined, the performance of the physical model is evaluated by utilizing the ratio data, and a more accurate performance evaluation result can be obtained.
Optionally, determining performance evaluation data of the physical model according to each of the ratio data includes:
determining the average value of the proportional data as the performance evaluation data of the physical model; alternatively, the first and second electrodes may be,
and determining the minimum proportion data in the proportion data as the performance evaluation data of the physical model.
One embodiment in the above application has the following advantages or benefits: the average value of the proportional data is determined as the performance evaluation data of the physical model, the influence of different algorithms on the performance of the physical model is considered, and the average performance of the physical model under any algorithm can be measured based on the performance evaluation data. Or, the minimum proportion data in each proportion data is determined as the performance evaluation data of the physical model, and the minimum performance of the physical model under any algorithm can be measured based on the performance evaluation data, so that the performance evaluation of the physical model is prevented from being influenced by the algorithm, and the performance evaluation data has higher reference value.
Optionally, the method further comprises:
determining the similarity between the environmental perception result data respectively output by each algorithm by comparing the environmental perception result data respectively output by each algorithm;
and sending prompt information containing the similarity so as to enable a user to determine the confidence of the performance evaluation data according to the similarity.
One embodiment in the above application has the following advantages or benefits: by determining the similarity among the environment perception result data output by various algorithms and sending prompt information containing the similarity to the user, the user can be prompted to determine the confidence of the performance evaluation data based on the similarity, so that the user can better know the reference value degree of the performance evaluation data.
Optionally, after determining the performance assessment data of the physical model, the method further comprises:
acquiring performance evaluation data of the vehicle at least one driving performance angle when the vehicle carrying the physical model is subjected to a driving test;
determining driving performance evaluation data of the vehicle based on the performance evaluation data of the physical model and the performance evaluation data of the at least one drivability angle.
One embodiment in the above application has the following advantages or benefits: the driving test is carried out on the vehicle loaded with the physical model, so that the performance evaluation data of the vehicle at least one driving performance angle is obtained, and the comprehensive evaluation of the driving performance of the vehicle based on the performance evaluation data of the physical model and the performance evaluation data of each driving performance angle is realized.
Optionally, determining driving performance evaluation data of the vehicle based on the performance evaluation data of the physical model and the performance evaluation data of the at least one drivability angle comprises:
and determining driving performance evaluation data of the vehicle according to the performance evaluation data of the physical model, the weight corresponding to the performance evaluation data of the physical model, the performance evaluation data of each driving performance angle and the weight corresponding to the performance evaluation data of each driving performance angle.
One embodiment in the above application has the following advantages or benefits: by adopting the scheme, the performance evaluation data of the physical model and the performance evaluation data of each drivability angle can be adjusted by utilizing the weight of the performance evaluation data of the physical model and the weight of the performance evaluation data of each drivability angle, so that the influence degree on the drivability of the vehicle is determined, and the driving performance condition of the vehicle can be determined more accurately.
Optionally, the physical model of the sensor is a physical model of a camera, the raw data is image data, the environment sensing result data is a list of entity objects sensed by the model, and the environment truth value data is a list of entity objects actually existing in the environment; alternatively, the first and second electrodes may be,
the physical model of the sensor is a physical model of a radar, the original data is point cloud data, the environment perception result data is a list of entity objects perceived by the model, and the environment truth value data is a list of entity objects really existing in the environment.
One embodiment in the above application has the following advantages or benefits: the embodiment can effectively evaluate the performance of the physical model of the camera or the physical model of the radar.
In a second aspect, an embodiment of the present invention further provides a sensor model performance evaluation apparatus, including:
the raw data acquisition module is used for acquiring raw data output by a physical model of the sensor;
the original data analysis processing module is used for inputting the original data into at least two algorithms respectively so that each algorithm can analyze and process the original data respectively;
the environment perception result data acquisition module is used for acquiring environment perception result data respectively output by the at least two algorithms;
and the performance evaluation data determining module is used for determining the performance evaluation data of the physical model according to the acquired environment sensing result data and the acquired environment truth value data.
In a third aspect, an embodiment of the present invention further provides an apparatus, where the apparatus includes:
one or more processors;
a memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement a sensor model performance evaluation method as provided by any of the embodiments of the invention.
In a fourth aspect, embodiments of the present invention further provide a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the sensor model performance evaluation method provided in any of the embodiments of the present invention.
Drawings
FIG. 1 is a flow chart of a method for evaluating performance of a sensor model according to an embodiment of the present invention;
FIG. 2 is an example of a sensor model performance evaluation process according to an embodiment of the present invention;
FIG. 3 is a flowchart of a method for evaluating performance of a sensor model according to a second embodiment of the present invention;
FIG. 4 is a flowchart of a method for evaluating performance of a sensor model according to a third embodiment of the present invention;
FIG. 5 is a flowchart of a method for evaluating performance of a sensor model according to a fourth embodiment of the present invention;
fig. 6 is a flowchart of a sensor model performance evaluation method according to a fifth embodiment of the present invention;
fig. 7 is a schematic structural diagram of a sensor model performance evaluation apparatus according to a sixth embodiment of the present invention;
fig. 8 is a schematic structural diagram of an apparatus according to a seventh embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
Example one
Fig. 1 is a flowchart of a sensor model performance evaluation method according to an embodiment of the present invention, where the embodiment is applicable to a situation where a physical model in a sensor model is subjected to performance evaluation, and the method may be executed by a sensor model performance evaluation device, which may be implemented by software and/or hardware and integrated in a device with a simulation function.
Before introducing the performance evaluation method of the sensor model provided by the embodiment of the invention, an application scenario of the embodiment of the invention is briefly described. The simulation system may be comprised of a simulation environment and a sensor model. The sensor model may be composed of a physical model and an algorithm for data analysis processing. Wherein, the environment simulation is realized by a simulator. The physical model in the sensor model can simulate the function of the sensor hardware through simulation so as to restore the physical parameters of the sensor hardware. Generally, an algorithm for data analysis and processing corresponding to sensor hardware is designed by a manufacturer of the sensor hardware, and the algorithm can be converted into a logic circuit to be integrated in the sensor hardware, so that a source code of the algorithm designed by the manufacturer cannot be obtained, and the algorithm designed by the sensor manufacturer cannot be restored through simulation in a simulation system. For this reason, the embodiment can effectively evaluate the performance of the physical model of the sensor especially when the algorithm inside the sensor hardware cannot be restored in the simulation system, so as to evaluate the real performance condition of the physical model of the sensor.
As shown in fig. 1, the method for evaluating the performance of the sensor model provided in this embodiment specifically includes the following steps:
and S110, acquiring raw data output by a physical model of the sensor.
Wherein, a sensor may refer to a hardware device for sensing surrounding environment information. For example, the sensor may be, but is not limited to, a camera and a radar. The radar may include, but is not limited to, laser radar and millimeter wave radar, among others. Lidar may include rotary lidar and solid state lidar. The cameras in the camera may include a tele camera and a wide camera. The long-focus camera has long focal length and small visual angle, and is suitable for shooting images of objects at a distance. The wide-angle camera has short focal length, wide visual angle and deep depth of field, and is suitable for shooting images of large scenes. The physical model of the sensor may be a hardware virtual model based on simulation of physical parameters of the sensor hardware, for simulating the hardware function of the actual sensor. The raw data may be raw data obtained by processing environmental data input by the simulator with a physical model. For example, when the physical model of the sensor is that of a camera, the raw data output may be image data. And when the physical model of the sensor is the physical model of the radar, the output original data is point cloud data. For example, the point cloud data may include a set of position vectors in a three-dimensional coordinate system, which may also include RGB (Red, Green, Blue) color information, gray value information, depth information, and the like for each position point.
And S120, respectively inputting the original data into at least two algorithms so that each algorithm respectively analyzes and processes the original data.
The algorithm may be a general algorithm for analyzing and processing the environmental perception data. For example, the algorithm may refer to a common target detection type algorithm to detect and identify various object information in the surrounding environment. For example, the target detection-class algorithm may be an object recognition algorithm based on a Neural network model, such as a Region-Convolutional Neural network model (R-CNN) algorithm, a fast Region-Convolutional Neural network model (fast R-CNN) algorithm, a Single-Shot multi-box Detector (SSD) algorithm, a yolo (you Only Look one) algorithm, a RetinaNet algorithm, a point cloud-based end-to-end 3D object detection network model (VoxelNet) algorithm, a deep learning-based 3D point cloud classification and segmentation model (FPointNet) algorithm, and a FPGA-based Constant False Alarm (CFAR) algorithm.
Specifically, at least two matched algorithms for analyzing and processing the environmental perception data can be selected from the existing general algorithms based on the business requirements and the categories of the sensors, and each algorithm is used for analyzing and processing the raw data output by the physical model, so that the existing at least two known algorithms can be used for replacing the real algorithms in the sensors.
It should be noted that, since the utilized algorithm is not the true algorithm in the sensor, there may be a situation that the performance of the physical model is low due to the reason of the algorithm itself rather than the reason of the physical model, and therefore the performance of the physical model cannot be accurately evaluated, so that at least two algorithms need to be utilized to analyze the raw data output by the physical model, and compared with the method that one algorithm is utilized to analyze the raw data output by the physical model, the method can avoid the influence on the performance evaluation of the physical model due to the reason of the algorithm itself as much as possible, thereby ensuring the accuracy of the performance evaluation of the physical model.
And S130, acquiring environment perception result data respectively output by at least two algorithms.
The environmental sensing result data may be an environmental sensing result obtained by processing environmental data input by the simulator by the physical model of the sensor. For example, when the physical model of the sensor is a physical model of a camera or a physical model of a radar, the output environment perception result data may be a list of physical objects perceived by the model. The physical object may refer to any object that actually exists in the simulation environment, such as a human body, a vehicle, a tree, and the like. The list of entity objects may include information of all entity objects perceived by the model stored in a tabular manner.
Specifically, each algorithm may output environment sensing result data after analyzing and processing raw data output by the physical model. It should be noted that the environmental perception result data output by different algorithms may be the same or different. For example, the context awareness result data output by an algorithm includes: pedestrians, vehicles, and street lights; the environment perception result data output by the other algorithm comprises: pedestrians and vehicles. When the similarity between the environmental perception result data output by different algorithms is higher, the perception capability difference of the physical model under different algorithms is smaller, so that the output environmental perception result data can reflect the performance condition of the physical model more truly, and the performance evaluation is more accurate.
And S140, determining performance evaluation data of the physical model according to the acquired environmental perception result data and the acquired environmental truth value data.
The environment truth data may include, among other things, information of all objects that are actually present in the simulated environment. For example, when the physical model of the sensor is a physical model of a camera or a physical model of a radar, the environment truth data may be a list of physical objects that are actually present in the environment. The environment truth data may include information of all entity objects that actually exist in the environment stored in a tabular manner. The performance evaluation data may refer to an index for evaluating the performance of the physical model. For example, the performance assessment data may include a perceived accuracy of the physical model, and the like.
Specifically, in this embodiment, based on the environment truth data, the environmental sensing result data output by various algorithms may be contrastingly analyzed to determine the performance of the physical model under each algorithm, and based on the performance under various algorithms, the performance evaluation data of the physical model under any algorithm may be comprehensively determined, so that the influence of the selected algorithm on the performance of the physical model may be avoided, the performance of the physical model represented by the performance evaluation data is closer to the real performance condition of the physical model, and further, when the algorithm inside the sensor cannot be restored in the simulation system, the performance evaluation may be performed on the physical model of the sensor by using at least two general algorithms, thereby ensuring the accuracy of the performance evaluation of the physical model.
FIG. 2 shows an example of a sensor model performance evaluation process. As shown in fig. 2, the sensor model performance evaluation process may be: after physical models in a simulation environment and a sensor model are built, environment data can be input into the physical models through a simulator to obtain original data output by the physical models, the original data are respectively input into at least two pre-selected universal algorithms, each algorithm analyzes and processes the original data, and corresponding environment perception result data are output. And determining the evaluation indexes according to the environment truth value data corresponding to the simulation environment and the environment perception result data, so that the performance evaluation data of the physical model can be obtained, the effective evaluation of the performance of the physical model is realized, and the physical model with better performance can be accurately selected.
According to the technical scheme of the embodiment, the original data output by the physical model of the sensor are respectively analyzed and processed by utilizing at least two algorithms, and the performance of the physical model can be effectively evaluated according to the environment perception result data output by each algorithm. Because the performance of the physical model is evaluated by using the environment perception result data output by at least two algorithms, compared with the performance of the physical model by using the environment perception result data output by one algorithm, the influence on the performance evaluation of the physical model caused by the self-reason of the algorithm can be avoided as much as possible, so that the performance evaluation data can reflect the performance of the physical model more truly, and further the effective evaluation of the performance of the physical model is realized. The scheme of the embodiment is particularly suitable for effectively evaluating the performance of the physical model of the sensor when the data analysis processing algorithm adopted by the producer of the sensor hardware cannot be obtained, so that the real performance condition of the physical model is evaluated.
On the basis of the above technical solution, when determining the performance evaluation data of the physical model or after determining the performance evaluation data of the physical model in S140, the method may further include: determining the similarity between the environmental perception result data respectively output by each algorithm by comparing the environmental perception result data respectively output by each algorithm; and sending prompt information containing the similarity so that the user can determine the confidence of the performance evaluation data according to the similarity.
The similarity may refer to a degree of similarity between environment perception result data output by various algorithms, and may be used to reflect a reference value degree of the obtained performance evaluation data. For example, the higher the similarity between the environmental perception result data corresponding to each algorithm is, the smaller the influence of the reason of the algorithm on each output environmental perception result data is, and at this time, the more the performance evaluation data can reflect the real performance condition of the physical model, so that the higher the reference value degree of the performance evaluation data is. The confidence level may refer to the true degree of the performance of the physical model that may be reflected in the performance evaluation data of the determined physical model. The higher the degree of similarity between the environment perception result data output by each algorithm, the higher the confidence of the performance evaluation data.
Specifically, the similarity between the environmental perception result data respectively output by the various algorithms may be determined based on a similarity calculation algorithm. The similarity calculation algorithm may include, but is not limited to, an euclidean distance algorithm, a manhattan distance algorithm, and a cosine similarity algorithm. The embodiment can also prompt the user to determine the confidence of the performance evaluation data of the physical model based on the similarity by sending prompt information containing the similarity to the user, so that the user can better know the reference value degree of the performance evaluation data, and the user evaluation experience is improved.
Example two
Fig. 3 is a flowchart of a sensor model performance evaluation method according to a second embodiment of the present invention, where "determining performance evaluation data of a physical model according to acquired environmental sensing result data and environmental truth data" is optimized in this embodiment based on the above embodiments. Wherein explanations of the same or corresponding terms as those of the above-described embodiments are omitted.
Referring to fig. 3, the method for evaluating the performance of the sensor model provided by the embodiment specifically includes the following steps:
and S210, acquiring raw data output by a physical model of the sensor.
And S220, respectively inputting the original data into at least two algorithms so that each algorithm respectively analyzes and processes the original data.
And S230, acquiring environment perception result data respectively output by at least two algorithms.
S240, determining correct environment sensing result data in the environment sensing result data output by each algorithm by comparing the environment sensing result data output by each algorithm with the environment true value data.
Specifically, for the environment sensing result data output by each algorithm, the environment sensing result data is compared with the environment true value data, and if some subdata in the environment sensing result data is included in the environment true value data, it can be determined that the subdata is correct environment sensing result data, otherwise, the subdata is wrong environment sensing result data, so that all correct environment sensing result data in the environment sensing result data output by each algorithm can be determined. For example, the environment truth data includes: pedestrian, vehicle and street lamp, the environmental perception result data of certain algorithm output include: pedestrians, vehicles and trees, it can be determined that the correct environment sensing result data output by the algorithm comprises the pedestrians and the vehicles. Since each algorithm aims at the same simulation environment, the sensing accuracy of the physical model under each algorithm can be measured based on the number of correct environment sensing result data output by the algorithm. For example, if the larger the number of correct environment perception result data output by the algorithm, the higher the perception accuracy of the physical model under the algorithm.
And S250, determining performance evaluation data of the physical model according to the correct environment sensing result data and the environment truth value data.
Specifically, the performance evaluation data of the physical model can be comprehensively evaluated by outputting correct environment perception result data and environment truth value data according to each algorithm, so that the performance of the physical model represented by the performance evaluation data is closer to the real performance condition of the physical model, and the accuracy of the performance evaluation result is improved.
According to the technical scheme of the embodiment, the correct environmental perception result data output by each algorithm can be determined by comparing the environmental perception result data output by each algorithm with the environmental truth value data, so that the performance of the physical model is evaluated according to the correct environmental perception result data output by each algorithm, and a more accurate performance evaluation result can be obtained.
EXAMPLE III
Fig. 4 is a flowchart of a sensor model performance evaluation method according to a third embodiment of the present invention, and in this embodiment, based on the second embodiment, optimization is performed on "determining performance evaluation data of a physical model according to correct environmental sensing result data and environmental truth value data". Wherein explanations of the same or corresponding terms as those of the above embodiments are omitted.
Referring to fig. 4, the method for evaluating the performance of the sensor model provided by the embodiment specifically includes the following steps:
and S310, acquiring raw data output by the physical model of the sensor.
And S320, respectively inputting the original data into at least two algorithms so that each algorithm respectively analyzes and processes the original data.
And S330, acquiring environment perception result data respectively output by at least two algorithms.
S340, determining correct environment sensing result data in the environment sensing result data output by each algorithm by comparing the environment sensing result data output by each algorithm with the environment truth value data.
And S350, determining the intersection of the correct environment perception result data output by each algorithm.
Specifically, the correct environmental perception result data which can be identified by each algorithm can be determined by comparing the correct environmental perception result data output by each algorithm, so that the intersection of the correct environmental perception result data output by each algorithm is obtained. For example, correct context awareness result data output by an algorithm includes: pedestrians, vehicles, and street lights; the correct environmental perception result data output by another algorithm comprises: pedestrians, vehicles and trees, it can be determined that the intersection of the correct environment sensing result data output by the two algorithms comprises the pedestrians and the vehicles.
And S360, determining the proportion data of the environmental perception result data in the intersection to the environmental truth value data as the performance evaluation data of the physical model.
Specifically, the environmental perception result data in the intersection may refer to an environmental perception result that the physical model can correctly recognize under any algorithm. The proportion data of the environmental perception result data in the intersection to the environmental truth value data can be used for reflecting the lowest perception accuracy of the physical model which can be guaranteed under any algorithm, and the influence on the performance of the physical model caused by the self reason of the algorithm can be avoided by taking the proportion data as the performance evaluation index of the physical model. If the ratio data is larger, the lowest perception accuracy which can be achieved by the physical model is higher, namely the performance of the physical model is better. For example, by evaluating different physical models of the sensor, the lowest sensing accuracy corresponding to each physical model is obtained, so that the physical model with the best physical performance can be accurately screened out without considering the algorithm factors based on each lowest sensing accuracy, for example, the physical model corresponding to the maximum value of each lowest sensing accuracy is taken as the physical model with the best physical performance.
According to the technical scheme of the embodiment, the correct environment sensing result which can be obtained by the physical model under any algorithm can be obtained by determining the intersection of the correct environment sensing result data output by each algorithm, the ratio data of the environment sensing result data in the intersection to the environment truth value data is determined as the performance evaluation data of the physical model, the performance evaluation data represents that the physical model is similar to the lowest performance which can be ensured to be provided under any algorithm condition, and further the same evaluation can be carried out on different physical models so as to screen out the physical model which has the best performance without considering algorithm factors.
Example four
Fig. 5 is a flowchart of a sensor model performance evaluation method according to a fourth embodiment of the present invention, where the embodiment optimizes "determining performance evaluation data of a physical model according to correct environmental sensing result data and environmental truth value data" based on the second embodiment. Wherein explanations of the same or corresponding terms as those of the above embodiments are omitted.
Referring to fig. 5, the method for evaluating the performance of the sensor model provided by the embodiment specifically includes the following steps:
and S410, acquiring raw data output by a physical model of the sensor.
And S420, respectively inputting the original data into at least two algorithms so that each algorithm respectively analyzes and processes the original data.
And S430, acquiring environment perception result data respectively output by at least two algorithms.
S440, determining correct environment sensing result data in the environment sensing result data output by each algorithm by comparing the environment sensing result data output by each algorithm with the environment truth value data.
S450, determining the correct environmental perception result data output by each algorithm respectively to be proportional data of the environmental truth value data.
Specifically, for each algorithm, the proportion data of the output correct environmental perception result data to the environmental truth value data can be determined. For example, a first number of environment truth data and a second number of correct environment sensing result data output by each algorithm may be obtained, and a ratio of the second number to the first number may be used as the proportional data corresponding to the corresponding algorithm. The proportional data corresponding to each algorithm may reflect the perceived accuracy that the physical model may achieve under each algorithm. For example, the environment truth data includes: pedestrian, vehicle, trees and street lamp, the correct environmental perception result data of certain algorithm output include: pedestrians and vehicles can determine that the correct environment sensing result data output by the algorithm accounts for 50% of the proportion data of the environment true value data.
And S460, determining performance evaluation data of the physical model according to the proportional data.
Specifically, the performance evaluation data of the physical model can be comprehensively determined according to the proportional data corresponding to each algorithm, so that the performance of the physical model represented by the performance evaluation data is closer to the real performance condition of the physical model, and the accuracy of the performance evaluation result is improved.
Exemplarily, S460 may include: determining performance evaluation data of the physical model according to the proportional data, wherein the performance evaluation data comprises the following steps: determining the average value of the proportional data as the performance evaluation data of the physical model; or, the minimum proportion data in the proportion data is determined as the performance evaluation data of the physical model.
Specifically, the average value of each proportion data is determined as the performance evaluation data of the physical model, the influence of different algorithms on the performance of the physical model is considered, and the average performance of the physical model under any algorithm can be measured based on the performance evaluation data. For example: the proportion data corresponding to the three algorithms are respectively as follows: 50%, 60% and 70%, the average of the three proportion data can be determined to be 60%, i.e. the performance evaluation data of the physical model is the perceived accuracy of 60%.
Or the minimum proportion data in the proportion data is determined as the performance evaluation data of the physical model, and the minimum performance of the physical model under any algorithm can be measured based on the performance evaluation data, so that the performance evaluation of the physical model is prevented from being influenced by the algorithm, and the performance evaluation data has higher reference value. For example: the proportion data corresponding to the three algorithms are respectively as follows: 50%, 60% and 70%, the minimum proportion of the three proportions of data can be determined to be 50%, i.e. the performance evaluation data of the physical model is the perceived accuracy of 50%.
According to the technical scheme of the embodiment, the correct environment perception result data output by each algorithm is determined respectively, the ratio data occupying the environment truth value data is used for evaluating the performance of the physical model, and a more accurate performance evaluation result can be obtained.
EXAMPLE five
Fig. 6 is a flowchart of a sensor model performance evaluation method according to a fifth embodiment of the present invention, and this embodiment describes in detail a driving performance evaluation process of a vehicle equipped with a physical model after determining performance evaluation data of the physical model based on the above embodiments. Wherein explanations of the same or corresponding terms as those of the above embodiments are omitted.
Referring to fig. 6, the method for evaluating the performance of the sensor model provided by the embodiment specifically includes the following steps:
and S510, acquiring raw data output by a physical model of the sensor.
And S520, respectively inputting the original data into at least two algorithms so that each algorithm respectively analyzes and processes the original data.
And S530, acquiring environment perception result data respectively output by at least two algorithms.
And S540, determining performance evaluation data of the physical model according to the acquired environmental perception result data and the acquired environmental truth value data.
And S550, acquiring performance evaluation data of the vehicle at least one driving performance angle when the vehicle carrying the physical model is subjected to driving test.
The drivability angle may be an angle for measuring the level of drivability. For example, drivability angles may include, but are not limited to, whether a collision occurred and whether driving completed a full stroke. Accordingly, the performance evaluation data for the drivability perspective may include a collision score and a driving through-range score.
Specifically, a physical model of the sensor is loaded in the vehicle, and each algorithm is adopted to analyze and process the original data output by the loaded physical model, so that the vehicle can carry out driving test based on the environment sensing result data output by the algorithm, and vehicle driving test results under each algorithm are obtained, such as whether the vehicle collides under the algorithm and whether the driving is completed in the whole process. According to the vehicle driving test results under various algorithms, performance evaluation data of at least one drivability angle can be determined.
For example, when the performance evaluation data from the driving performance viewpoint is the collision score, the determination may be made based on whether or not a collision occurs under various algorithms on a physical model mounted in the vehicle. For example, a first algorithm number of collisions is counted, and when the first algorithm number is equal to 0, it indicates that the physical model mounted in the vehicle will not collide when each algorithm is adopted, and the collision score may be determined as a maximum score value, such as 100; when the first number of algorithms is greater than or equal to 1, indicating that there is at least one algorithm that results in a vehicle collision, the collision score may be determined directly as a minimum value, such as 0, although it is also possible to have the collision score decrease as the first number of algorithms increases, such as the collision score may decrease linearly or exponentially, until the collision score decreases to a minimum value, such as 0, when the first number of algorithms equals the total number of algorithms.
For example, when the performance evaluation data of the drivability angle is the driving completion fraction, the determination may be made according to whether the driving completion of the entire process is performed under various algorithms by using a physical model mounted in the vehicle. For example, the number of second algorithms that cannot drive the vehicle to complete the whole course may be counted, and when the number of second algorithms is equal to 0, it indicates that the physical model mounted in the vehicle can drive the vehicle to complete the whole course when each algorithm is adopted, and at this time, the driving whole course score may be determined as a maximum score value, such as 100; when the number of the second algorithms is greater than or equal to 1, it indicates that there is at least one algorithm that cannot drive the entire driving process, and at this time, the driving entire process score may be directly determined to be a minimum value, such as 0, but it is also possible to make the driving entire process score decrease as the number of the second algorithms increases, such as the driving entire process score may decrease linearly or exponentially until the driving entire process score decreases to be the minimum value, such as 0, when the number of the second algorithms is equal to the total number of algorithms.
And S560, determining driving performance evaluation data of the vehicle according to the performance evaluation data of the physical model and the performance evaluation data of at least one driving performance angle.
Specifically, the performance evaluation data of the physical model and the performance evaluation data of each drivability angle may be added, and the obtained addition result is determined as driving performance evaluation data of the vehicle, so that the driving performance of the vehicle may be evaluated, and the performance of the physical model in the driving scene of the vehicle may be further reflected.
Exemplarily, S560 may include: and determining driving performance evaluation data of the vehicle according to the performance evaluation data of the physical model, the weight corresponding to the performance evaluation data of the physical model, the performance evaluation data of each driving performance angle and the weight corresponding to the performance evaluation data of each driving performance angle.
The influence degree of the performance evaluation data of the physical model and the performance evaluation data of each drivability angle on the overall drivability of the vehicle can be adjusted by using the weight of the performance evaluation data of the physical model and the weight of the performance evaluation data of each drivability angle. For example, the higher the weight of the performance evaluation data, the greater the degree of influence of the performance evaluation data on the overall drivability of the vehicle. Different service requirements can be met by setting corresponding weight for each performance evaluation data.
Specifically, the embodiment may perform weighted superposition on the performance evaluation data of the physical model and the performance evaluation data of each drivability angle, and determine the obtained superposition result as the driving performance evaluation data of the vehicle, so as to determine the driving performance of the vehicle more accurately.
According to the technical scheme, the driving test is carried out on the vehicle loaded with the physical model, the performance evaluation data of the vehicle at least one driving performance angle is obtained, and the comprehensive evaluation of the driving performance of the vehicle based on the performance evaluation data of the physical model and the performance evaluation data of each driving performance angle is realized.
It should be noted that how to determine the performance evaluation data of the physical model according to the acquired environmental perception result data and the environmental truth value data is not limited to the scheme provided in the embodiment of the present application, and any other scheme capable of obtaining the performance evaluation data of the physical model according to the environmental perception result data and the environmental truth value data output by a plurality of algorithms is within the protection scope of the present application, for example, the performance evaluation data of the physical model can be determined by comparing the environmental perception result data output by each algorithm with the environmental truth value data respectively to determine the wrong environmental perception result data in the environmental perception result data output by each algorithm respectively, and then the performance evaluation data of the physical model can be determined according to the wrong environmental perception result data and the environmental truth value data output by each algorithm respectively, and specifically, the wrong environmental perception result data output by each algorithm respectively accounts for the maximum proportion data in the proportion data of the environmental truth value data, as performance evaluation data of the physical model, or as average data of ratio data of erroneous environment sensing result data output by each algorithm to the environment truth value data, as performance evaluation data of the physical model, etc., which are not described in detail herein.
The following is an embodiment of the sensor model performance evaluation apparatus provided in the embodiments of the present invention, which belongs to the same inventive concept as the sensor model performance evaluation methods of the above embodiments, and details that are not described in detail in the embodiments of the sensor model performance evaluation apparatus may refer to the embodiments of the sensor model performance evaluation methods described above.
EXAMPLE six
Fig. 7 is a schematic structural diagram of a sensor model performance evaluation apparatus according to a sixth embodiment of the present invention, which is applicable to performance evaluation of a physical model in a sensor model. As shown in fig. 7, the apparatus specifically includes: a raw data acquisition module 610, a raw data analysis processing module 620, an environmental perception result data acquisition module 630 and a performance evaluation data determination module 640.
The raw data acquiring module 610 is configured to acquire raw data output by a physical model of a sensor; the raw data analyzing and processing module 620 is configured to input the raw data into at least two algorithms respectively, so that each algorithm analyzes and processes the raw data respectively; an environmental perception result data obtaining module 630, configured to obtain environmental perception result data output by at least two algorithms respectively; and the performance evaluation data determining module 640 is configured to determine performance evaluation data of the physical model according to the acquired environmental sensing result data and the acquired environmental truth value data.
According to the embodiment of the invention, the original data output by the physical model of the sensor is respectively analyzed and processed by utilizing at least two algorithms, and the performance of the physical model can be effectively evaluated according to the environmental perception result data output by each algorithm. Because the performance of the physical model is evaluated by using the environment perception result data output by at least two algorithms, compared with the performance of the physical model by using the environment perception result data output by one algorithm, the influence on the performance evaluation of the physical model caused by the self-reason of the algorithm can be avoided as much as possible, so that the performance evaluation data can reflect the performance of the physical model more truly, and further the effective evaluation of the performance of the physical model is realized. The scheme of the embodiment is particularly suitable for effectively evaluating the performance of the physical model of the sensor when the data analysis processing algorithm adopted by the producer of the sensor hardware cannot be obtained, so that the real performance condition of the physical model is evaluated.
Optionally, the performance evaluation data determining module 640 includes:
the correct environment sensing result data determining unit is used for determining correct environment sensing result data in the environment sensing result data output by each algorithm by comparing the environment sensing result data output by each algorithm with the environment truth value data;
and the performance evaluation data determining unit is used for determining the performance evaluation data of the physical model according to the correct environment sensing result data and the environment truth value data.
Optionally, the performance evaluation data determining unit is specifically configured to: determining the intersection of correct environment sensing result data output by each algorithm; and determining the proportion data of the environmental perception result data in the intersection to the environmental truth value data as the performance evaluation data of the physical model.
Optionally, the performance evaluation data determining unit includes:
the proportion data determining subunit is used for determining the correct environment sensing result data output by each algorithm respectively, and the proportion data accounts for the environment truth value data;
and the performance evaluation data determining subunit is used for determining the performance evaluation data of the physical model according to the proportional data.
Optionally, the performance evaluation data determining subunit is specifically configured to: determining the average value of the proportional data as the performance evaluation data of the physical model; or, the minimum proportion data in the proportion data is determined as the performance evaluation data of the physical model.
Optionally, the apparatus further comprises:
the similarity determining module is used for determining the similarity between the environment sensing result data respectively output by each algorithm by comparing the environment sensing result data respectively output by each algorithm;
and the prompt information sending module is used for sending prompt information containing the similarity so that a user can determine the confidence of the performance evaluation data according to the similarity.
Optionally, the apparatus further comprises:
the data acquisition module is used for acquiring performance evaluation data of the vehicle at least one driving performance angle when the vehicle carrying the physical model is subjected to driving test after the performance evaluation data of the physical model is determined;
and the driving performance evaluation data determining module is used for determining the driving performance evaluation data of the vehicle according to the performance evaluation data of the physical model and the performance evaluation data of at least one driving performance angle.
Optionally, the driving performance evaluation data determining module is specifically configured to: and determining driving performance evaluation data of the vehicle according to the performance evaluation data of the physical model, the weight corresponding to the performance evaluation data of the physical model, the performance evaluation data of each driving performance angle and the weight corresponding to the performance evaluation data of each driving performance angle.
Optionally, the physical model of the sensor is a physical model of a camera, the raw data is image data, the environment sensing result data is a list of entity objects sensed by the model, and the environment truth value data is a list of entity objects actually existing in the environment; alternatively, the first and second electrodes may be,
the physical model of the sensor is a physical model of the radar, the original data is point cloud data, the environment sensing result data is a list of entity objects sensed by the model, and the environment truth value data is a list of entity objects really existing in the environment.
The sensor model performance evaluation device provided by the embodiment of the invention can execute the sensor model performance evaluation method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of executing the sensor model performance evaluation method.
It should be noted that, in the embodiment of the sensor model performance evaluation apparatus, the modules included in the embodiment are only divided according to functional logic, but are not limited to the above division as long as the corresponding functions can be implemented; in addition, specific names of the functional units are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present invention.
EXAMPLE seven
Fig. 8 is a schematic structural diagram of an apparatus according to a fourth embodiment of the present invention. FIG. 8 illustrates a block diagram of an exemplary device 12 suitable for use in implementing embodiments of the present invention. The device 12 shown in fig. 8 is only an example and should not bring any limitation to the function and scope of use of the embodiments of the present invention.
As shown in FIG. 8, device 12 is in the form of a general purpose computing device. The components of device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, micro-channel architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
Device 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The system memory 28 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM)30 and/or cache memory 32. Device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 8, and commonly referred to as a "hard drive"). Although not shown in FIG. 8, a magnetic disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a CD-ROM, DVD-ROM, or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. System memory 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the invention.
A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, in system memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally carry out the functions and/or methodologies of the described embodiments of the invention.
Device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), with one or more devices that enable a user to interact with device 12, and/or with any devices (e.g., network card, modem, etc.) that enable device 12 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 22. Also, the device 12 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet) via the network adapter 20. As shown, the network adapter 20 communicates with the other modules of the device 12 via the bus 18. It should be understood that although not shown in the figures, other hardware and/or software modules may be used in conjunction with device 12, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 16 executes various functional applications and data processing by executing programs stored in the system memory 28, for example, implementing steps of a sensor model performance evaluation method provided by the embodiment of the present invention, the method including:
acquiring original data output by a physical model of a sensor;
respectively inputting the original data into at least two algorithms so that each algorithm respectively analyzes and processes the original data;
acquiring environment sensing result data respectively output by at least two algorithms;
and determining performance evaluation data of the physical model according to the acquired environment sensing result data and the acquired environment truth value data.
Of course, those skilled in the art will understand that the processor may also implement the technical solution of the sensor model performance evaluation method provided in any embodiment of the present invention.
Example eight
The present embodiment provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of a sensor model performance evaluation method as provided by any of the embodiments of the invention, the method comprising:
acquiring original data output by a physical model of a sensor;
respectively inputting the original data into at least two algorithms so that each algorithm respectively analyzes and processes the original data;
acquiring environment sensing result data respectively output by at least two algorithms;
and determining performance evaluation data of the physical model according to the acquired environment sensing result data and the acquired environment truth value data.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer-readable storage medium may be, for example but not limited to: an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination thereof. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It will be understood by those skilled in the art that the modules or steps of the invention described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of computing devices, and optionally they may be implemented by program code executable by a computing device, such that it may be stored in a memory device and executed by a computing device, or it may be separately fabricated into various integrated circuit modules, or it may be fabricated by fabricating a plurality of modules or steps thereof into a single integrated circuit module. Thus, the present invention is not limited to any specific combination of hardware and software.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (12)

1. A method for evaluating performance of a sensor model, comprising:
acquiring original data output by a physical model of a sensor;
inputting the original data into at least two algorithms respectively so that each algorithm analyzes and processes the original data respectively;
acquiring environment perception result data respectively output by the at least two algorithms;
and determining performance evaluation data of the physical model according to the acquired environment perception result data and the acquired environment truth value data.
2. The method of claim 1, wherein determining performance evaluation data for the physical model based on the obtained environmental perception result data and environmental truth data comprises:
determining correct environment sensing result data in the environment sensing result data output by each algorithm by comparing the environment sensing result data output by each algorithm with environment true value data;
and determining the performance evaluation data of the physical model according to the correct environment perception result data and the environment truth value data.
3. The method of claim 2, wherein determining performance assessment data for the physical model based on the correct environmental perception result data and environmental truth data comprises:
determining the intersection of the correct environment sensing result data output by each algorithm;
and determining the proportion data of the environmental perception result data in the intersection to the environmental truth value data as the performance evaluation data of the physical model.
4. The method of claim 2, wherein determining performance assessment data for the physical model based on the correct environmental perception result data and environmental truth data comprises:
determining the correct environmental perception result data output by each algorithm respectively, wherein the correct environmental perception result data account for the proportion data of the environmental truth value data;
and determining performance evaluation data of the physical model according to the proportional data.
5. The method of claim 4, wherein determining performance assessment data for the physical model based on each of the scale data comprises:
determining the average value of the proportional data as the performance evaluation data of the physical model; alternatively, the first and second electrodes may be,
and determining the minimum proportion data in the proportion data as the performance evaluation data of the physical model.
6. The method of claim 1, further comprising:
determining the similarity between the environmental perception result data respectively output by each algorithm by comparing the environmental perception result data respectively output by each algorithm;
and sending prompt information containing the similarity so as to enable a user to determine the confidence of the performance evaluation data according to the similarity.
7. The method of any of claims 1-6, wherein after determining the performance assessment data for the physical model, the method further comprises:
acquiring performance evaluation data of the vehicle at least one driving performance angle when the vehicle carrying the physical model is subjected to a driving test;
determining driving performance evaluation data of the vehicle based on the performance evaluation data of the physical model and the performance evaluation data of the at least one drivability angle.
8. The method of claim 7, wherein determining driving performance assessment data for the vehicle based on the performance assessment data for the physical model and the performance assessment data for the at least one drivability angle comprises:
and determining driving performance evaluation data of the vehicle according to the performance evaluation data of the physical model, the weight corresponding to the performance evaluation data of the physical model, the performance evaluation data of each driving performance angle and the weight corresponding to the performance evaluation data of each driving performance angle.
9. The method according to any one of claims 1-6, wherein the physical model of the sensor is a physical model of a camera, the raw data is image data, the environment perception result data is a list of physical objects perceived by the model, and the environment truth data is a list of physical objects that are actually present in the environment; alternatively, the first and second electrodes may be,
the physical model of the sensor is a physical model of a radar, the original data is point cloud data, the environment perception result data is a list of entity objects perceived by the model, and the environment truth value data is a list of entity objects really existing in the environment.
10. A sensor model performance evaluation apparatus, characterized by comprising:
the raw data acquisition module is used for acquiring raw data output by a physical model of the sensor;
the original data analysis processing module is used for inputting the original data into at least two algorithms respectively so that each algorithm can analyze and process the original data respectively;
the environment perception result data acquisition module is used for acquiring environment perception result data respectively output by the at least two algorithms;
and the performance evaluation data determining module is used for determining the performance evaluation data of the physical model according to the acquired environment sensing result data and the acquired environment truth value data.
11. An apparatus, characterized in that the apparatus comprises:
one or more processors;
a memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the sensor model performance assessment method of any one of claims 1-9.
12. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out a method of sensor model performance evaluation according to any one of claims 1 to 9.
CN202010041616.9A 2020-01-15 2020-01-15 Sensor model performance evaluation method, device, equipment and storage medium Pending CN113128315A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010041616.9A CN113128315A (en) 2020-01-15 2020-01-15 Sensor model performance evaluation method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010041616.9A CN113128315A (en) 2020-01-15 2020-01-15 Sensor model performance evaluation method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN113128315A true CN113128315A (en) 2021-07-16

Family

ID=76771326

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010041616.9A Pending CN113128315A (en) 2020-01-15 2020-01-15 Sensor model performance evaluation method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113128315A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060007308A1 (en) * 2004-07-12 2006-01-12 Ide Curtis E Environmentally aware, intelligent surveillance device
US20150112570A1 (en) * 2013-10-22 2015-04-23 Honda Research Institute Europe Gmbh Confidence estimation for predictive driver assistance systems based on plausibility rules
CN106407281A (en) * 2016-08-26 2017-02-15 北京奇艺世纪科技有限公司 Image retrieval method and device
CN109522825A (en) * 2018-10-31 2019-03-26 蔚来汽车有限公司 The Performance Test System and its performance test methods of visual perception system
CN109520744A (en) * 2018-11-12 2019-03-26 百度在线网络技术(北京)有限公司 The driving performance test method and device of automatic driving vehicle
US20190329773A1 (en) * 2017-01-12 2019-10-31 Mobileye Vision Technologies Ltd. Navigation based on bahaviors of following vehicles
CN110532856A (en) * 2019-07-16 2019-12-03 公安部第一研究所 A kind of face identification method of more algorithm fusions

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060007308A1 (en) * 2004-07-12 2006-01-12 Ide Curtis E Environmentally aware, intelligent surveillance device
US20150112570A1 (en) * 2013-10-22 2015-04-23 Honda Research Institute Europe Gmbh Confidence estimation for predictive driver assistance systems based on plausibility rules
CN106407281A (en) * 2016-08-26 2017-02-15 北京奇艺世纪科技有限公司 Image retrieval method and device
US20190329773A1 (en) * 2017-01-12 2019-10-31 Mobileye Vision Technologies Ltd. Navigation based on bahaviors of following vehicles
CN109522825A (en) * 2018-10-31 2019-03-26 蔚来汽车有限公司 The Performance Test System and its performance test methods of visual perception system
CN109520744A (en) * 2018-11-12 2019-03-26 百度在线网络技术(北京)有限公司 The driving performance test method and device of automatic driving vehicle
CN110532856A (en) * 2019-07-16 2019-12-03 公安部第一研究所 A kind of face identification method of more algorithm fusions

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
DANYANG TIAN, ET.AL: "Performance Measurement Evaluation Framework and Co-Benefit\\/Tradeoff Analysis for Connected and Automated Vehicles (CAV) Applications: A Survey", 《 IEEE INTELLIGENT TRANSPORTATION SYSTEMS MAGAZINE》, vol. 10, no. 3, pages 110 - 122 *

Similar Documents

Publication Publication Date Title
CN109188457B (en) Object detection frame generation method, device, equipment, storage medium and vehicle
CN109145680B (en) Method, device and equipment for acquiring obstacle information and computer storage medium
CN109343061B (en) Sensor calibration method and device, computer equipment, medium and vehicle
CN109087510B (en) Traffic monitoring method and device
WO2020147500A1 (en) Ultrasonic array-based obstacle detection result processing method and system
CN113807350A (en) Target detection method, device, equipment and storage medium
CN112863187B (en) Detection method of perception model, electronic equipment, road side equipment and cloud control platform
CN115797736B (en) Training method, device, equipment and medium for target detection model and target detection method, device, equipment and medium
US9613272B2 (en) Image analyzing device, image analyzing method, and recording medium storing image analyzing program
CN116964588A (en) Target detection method, target detection model training method and device
CN113255516A (en) Living body detection method and device and electronic equipment
CN110363193B (en) Vehicle weight recognition method, device, equipment and computer storage medium
CN113177497B (en) Training method of visual model, vehicle identification method and device
CN109635868B (en) Method and device for determining obstacle type, electronic device and storage medium
CN114120071A (en) Detection method of image with object labeling frame
CN111860623A (en) Method and system for counting tree number based on improved SSD neural network
CN111091099A (en) Scene recognition model construction method, scene recognition method and device
CN114429631B (en) Three-dimensional object detection method, device, equipment and storage medium
KR20130013462A (en) Foreground extraction apparatus and method using ccb and mt lbp
CN113591543B (en) Traffic sign recognition method, device, electronic equipment and computer storage medium
CN113128315A (en) Sensor model performance evaluation method, device, equipment and storage medium
CN113111692B (en) Target detection method, target detection device, computer readable storage medium and electronic equipment
CN111124862B (en) Intelligent device performance testing method and device and intelligent device
CN114638947A (en) Data labeling method and device, electronic equipment and storage medium
CN113762001B (en) Target detection method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination