CN117349063A - Method, device, equipment and storage medium for determining detection performance of detector - Google Patents
Method, device, equipment and storage medium for determining detection performance of detector Download PDFInfo
- Publication number
- CN117349063A CN117349063A CN202311277179.0A CN202311277179A CN117349063A CN 117349063 A CN117349063 A CN 117349063A CN 202311277179 A CN202311277179 A CN 202311277179A CN 117349063 A CN117349063 A CN 117349063A
- Authority
- CN
- China
- Prior art keywords
- data
- detected
- obstacle
- matching result
- detection
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 227
- 238000000034 method Methods 0.000 title claims abstract description 46
- 238000004590 computer program Methods 0.000 claims description 16
- 230000009286 beneficial effect Effects 0.000 abstract description 3
- 238000010586 diagram Methods 0.000 description 10
- 238000004891 communication Methods 0.000 description 8
- 238000003384 imaging method Methods 0.000 description 8
- 238000012545 processing Methods 0.000 description 7
- 238000011156 evaluation Methods 0.000 description 4
- 230000006872 improvement Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000012937 correction Methods 0.000 description 2
- 238000013480 data collection Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 239000002699 waste material Substances 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000012634 fragment Substances 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000011056 performance test Methods 0.000 description 1
- 238000012827 research and development Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000026676 system process Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/07—Responding to the occurrence of a fault, e.g. fault tolerance
- G06F11/0703—Error or fault processing not based on redundancy, i.e. by taking additional measures to deal with the error or fault not making use of redundancy in operation, in hardware, or in data representation
- G06F11/0706—Error or fault processing not based on redundancy, i.e. by taking additional measures to deal with the error or fault not making use of redundancy in operation, in hardware, or in data representation the processing taking place on a specific hardware platform or in a specific software environment
- G06F11/0736—Error or fault processing not based on redundancy, i.e. by taking additional measures to deal with the error or fault not making use of redundancy in operation, in hardware, or in data representation the processing taking place on a specific hardware platform or in a specific software environment in functional embedded systems, i.e. in a data processing system designed as a combination of hardware and software dedicated to performing a certain function
- G06F11/0739—Error or fault processing not based on redundancy, i.e. by taking additional measures to deal with the error or fault not making use of redundancy in operation, in hardware, or in data representation the processing taking place on a specific hardware platform or in a specific software environment in functional embedded systems, i.e. in a data processing system designed as a combination of hardware and software dedicated to performing a certain function in a data processing system embedded in automotive or aircraft systems
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
Abstract
The invention discloses a method, a device, equipment and a storage medium name for determining detection performance of a detector. Wherein the method comprises the following steps: obtaining multi-frame data to be detected, and respectively inputting the multi-frame data to be detected into a target detector to obtain obstacle detection information corresponding to each frame of data to be detected output by the detector; for each frame of data to be detected, standard obstacle information corresponding to the data to be detected is obtained, obstacle difference information is determined based on the obstacle detection information and the standard obstacle information, and a preliminary matching result corresponding to the data to be detected is determined based on the obstacle difference information; acquiring reference detection data corresponding to the data to be detected, and determining a target matching result of the data to be detected based on a preliminary matching result corresponding to the reference detection data and a preliminary matching result corresponding to the data to be detected; and determining the detection performance index of the detector based on the target matching result corresponding to the multi-frame data to be detected. The beneficial effect of improving the rationality of the detection result of the detector is achieved.
Description
Technical Field
The present invention relates to the field of detection technologies, and in particular, to a method, an apparatus, a device, and a storage medium name for determining detection performance of a detector.
Background
In the field of autopilot, the detection performance of an on-board system for objects around a vehicle is critical to the safety of the vehicle. In the prior art, research and development tests of the vehicle-mounted environment perception detector often carry out obstacle matching comparison one by one based on a single-frame detection target and a true value, namely one frame of detection data corresponds to one frame of true value data.
However, autopilot systems also have tracking strategies downstream of the detector. Specifically, the occasional loss of the detection obstacle can be effectively compensated by means of Kalman filtering and the like. Therefore, only the output of the obstacle information list tracked by the tracker can influence the subsequent vehicle driving path planning.
Therefore, the performance detection mode based on single-frame detection is not only easy to cause waste of precision, but also often cannot accurately reflect the real detection performance of the detector in scene detection based on the performance test result of single-frame detection.
Disclosure of Invention
The invention provides a method, a device, equipment and a storage medium name for determining detection performance of a detector, which are used for solving the technical problems of precision waste caused by a performance detection mode based on single-frame detection and incapability of accurately reflecting the detection performance of a real scene of the detector.
According to an aspect of the present invention, there is provided a method of determining detection performance of a detector, the method comprising:
obtaining multi-frame data to be detected, and respectively inputting the multi-frame data to be detected into a target detector to obtain obstacle detection information corresponding to each frame of data to be detected output by the detector;
for each frame of data to be detected, standard obstacle information corresponding to the data to be detected is obtained, obstacle difference information is determined based on the obstacle detection information and the standard obstacle information, and a preliminary matching result corresponding to the data to be detected is determined based on the obstacle difference information;
acquiring reference detection data corresponding to the data to be detected, and determining a target matching result of the data to be detected based on a preliminary matching result corresponding to the reference detection data and a preliminary matching result corresponding to the data to be detected, wherein the reference detection data is the data to be detected adjacent to the data to be detected;
and determining the detection performance index of the detector based on the target matching result corresponding to the multi-frame data to be detected.
According to another aspect of the present invention, there is provided a detector detection performance determining apparatus, the apparatus comprising:
a to-be-detected data acquisition module for acquiring multiple frames of to-be-detected data, respectively inputting the multiple frames of to-be-detected data into the target detector to obtain obstacle detection information corresponding to each frame of to-be-detected data output by the detector,
The preliminary matching result determining module is used for acquiring standard obstacle information corresponding to the data to be detected for each frame of the data to be detected, determining obstacle difference information based on the obstacle detection information and the standard obstacle information, and determining a preliminary matching result corresponding to the data to be detected based on the obstacle difference information;
the target matching result determining module is used for obtaining reference detection data corresponding to the data to be detected, and determining a target matching result of the data to be detected based on a preliminary matching result corresponding to the reference detection data and a preliminary matching result corresponding to the data to be detected, wherein the reference detection data is the data to be detected adjacent to the data to be detected;
and the detection performance index determining module is used for determining the detection performance index of the detector based on the target matching result corresponding to the multi-frame data to be detected.
According to another aspect of the present invention, there is provided an electronic apparatus including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the method of determining the detection performance of the detector of any of the embodiments of the invention.
According to another aspect of the present invention, there is provided a computer readable storage medium storing computer instructions for causing a processor to perform a method of determining the detection performance of a detector according to any of the embodiments of the present invention.
According to the technical scheme, firstly, multiple frames of data to be detected are obtained, the multiple frames of data to be detected are respectively input into the target detector, obstacle detection information corresponding to each frame of data to be detected output by the detector is obtained, and a data source is provided for multiple frames of detection. And then, for each frame of data to be detected, acquiring standard obstacle information corresponding to the data to be detected, determining obstacle difference information based on the obstacle detection information and the standard obstacle information, and determining a preliminary matching result corresponding to the data to be detected based on the obstacle difference information. And further acquiring reference detection data corresponding to the data to be detected, and determining a target matching result of the data to be detected based on the preliminary matching result corresponding to the reference detection data and the preliminary matching result corresponding to the data to be detected, wherein the reference detection data is the data to be detected adjacent to the data to be detected. By referring to the detection result of the adjacent frame detection data, the optimization and correction of the data to be detected are realized. And finally, determining the detection performance index of the detector based on the target matching result corresponding to the multi-frame data to be detected. The technical problems that precision is wasted and real scene detection performance of the detector cannot be accurately reflected due to a performance detection mode based on single-frame detection are solved. The method has the advantages of improving the rationality of the performance detection result of the detector, effectively guiding the performance improvement direction of the detector and providing more accurate detection data basis for vehicle driving path planning.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method for determining detection performance of a detector according to a first embodiment of the present invention;
FIG. 2 is a logic block diagram of a determination of detection performance of a detector provided in accordance with a second embodiment of the present invention;
FIG. 3 is a logic block diagram of a determination of detector detection performance provided in accordance with a second embodiment of the present invention;
fig. 4 is a schematic structural diagram of a device for determining detection performance of a detector according to a third embodiment of the present invention;
fig. 5 is a schematic diagram of the structure of an electronic device 10 that may be used to implement an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
Fig. 1 is a flowchart of a method for determining detection performance of a detector according to an embodiment of the present invention, where the method may be performed by a device for determining detection performance of a detector, and the device for determining detection performance of a detector may be implemented in hardware and/or software, and the device for determining detection performance of a detector may be configured in an electronic device. As shown in fig. 1, the method includes:
s110, acquiring multi-frame data to be detected, and respectively inputting the multi-frame data to be detected into a target detector to obtain obstacle detection information corresponding to each frame of data to be detected output by the detector.
In this embodiment, the data to be detected may be 1 frame in the scene segment imaging data acquired by the environment sensing device such as the radar imaging system. In particular, the context-aware device may emit a pulsed signal to the surrounding environment and may then receive the pulsed signal reflected by the object. And the environment sensing equipment can calculate the distance and azimuth angle information of the target object and the radar according to the time delay and the phase information of the received reflected signals. Meanwhile, the environment sensing equipment can calculate the size, shape and other information of the target object according to the amplitude information of the reflected signal, so that the imaging and the identification of the target object can be realized. After receiving the plurality of reflected signals, the system processes the reflected signals by using a signal processing algorithm, thereby realizing the identification of a scene comprising a plurality of target objects and the acquisition of imaging data of a scene segment. The multi-frame data to be detected may be a plurality of frames in the scene segment imaging data. By way of example, the perceived frequency of the radar imaging system may be at least 10Hz (the spacing of the preceding and following frames may be greater than 0.1 s), and the multiframe data to be detected may be scene segment data comprising at least 5 consecutive frames. The object detector may be a terminal, a system or a performance detection module for performing performance detection on data to be detected. The obstacle detection information may be information related to an obstacle detection result outputted after detecting each frame in the scene fragment imaging data. Such as the position and shape of the obstacle.
Specifically, a scene segment of at least 5 continuous frames can be obtained as 5 pieces of data to be detected based on the radar imaging system, and multiple frames of data to be detected can be respectively input into the target detector. Then, obstacle detection information corresponding to each frame of data to be detected output by the detector can be obtained based on the analysis detection of the target detector.
S120, for each frame of data to be detected, standard obstacle information corresponding to the data to be detected is acquired, obstacle difference information is determined based on the obstacle detection information and the standard obstacle information, and a preliminary matching result corresponding to the data to be detected is determined based on the obstacle difference information.
In this embodiment, the standard obstacle information may be truth data information of the obstacle obtained by using a collection device such as a truth data collection device, a system or a collection module. For example, the standard obstacle information may be extracted obstacle truth value data information in a scene segment acquired by the evaluation system based on Ibeo Reference System. It can be appreciated that the truth data collection device and the environmental awareness apparatus may perform spatial calibration and time synchronization in advance. The standard obstacle information corresponding to the data to be detected can be objective truth data corresponding to each frame of data of multiple frames of data to be detected, which is acquired by the truth data acquisition device after space calibration and time synchronization are performed. The obstacle difference information may be data difference information of dimensions such as the shape and position of the obstacle, among the data of the obstacle detection information and the standard obstacle information. The preliminary matching result may be a preliminary matching result determined based on 1 frame of obstacle difference information in the multi-frame to-be-detected data and the standard obstacle information. The preliminary matching result may be a correct matching (positive class) with a small difference or a wrong matching (negative class) with a large difference according to the difference magnitude value in the obstacle difference information.
Specifically, for each frame of data to be detected, standard obstacle information corresponding to each frame of data to be detected can be obtained based on the truth value data acquisition device, wherein the standard obstacle information is obstacle truth value data obtained by the truth value data acquisition device after spatial calibration and time synchronization with the environment sensing equipment. Then 1 frame of multi-frame data to be detected can be selected, obstacle detection information is extracted based on the data to be detected, and standard obstacle information is extracted from corresponding 1 frame true value data. And finally, determining obstacle difference information based on information differences of the shape, the position and other data of the obstacle, and determining a preliminary matching result corresponding to the data to be detected based on the obstacle difference information.
S130, acquiring reference detection data corresponding to the data to be detected, and determining a target matching result of the data to be detected based on a preliminary matching result corresponding to the reference detection data and a preliminary matching result corresponding to the data to be detected, wherein the reference detection data is the data to be detected adjacent to the data to be detected.
In this embodiment, the reference detection data may be to-be-detected data corresponding to adjacent frames before and after 1 frame of to-be-detected data in the multi-frame to-be-detected data. The target matching result may be a matching result obtained by comprehensively judging after integrating all the preliminary matching results corresponding to the multi-frame data to be detected. Because the influence of measurement noise on the single-frame detection result cannot be eliminated by single-frame detection, the rationality of the detection result can be effectively improved based on the preliminary matching result corresponding to the reference detection data and the target matching result determined by the preliminary matching result corresponding to the data to be detected. The target matching result may be correct matching (positive class) or incorrect matching (negative class) after the preliminary matching result is corrected.
For example, the preliminary matching result corresponding to the data to be detected may be a matching result determined based on the 3 rd frame of the 5 consecutive frames of the multi-frame data to be detected. The preliminary matching result corresponding to the reference detection data may be a preliminary matching result corresponding to the data to be detected of the previous and subsequent 2 frames of the data to be detected. The preliminary matching result corresponding to the data to be detected can be corrected based on the preliminary matching result corresponding to the data to be detected of the previous and subsequent 2 frames, and the target matching result corresponding to the data to be detected is obtained.
And S140, determining the detection performance index of the detector based on the target matching result corresponding to the multi-frame data to be detected.
In this embodiment, the detection performance index of the detector is determined based on the target matching result corresponding to the multi-frame data to be detected, and the detection performance of the detector may be evaluated by drawing PR curves and ROC curves based on statistical results of real class, false positive class, false negative class and true negative class corresponding to the multiple target matching results. The real class may be TP (True positive) in which the preliminary matching result is a positive class and the target matching result is a positive class; the False positive class may be FP (False positive) in which the preliminary matching result is a positive class and the target matching result is a negative class; the False Negative class may be FN (False Negative class) that the preliminary matching result is a Negative class and the target matching result is a positive class; the True Negative class may be TN (True Negative class) in which the preliminary matching result is a Negative class and the target matching result is a Negative class.
Optionally, the obstacle detection information is an obstacle detection position, and the standard obstacle information is an obstacle standard position; determining obstacle difference information based on the obstacle detection information and the standard obstacle information, determining a preliminary matching result corresponding to the data to be detected based on the obstacle difference information, comprising: calculating the deviation distance between the obstacle detection position and the obstacle standard position, and determining a preliminary matching result corresponding to the data to be detected based on the deviation distance and a preset distance threshold.
In this embodiment, the obstacle detection position may be relative spatial position information of the obstacle and the environment sensing device, for example, may be a distance between the obstacle and the environment sensing device. The standard position of the obstacle may be information of relative spatial positions of the obstacle and the truth data acquisition device, for example, may be a distance between the obstacle and the truth data acquisition device. The preset distance threshold may be a distance threshold preset according to actual situations. For example, when the deviation distance is greater than the preset distance threshold, the preliminary matching result may be determined to be a matching error (negative class), and when the deviation distance is equal to or less than the preset distance threshold, the preliminary matching result may be determined to be a matching correct (positive class).
Optionally, the obstacle detection information is an obstacle detection area, and the standard obstacle information is an obstacle standard area; determining obstacle difference information based on the obstacle detection information and the standard obstacle information, determining a preliminary matching result corresponding to the data to be detected based on the obstacle difference information, comprising: and calculating the region intersection ratio between the obstacle detection region and the obstacle standard region, and determining a preliminary matching result corresponding to the data to be detected based on the region intersection ratio and a preset intersection ratio threshold value.
In this embodiment, the obstacle detection area may be geometric information of the obstacle area determined in the obstacle detection information, for example, may be a shape of the obstacle. The obstacle standard region may be geometric information of the obstacle region acquired by the truth data acquisition device. The preset cross ratio threshold may be a cross ratio limit value preset according to actual situations, for example, may be 0.6. It will be appreciated that a larger area-to-area ratio between the obstacle detection area and the obstacle criterion area represents that the data to be detected is closer to the truth data.
For example, when the area intersection ratio between the obstacle detection area and the obstacle standard area is greater than the preset intersection ratio threshold value of 0.6, the preliminary matching result may be determined to be correct (positive class). When the area intersection ratio between the obstacle detection area and the obstacle standard area is less than or equal to a preset intersection ratio threshold value of 0.6, the preliminary matching result can be determined to be a matching error (negative class).
Optionally, the determining the target matching result of the data to be detected based on the preliminary matching result corresponding to the reference detection data and the preliminary matching result corresponding to the data to be detected includes at least one of the following operations: under the condition that the number of frames of the reference detection data, which are different from the preliminary matching result corresponding to the data to be detected, reach the preset number of frames, the preliminary matching result of the reference detection data is used as a target matching result of the data to be detected; under the condition that the reference detection data is adjacent to the data to be detected and the preliminary matching result corresponding to the data to be detected is different, if the obstacle difference information corresponding to the preliminary matching result of the data to be detected meets a first preset condition, taking the preliminary matching result of the reference detection data as a target matching result of the data to be detected; and taking the preliminary matching result of the reference detection data as the target matching result of the data to be detected under the condition that the obstacle difference information corresponding to the preliminary matching result of the reference detection data meets a second preset condition.
In this embodiment, the preset frame number may be a continuous frame number preset according to an actual situation, for example, may be 3 continuous frames.
Specifically, when the preliminary matching result corresponding to the reference detection data of 3 continuous frames in one scene segment is correct (positive class) and the preliminary matching result corresponding to the data to be detected is incorrect (negative class), the correct matching (positive class) can be used as the target matching result of the data to be detected. In contrast, when the preliminary matching result corresponding to the reference detection data of 3 continuous frames in one scene segment is a matching error (negative class), and the preliminary matching result corresponding to the data to be detected is a matching correct (positive class), the matching error (negative class) may be used as the target matching result of the data to be detected.
The first preset condition may be a preset range of values of the difference information according to actual situations, for example, the first preset condition may be that the difference information of the obstacle is greater than a preset deviation distance threshold or greater than a preset intersection ratio threshold, or the first preset condition may be that the difference information of the obstacle is greater than a preset deviation distance threshold or greater than a preset intersection ratio threshold.
Specifically, if the reference detection data is adjacent to the data to be detected and the preliminary matching result corresponding to the data to be detected is different, and the obstacle difference information corresponding to the preliminary matching result of the data to be detected meets a specific difference information value range, the preliminary matching result of the reference detection data may be used as the target matching result of the data to be detected.
For example, when the preliminary matching result of the data to be detected is correct (positive type) and the preliminary matching result of the data to be detected is incorrect (negative type) in the consecutive 2 frames immediately before or after the data to be detected, if the intersection ratio of the obstacle difference information corresponding to the preliminary matching result of the data to be detected is greater than 0.3, the preliminary matching result of the reference detection data may be correct (positive type) as the target matching result of the data to be detected; in contrast, when the preliminary matching result of the data to be detected, which is located in the consecutive 2 frames immediately before or after the data to be detected, is a matching error (negative class), and the preliminary matching result of the data to be detected is a matching correct (positive class), if the intersection ratio of the obstacle difference information corresponding to the preliminary matching result of the data to be detected is less than 0.3, the preliminary matching result matching error (negative class) of the reference detection data may be used as the target matching result of the data to be detected.
The second preset condition may be a preset range of values of the difference information according to the actual situation, for example, the second preset condition may be that the difference information of the obstacle is greater than a preset deviation distance threshold or greater than a preset intersection ratio threshold, or the second preset condition may be that the difference information of the obstacle is greater than a preset deviation distance threshold or greater than a preset intersection ratio threshold. For example, the second preset condition may be that the intersection ratio of the obstacle difference information corresponding to the preliminary matching result of the data to be detected is greater than 0.9.
Specifically, when the obstacle difference information corresponding to the preliminary matching result of the reference detection data meets a second preset condition and the intersection ratio is greater than 0.9, the preliminary matching result of the reference detection data may be correctly matched (positive type) as the target matching result of the data to be detected.
Optionally, the acquiring the reference detection data corresponding to the data to be detected includes at least one of the following operations: acquiring to-be-detected data of a first preset number of frames which are positioned before and/or after the to-be-detected data and are adjacent to the acquisition time of the to-be-detected data, and taking the to-be-detected data as reference detection data corresponding to the to-be-detected data; and acquiring the to-be-detected data of a second preset number of frames which are continuous in acquisition time in the multi-frame to-be-detected data.
In this embodiment, the first preset number of frames may be a certain number of frames of to-be-detected data located before and/or after the to-be-detected data adjacent to the acquisition time of the to-be-detected data, for example, may be 1 frame located before or 1 frame located after the to-be-detected data adjacent to the acquisition time of the to-be-detected data. The second preset number of frames may be a number of frames preset according to actual conditions, for example, may be 2 frames. The obtaining of the to-be-detected data of the second preset number of frames, which are continuous in the acquisition time, in the multi-frame to-be-detected data may be that of continuous 2 frames before or after obtaining the to-be-detected data in one scene segment.
Optionally, the determining the target matching result of the data to be detected based on the preliminary matching result corresponding to the reference detection data and the preliminary matching result corresponding to the data to be detected includes at least one of the following operations: under the condition that the preliminary matching results corresponding to the data to be detected in the previous frame of the data to be detected and the preliminary matching results corresponding to the data to be detected in the next frame of the data to be detected are different from the preliminary matching results of the data to be detected, taking the preliminary matching results corresponding to the data to be detected in the previous frame or the data to be detected in the next frame as target matching results of the data to be detected; under the condition that preliminary matching results of the to-be-detected data of a continuous second preset number of frames which are positioned immediately before or after the to-be-detected data are different from the preliminary matching results of the to-be-detected data, if the obstacle difference information corresponding to the preliminary matching results of the to-be-detected data is in a preset first information value interval, taking the preliminary matching results of the reference detection data as target matching results of the to-be-detected data; and taking the preliminary matching result of the reference detection data as a target matching result of the data to be detected when the preliminary matching result of the data to be detected in the previous frame of the data to be detected or the preliminary matching result of the data to be detected in the next frame is different from the preliminary matching result of the data to be detected, and the obstacle difference information corresponding to the preliminary matching result of the data to be detected in the previous frame of the data to be detected or the preliminary matching result of the data to be detected in the next frame is in a preset second information value interval.
In this embodiment, the preset first and second information value ranges may be value range ranges of the difference information preset according to actual situations. For example, the first and second information evaluation sections may be the evaluation sections of the offset distance or the evaluation sections of the region intersection ratio.
Specifically, in the case that the preliminary matching results corresponding to the data to be detected in the previous frame of the data to be detected and the preliminary matching results corresponding to the data to be detected in the next frame of the data to be detected are different from the preliminary matching results of the data to be detected, the preliminary matching results corresponding to the data to be detected in the previous frame of the data to be detected or the data to be detected in the next frame of the data to be detected may be used as the target matching results of the data to be detected. For example, when the preliminary matching result corresponding to the data to be detected is correct (positive class) and the preliminary matching result corresponding to the data to be detected in the previous frame and the data to be detected in the next frame is incorrect (negative class), the correct matching (positive class) may be used as the target matching result of the data to be detected; when the preliminary matching result corresponding to the data to be detected is a matching error (negative class) and the preliminary matching result corresponding to the data to be detected in the previous frame and the data to be detected in the next frame is a matching correct (positive class), the matching error (negative class) can be used as a target matching result of the data to be detected.
When the preliminary matching result of the data to be detected, which is located in the continuous 2 frames immediately before or after the data to be detected, is correct (positive class) and the preliminary matching result of the data to be detected is incorrect (negative class), if the obstacle difference information corresponding to the preliminary matching result of the data to be detected is greater than a preset deviation distance threshold or greater than a preset intersection ratio threshold, the preliminary matching result of the reference detection data can be correctly matched (positive class) as a target matching result of the data to be detected; in contrast, when the preliminary matching result of the data to be detected, which is located in the consecutive 2 frames immediately before or after the data to be detected, is a matching error (negative class), and the preliminary matching result of the data to be detected is a matching correct (positive class), if the obstacle difference information corresponding to the preliminary matching result of the data to be detected is smaller than a preset deviation distance threshold or smaller than a preset cross ratio threshold, the preliminary matching result of the reference detection data may be used as the target matching result of the data to be detected.
And in the previous frame of the data to be detected, the preliminary matching results corresponding to the data to be detected or the next frame of the data to be detected are correct (positive type) and the preliminary matching results of the data to be detected are incorrect (negative type), and the obstacle difference information corresponding to the preliminary matching results of the data to be detected or the next frame of the data to be detected is larger than a preset deviation distance threshold value or larger than a preset intersection ratio threshold value, the preliminary matching results of the reference detection data can be matched correctly (positive type) to be used as the target matching results of the data to be detected. In contrast, if the preliminary matching result corresponding to the data to be detected in the previous frame of the data to be detected or the data to be detected in the next frame is a matching error (negative type), the preliminary matching result of the data to be detected is a matching correct (positive type), and the obstacle difference information corresponding to the preliminary matching result of the data to be detected in the previous frame or the data to be detected in the next frame is smaller than a preset deviation distance threshold or smaller than a preset intersection ratio threshold, the preliminary matching result of the reference detection data may be a matching error (negative type) as the target matching result of the data to be detected.
Optionally, determining the detection performance index of the detector based on the target matching result corresponding to the multi-frame data to be detected includes: and constructing a receiver operation characteristic curve based on target matching results corresponding to the multi-frame data to be detected, and determining the detection performance index of the detector based on the area under the receiver operation characteristic curve.
In this embodiment, the horizontal axis of the receiver operation characteristic curve (ROC, receiver operating characteristic curve) may be the duty ratio FPR (False Postive Rate, false positive rate) representing that all the target matching results are negative, and the preliminary matching result is positive (false positive), where fpr=fp/(fp+tn); the vertical axis of the receiver operating characteristic curve may be the duty ratio TPR (True Postive Rate, true positive rate) representing that all the target matching results are positive classes, where tpr=tp/(tp+fn), of the data representing that the preliminary matching result is positive class (true class). The detection performance index of the detector may be an index value determined from the area AUC (area under the curve) under the ROC curve. The larger the TPR, the larger the duty ratio of the primary matching result being the positive class in the data of which the target matching result is the positive class, the more excellent the detector performance can be considered. Thus, the greater the AUC of the ROC curve, the higher the detection performance index of the detector can be determined.
According to the technical scheme, the multi-frame data to be detected are obtained, the multi-frame data to be detected are respectively input into the target detector, the obstacle detection information corresponding to each frame of data to be detected output by the detector is obtained, and a data source is provided for multi-frame detection. And then, for each frame of data to be detected, acquiring standard obstacle information corresponding to the data to be detected, determining obstacle difference information based on the obstacle detection information and the standard obstacle information, and determining a preliminary matching result corresponding to the data to be detected based on the obstacle difference information. And further acquiring reference detection data corresponding to the data to be detected, and determining a target matching result of the data to be detected based on the preliminary matching result corresponding to the reference detection data and the preliminary matching result corresponding to the data to be detected, wherein the reference detection data is the data to be detected adjacent to the data to be detected. By referring to the detection result of the adjacent frame detection data, the optimization and correction of the data to be detected are realized. And finally, determining an ROC curve based on a target matching result corresponding to the multi-frame data to be detected, and determining the detection performance index of the detector according to the area AUC under the ROC curve. The technical problems that precision is wasted and real scene detection performance of the detector cannot be accurately reflected due to a performance detection mode based on single-frame detection are solved. The method has the advantages of improving the rationality of the performance detection result of the detector, effectively guiding the performance improvement direction of the detector and providing more accurate detection data basis for vehicle driving path planning.
Example two
Fig. 2-3 are logic block diagrams of a method for determining detection performance of a detector according to a third embodiment of the present invention, where based on the foregoing embodiments, a method for determining statistical results of real class, false positive class, false negative class, and true negative class corresponding to a target matching result is specifically described by taking an area intersection ratio between a method obstacle detection area and an obstacle standard area as an example. Wherein, a region intersection ratio IOU greater than 0.6 may be considered to be a match correct, and a region intersection ratio IOU less than or equal to 0.6 may be considered to be a match error. The front and rear frames of the data to be detected may be reference detection data when the first preset number of frames is 1 frame in the foregoing embodiments; the three consecutive frames may be the data to be detected of the consecutive second preset number of frames when the second preset number is 2 in the foregoing embodiments. The continuous 3 frames may be the reference detection data of the same scene segment as the data to be detected in the foregoing embodiments, where the continuous preset frame number is 3 frames, and the acquisition time is adjacent to the continuous preset frame number. Reference is made to the description of this example for a specific implementation. The technical features that are the same as or similar to those of the foregoing embodiments are not described herein.
As shown in fig. 2, the logic block diagram includes:
1. and acquiring multi-frame data to be detected, and respectively inputting the multi-frame data to be detected into the target detector to obtain obstacle detection information.
2. And determining a preliminary matching result corresponding to the data to be detected for each frame of the data to be detected, and acquiring reference detection data corresponding to the data to be detected.
When the data to be detected is determined to be a match error,
3. when the front frame and the rear frame of the data to be detected are both correctly matched, the data to be detected is regarded as being correctly matched (FN);
4. when there is a correct match of three consecutive frames, the other frames are considered to be correctly matched (FN);
5. when the matching of the immediately preceding two continuous frames or the immediately following two frames is correct, the data iou to be detected is more than 0.3, and the matching is considered to be correct (FN);
6. when the immediately preceding frame or the immediately following frame is larger than 0.9 (extremely high matching degree), the data to be detected is regarded as matching correct (FN);
7. the above case is not satisfied and a match error (TN) is determined.
8. And determining the detection performance index of the detector based on the target matching result corresponding to the multi-frame data to be detected.
As shown in fig. 3, the logic block diagram includes:
1. and acquiring multi-frame data to be detected, and respectively inputting the multi-frame data to be detected into the target detector to obtain obstacle detection information.
2. And determining a preliminary matching result corresponding to the data to be detected for each frame of the data to be detected, and acquiring reference detection data corresponding to the data to be detected.
When the data to be detected is determined to match correctly,
3. when the front frame and the rear frame of the data to be detected are in error matching, the data to be detected is regarded as matching error (FP);
4. in the case of a consecutive three frame error match, the other frames are considered to be matching errors (FP);
5. when the immediately preceding two frames or the immediately following two frames are in matching error, the data iou to be detected is less than or equal to 0.3 and is regarded as matching error (FP);
6. when the immediately preceding or following frame is less than 0.1, the data to be detected is regarded as a match error (FP);
7. the above case is not satisfied, and the matching is determined to be correct (TP).
8. And determining the detection performance index of the detector based on the target matching result corresponding to the multi-frame data to be detected.
According to the technical scheme, the target matching result of the data to be detected is determined based on the preliminary matching result corresponding to the reference detection data and the preliminary matching result corresponding to the data to be detected, and the statistical results of the real class (TP), the false positive class (FP), the false negative class (FN) and the true negative class (TN) corresponding to the target matching result are obtained. According to the technical scheme, the target detection problem is converted into the classification problem through the continuity judgment matching result, and a data basis is provided for drawing the PR curve and the ROC curve and calculating the detection performance index of the area determination detector under the curve. The method has the beneficial effects of improving the rationality of the performance detection result of the detector, effectively guiding the performance improvement direction of the detector and providing more accurate detection data basis for vehicle driving path planning.
Example III
Fig. 4 is a schematic structural diagram of a device for determining detection performance of a detector according to a third embodiment of the present invention. As shown in fig. 4, the apparatus includes: the system comprises a data acquisition module 410 to be detected, a preliminary matching result determining module 420, a target matching result determining module 430 and a detection performance index determining module 440.
Wherein the to-be-detected data obtaining module 410 is configured to obtain multiple frames of to-be-detected data, respectively input the multiple frames of to-be-detected data into the target detector, obtain obstacle detection information corresponding to each frame of to-be-detected data output by the detector,
the preliminary matching result determining module 420 is configured to obtain, for each frame of data to be detected, standard obstacle information corresponding to the data to be detected, determine obstacle difference information based on the obstacle detection information and the standard obstacle information, and determine a preliminary matching result corresponding to the data to be detected based on the obstacle difference information;
the target matching result determining module 430 is configured to obtain reference detection data corresponding to the data to be detected, and determine a target matching result of the data to be detected based on the preliminary matching result corresponding to the reference detection data and the preliminary matching result corresponding to the data to be detected, where the reference detection data is data to be detected adjacent to the data to be detected;
The detection performance index determining module 440 is configured to determine a detection performance index of the detector based on a target matching result corresponding to the multi-frame data to be detected.
On the basis of the technical scheme, optionally, the obstacle detection information is an obstacle detection position, and the standard obstacle information is an obstacle standard position; the preliminary matching result determining module 420 is specifically configured to calculate a deviation distance between the obstacle detection position and the obstacle standard position, and determine a preliminary matching result corresponding to the data to be detected based on the deviation distance and a preset distance threshold.
On the basis of the technical scheme, optionally, the obstacle detection information is an obstacle detection area, and the standard obstacle information is an obstacle standard area; the preliminary matching result determining module 420 is specifically configured to calculate an area intersection ratio between the obstacle detection area and the obstacle standard area, and determine a preliminary matching result corresponding to the data to be detected based on the area intersection ratio and a preset intersection ratio threshold.
On the basis of the above technical solution, optionally, the preliminary matching result determining module 420 is specifically configured to perform at least one of the following operations:
under the condition that the number of frames of the reference detection data which are different from the preliminary matching result corresponding to the data to be detected reaches the preset number of frames, taking the preliminary matching result of the reference detection data as a target matching result of the data to be detected;
Under the condition that the reference detection data is adjacent to the data to be detected and the preliminary matching result corresponding to the data to be detected is different, if the obstacle difference information corresponding to the preliminary matching result of the data to be detected meets a first preset condition, the preliminary matching result of the reference detection data is used as a target matching result of the data to be detected;
and taking the preliminary matching result of the reference detection data as a target matching result of the data to be detected under the condition that the obstacle difference information corresponding to the preliminary matching result of the reference detection data meets a second preset condition.
On the basis of the above technical solution, optionally, the preliminary matching result determining module 420 is specifically configured to perform at least one of the following operations:
acquiring to-be-detected data of a first preset number of frames positioned before and/or after the to-be-detected data, which are adjacent to the acquisition time of the to-be-detected data, as reference detection data corresponding to the to-be-detected data;
and acquiring the to-be-detected data of a second preset number of frames which are continuous in acquisition time in the multi-frame to-be-detected data.
Based on the above technical solution, optionally, the target matching result determining module 430 is specifically configured to perform at least one of the following operations:
Under the condition that the preliminary matching results corresponding to the data to be detected of the previous frame of the data to be detected and the data to be detected of the next frame are different from the preliminary matching results of the data to be detected, taking the preliminary matching results corresponding to the data to be detected of the previous frame or the data to be detected of the next frame as target matching results of the data to be detected;
under the condition that preliminary matching results of the to-be-detected data of a continuous second preset number of frames which are adjacent to and before or after the to-be-detected data are different from the preliminary matching results of the to-be-detected data, if obstacle difference information corresponding to the preliminary matching results of the to-be-detected data is in a preset first information value interval, taking the preliminary matching results of the reference detection data as target matching results of the to-be-detected data;
and taking the preliminary matching result of the reference detection data as the target matching result of the data to be detected when the preliminary matching result of the data to be detected in the previous frame of the data to be detected or the preliminary matching result of the data to be detected in the next frame of the data to be detected is different from the preliminary matching result of the data to be detected and the obstacle difference information corresponding to the preliminary matching result of the data to be detected in the previous frame of the data to be detected or the preliminary matching result of the data to be detected in the next frame of the data to be detected is in a preset second information value interval.
Based on the above technical solution, optionally, the detection performance index determining module 440 is specifically configured to construct a receiver operation characteristic curve based on a target matching result corresponding to the multi-frame data to be detected, and determine the detection performance index of the detector based on an area under the receiver operation characteristic curve.
The device for determining the detection performance of the detector provided by the embodiment of the invention can execute the method for determining the detection performance of the detector provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Example IV
Fig. 5 shows a schematic diagram of the structure of an electronic device 10 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 5, the electronic device 10 includes at least one processor 11, and a memory, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, etc., communicatively connected to the at least one processor 11, in which the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data required for the operation of the electronic device 10 may also be stored. The processor 11, the ROM 12 and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
Various components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, etc.; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 11 performs the respective methods and processes described above, for example, a method of determining the detection performance of the detector.
In some embodiments, the method of determining the detector detection performance may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as the storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When the computer program is loaded into the RAM 13 and executed by the processor 11, one or more steps of the above-described method of determining the detection performance of the detector may be performed. Alternatively, in other embodiments, the processor 11 may be configured to perform the method of determining the detector detection performance in any other suitable way (e.g. by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.
Claims (10)
1. A method for determining detection performance of a detector, comprising:
obtaining multi-frame data to be detected, and respectively inputting the multi-frame data to be detected into a target detector to obtain obstacle detection information corresponding to each frame of data to be detected output by the detector;
for each frame of the data to be detected, acquiring standard obstacle information corresponding to the data to be detected, determining obstacle difference information based on the obstacle detection information and the standard obstacle information, and determining a preliminary matching result corresponding to the data to be detected based on the obstacle difference information;
Acquiring reference detection data corresponding to the data to be detected, and determining a target matching result of the data to be detected based on a preliminary matching result corresponding to the reference detection data and a preliminary matching result corresponding to the data to be detected, wherein the reference detection data is the data to be detected adjacent to the data to be detected;
and determining the detection performance index of the detector based on the target matching result corresponding to the multi-frame data to be detected.
2. The method of claim 1, wherein the obstacle detection information is an obstacle detection location and the standard obstacle information is an obstacle standard location;
the determining obstacle difference information based on the obstacle detection information and the standard obstacle information, and determining a preliminary matching result corresponding to the data to be detected based on the obstacle difference information includes:
and calculating the deviation distance between the obstacle detection position and the obstacle standard position, and determining a preliminary matching result corresponding to the data to be detected based on the deviation distance and a preset distance threshold.
3. The method of claim 1, wherein the obstacle detection information is an obstacle detection area and the standard obstacle information is an obstacle standard area;
The determining obstacle difference information based on the obstacle detection information and the standard obstacle information, and determining a preliminary matching result corresponding to the data to be detected based on the obstacle difference information includes:
and calculating the region intersection ratio between the obstacle detection region and the obstacle standard region, and determining a preliminary matching result corresponding to the data to be detected based on the region intersection ratio and a preset intersection ratio threshold value.
4. The method of claim 1, wherein the determining the target match result for the data to be detected based on the preliminary match result for the reference detection data and the preliminary match result for the data to be detected comprises at least one of:
under the condition that the number of frames of the reference detection data, which are different from the preliminary matching result corresponding to the data to be detected, reach the preset number of frames, the preliminary matching result of the reference detection data is used as a target matching result of the data to be detected;
under the condition that the reference detection data is adjacent to the data to be detected and the preliminary matching result corresponding to the data to be detected is different, if the obstacle difference information corresponding to the preliminary matching result of the data to be detected meets a first preset condition, taking the preliminary matching result of the reference detection data as a target matching result of the data to be detected;
And taking the preliminary matching result of the reference detection data as the target matching result of the data to be detected under the condition that the obstacle difference information corresponding to the preliminary matching result of the reference detection data meets a second preset condition.
5. The method of claim 4, wherein the acquiring the reference detection data corresponding to the data to be detected comprises at least one of:
acquiring to-be-detected data of a first preset number of frames which are positioned before and/or after the to-be-detected data and are adjacent to the acquisition time of the to-be-detected data, and taking the to-be-detected data as reference detection data corresponding to the to-be-detected data;
and acquiring the to-be-detected data of a second preset number of frames which are continuous in acquisition time in the multi-frame to-be-detected data.
6. The method of claim 5, wherein the determining the target match result for the data to be detected based on the preliminary match result for the reference detection data and the preliminary match result for the data to be detected comprises at least one of:
under the condition that the preliminary matching results corresponding to the data to be detected in the previous frame of the data to be detected and the preliminary matching results corresponding to the data to be detected in the next frame of the data to be detected are different from the preliminary matching results of the data to be detected, taking the preliminary matching results corresponding to the data to be detected in the previous frame or the data to be detected in the next frame as target matching results of the data to be detected;
Under the condition that preliminary matching results of the to-be-detected data of a continuous second preset number of frames which are positioned immediately before or after the to-be-detected data are different from the preliminary matching results of the to-be-detected data, if the obstacle difference information corresponding to the preliminary matching results of the to-be-detected data is in a preset first information value interval, taking the preliminary matching results of the reference detection data as target matching results of the to-be-detected data;
and taking the preliminary matching result of the reference detection data as a target matching result of the data to be detected when the preliminary matching result of the data to be detected in the previous frame of the data to be detected or the preliminary matching result of the data to be detected in the next frame is different from the preliminary matching result of the data to be detected, and the obstacle difference information corresponding to the preliminary matching result of the data to be detected in the previous frame of the data to be detected or the preliminary matching result of the data to be detected in the next frame is in a preset second information value interval.
7. The method according to claim 1, wherein the determining the detection performance index of the detector based on the target matching result corresponding to the plurality of frames of the data to be detected includes:
And constructing a receiver operation characteristic curve based on target matching results corresponding to the multiple frames of data to be detected, and determining a detection performance index of the detector based on the area under the receiver operation characteristic curve.
8. A detector detection performance determining apparatus, comprising:
a to-be-detected data acquisition module for acquiring a plurality of frames of to-be-detected data, respectively inputting the plurality of frames of to-be-detected data into a target detector to obtain obstacle detection information corresponding to each frame of to-be-detected data output by the detector,
the preliminary matching result determining module is used for obtaining standard obstacle information corresponding to the data to be detected for each frame of the data to be detected, determining obstacle difference information based on the obstacle detection information and the standard obstacle information, and determining a preliminary matching result corresponding to the data to be detected based on the obstacle difference information;
the target matching result determining module is used for obtaining reference detection data corresponding to the data to be detected, and determining a target matching result of the data to be detected based on a preliminary matching result corresponding to the reference detection data and a preliminary matching result corresponding to the data to be detected, wherein the reference detection data is the data to be detected adjacent to the data to be detected;
And the detection performance index determining module is used for determining the detection performance index of the detector based on target matching results corresponding to the multi-frame data to be detected.
9. An electronic device, the electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the method of determining the detection performance of the detector of any one of claims 1-7.
10. A computer readable storage medium storing computer instructions for causing a processor to perform the method of determining the detection performance of the detector of any one of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311277179.0A CN117349063A (en) | 2023-09-28 | 2023-09-28 | Method, device, equipment and storage medium for determining detection performance of detector |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311277179.0A CN117349063A (en) | 2023-09-28 | 2023-09-28 | Method, device, equipment and storage medium for determining detection performance of detector |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117349063A true CN117349063A (en) | 2024-01-05 |
Family
ID=89360527
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311277179.0A Pending CN117349063A (en) | 2023-09-28 | 2023-09-28 | Method, device, equipment and storage medium for determining detection performance of detector |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117349063A (en) |
-
2023
- 2023-09-28 CN CN202311277179.0A patent/CN117349063A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3879307A1 (en) | Method and apparatus for detecting obstacle | |
CN110608982A (en) | Detection method, detection device, mobile equipment, electronic equipment and storage medium | |
CN114842445A (en) | Target detection method, device, equipment and medium based on multi-path fusion | |
CN113177497B (en) | Training method of visual model, vehicle identification method and device | |
CN115690739A (en) | Multi-sensor fusion obstacle existence detection method and automatic driving vehicle | |
CN114677655A (en) | Multi-sensor target detection method and device, electronic equipment and storage medium | |
CN114861725A (en) | Post-processing method, device, equipment and medium for perception and tracking of target | |
CN113177980B (en) | Target object speed determining method and device for automatic driving and electronic equipment | |
RU2461019C1 (en) | Method of coordinate-connected identification using statistical evaluation of difference of spatial coordinates | |
CN117349063A (en) | Method, device, equipment and storage medium for determining detection performance of detector | |
CN115951344A (en) | Data fusion method and device for radar and camera, electronic equipment and storage medium | |
CN115546597A (en) | Sensor fusion method, device, equipment and storage medium | |
CN115147561A (en) | Pose graph generation method, high-precision map generation method and device | |
CN114694138B (en) | Road surface detection method, device and equipment applied to intelligent driving | |
CN115436899B (en) | Millimeter wave radar detection data processing method, device, equipment and storage medium | |
CN115327497B (en) | Radar detection range determining method, radar detection range determining device, electronic equipment and readable medium | |
RU2752863C1 (en) | Method for strobe identification of signals with radio sources in a multi-purpose environment | |
CN117292843B (en) | Electrical signal data processing method, apparatus, device and storage medium | |
CN117331063A (en) | Determination method, device, equipment and storage medium for perception target motion information | |
CN115857502B (en) | Driving control method and electronic device | |
KR20220156489A (en) | Fusion and association method and apparatus for traffic objects in driving environment, and edge computing device | |
CN118112562A (en) | Radar target detection method and device, electronic equipment and storage medium | |
CN116934779A (en) | Laser point cloud segmentation method and device, electronic equipment and storage medium | |
CN116843760A (en) | Automatic calibration method, device, equipment and medium for radar | |
CN118758343A (en) | Loop detection method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |