CN112581527B - Evaluation method, device, equipment and storage medium for obstacle detection - Google Patents

Evaluation method, device, equipment and storage medium for obstacle detection Download PDF

Info

Publication number
CN112581527B
CN112581527B CN202011453126.6A CN202011453126A CN112581527B CN 112581527 B CN112581527 B CN 112581527B CN 202011453126 A CN202011453126 A CN 202011453126A CN 112581527 B CN112581527 B CN 112581527B
Authority
CN
China
Prior art keywords
obstacle
image
matching
frame
matching pair
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011453126.6A
Other languages
Chinese (zh)
Other versions
CN112581527A (en
Inventor
赵晓健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202011453126.6A priority Critical patent/CN112581527B/en
Publication of CN112581527A publication Critical patent/CN112581527A/en
Application granted granted Critical
Publication of CN112581527B publication Critical patent/CN112581527B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/5866Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, manually generated location and time information

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Library & Information Science (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses an evaluation method, device, equipment and storage medium for obstacle detection, and relates to the fields of computer vision, automatic driving, intelligent transportation and the like. The specific implementation scheme is as follows: determining a labeling result of a plurality of frames of images, wherein the labeling result of each frame of image comprises the position information of a first obstacle; determining the corresponding moment of each frame of image, and acquiring an obstacle detection result output by an evaluation algorithm to be tested at each moment, wherein the obstacle detection result comprises the position information of a second obstacle; and evaluating the obstacle detection result output by the evaluation algorithm by using the labeling result of each frame of image. By evaluating the continuous detection results, problems can be quickly found.

Description

Evaluation method, device, equipment and storage medium for obstacle detection
Technical Field
The application relates to the field of image processing, in particular to the fields of computer vision, automatic driving, intelligent transportation and the like.
Background
In an automatic driving scene of a vehicle, an algorithm fuses input information of different perception sources and outputs results of types, positions, speeds and the like of obstacles. And the planning decision module judges whether the vehicle needs to change the road, brake and the like according to the result so as to send out a corresponding instruction.
The related test method is characterized in that the test personnel manually test the test personnel through the on-board drive test of the real vehicle, and the test mode has certain subjectivity. Moreover, under the condition that the algorithm is not mature enough, the testing method not only influences the sitting posture of the tester, but also has potential safety hazards.
Disclosure of Invention
The application provides an evaluation method, an evaluation device, a storage medium and a computer program product for obstacle detection.
According to an aspect of the present application, there is provided an evaluation method of obstacle detection, which may include the steps of:
determining a labeling result of a plurality of frames of images, wherein the labeling result of each frame of image comprises the position information of a first obstacle;
determining the corresponding moment of each frame of image, and acquiring an obstacle detection result output by an evaluation algorithm to be tested at each moment, wherein the obstacle detection result comprises the position information of a second obstacle;
and evaluating the obstacle detection result output by the evaluation algorithm by using the labeling result of each frame of image.
According to another aspect of the present application, there is provided an evaluation device for obstacle detection, the device may include:
the marking result determining module is used for determining marking results of multiple frames of images, wherein the marking results of each frame of image comprise the position information of the first obstacle;
the detection result acquisition module is used for determining the moment corresponding to each frame of image, acquiring an obstacle detection result output by the algorithm to be evaluated at each moment, wherein the obstacle detection result comprises the position information of a second obstacle;
and the evaluation module is used for evaluating the obstacle detection result output by the evaluation algorithm by using the labeling result of each frame of image.
In a third aspect, an embodiment of the present application provides an electronic device, including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the methods provided by any one of the embodiments of the present application.
In a fourth aspect, embodiments of the present application provide a non-transitory computer-readable storage medium storing computer instructions for causing a computer to perform a method provided by any one of the embodiments of the present application.
According to another aspect of the present application, there is provided a computer program product comprising computer instructions which, when executed by a processor, implement the method of any of the embodiments of the present application.
Since the evaluation algorithm to be tested can be performed in an off-line environment. Therefore, after one labeling, the effect of iterative upgrading of the algorithm to be evaluated can be evaluated according to the labeling information, and the effects of saving labor and time cost are achieved. In addition, because the actual road test is not needed, the test cost can be reduced on one hand, and the threat to the safety of testers can be reduced to the greatest extent on the other hand. Finally, by evaluating the continuous detection results, the problem can be quickly found.
It should be understood that the description of this section is not intended to identify key or critical features of the embodiments of the application or to delineate the scope of the application. Other features of the present application will become apparent from the description that follows.
Drawings
The drawings are for better understanding of the present solution and do not constitute a limitation of the present application. Wherein:
FIG. 1 is a flow chart of an assessment method for obstacle detection according to the present application;
FIG. 2 is a schematic illustration of image annotation according to the present application;
FIG. 3 is a schematic illustration of road area division according to the present application;
FIG. 4 is a flow chart of an assessment according to the present application;
FIG. 5 is a flow chart of an assessment according to the present application;
FIG. 6 is a flow chart of a composition matching pair according to the present application;
FIG. 7 is a flow chart of a composition matching pair according to the present application;
FIG. 8 is a flow chart for determining labeling results for multiple frames of images according to the present application;
FIG. 9 is a schematic diagram of an evaluation device for obstacle detection according to the present application;
fig. 10 is a block diagram of an electronic device for implementing an evaluation method of obstacle detection of the embodiment of the present application.
Detailed Description
Exemplary embodiments of the present application are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present application to facilitate understanding, and should be considered as merely exemplary. Accordingly, one of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
As shown in fig. 1, in one embodiment, the present application provides an evaluation method for obstacle detection, which may include the following steps:
s101: determining a labeling result of a plurality of frames of images, wherein the labeling result of each frame of image comprises the position information of a first obstacle;
s102: determining the corresponding moment of each frame of image, and acquiring an obstacle detection result output by an evaluation algorithm to be tested at each moment, wherein the obstacle detection result comprises the position information of a second obstacle;
s103: and evaluating the obstacle detection result output by the evaluation algorithm by using the labeling result of each frame of image.
In this embodiment of the present application, the multi-frame image may be a 2D visual image acquired by an on-vehicle image acquisition device disposed on the host vehicle. Multiple frames may represent successive in time.
An arbitrary frame image is described as an example with reference to fig. 2. Labeling includes determining lane information in the image and location information of the obstacle.
The lane in which the host vehicle is currently traveling may be taken as the host lane. The lane number may be denoted as obs_id. That is, in the labeling process, the main lane may be labeled obs_id=0, and the lane lines on both sides of the main lane may be labeled obs= -10, obs=10, respectively.
The left and right secondary lanes of the lane on both sides of the primary lane may be labeled obs_id= -2, obs_id=2, respectively. The left lane line may be labeled obs_id= -21 and the right lane line may be labeled obs_id= 21.
In the current embodiment, vehicles in 7 areas such as a primary lane, a secondary lane line and the like, or other static barriers can be selected for marking.
In the embodiment of the present application, the obstacle marked in each frame of image is a first obstacle. The number of first obstacles detected per frame of image may be 0 or a plurality of first obstacles. In the case where a plurality of first obstacles are included in the current frame image, one identification may be assigned to each first obstacle. In addition, in the case where the same first obstacle appears in successive multi-frame images, the identity of the first obstacle in each frame image is the same.
For example, the first obstacle may be marked in the image in the form of a detection frame by means of an image recognition technique. Further, a specific point within the detection frame range may be used as the characteristic point of the first obstacle. The feature points are used to represent position information of the first obstacle in the image. The location information may include lanes, coordinates, and the like. The position information of the first obstacle marked in the image may be used as a true value of the first obstacle.
For each frame of image, its moment can be determined. For example, the time corresponding to the 1 st frame image is t 1 The corresponding moment of the N-th frame image is t n
For each moment, the evaluation algorithm to be tested outputs a detection result. The detection result includes the position information of the second obstacle detected at that time. In the embodiment of the present application, the evaluation algorithm to be tested may be an environmental modeling algorithm.
Taking the detection of the forward obstacle of the host vehicle as an example, the algorithm can fuse the detection information of the sensors such as the forward wide angle, the forward fish eyes, the forward side fish eyes, the millimeter wave radar and the like and output the position information of the forward obstacles of the host vehicle. The obstacle that the evaluation algorithm to be tested will output is referred to as a second obstacle. The algorithm can be operated in an off-line state, namely, according to the information detected by each sensor of the host vehicle at each moment, the algorithm can obtain a detection result.
Comparing the position information of the first obstacle with the position information of the second obstacle, the detection result of the algorithm to be detected can be evaluated. That is, the second obstacle and the first obstacle are desirably the same obstacle. Therefore, the error of the position information of the second obstacle and the first obstacle can represent the advantages and disadvantages of the algorithm to be detected.
For example, the position detection precision of the algorithm to be detected can be detected and evaluated according to the marked coordinate information. For another example, the speed of detection of the algorithm to be detected may be evaluated according to the change of the labeled coordinate information in each frame of image. For another example, the jump rate of the identification of the algorithm to be detected can be evaluated according to the identification of the marked obstacle.
In addition, in the embodiment of the present application, the road may be further divided according to the distance between the obstacle and the host vehicle. As shown in connection with fig. 3, for example, the division may be in three categories, i.e., near (e.g., may be 0 to 60 meters from the host vehicle), medium (e.g., may be 60 to 100 meters from the host vehicle), and far (e.g., may be more than 100 meters from the host vehicle). And combining the current lane with the left lane and the right lane to obtain a nine-grid area. Different weights are set for different areas, so that areas with higher weights can be evaluated preferentially. Alternatively, the evaluation criterion is higher for the region with higher weight.
Since the evaluation algorithm to be tested can be performed in an off-line environment. Therefore, after one labeling, the effect of iterative upgrading of the algorithm to be evaluated can be evaluated according to the labeling information, and the effects of saving labor and time cost are achieved. In addition, because the actual road test is not needed, the test cost can be reduced on one hand, and the threat to the safety of testers can be reduced to the greatest extent on the other hand. Finally, by evaluating the continuous detection results, the problem can be quickly found.
As shown in connection with fig. 4, in one embodiment, step S103 may further include the sub-steps of:
s1031: for each moment, under the condition that the first obstacle and the second obstacle meet the matching rule, forming a matching pair by the first obstacle and the second obstacle;
s1032: acquiring the identification of a first obstacle and a second obstacle in a matched pair;
s1033: and determining the jump rate of the identification of the second obstacle output by the evaluation algorithm to be tested by utilizing the identification of the first obstacle and the identification of the second obstacle in the matching pair at each moment.
The matching rules may be the same at each instant. For example, the euclidean distance between the first obstacle and the second obstacle may be used for matching. In case the euclidean distance is not greater than the corresponding threshold value, it is indicated that the first obstacle and the second obstacle may constitute a matching pair. For another example, the matching may be performed using a spatial distance between the first obstacle and the second obstacle, or the matching may be performed using a detection frame overlapping ratio of the first obstacle and the second obstacle.
For each moment, an identification of a first obstacle and a second obstacle in the matching pair may be obtained. And determining the jump rate of the identification of the second obstacle output by the evaluation algorithm to be tested by utilizing the identification of the first obstacle and the identification of the second obstacle in the matching pair at each moment.
For example, at time 1, the first obstacle in the matched pair is identified as gta 1 The second obstacle is identified as pd 1 . At time 2, the first obstacle in the matched pair is identified as gta 1 The second obstacle is identified as pd 1 . If at time 3 the first obstacle in the matched pair is identified as gta 1 The second obstacle is identified as pd 2 And the mark of the second obstacle, which is output by the evaluation algorithm to be tested, jumps at the 3 rd moment.
Namely, if the second obstacle identification output by the evaluation algorithm to be tested changes once under the condition of successful matching, the second obstacle identification can be recorded as a jump.
The calculation formula of the jump rate (IdRate) can be expressed as:
wherein N is id_jmp Can represent the number of times of the change of the identification of the second obstacle output by the evaluation algorithm to be tested, N tp The number of successful matches may be represented, i.e., if there are N times, each of which is successful, then the number of successful matches is N. In the case that there is a moment that fails to match, the number of times of matching success is N-1.
In addition, the jump rates of the different areas shown in fig. 3 can also be checked. For example, for the i-th region, the jump rate for that region can be expressed as:
in N id_jmp(reg=reg_i) Can represent the number of times of the change of the identification of the second obstacle appearing in the ith area and output by the evaluation algorithm to be tested, N id_jmp(reg=reg_i) The number of successful matches in the ith region may be represented.
In the present embodiment, only one first obstacle and one second obstacle are illustrated at each time. If there are a plurality of first obstacles and a plurality of second obstacles at each moment, the matching rule is the same as there is only one first obstacle and one second obstacle.
Through the scheme, the identification of the first obstacle and the identification of the second obstacle in the matched pair established at each moment can be utilized, and the identification jump rate output by the evaluation algorithm to be tested is detected. The method is suitable for continuous frame evaluation of the fusion obstacle of the automatic driving high-speed scene and the urban road section scene, can quickly find and locate the problem, and helps the product to optimize and iterate.
As shown in connection with fig. 5, in one embodiment, step S103 may further include the sub-steps of:
s1034: for each time instant, determining a number of second obstacles that fail to match the first obstacle composition;
s1035: acquiring the time interval between each moment and the speed of a main vehicle;
s1036: and determining the false detection mileage output by the evaluation algorithm to be tested by using the number of the second obstacles, the time interval and the speed of the main vehicle which are not matched with the first obstacle.
The second obstacle which fails to form a matched pair with the first obstacle can be used as the false detection condition of the evaluation method to be tested. The false detection condition may include a false detection or a missed detection of a second obstacle.
For example, at the current time, the first obstacle of each lane may be acquired. Taking the main lane as an example, if a first obstacle exists in the main lane, but the evaluation algorithm to be detected does not detect a second obstacle, the omission of the algorithm to be detected can be determined. Under the condition that a first obstacle exists in the main lane and the to-be-tested evaluation algorithm detects a second obstacle, whether the first obstacle and the second obstacle meet a matching condition or not can be further detected, and under the condition that the matching condition is not met, the condition that the to-be-tested evaluation algorithm is not detected can be calculated, and the condition that the to-be-tested evaluation algorithm is not detected can also be calculated. Or, it can also be calculated that the evaluation algorithm to be tested has one missing detection and one false detection at the same time.
The first obstacle is not present in the main lane, but the evaluation algorithm to be detected may be determined as false detection of the evaluation algorithm to be detected in the case where the second obstacle is detected by the evaluation algorithm to be detected.
In the embodiment of the present application, the time interval between each time may be represented by the detection frequency of the evaluation algorithm to be measured. And obtaining the detection mileage by using the detection frequency and the speed of the main vehicle.
And determining the false detection mileage output by the evaluation algorithm to be tested by using the number and time interval of the second obstacles which are not matched with the first obstacle and the speed of the main vehicle. Error checking mileage (M) fn ) Can be expressed as:
in N fn The number of missed checks can be represented; n (N) fp The number of false detections may be represented; v (V) veh The speed of the main vehicle can be represented; h is a em The time interval between each time instant, i.e., the detection frequency of the evaluation algorithm to be measured, may be represented.
In addition, the missed detection mileage and the false detection mileage can be counted respectively.
In addition, the false detection mileage of different areas shown in fig. 3 can be detected. The specific calculation formula will not be described in detail.
Through the scheme, the detection of the false detection mileage output by the evaluation algorithm to be detected can be realized by utilizing the matching condition and the detection mileage at each moment. The method is suitable for continuous frame evaluation of the fusion obstacle of the automatic driving high-speed scene and the urban road section scene, can quickly find and locate the problem, and helps the product to optimize and iterate.
As shown in connection with fig. 6, in one embodiment, step S1031 may further include the sub-steps of:
s10311: for the moment corresponding to the nth frame image, under the condition that n=1, respectively calculating the Euclidean distance between each first obstacle and each second obstacle based on the position information of each first obstacle and the position information of each second obstacle at the moment corresponding to the nth frame image;
s10312: selecting a plurality of candidate matching pairs based on the Euclidean distance so as to minimize the sum of the Euclidean distances satisfied by the plurality of candidate matching pairs;
s10313: and taking the candidate matching pairs with the Euclidean distance smaller than the corresponding threshold value as matching pairs conforming to the matching rule.
In the case of n=1, namely, the first frame image, for which the corresponding time can be denoted as t 1
Obtaining t 1 Position information of each first obstacle marked in the image at the moment. In the present embodiment, it is assumed that the number of first obstacles is M.
Similarly, get t 1 And the position information of each second obstacle output by the moment waiting evaluation algorithm. In the present embodiment, it is assumed that the number of second obstacles is N.
And respectively calculating Euclidean distance between each first obstacle and each second obstacle to obtain an Euclidean distance matrix, wherein the Euclidean distance matrix is M x N.
In this step, a plurality of candidate matching pairs may be determined from the euclidean distance matrix using hungarian matching, so that the plurality of candidate matching pairs satisfy the sum of euclidean distances. That is, the number of candidate matching pairs is M or N. And comparing the Euclidean distance of each candidate matching pair with a corresponding threshold value to obtain a comparison result.
And reserving the matching pair with the Euclidean distance not larger than the corresponding threshold value as a target matching pair.
Through the scheme, a plurality of candidate matching pairs can be determined by utilizing Hungary matching for the situation of a plurality of obstacles. And then screening out target matching pairs from the plurality of candidate matching pairs according to the corresponding threshold values. Thereby achieving an efficient determination of the matching pair.
As shown in connection with fig. 7, in one embodiment, step S1031 may further include the sub-steps of:
s10314: for the moment corresponding to the N-th frame image, under the condition that N is a positive integer larger than 1, acquiring a first obstacle and a second obstacle in a matching pair established at the moment corresponding to the N-1 th frame image;
s10315: under the condition that a first obstacle and a second obstacle in a matching pair established at the moment corresponding to the N-1 frame image appear at the moment corresponding to the N-1 frame image, calculating the Euclidean distance between the first obstacle and the second obstacle in the matching pair established at the moment corresponding to the N-1 frame image;
s10316: and under the condition that the Euclidean distance at the moment corresponding to the N frame image is not greater than the corresponding threshold value, reserving the matching pair, and taking the reserved matching pair as the matching pair conforming to the matching rule.
When N is a positive integer greater than 1, the case of the corresponding non-first frame image is indicated. For the non-first frame image, firstly determining the corresponding time point as t N . And secondly, acquiring a matching pair established at the moment corresponding to the N-1 frame image. The time corresponding to the N-1 th frame image can be expressed as t N-1
For example, at t N-1 The matching pair established at the moment contains a mark gta 1 Is identified as pd 1 Is a second obstacle. Then it is necessary to judge at t N At the moment, whether the mark gta still exists in the marked image or not 1 Is identified as pd by the output of the algorithm to be detected 1 Is a second obstacle.
At t N The same time exists in the firstIn the case of the obstacle and the second obstacle, the euclidean distance between the two obstacles can be directly calculated. And if the calculation result is that the Euclidean distance is not greater than the corresponding threshold value, reserving the matching pair. The reserved matching pair is used as the corresponding time (t N Time of day) established matching pairs.
Through the scheme, for the moment corresponding to the non-first frame image, the matching pair established at the moment corresponding to the previous frame image can be referred to first, so that the calculation amount for establishing the Euclidean distance matrix and the Hungary matching process can be saved, and the overall efficiency is improved.
As shown in fig. 8, in one embodiment, step S101 may include the steps of:
s1011: determining a detection frame corresponding to the first obstacle in the image;
s1012: acquiring coordinates of characteristic points of a detection frame in an image;
s1013: converting coordinates of the feature points in the image to a world coordinate system according to preset conversion parameters to obtain coordinates of the feature points in the world coordinate system;
s1014: and correcting coordinates of the characteristic points in a world coordinate system by using radar detection data, and taking a correction result as position information of the first obstacle.
The first obstacle may be marked in the image in the form of a detection frame by means of image recognition techniques. Further, a specific point within the detection frame range may be used as the characteristic point of the first obstacle. For example, the feature point may be the center point of the bottom edge of the detection frame. The feature points are used to represent position information of the first obstacle in the image. The location information may include lanes, coordinates, and the like.
The preset conversion parameters may be external parameters of the camera, and the coordinates of the feature points are converted into the world coordinate system to obtain the coordinates of the feature points in the world coordinate system.
And acquiring radar detection data of the main vehicle, and comparing the radar detection data with coordinates of the feature points in a world coordinate system, so as to revise the coordinates of the feature points in the world coordinate system. For example, among a plurality of detection points of an obstacle detected by a radar, a detection point closest to the coordinates of the feature point in the world coordinate system is selected, and the coordinates of the feature point in the world coordinate system are replaced with the coordinates of the detection point.
According to the scheme, the radar with higher precision is utilized to revise the calibration data, so that the calibration data is closer to a true value.
As shown in fig. 9, in one embodiment, the present application provides an evaluation device for obstacle detection, which may include the following components:
the marking result determining module 901 is configured to determine a marking result of a plurality of frames of images, where the marking result of each frame of image includes position information of a first obstacle;
the detection result obtaining module 902 is configured to determine a time corresponding to each frame of image, and obtain an obstacle detection result output by the evaluation algorithm at each time, where the obstacle detection result includes position information of the second obstacle;
the evaluation module 903 is configured to evaluate the obstacle detection result output by the evaluation algorithm by using the labeling result of each frame of image.
In one embodiment, the evaluation module 903 may further include:
the matching pair forming sub-module is used for forming a matching pair of the first obstacle and the second obstacle under the condition that the first obstacle and the second obstacle accord with a matching rule at each moment;
the identification acquisition sub-module is used for acquiring the identification of the first obstacle and the second obstacle in the matched pair;
the identification jump rate determination submodule is used for determining the identification jump rate of the second obstacle output by the evaluation algorithm to be tested by utilizing the identification of the first obstacle and the identification of the second obstacle in the matching pair at each moment.
In one embodiment, the evaluation module 903 may further include:
a number determination submodule for determining, for each time instant, a number of second obstacles that fail to match the first obstacle composition;
the information acquisition sub-module is used for acquiring the time interval between each moment and the speed of the main vehicle;
and the false detection mileage determination submodule is used for determining the false detection mileage output by the evaluation algorithm to be tested by utilizing the number of the second obstacles which are not matched with the first obstacle, the time interval and the speed of the host vehicle.
In one embodiment, the matching pair component sub-module may further include:
a euclidean distance calculating unit configured to calculate, for a time corresponding to the nth frame image, a euclidean distance between each first obstacle and each second obstacle based on the position information of each first obstacle and the position information of each second obstacle at the time corresponding to the nth frame image, in the case where n=1;
a candidate matching pair determining unit configured to select a plurality of candidate matching pairs based on the euclidean distance so that the plurality of candidate matching pairs satisfy a sum of euclidean distances;
and the matching pair composition execution unit is used for taking the candidate matching pair with the Euclidean distance smaller than the corresponding threshold value as the matching pair conforming to the matching rule.
In one embodiment, the matching pair component sub-module may further include:
a history matching pair obtaining unit, configured to obtain, for a time corresponding to an nth frame image, a first obstacle and a second obstacle in a matching pair established at a time corresponding to an nth-1 frame image, where N is a positive integer greater than 1;
a history matching pair checking unit, configured to calculate, when a first obstacle and a second obstacle in a matching pair established at a time corresponding to an N-1 th frame image appear at a time corresponding to the N-th frame image, a euclidean distance between the first obstacle and the second obstacle in the matching pair established at the time corresponding to the N-1 th frame image at the time corresponding to the N-th frame image;
and the matching pair forming execution unit is used for reserving the matching pair and taking the reserved matching pair as the matching pair conforming to the matching rule under the condition that the Euclidean distance at the moment corresponding to the Nth frame image is not greater than the corresponding threshold value.
In one embodiment, the labeling result determining module 901 may further include:
the detection frame determining submodule is used for determining a detection frame corresponding to the first obstacle in the image;
the coordinate acquisition sub-module is used for acquiring coordinates of the feature points of the detection frame in the image;
the coordinate conversion sub-module is used for converting the coordinates of the feature points in the image into a world coordinate system according to preset conversion parameters to obtain the coordinates of the feature points in the world coordinate system;
and the coordinate correction sub-module is used for correcting the coordinates of the characteristic points in the world coordinate system by utilizing the radar detection data, and taking the correction result as the position information of the first obstacle.
According to embodiments of the present application, there is also provided an electronic device, a readable storage medium and a computer program product.
As shown in fig. 10, a block diagram of an electronic device according to an evaluation method of obstacle detection according to an embodiment of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the application described and/or claimed herein.
As shown in fig. 10, the electronic device includes: one or more processors 1010, a memory 1020, and interfaces for connecting components, including a high-speed interface and a low-speed interface. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions executing within the electronic device, including instructions stored in or on memory to display graphical information of the GUI on an external input/output device, such as a display device coupled to the interface. In other embodiments, multiple processors and/or multiple buses may be used, if desired, along with multiple memories and multiple memories. Also, multiple electronic devices may be connected, each providing a portion of the necessary operations (e.g., as a server array, a set of blade servers, or a multiprocessor system). One processor 1010 is illustrated in fig. 10.
Memory 1020 is a non-transitory computer-readable storage medium provided herein. The memory stores instructions executable by the at least one processor to cause the at least one processor to perform the method for evaluating obstacle detection provided herein. The non-transitory computer-readable storage medium of the present application stores computer instructions for causing a computer to execute the evaluation method of obstacle detection provided by the present application.
The memory 1020 is used as a non-transitory computer readable storage medium, and may be used to store a non-transitory software program, a non-transitory computer executable program, and modules, such as program instructions/modules (e.g., the labeling result determining module 901, the detection result acquiring module 902, and the evaluation module 903 shown in fig. 9) corresponding to the evaluation method of obstacle detection in the embodiments of the present application. The processor 1010 executes various functional applications of the server and data processing, i.e., implements the evaluation method of obstacle detection in the above-described method embodiment, by running non-transitory software programs, instructions, and modules stored in the memory 1020.
Memory 1020 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, at least one application program required for a function; the storage data area may store data created by use of the electronic device according to an evaluation method of obstacle detection, and the like. In addition, memory 1020 may include high-speed random access memory and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, memory 1020 may optionally include memory remotely located with respect to processor 1010, which may be connected to the electronic device of the method of evaluation of obstacle detection via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device of the evaluation method of obstacle detection may further include: an input device 1030 and an output device 1040. The processor 1010, memory 1020, input device 1030, and output device 1040 may be connected by a bus or other means, for example in fig. 10.
The input device 1030 may receive input numeric or character information and key signal inputs related to user settings and function control of the electronic device that generate an assessment method for obstacle detection, such as input devices for a touch screen, keypad, mouse, trackpad, touch pad, pointer stick, one or more mouse buttons, trackball, joystick, and the like. The output means 1040 may include a display device, auxiliary lighting means (e.g., LEDs), tactile feedback means (e.g., vibration motors), and the like. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device may be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASIC (application specific integrated circuit), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
These computing programs (also referred to as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so as to solve the defects of high management difficulty and weak service expansibility in the traditional physical host and Virtual Private Server (VPS) service. The server may also be a server of a distributed system or a server that incorporates a blockchain.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present application may be performed in parallel, sequentially, or in a different order, provided that the desired results of the technical solutions disclosed in the present application can be achieved, and are not limited herein.
The above embodiments do not limit the scope of the application. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present application are intended to be included within the scope of the present application.

Claims (10)

1. An evaluation method for obstacle detection, comprising:
determining a labeling result of a plurality of frames of images, wherein the labeling result of each frame of images comprises position information of a first obstacle;
determining the corresponding moment of each frame of the image, and acquiring an obstacle detection result output by an evaluation algorithm to be tested at each moment, wherein the obstacle detection result comprises the position information of a second obstacle;
evaluating the obstacle detection result output by the evaluation algorithm to be evaluated by using the labeling result of each frame of the image;
the step of evaluating the obstacle detection result output by the evaluation algorithm to be evaluated by using the labeling result of each frame of the image comprises the following steps:
for each moment, if the first obstacle and the second obstacle meet a matching rule, forming a matching pair by the first obstacle and the second obstacle;
acquiring the identification of the first obstacle and the second obstacle in the matched pair;
determining the jump rate of the identification of the second obstacle output by the evaluation algorithm to be tested by utilizing the identification of the first obstacle and the identification of the second obstacle in each matching pair at the moment;
wherein for each of the moments, when the first obstacle and the second obstacle meet a matching rule, forming a matching pair of the first obstacle and the second obstacle includes:
for the moment corresponding to the nth frame image, under the condition that n=1, respectively calculating the Euclidean distance between each first obstacle and each second obstacle based on the position information of each first obstacle and the position information of each second obstacle at the moment corresponding to the nth frame image;
selecting a plurality of candidate matching pairs based on the euclidean distance so as to minimize the sum of the euclidean distances satisfied by the plurality of candidate matching pairs;
and taking the candidate matching pairs with the Euclidean distance smaller than the corresponding threshold value as matching pairs conforming to the matching rule.
2. The method of claim 1, wherein the evaluating the obstacle detection result output by the evaluation-to-be-evaluated algorithm using the labeling result of the image of each frame, further comprises:
determining, for each of the time instants, a number of second obstacles that fail to match the first obstacle composition;
acquiring the time interval between each time and the speed of the main vehicle;
and determining the false detection mileage output by the evaluation algorithm to be tested by utilizing the number of the second obstacles which are not matched with the first obstacle, the time interval and the speed of the host vehicle.
3. The method of claim 1, wherein for each of the time instants, if the first obstacle and the second obstacle meet a matching rule, grouping the first obstacle and the second obstacle into a matching pair comprises:
for the moment corresponding to the N-th frame image, under the condition that N is a positive integer larger than 1, acquiring a first obstacle and a second obstacle in a matching pair established at the moment corresponding to the N-1 th frame image;
under the condition that a first obstacle and a second obstacle in a matching pair established at the moment corresponding to the N-1 frame image appear at the moment corresponding to the N-1 frame image, calculating the Euclidean distance between the first obstacle and the second obstacle in the matching pair established at the moment corresponding to the N-1 frame image;
and under the condition that the Euclidean distance of the moment corresponding to the N frame image is not greater than a corresponding threshold value, reserving the matching pair, and taking the reserved matching pair as the matching pair conforming to a matching rule.
4. The method of claim 1, wherein determining the manner of labeling the results comprises:
determining a detection frame corresponding to the first obstacle in the image;
acquiring coordinates of characteristic points of the detection frame in the image;
converting the coordinates of the feature points in the image to a world coordinate system according to preset conversion parameters to obtain the coordinates of the feature points in the world coordinate system;
and correcting the coordinates of the characteristic points in a world coordinate system by using radar detection data, and taking the correction result as the position information of the first obstacle.
5. An evaluation device for obstacle detection, comprising:
the marking result determining module is used for determining marking results of multiple frames of images, wherein the marking results of each frame of images comprise position information of a first obstacle;
the detection result acquisition module is used for determining the moment corresponding to each frame of the image and acquiring an obstacle detection result output by an evaluation algorithm to be tested at each moment, wherein the obstacle detection result comprises the position information of a second obstacle;
the evaluation module is used for evaluating the obstacle detection result output by the evaluation algorithm to be tested by using the labeling result of each frame of the image;
wherein, the evaluation module includes:
a matching pair forming sub-module, configured to form a matching pair from the first obstacle and the second obstacle when the first obstacle and the second obstacle meet a matching rule for each time;
an identification acquisition sub-module for acquiring identifications of the first obstacle and the second obstacle in the matched pair;
the identification jump rate determination submodule is used for determining the identification jump rate of the second obstacle output by the evaluation algorithm to be tested by utilizing the identification of the first obstacle and the identification of the second obstacle in the matching pair at each moment;
wherein, the matching pair constitutes a sub-module comprising:
a euclidean distance calculating unit configured to calculate, for a time corresponding to an nth frame image, in a case where n=1, a euclidean distance between each first obstacle and each second obstacle based on position information of each first obstacle and position information of each second obstacle at the time corresponding to the nth frame image;
a candidate matching pair determining unit configured to select a plurality of candidate matching pairs based on the euclidean distance so that the plurality of candidate matching pairs satisfy a sum of euclidean distances;
and the matching pair composition execution unit is used for taking the candidate matching pair with the Euclidean distance smaller than the corresponding threshold value as the matching pair conforming to the matching rule.
6. The apparatus of claim 5, wherein the evaluation module comprises:
a number determination submodule for determining, for each of the moments, a number of second obstacles that fail to match the first obstacle composition;
the information acquisition sub-module is used for acquiring the time interval between each time and the speed of the main vehicle;
and the false detection mileage determination submodule is used for determining the false detection mileage output by the evaluation algorithm to be tested by utilizing the number of the second barriers which are not matched with the first barrier, the time interval and the speed of the main vehicle.
7. The apparatus of claim 5, wherein the matched pair comprises sub-modules comprising:
a history matching pair obtaining unit, configured to obtain, for a time corresponding to an nth frame image, a first obstacle and a second obstacle in a matching pair established at a time corresponding to an nth-1 frame image, where N is a positive integer greater than 1;
a history matching pair checking unit, configured to calculate, when a first obstacle and a second obstacle in a matching pair established at a time corresponding to the N-1 th frame image appear at a time corresponding to the N-1 th frame image, a euclidean distance between the first obstacle and the second obstacle in the matching pair established at the time corresponding to the N-1 th frame image at the time corresponding to the N-th frame image;
and the matching pair composition execution unit is used for reserving the matching pair and taking the reserved matching pair as the matching pair conforming to a matching rule under the condition that the Euclidean distance of the moment corresponding to the Nth frame image is not greater than a corresponding threshold value.
8. The apparatus of claim 5, wherein the labeling result determination module comprises:
the detection frame determining submodule is used for determining a detection frame corresponding to the first obstacle in the image;
the coordinate acquisition sub-module is used for acquiring coordinates of the characteristic points of the detection frame in the image;
the coordinate conversion sub-module is used for converting the coordinates of the feature points in the image to the world coordinate system according to preset conversion parameters to obtain the coordinates of the feature points in the world coordinate system;
and the coordinate correction sub-module is used for correcting the coordinates of the characteristic points in a world coordinate system by utilizing radar detection data, and taking the correction result as the position information of the first obstacle.
9. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1 to 4.
10. A non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the method of any one of claims 1 to 4.
CN202011453126.6A 2020-12-11 2020-12-11 Evaluation method, device, equipment and storage medium for obstacle detection Active CN112581527B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011453126.6A CN112581527B (en) 2020-12-11 2020-12-11 Evaluation method, device, equipment and storage medium for obstacle detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011453126.6A CN112581527B (en) 2020-12-11 2020-12-11 Evaluation method, device, equipment and storage medium for obstacle detection

Publications (2)

Publication Number Publication Date
CN112581527A CN112581527A (en) 2021-03-30
CN112581527B true CN112581527B (en) 2024-02-27

Family

ID=75131332

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011453126.6A Active CN112581527B (en) 2020-12-11 2020-12-11 Evaluation method, device, equipment and storage medium for obstacle detection

Country Status (1)

Country Link
CN (1) CN112581527B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008026999A (en) * 2006-07-18 2008-02-07 Sumitomo Electric Ind Ltd Obstacle detection system and obstacle detection method
CN110940979A (en) * 2019-10-28 2020-03-31 杭州飞步科技有限公司 Obstacle detection method, apparatus, device, and storage medium
CN111046809A (en) * 2019-12-16 2020-04-21 中科院微电子研究所昆山分所 Obstacle detection method, device and equipment and computer readable storage medium
CN111324115A (en) * 2020-01-23 2020-06-23 北京百度网讯科技有限公司 Obstacle position detection fusion method and device, electronic equipment and storage medium
CN111339996A (en) * 2020-03-20 2020-06-26 北京百度网讯科技有限公司 Method, device and equipment for detecting static obstacle and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008026999A (en) * 2006-07-18 2008-02-07 Sumitomo Electric Ind Ltd Obstacle detection system and obstacle detection method
CN110940979A (en) * 2019-10-28 2020-03-31 杭州飞步科技有限公司 Obstacle detection method, apparatus, device, and storage medium
CN111046809A (en) * 2019-12-16 2020-04-21 中科院微电子研究所昆山分所 Obstacle detection method, device and equipment and computer readable storage medium
CN111324115A (en) * 2020-01-23 2020-06-23 北京百度网讯科技有限公司 Obstacle position detection fusion method and device, electronic equipment and storage medium
CN111339996A (en) * 2020-03-20 2020-06-26 北京百度网讯科技有限公司 Method, device and equipment for detecting static obstacle and storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
基于信息融合的智能车障碍物检测方法;陆峰;徐友春;李永乐;王德宇;谢德胜;;计算机应用(第S2期);全文 *
基于视觉的机器人自主定位与障碍物检测方法;丁斗建;赵晓林;王长根;高关根;寇磊;;计算机应用(第06期);全文 *
道路通行障碍物遥感检测与通过性评价;康晋洁;戚浩平;杨清华;陈华;;国土资源遥感(第02期);全文 *

Also Published As

Publication number Publication date
CN112581527A (en) 2021-03-30

Similar Documents

Publication Publication Date Title
CN111998860B (en) Automatic driving positioning data verification method and device, electronic equipment and storage medium
CN111292531B (en) Tracking method, device and equipment of traffic signal lamp and storage medium
CN109633688A (en) A kind of laser radar obstacle recognition method and device
CN109212532A (en) Method and apparatus for detecting barrier
CN111428663A (en) Traffic light state identification method and device, electronic equipment and storage medium
CN111310840B (en) Data fusion processing method, device, equipment and storage medium
CN111753765A (en) Detection method, device and equipment of sensing equipment and storage medium
US20210312799A1 (en) Detecting traffic anomaly event
CN111324115A (en) Obstacle position detection fusion method and device, electronic equipment and storage medium
EP3951741B1 (en) Method for acquiring traffic state, relevant apparatus, roadside device and cloud control platform
CN111402326B (en) Obstacle detection method, obstacle detection device, unmanned vehicle and storage medium
CN113091757B (en) Map generation method and device
CN112147632A (en) Method, device, equipment and medium for testing vehicle-mounted laser radar perception algorithm
KR20200056905A (en) Method and apparatus for aligning 3d model
CN115719436A (en) Model training method, target detection method, device, equipment and storage medium
CN112287806A (en) Road information detection method, system, electronic equipment and storage medium
CN110866504B (en) Method, device and equipment for acquiring annotation data
CN112528846A (en) Evaluation method, device, equipment and storage medium for obstacle detection
EP4145408A1 (en) Obstacle detection method and apparatus, autonomous vehicle, device and storage medium
CN114565908A (en) Lane line detection method and device, electronic device and storage medium
CN110458815A (en) There is the method and device of mist scene detection
CN114186007A (en) High-precision map generation method and device, electronic equipment and storage medium
CN111640301B (en) Fault vehicle detection method and fault vehicle detection system comprising road side unit
CN112581527B (en) Evaluation method, device, equipment and storage medium for obstacle detection
CN113126120A (en) Data annotation method, device, equipment, storage medium and computer program product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant