CN112799091A - Algorithm evaluation method, device and storage medium - Google Patents

Algorithm evaluation method, device and storage medium Download PDF

Info

Publication number
CN112799091A
CN112799091A CN202110120361.XA CN202110120361A CN112799091A CN 112799091 A CN112799091 A CN 112799091A CN 202110120361 A CN202110120361 A CN 202110120361A CN 112799091 A CN112799091 A CN 112799091A
Authority
CN
China
Prior art keywords
vehicle
point cloud
target
algorithm
cloud data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110120361.XA
Other languages
Chinese (zh)
Inventor
陈伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Imotion Automotive Technology Suzhou Co Ltd
Original Assignee
Imotion Automotive Technology Suzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Imotion Automotive Technology Suzhou Co Ltd filed Critical Imotion Automotive Technology Suzhou Co Ltd
Priority to CN202110120361.XA priority Critical patent/CN112799091A/en
Publication of CN112799091A publication Critical patent/CN112799091A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Traffic Control Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The application relates to an algorithm evaluation method, an algorithm evaluation device and a storage medium, which belong to the technical field of computers, and the method comprises the following steps: obtaining a vehicle-related prediction result of a target vehicle obtained by an evaluation algorithm to be evaluated; acquiring point cloud data acquired by a laser radar on a target vehicle; calculating a vehicle-related real result of the target vehicle based on the point cloud data; comparing the vehicle-related prediction result with the vehicle-related real result to obtain an evaluation result of the algorithm to be evaluated; the problem that when the vehicle related algorithm is directly put into use, if the algorithm is inaccurate, the safety of automatic driving is possibly low can be solved; the algorithm can be evaluated, and therefore safety of automatic driving is improved. Due to the fact that the data acquisition posture of the laser radar enables the number of the point cloud data to be larger than the preset threshold value, the target can receive more laser radar wire harnesses, accuracy of a real value calculation result is improved, and therefore accuracy of an evaluation result can be improved.

Description

Algorithm evaluation method, device and storage medium
[ technical field ] A method for producing a semiconductor device
The application relates to an algorithm evaluation method, an algorithm evaluation device and a storage medium, and belongs to the technical field of computers.
[ background of the invention ]
With the development of the automatic driving technology, various algorithms are generally integrated in a vehicle to realize an automatic driving function. Such as: a target detection algorithm is integrated in the vehicle to detect a target; and/or a ranging algorithm is integrated in the vehicle to measure the distance between the target and the current vehicle.
In order to ensure the safety of autonomous driving, an algorithm integrated in the vehicle needs to be evaluated before the vehicle is put into use.
[ summary of the invention ]
The application provides an algorithm evaluation method, an algorithm evaluation device and a storage medium, which can solve the problem that when a vehicle related algorithm is directly put into use, if the algorithm is inaccurate, the safety of automatic driving is possibly low. The application provides the following technical scheme:
in a first aspect, an algorithm evaluation method is provided, the method comprising:
obtaining a vehicle-related prediction result of a target vehicle obtained by an evaluation algorithm to be evaluated;
acquiring point cloud data acquired by a laser radar on the target vehicle; the data acquisition attitude of the laser radar enables the number of the point cloud data to be larger than a preset threshold value;
calculating a vehicle-related real result of the target vehicle based on the point cloud data, wherein the preset threshold value enables an error between the vehicle-related real result and an actual result to be smaller than an error threshold value;
and comparing the vehicle-related prediction result with the vehicle-related real result to obtain an evaluation result of the algorithm to be evaluated.
Optionally, the algorithm to be evaluated is used for calculating the predicted distance of other targets relative to the target vehicle, and the vehicle-related prediction result comprises the predicted distance; the data acquisition pose includes a field angle and/or an angular resolution of the lidar.
Optionally, an image sensor is further installed on the target vehicle, and an image acquisition range of the image sensor includes an area where the other target is located; the calculating of vehicle-related real results of the target vehicle based on the point cloud data comprises:
projecting the point cloud data to a sensing image acquired by the image sensor;
acquiring a target detection result of the sensing image, wherein the target detection result is used for indicating the positions of other targets in the sensing image;
screening the point cloud data based on the target detection result to obtain candidate point cloud data;
calculating a true distance between the candidate target and the target vehicle based on the candidate point cloud data, the vehicle-related true result including the true distance.
Optionally, the other targets comprise vehicles located forward in the direction of travel of the target vehicle, the candidate point cloud data being located at vehicle tails of the other targets.
Optionally, the screening the point cloud data based on the target detection result to obtain candidate point cloud data includes:
for the target external frame indicated by the target detection result, point cloud data which is positioned in the target external frame and has a distance with the bottom edge of the target external frame smaller than or equal to a preset distance is determined as the candidate point cloud data;
and/or the presence of a gas in the gas,
calculating a point cloud normal vector of each point cloud data in the target external frame for the target external frame indicated by the target detection result; and eliminating the point cloud data of which the angle between the point cloud normal vector and the ground is within a preset range to obtain the candidate point cloud data.
Optionally, the calculating a true distance between the candidate target and the target vehicle based on the candidate point cloud data comprises:
clustering the candidate point cloud data;
performing histogram statistics on the clustered point cloud data to obtain point cloud sets with the largest number;
calculating a true distance between the candidate target and the target vehicle using the largest number of point cloud sets.
Optionally, the obtaining of the vehicle-related prediction result of the target vehicle obtained by the evaluation algorithm includes:
acquiring a target image including the other target;
and calculating the predicted distance between other targets in the target image and the target vehicle by using a monocular distance measurement algorithm to obtain the vehicle-related prediction result, wherein the algorithm to be evaluated comprises the monocular distance measurement algorithm.
Optionally, the evaluation algorithm is configured to detect whether there are other targets near the target vehicle, and the vehicle-related prediction result includes the prediction detection result; the data acquisition pose includes a field angle and/or an angular resolution of the lidar.
In a second aspect, an algorithm evaluation device is provided, the device comprising a processor and a memory; the memory stores therein a program that is loaded and executed by the processor to implement the algorithm evaluation method provided in the first aspect.
In a third aspect, a computer-readable storage medium is provided, in which a program is stored, which when executed by a processor is configured to implement the algorithm evaluation method provided in the first aspect.
The beneficial effect of this application lies in: obtaining a vehicle-related prediction result of the target vehicle by obtaining an algorithm to be evaluated; acquiring point cloud data acquired by a laser radar on a target vehicle; the data acquisition attitude of the laser radar enables the number of point cloud data to be larger than a preset threshold value; calculating a vehicle-related real result of the target vehicle based on the point cloud data, and presetting a threshold value to enable an error between the vehicle-related real result and an actual result to be smaller than an error threshold value; comparing the vehicle-related prediction result with the vehicle-related real result to obtain an evaluation result of the algorithm to be evaluated; the problem that when the vehicle related algorithm is directly put into use, if the algorithm is inaccurate, the safety of automatic driving is possibly low can be solved; the actual value is calculated by using the point cloud data collected by the laser radar, and the accuracy of the algorithm to be evaluated is evaluated by using the actual value, so that the algorithm can be evaluated, and the safety of automatic driving is improved.
In addition, due to the fact that the number of the point cloud data is larger than the preset threshold value through the data acquisition posture of the laser radar, the target can receive more laser radar wire harnesses, the accuracy of a real value calculation result is improved, and therefore the accuracy of an evaluation result can be improved.
In addition, the position with the nearest target distance is extracted in a clustering and histogram mode to serve as a distance truth value, and the accuracy of the truth value is improved.
The foregoing description is only an overview of the technical solutions of the present application, and in order to make the technical solutions of the present application more clear and clear, and to implement the technical solutions according to the content of the description, the following detailed description is made with reference to the preferred embodiments of the present application and the accompanying drawings.
[ description of the drawings ]
FIG. 1 is a flow chart of an algorithm evaluation method provided by one embodiment of the present application;
FIG. 2 is a schematic illustration of a low beam lidar laser beam delivered to a leading vehicle provided by one embodiment of the present application;
FIG. 3 is a schematic diagram of an installation of a lidar and an image sensor provided by an embodiment of the present application;
FIG. 4 is a schematic illustration of a sensed image provided by one embodiment of the present application;
FIG. 5 is a schematic diagram of coordinate transformation provided by one embodiment of the present application;
FIG. 6 is a schematic diagram of point cloud data provided by an embodiment of the present application after mapping to a sensed image;
FIG. 7 is a schematic diagram of a point cloud data filtering provided by an embodiment of the present application;
FIG. 8 is a block diagram of an algorithm evaluation device provided in one embodiment of the present application;
fig. 9 is a block diagram of an algorithm evaluation device according to an embodiment of the present application.
[ detailed description ] embodiments
The following detailed description of embodiments of the present application will be described in conjunction with the accompanying drawings and examples. The following examples are intended to illustrate the present application but are not intended to limit the scope of the present application.
First, several terms referred to in the present application will be described.
Automatic driving (Self-driving): the intelligent automobile is an intelligent automobile which can realize automatic driving through a computer system.
Lidar (LightLaser Detection and Ranging): the radar using laser as radiation source is composed of transmitter, antenna, receiver, tracking frame and information processing unit.
Point Cloud (Point Cloud): the lidar emits a plurality of laser beams, a collection of points reflected by the surface of the object is a point cloud, and each point comprises a three-dimensional coordinate (XYZ, position relative to the lidar) and a laser reflection Intensity (Intensity) of the point.
Monocular Ranging (Monocular Ranging): a camera is used for video shooting, an object to be measured is found in an image, and distance measurement is carried out.
Aberration Correction (aberration Correction): the method is characterized in that pillow-shaped and barrel-shaped distortions of a lens are corrected by utilizing an internal reference matrix of a camera.
Clustering (Clustering): is an analytical process that groups a collection of physical or abstract objects into classes that are composed of objects that are similar to each other.
Histogram (Histogram): is a statistical report graph, and the condition of data distribution is represented by a series of vertical stripes or line segments with unequal heights.
Angular resolution: refers to the resolving power of the imaging system or a component of the system.
Optionally, the execution subject of each embodiment is taken as an example of an electronic device with computing capability, the electronic device may be a terminal or a server, the terminal may be a vehicle-mounted computer, a mobile phone, a computer, a notebook computer, a tablet computer, and the like, and the type of the terminal and the type of the electronic device are not limited in this embodiment.
In this embodiment, the electronic device is in communication connection with the lidar sensor and the image sensor on the target vehicle, that is, the lidar sensor and the image sensor are mounted on the target vehicle. In practical implementation, the target vehicle may also be equipped with other types of sensors, and the present embodiment does not limit the types of sensors installed on the target vehicle. The electronic device may be an on-board computer on the target vehicle or a device independent from the target vehicle, and the embodiment does not limit the installation manner between the electronic device and the target vehicle.
Fig. 1 is a flowchart of an algorithm evaluation method according to an embodiment of the present application. The method at least comprises the following steps:
step 101, obtaining a vehicle-related prediction result of a target vehicle obtained by an evaluation algorithm.
In this embodiment, the estimation method can obtain the vehicle-related prediction result without using the point cloud data acquired by the laser radar. In addition, the vehicle related prediction result obtained by the to-be-evaluated algorithm is also obtained by using point cloud data for calculation.
Optionally, the algorithm to be evaluated includes, but is not limited to, at least one of the following:
the first method comprises the following steps: the evaluation algorithm is used for calculating the predicted distance of other targets relative to the target vehicle, and the vehicle-related prediction result comprises the predicted distance.
In a first implementation manner, obtaining a vehicle-related prediction result of a target vehicle obtained by an algorithm to be evaluated includes: acquiring a target image including other targets; and calculating the predicted distance between other targets in the target image and the target vehicle by using a monocular distance measurement algorithm to obtain a vehicle-related prediction result, wherein the algorithm to be evaluated comprises the monocular distance measurement algorithm.
Optionally, the target image is acquired by an image sensor or acquired by a vehicle event data recorder, and the source of the target image is not limited in this embodiment.
Other objects may be vehicles, pedestrians, curbs, etc., and the present embodiment is not limited to the types of other objects.
Or, obtaining a vehicle-related prediction result of the target vehicle obtained by the algorithm to be evaluated, including: acquiring first positioning results of first positioning components installed on other targets; acquiring a second positioning result of a second positioning component installed on the target vehicle; and determining the predicted distance between other targets and the target vehicle according to the distance between the first positioning result and the second positioning result to obtain a vehicle-related predicted result, wherein the algorithm to be evaluated comprises a positioning algorithm of a positioning component.
Of course, in other embodiments, the algorithm to be evaluated may also be other algorithms, such as: acoustic ranging algorithm, etc., and the present embodiment does not limit the type of the method to be evaluated.
And the second method comprises the following steps: the algorithm to be evaluated is used for detecting whether other targets exist near the target vehicle, and at the moment, the vehicle-related prediction result comprises a prediction detection result.
In a second implementation manner, obtaining a vehicle-related prediction result of a target vehicle obtained by an algorithm to be evaluated includes: acquiring a target image including other targets; and carrying out target detection on the target image by using a target detection algorithm to obtain a prediction detection result. The algorithm to be evaluated comprises a target detection algorithm.
Alternatively, the target detection algorithm may be a Single Shot multi box Detector (SSD), You Only Look Once (YOLO), YOLOv2, or the like, and the present embodiment does not limit the algorithm type of the target detection algorithm to be evaluated.
Other objects may be vehicles, pedestrians, curbs, etc., and the present embodiment is not limited to the types of other objects.
102, acquiring point cloud data acquired by a laser radar on a target vehicle; the data acquisition posture of the laser radar enables the number of point cloud data to be larger than a preset threshold value.
Wherein the data acquisition pose comprises a field angle and/or an angular resolution of the lidar.
When a conventional low-beam lidar (for example, a 16-beam lidar) collects point cloud data, the vertical angular resolution of most of the 16-beam lidar is 2 °, so that the point cloud data in the vertical direction is sparse. Thus, the interval at which each laser line scans is greater as the distance increases. If the laser radar is installed at the roof at a height of 1.5 m, and the distance between two laser lines after 20 m is greater than 7 m, many target point clouds to be collected are easily missed, as shown in (a) of fig. 2. If the 16-beam lidar is installed right in front of the vehicle, there is also a problem that the point cloud data in the vertical direction is not sufficiently abundant due to the influence of the road gradient, as shown in (b) of fig. 2.
In the embodiment, the condition that the resolution and the field angle of the laser radar cannot be compatible is considered, and the laser radar with low field angle and high resolution (such as a 16-beam hybrid solid-state laser radar) in the vertical direction of the laser radar is customized and installed on the vehicle to collect the point cloud data. Optionally, the acquisition range of the lidar includes, but is not limited to: directly in front of the target vehicle, to the left of the target vehicle, to the right of the target vehicle, and/or to the rear of the target vehicle.
In one example, in the present embodiment, the data acquisition postures are-4 ° +2 ° in the vertical angle of view, and the vertical angle resolution is 0.33 °. As shown in fig. 2 (c), as compared with the conventional laser radar installation method (16-beam mechanical radar vertical field angle-15 ° - +15 °, vertical angular resolution 2 °), the data acquisition posture can cover more laser beams on the target object, so that more point cloud data can be acquired, and accuracy of true value calculation can be improved.
And 103, calculating a vehicle-related real result of the target vehicle based on the point cloud data, and presetting a threshold value to enable an error between the vehicle-related real result and an actual result to be smaller than an error threshold value.
Alternatively, the vehicle-related real result may be calculated using only point cloud data; alternatively, the vehicle related real result may be calculated by using the point cloud data in combination with the sensing data obtained by other sensors, and the calculation method of the vehicle related real result is not limited in this embodiment.
Taking the first evaluation algorithm to be evaluated in step 101 as an example, at this time, the vehicle related real result includes a real distance corresponding to the predicted distance.
In one example, the target vehicle is also mounted with an image sensor having an image capturing range including an area where other targets are located. Refer to the image sensor and lidar mounting schematic shown in fig. 3. In fig. 3, the laser radar 31 is installed right in front of the vehicle (at the license plate) with the angle of view horizontally forward; an image sensor (monocular camera) 32 is installed at the center of the vehicle front windshield with the angle of view directed horizontally forward. The vertical field angle of the image sensor is larger than that of the laser radar. In fig. 3, a distance measurement algorithm is used for a road target, and accordingly, a laser radar samples a road target ahead. Calculating a vehicle-related real result of the target vehicle based on the point cloud data, comprising the steps of:
and step 1031, projecting the point cloud data to a sensing image acquired by the image sensor.
Before the step, in order to ensure the accuracy of the calculation of the real distance, internal reference calibration needs to be performed on the image sensor, and the image sensor and the laser radar are subjected to combined calibration to obtain an external reference matrix from a laser radar coordinate system to a camera coordinate system. And then, performing time synchronization on the laser radar and the image sensor based on a timestamp alignment mode, and receiving paired three-dimensional point cloud data and a sensing image in real time.
Optionally, after the sensing image is acquired, the electronic device corrects pillow distortion and barrel distortion of the sensing image based on an internal reference matrix obtained by internal reference calibration. Referring to fig. 4, (a) the diagram shows a sensed image without distortion; (b) the figure shows a sensed image with pincushion distortion; (c) the figure shows a sensed image with barrel-type distortion.
Referring to fig. 5, the electronic device projects the point cloud data to a sensing image acquired by an image sensor, including: converting the point cloud data from a coordinate system taking the laser radar as a sitting origin to a coordinate system taking the image sensor as an origin on the basis of the external reference matrix, wherein the coordinate system taking the laser radar as the sitting origin is a three-dimensional coordinate, and the coordinate system taking the image sensor as the origin is also a three-dimensional coordinate; then, the data in the coordinate system with the image sensor as the origin is converted into the pixel coordinate system of the sensed image. Wherein, the pixel coordinate system of the sensing image is a two-dimensional coordinate.
After the point clouds are mapped into the sensing image, index information of each point cloud is recorded, as shown in fig. 6, which is a schematic diagram after the point cloud data are mapped into the sensing image.
Step 1032, an object detection result of the sensed image is obtained, and the object detection result is used for indicating the positions of other objects in the sensed image.
After the sensing image is acquired, step 1031 may be executed before step 1032, or after step 1032, or may be executed simultaneously with step 1032, and the execution order between steps 1031 and 1032 is not limited in this embodiment.
The target detection result of the sensing image may be calculated by the current electronic device using a stored target detection algorithm, or sent by another device, or manually marked, and the embodiment does not limit the manner of obtaining the target detection result.
Optionally, the target detection result may also indicate the type of other targets or other attribute information, and the content of the target detection result is not limited in this embodiment.
Optionally, the manner in which the target detection result indicates the position of another target may be represented by a circumscribed rectangular frame, and of course, may also be represented by a coordinate manner, and the manner in which the target detection result indicates the position of another target is not limited in this embodiment.
And 1033, screening the point cloud data based on the target detection result to obtain candidate point cloud data.
The point cloud data is filtered for screening out redundant point cloud data, i.e., point cloud data that does not contribute to calculating the true distance.
Taking the example that the other targets include the vehicle located in front of the target vehicle in the traveling direction, the candidate point cloud data is located at the tail of the vehicle of the other targets, as shown in fig. 7. Namely, the point cloud data of other targets except the vehicle tail is screened out. At this time, screening the point cloud data based on the target detection result to obtain candidate point cloud data, including: for the target external frame indicated by the target detection result, point cloud data which is positioned in the target external frame and has a distance with the bottom edge of the target external frame smaller than or equal to a preset distance is determined as candidate point cloud data; and/or calculating a point cloud normal vector of each point cloud data in the target external frame for the target external frame indicated by the target detection result; and eliminating the point cloud data of which the angle between the point cloud normal vector and the ground is within a preset range to obtain candidate point cloud data.
Illustratively, the preset range is a point set in which the normal vector angle of the point cloud is within plus or minus 10 degrees (Z is 0 plane) of the ground, that is, the laser point cloud of the candidate frame close to the ground and the point cloud data of the position of the vehicle roof and the vehicle tail cover are rejected.
Step 1034, calculating a true distance between the candidate target and the target vehicle based on the candidate point cloud data, the vehicle-related true result including the true distance.
Taking the example that the other targets include a vehicle located in front of the traveling direction of the target vehicle, calculating the real distance between the candidate target and the target vehicle based on the candidate point cloud data includes: clustering candidate point cloud data; performing histogram statistics on the clustered point cloud data to obtain point cloud sets with the largest number; and calculating the real distance between the candidate target and the target vehicle by using the point cloud set with the largest number.
Taking the second evaluation-to-be-evaluated algorithm in step 102 as an example, at this time, the vehicle-related real result includes a real detection result corresponding to the predicted detection result. At this time, vehicle-related real results of the target vehicle are calculated based on the point cloud data, including: and performing target detection by using a three-dimensional target detection algorithm based on the point cloud data to obtain a real detection result.
Optionally, the three-dimensional object detection algorithm includes, but is not limited to: LaserNet, BirdNet, or single-stage deep convolutional neural network LMNet, and the like, and the present embodiment does not limit the type of the three-dimensional target detection method.
It should be added that, in this embodiment, only the to-be-evaluated algorithm is used for performing target detection and/or performing ranging as an example for description, in practical implementation, the to-be-evaluated algorithm may also be another algorithm, and this embodiment is not listed here.
And 104, comparing the vehicle-related prediction result with the vehicle-related real result to obtain an evaluation result of the algorithm to be evaluated.
Taking the example of the algorithm to be evaluated for calculating the predicted distance, when comparing the vehicle-related prediction result with the vehicle-related real result, the evaluated distance may be divided into a plurality of distance intervals, such as: and dividing the distance into 0-20 meters, 20-40 meters and 40-80 meters at equal intervals, and counting errors between the predicted distance and the actual distance in different distance intervals to obtain an evaluation result.
Alternatively, the evaluation result may be displayed in the form of an evaluation report.
In summary, in the algorithm evaluation method provided by the embodiment, the vehicle-related prediction result of the target vehicle is obtained by obtaining the algorithm to be evaluated; acquiring point cloud data acquired by a laser radar on a target vehicle; the data acquisition attitude of the laser radar enables the number of point cloud data to be larger than a preset threshold value; calculating a vehicle-related real result of the target vehicle based on the point cloud data, and presetting a threshold value to enable an error between the vehicle-related real result and an actual result to be smaller than an error threshold value; comparing the vehicle-related prediction result with the vehicle-related real result to obtain an evaluation result of the algorithm to be evaluated; the problem that when the vehicle related algorithm is directly put into use, if the algorithm is inaccurate, the safety of automatic driving is possibly low can be solved; the actual value is calculated by using the point cloud data collected by the laser radar, and the accuracy of the algorithm to be evaluated is evaluated by using the actual value, so that the algorithm can be evaluated, and the safety of automatic driving is improved.
In addition, due to the fact that the number of the point cloud data is larger than the preset threshold value through the data acquisition posture of the laser radar, the target can receive more laser radar wire harnesses, the accuracy of a real value calculation result is improved, and therefore the accuracy of an evaluation result can be improved.
In addition, the position with the nearest target distance is extracted in a clustering and histogram mode to serve as a distance truth value, and the accuracy of the truth value is improved.
Fig. 8 is a block diagram of an algorithm evaluation device according to an embodiment of the present application. The device at least comprises the following modules:
the result prediction module 810 is used for obtaining a vehicle-related prediction result of the target vehicle obtained by the algorithm to be evaluated;
a point cloud obtaining module 820, configured to obtain point cloud data collected by a laser radar on the target vehicle; the data acquisition attitude of the laser radar enables the number of the point cloud data to be larger than a preset threshold value;
a true value calculation module 830, configured to calculate a vehicle-related real result of the target vehicle based on the point cloud data, where the preset threshold is such that an error between the vehicle-related real result and an actual result is smaller than an error threshold;
and the algorithm evaluation module 840 is used for comparing the vehicle-related prediction result with the vehicle-related real result to obtain an evaluation result of the algorithm to be evaluated.
For relevant details reference is made to the above-described method embodiments.
It should be noted that: in the above embodiment, when performing algorithm evaluation, the algorithm evaluation device is only illustrated by dividing the functional modules, and in practical applications, the function distribution may be completed by different functional modules according to needs, that is, the internal structure of the algorithm evaluation device is divided into different functional modules to complete all or part of the functions described above. In addition, the algorithm evaluation device and the algorithm evaluation method provided by the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiments in detail and are not described herein again.
Fig. 9 is a block diagram of an algorithm evaluation device according to an embodiment of the present application. The apparatus comprises at least a processor 901 and a memory 902.
Processor 901 may include one or more processing cores such as: 4 core processors, 8 core processors, etc. The processor 901 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 901 may also include a main processor and a coprocessor, where the main processor is a processor for Processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 901 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, the processor 901 may further include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
Memory 902 may include one or more computer-readable storage media, which may be non-transitory. The memory 902 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 902 is used to store at least one instruction for execution by processor 901 to implement the algorithm evaluation methods provided by the method embodiments herein.
In some embodiments, the algorithm evaluating device may further include: a peripheral interface and at least one peripheral. The processor 901, memory 902 and peripheral interfaces may be connected by buses or signal lines. Each peripheral may be connected to the peripheral interface via a bus, signal line, or circuit board. Illustratively, peripheral devices include, but are not limited to: radio frequency circuit, touch display screen, audio circuit, power supply, etc.
Of course, the algorithm evaluating device may also include fewer or more components, which is not limited by the embodiment.
Optionally, the present application further provides a computer-readable storage medium, in which a program is stored, and the program is loaded and executed by a processor to implement the algorithm evaluation method of the above method embodiment.
Optionally, the present application further provides a computer product, which includes a computer-readable storage medium, in which a program is stored, and the program is loaded and executed by a processor to implement the algorithm evaluation method of the above-mentioned method embodiment.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.
The above is only one specific embodiment of the present application, and any other modifications based on the concept of the present application are considered as the protection scope of the present application.

Claims (10)

1. An algorithm evaluation method, characterized in that the method comprises:
obtaining a vehicle-related prediction result of a target vehicle obtained by an evaluation algorithm to be evaluated;
acquiring point cloud data acquired by a laser radar on the target vehicle; the data acquisition attitude of the laser radar enables the number of the point cloud data to be larger than a preset threshold value;
calculating a vehicle-related real result of the target vehicle based on the point cloud data, wherein the preset threshold value enables an error between the vehicle-related real result and an actual result to be smaller than an error threshold value;
and comparing the vehicle-related prediction result with the vehicle-related real result to obtain an evaluation result of the algorithm to be evaluated.
2. The method according to claim 1, wherein the evaluation algorithm is used to calculate predicted distances of other targets relative to the target vehicle, the vehicle-related prediction including the predicted distance; the data acquisition pose includes a field angle and/or an angular resolution of the lidar.
3. The method according to claim 2, wherein the target vehicle is further provided with an image sensor, and the image acquisition range of the image sensor comprises the area of the other target; the calculating of vehicle-related real results of the target vehicle based on the point cloud data comprises:
projecting the point cloud data to a sensing image acquired by the image sensor;
acquiring a target detection result of the sensing image, wherein the target detection result is used for indicating the positions of other targets in the sensing image;
screening the point cloud data based on the target detection result to obtain candidate point cloud data;
calculating a true distance between the candidate target and the target vehicle based on the candidate point cloud data, the vehicle-related true result including the true distance.
4. The method of claim 3, wherein the other targets comprise vehicles located forward in a direction of travel of the target vehicle, the candidate point cloud data being located at vehicle tails of the other targets.
5. The method of claim 4, wherein the screening the point cloud data based on the target detection result to obtain candidate point cloud data comprises:
for the target external frame indicated by the target detection result, point cloud data which is positioned in the target external frame and has a distance with the bottom edge of the target external frame smaller than or equal to a preset distance is determined as the candidate point cloud data;
and/or the presence of a gas in the gas,
calculating a point cloud normal vector of each point cloud data in the target external frame for the target external frame indicated by the target detection result; and eliminating the point cloud data of which the angle between the point cloud normal vector and the ground is within a preset range to obtain the candidate point cloud data.
6. The method of claim 3, wherein said calculating a true distance between the candidate target and the target vehicle based on the candidate point cloud data comprises:
clustering the candidate point cloud data;
performing histogram statistics on the clustered point cloud data to obtain point cloud sets with the largest number;
calculating a true distance between the candidate target and the target vehicle using the largest number of point cloud sets.
7. The method of claim 2, wherein obtaining the vehicle-related prediction of the target vehicle by the evaluation algorithm comprises:
acquiring a target image including the other target;
and calculating the predicted distance between other targets in the target image and the target vehicle by using a monocular distance measurement algorithm to obtain the vehicle-related prediction result, wherein the algorithm to be evaluated comprises the monocular distance measurement algorithm.
8. The method according to claim 1, wherein the evaluation algorithm is used for detecting whether other objects exist in the vicinity of the target vehicle, and the vehicle-related prediction result comprises the prediction detection result; the data acquisition pose includes a field angle and/or an angular resolution of the lidar.
9. An algorithm evaluation device, comprising a processor and a memory; the memory stores therein a program that is loaded and executed by the processor to implement the algorithm evaluation method according to any one of claims 1 to 8.
10. A computer-readable storage medium, characterized in that the storage medium has stored therein a program which, when being executed by a processor, is adapted to implement the algorithm evaluation method according to any one of claims 1 to 8.
CN202110120361.XA 2021-01-28 2021-01-28 Algorithm evaluation method, device and storage medium Pending CN112799091A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110120361.XA CN112799091A (en) 2021-01-28 2021-01-28 Algorithm evaluation method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110120361.XA CN112799091A (en) 2021-01-28 2021-01-28 Algorithm evaluation method, device and storage medium

Publications (1)

Publication Number Publication Date
CN112799091A true CN112799091A (en) 2021-05-14

Family

ID=75812501

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110120361.XA Pending CN112799091A (en) 2021-01-28 2021-01-28 Algorithm evaluation method, device and storage medium

Country Status (1)

Country Link
CN (1) CN112799091A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114972824A (en) * 2022-06-24 2022-08-30 小米汽车科技有限公司 Rod detection method and device, vehicle and storage medium
WO2023273895A1 (en) * 2021-06-29 2023-01-05 苏州一径科技有限公司 Method for evaluating clustering-based target detection model
CN116413740A (en) * 2023-06-09 2023-07-11 广汽埃安新能源汽车股份有限公司 Laser radar point cloud ground detection method and device

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170003380A1 (en) * 2013-12-31 2017-01-05 Korea University Research And Business Foundation Method for evaluating type of distance measured by laser range finder and method for estimating position of mobile robot by using same
CN108196260A (en) * 2017-12-13 2018-06-22 北京汽车集团有限公司 The test method and device of automatic driving vehicle multi-sensor fusion system
US20180348374A1 (en) * 2017-05-31 2018-12-06 Uber Technologies, Inc. Range-View Lidar-Based Object Detection
CN110175576A (en) * 2019-05-29 2019-08-27 电子科技大学 A kind of driving vehicle visible detection method of combination laser point cloud data
CN110244322A (en) * 2019-06-28 2019-09-17 东南大学 Pavement construction robot environment sensory perceptual system and method based on Multiple Source Sensor
CN110411499A (en) * 2019-08-05 2019-11-05 上海汽车集团股份有限公司 The appraisal procedure and assessment system of sensor detection recognition capability
CN110988912A (en) * 2019-12-06 2020-04-10 中国科学院自动化研究所 Road target and distance detection method, system and device for automatic driving vehicle
CN112147632A (en) * 2020-09-23 2020-12-29 中国第一汽车股份有限公司 Method, device, equipment and medium for testing vehicle-mounted laser radar perception algorithm

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170003380A1 (en) * 2013-12-31 2017-01-05 Korea University Research And Business Foundation Method for evaluating type of distance measured by laser range finder and method for estimating position of mobile robot by using same
US20180348374A1 (en) * 2017-05-31 2018-12-06 Uber Technologies, Inc. Range-View Lidar-Based Object Detection
CN108196260A (en) * 2017-12-13 2018-06-22 北京汽车集团有限公司 The test method and device of automatic driving vehicle multi-sensor fusion system
CN110175576A (en) * 2019-05-29 2019-08-27 电子科技大学 A kind of driving vehicle visible detection method of combination laser point cloud data
CN110244322A (en) * 2019-06-28 2019-09-17 东南大学 Pavement construction robot environment sensory perceptual system and method based on Multiple Source Sensor
CN110411499A (en) * 2019-08-05 2019-11-05 上海汽车集团股份有限公司 The appraisal procedure and assessment system of sensor detection recognition capability
CN110988912A (en) * 2019-12-06 2020-04-10 中国科学院自动化研究所 Road target and distance detection method, system and device for automatic driving vehicle
CN112147632A (en) * 2020-09-23 2020-12-29 中国第一汽车股份有限公司 Method, device, equipment and medium for testing vehicle-mounted laser radar perception algorithm

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023273895A1 (en) * 2021-06-29 2023-01-05 苏州一径科技有限公司 Method for evaluating clustering-based target detection model
CN114972824A (en) * 2022-06-24 2022-08-30 小米汽车科技有限公司 Rod detection method and device, vehicle and storage medium
CN114972824B (en) * 2022-06-24 2023-07-14 小米汽车科技有限公司 Rod detection method, device, vehicle and storage medium
CN116413740A (en) * 2023-06-09 2023-07-11 广汽埃安新能源汽车股份有限公司 Laser radar point cloud ground detection method and device
CN116413740B (en) * 2023-06-09 2023-09-05 广汽埃安新能源汽车股份有限公司 Laser radar point cloud ground detection method and device

Similar Documents

Publication Publication Date Title
CN111712731B (en) Target detection method, target detection system and movable platform
CN112799091A (en) Algorithm evaluation method, device and storage medium
CN109961468B (en) Volume measurement method and device based on binocular vision and storage medium
CN112733812B (en) Three-dimensional lane line detection method, device and storage medium
CN111045000A (en) Monitoring system and method
EP3792660A1 (en) Method, apparatus and system for measuring distance
CN112513679B (en) Target identification method and device
CN112711034B (en) Object detection method, device and equipment
CN113359097A (en) Millimeter wave radar and camera combined calibration method
CN113447923A (en) Target detection method, device, system, electronic equipment and storage medium
CN109828250B (en) Radar calibration method, calibration device and terminal equipment
CN112949782A (en) Target detection method, device, equipment and storage medium
CN114488099A (en) Laser radar coefficient calibration method and device, electronic equipment and storage medium
CN114495064A (en) Monocular depth estimation-based vehicle surrounding obstacle early warning method
CN111913177A (en) Method and device for detecting target object and storage medium
CN113838125A (en) Target position determining method and device, electronic equipment and storage medium
CN111105465B (en) Camera device calibration method, device, system electronic equipment and storage medium
CN112560800A (en) Road edge detection method, device and storage medium
CN113970734A (en) Method, device and equipment for removing snowing noise of roadside multiline laser radar
CN116027318A (en) Method, device, electronic equipment and storage medium for multi-sensor signal fusion
CN114550142A (en) Parking space detection method based on fusion of 4D millimeter wave radar and image recognition
CN108693517B (en) Vehicle positioning method and device and radar
CN111862208B (en) Vehicle positioning method, device and server based on screen optical communication
CN116189116B (en) Traffic state sensing method and system
CN114638947A (en) Data labeling method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 215123 g2-1901 / 1902 / 2002, No. 88, Jinjihu Avenue, Suzhou Industrial Park, Suzhou City, Jiangsu Province

Applicant after: Zhixing Automotive Technology (Suzhou) Co.,Ltd.

Address before: 215123 g2-1901 / 1902 / 2002, No. 88, Jinjihu Avenue, Suzhou Industrial Park, Suzhou City, Jiangsu Province

Applicant before: IMOTION AUTOMOTIVE TECHNOLOGY (SUZHOU) Co.,Ltd.