CN114994646A - Laser radar fusion algorithm detection method, device and equipment and readable storage medium - Google Patents

Laser radar fusion algorithm detection method, device and equipment and readable storage medium Download PDF

Info

Publication number
CN114994646A
CN114994646A CN202210513219.6A CN202210513219A CN114994646A CN 114994646 A CN114994646 A CN 114994646A CN 202210513219 A CN202210513219 A CN 202210513219A CN 114994646 A CN114994646 A CN 114994646A
Authority
CN
China
Prior art keywords
detected
target object
fusion algorithm
detection
angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210513219.6A
Other languages
Chinese (zh)
Inventor
张峻荧
苏芮琦
周正
黄波
童建辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiangyang Daan Automobile Test Center Co Ltd
Original Assignee
Xiangyang Daan Automobile Test Center Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiangyang Daan Automobile Test Center Co Ltd filed Critical Xiangyang Daan Automobile Test Center Co Ltd
Priority to CN202210513219.6A priority Critical patent/CN114994646A/en
Publication of CN114994646A publication Critical patent/CN114994646A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention provides a laser radar fusion algorithm detection method, a device and equipment and a readable storage medium, wherein the laser radar fusion algorithm detection method comprises the following steps: acquiring detection overlapping areas of a plurality of laser radars in a simulation test environment; selecting the type of the target object to be detected from the identification object ranges of the laser radars; selecting a target object under the type of the target object to be detected as the target object to be detected; traversing and detecting a target object to be detected in the detection overlapping area, and outputting a statistical result of the identification accuracy of the target object to be detected by a fusion algorithm; and summarizing the statistical results of the identification accuracy of the target object to be detected, and outputting a summary report of the identification accuracy of the fusion algorithm. By the method and the device, the type of the target object to be detected and the target object are configured aiming at radar products of different manufacturers, efficient, comprehensive and automatic detection of the fusion algorithm is realized, and the adaptability of the fusion algorithm can be detected by detecting the target objects of different types.

Description

Laser radar fusion algorithm detection method, device and equipment and readable storage medium
Technical Field
The invention relates to the field of intelligent driving tests, in particular to a laser radar fusion algorithm detection method, a device, equipment and a readable storage medium.
Background
The laser radar is used as an active sensor, can actively sense an object, is slightly influenced by the external environment, and has higher reliability and accuracy than a passive sensor, so the laser radar is widely applied to an environment sensing system of an intelligent driving vehicle. However, in the current testing scheme of the laser radar fusion algorithm, the laser radar fusion algorithm is repeatedly tested for multiple times to improve the accuracy of the test result, and the process of one-time testing is as follows: in a test scene, a static obstacle is placed, a vehicle with an intelligent driving function drives towards the obstacle, when the obstacle appears in a radar cloud picture and is displayed as the obstacle, a tester obtains and analyzes a result sensed by a laser radar fusion algorithm, verifies whether the identification of the laser radar fusion algorithm to the obstacle is accurate, and completes one-time test.
Disclosure of Invention
The invention mainly aims to provide a laser radar fusion algorithm detection method, a laser radar fusion algorithm detection device, laser radar fusion algorithm detection equipment and a readable storage medium, and aims to solve the technical problems of low efficiency and high cost of the existing laser radar fusion algorithm detection.
In a first aspect, the present invention provides a laser radar fusion algorithm detection method, including:
acquiring detection overlapping areas of a plurality of laser radars in a simulation test environment;
selecting the type of the target object to be detected from the identification object ranges of the laser radars;
selecting the target object under the type of the target object to be detected as the target object to be detected;
in the detection overlapping area, traversing detection is carried out on a target object to be detected, and a statistical result of the identification accuracy of the target object to be detected by a fusion algorithm is output;
if all the target objects under the type of the target object to be detected are not detected, selecting a new target object under the type of the target object to be detected as the target object to be detected, returning to execute the step of traversing detection on the target object to be detected in the detection overlapping area, and outputting a statistical result of the identification accuracy rate of the fusion algorithm on the target object to be detected;
if all target object types in the identification object ranges of the plurality of laser radars are not detected, selecting a new target object type from the identification object ranges of the plurality of laser radars as a target object type to be detected, and returning to execute the step of selecting the target object under the target object type to be detected as the target object to be detected;
and summarizing the statistical results of the identification accuracy of the target object to be detected, and outputting a summary report of the identification accuracy of the fusion algorithm.
Optionally, in the detection overlapping region, performing traversal detection on the target object to be detected, and outputting a statistical result of the fusion algorithm on the identification accuracy of the target object to be detected includes:
setting a position to be detected of a target object to be detected in the detection overlapping area;
setting a to-be-detected angle of the to-be-detected target object;
detecting the target object to be detected, and outputting a result of the accuracy rate of the fusion algorithm for identifying the target object to be detected;
if the angle of the target object to be detected is not detected, adjusting the angle of the target object to be detected, taking the adjusted angle as a new angle to be detected, returning to the step of executing the detection on the target object to be detected and outputting the result of the identification accuracy rate of the fusion algorithm on the target object to be detected, and if the angle of the target object to be detected is detected, moving the target object to be detected according to the traversal step length;
if the detection overlapping area is not detected, taking the position of the moved target object to be detected as a new position to be detected, and returning to the step of setting the angle to be detected of the target object to be detected;
and counting the result of the identification accuracy of each pose of the target object to be detected, and outputting the counting result of the identification accuracy of the target object to be detected by a fusion algorithm.
Optionally, in the detection overlapping area, setting a position to be detected of the target object to be detected includes:
and in the detection overlapping area, setting the position to be detected of the target object to be detected according to the width of the target object to be detected and the width of the detection overlapping area.
Optionally, the detecting the target object to be detected and outputting the result of the accuracy rate of the fusion algorithm for identifying the target object to be detected includes:
acquiring first position and attitude information of a target object to be detected through a simulation test environment;
obtaining second position and posture information of the target object to be detected through a fusion algorithm;
and comparing and calculating the first position and posture information and the second position and posture information, and outputting a result of the identification accuracy of the fusion algorithm on the target object to be detected.
Optionally, if the angle of the target object to be detected is not detected completely, adjusting the angle of the target object to be detected, taking the adjusted angle as a new angle to be detected, and returning to execute the step of detecting the target object to be detected, and outputting a result of accuracy of a fusion algorithm for identifying the target object to be detected, if the angle of the target object to be detected is detected completely, before moving the target object to be detected according to a traversal step length, the method further includes:
and setting the traversal step length according to the width of the detection overlapping area, the width of the target object to be detected and the number of the line groups of the plurality of laser radars.
In a second aspect, the present invention further provides a laser radar fusion algorithm detection apparatus, where the laser radar fusion algorithm detection apparatus includes:
the acquisition module is used for acquiring detection overlapping areas of a plurality of laser radars in a simulation test environment;
the first selection module is used for selecting the type of the target object to be detected from the identification object ranges of the laser radars;
the second selection module is used for selecting the target object under the type of the target object to be detected as the target object to be detected;
the detection module is used for performing traversal detection on the target object to be detected in the detection overlapping area and outputting a statistical result of the identification accuracy of the fusion algorithm on the target object to be detected;
a third selection module, configured to select a new target object of the type of the target object to be detected as the target object to be detected if all target objects of the type of the target object to be detected are not detected, return to execute the step of traversing detection on the target object to be detected in the detection overlapping area, and output a statistical result of the recognition accuracy of the fusion algorithm on the target object to be detected;
a fourth selection module, configured to select a new target type from the identification object ranges of the plurality of laser radars as a target type to be detected if all target types in the identification object ranges of the plurality of laser radars are not detected, and return to the step of selecting a target under the target type to be detected as a target to be detected;
and the output module is used for summarizing the statistical result of the identification accuracy of the target object to be detected and outputting a summary report of the identification accuracy of the fusion algorithm.
Optionally, the detection module includes:
the first setting unit is used for setting the position to be detected of the target object to be detected in the detection overlapping area;
the second setting unit is used for setting the angle to be detected of the target object to be detected;
the detection unit is used for detecting the target object to be detected and outputting the result of the identification accuracy of the fusion algorithm on the target object to be detected;
a third setting unit, configured to adjust the angle of the target object to be detected if the angle of the target object to be detected is not detected, use the adjusted angle as a new angle to be detected, return to the step of executing the detection on the target object to be detected, and output a result of the accuracy rate of the fusion algorithm for identifying the target object to be detected, and move the target object to be detected according to the traversal step length if the angle of the target object to be detected is detected;
a fourth setting unit, configured to, if the detection overlap area is not detected completely, use the position of the moved target object to be detected as a new position to be detected, and return to the step of setting the angle to be detected of the target object to be detected;
and the output unit is used for counting the result of the identification accuracy of each pose of the target object to be detected and outputting the counting result of the identification accuracy of the target object to be detected by a fusion algorithm.
Optionally, the detection unit is configured to:
acquiring first position and attitude information of a target object to be detected through a simulation test environment;
obtaining second position and posture information of the target object to be detected through a fusion algorithm;
and comparing and calculating the first position and posture information and the second position and posture information, and outputting a result of the identification accuracy of the fusion algorithm on the target object to be detected.
In a third aspect, the present invention further provides a lidar fusion algorithm detection apparatus, where the lidar fusion algorithm detection apparatus includes a processor, a memory, and a lidar fusion algorithm detection program stored in the memory and executable by the processor, where the lidar fusion algorithm detection program, when executed by the processor, implements the steps of the lidar fusion algorithm detection method described above.
In a fourth aspect, the present invention further provides a readable storage medium, on which a lidar fusion algorithm detection program is stored, wherein when the lidar fusion algorithm detection program is executed by a processor, the steps of the lidar fusion algorithm detection method as described above are implemented.
In the invention, in a simulation test environment, detection overlapping areas of a plurality of laser radars are obtained; selecting the type of the target object to be detected from the identification object ranges of the laser radars; selecting the target object under the type of the target object to be detected as the target object to be detected; in the detection overlapping area, traversing detection is carried out on a target object to be detected, and a statistical result of the identification accuracy of the target object to be detected by a fusion algorithm is output; if all the target objects under the type of the target object to be detected are not detected, selecting a new target object under the type of the target object to be detected as the target object to be detected, returning to execute the step of traversing detection on the target object to be detected in the detection overlapping area, and outputting a statistical result of the identification accuracy rate of the fusion algorithm on the target object to be detected; if all target object types in the identification object ranges of the plurality of laser radars are not detected, selecting a new target object type from the identification object ranges of the plurality of laser radars as a target object type to be detected, and returning to execute the step of selecting the target object under the target object type to be detected as the target object to be detected; and summarizing the statistical results of the identification accuracy of the target object to be detected, and outputting a summary report of the identification accuracy of the fusion algorithm. The method comprises the steps of selecting a target object type to be detected from identification object ranges of a plurality of laser radars in a simulation test environment, then selecting the target object to be detected from the target object type to be detected, traversing and detecting the target object to be detected in detection overlapping areas of the plurality of laser radars, outputting a statistical result of the identification accuracy of the target object to be detected by a fusion algorithm of the plurality of laser radars, then replacing the target object and replacing the target object type until all target object types in the identification object ranges of the plurality of laser radars and target objects under all target object types are detected, finally summarizing the statistical results of the identification accuracy of the target object types in the identification object ranges of the plurality of laser radars and the target objects under all target object types, and outputting a summarizing report of the identification accuracy of the fusion algorithm. According to the invention, the type of the target object to be detected and the target object are configured in a targeted manner aiming at different radar products of different manufacturers, the detection of the fusion algorithm of a plurality of laser radars can be realized in an efficient, comprehensive and automatic manner according to the detection process of the fusion algorithm of the plurality of laser radars, and the adaptability of the fusion algorithm can be detected through the detection of the target objects of different types.
Drawings
FIG. 1 is a schematic diagram of a hardware structure of an embodiment of a laser radar fusion algorithm detection apparatus according to the present invention;
FIG. 2 is a schematic flow chart diagram of an embodiment of a laser radar fusion algorithm detection method of the present invention;
FIG. 3 is a schematic diagram of a detection overlap region according to an embodiment of the laser radar fusion algorithm detection method of the present invention;
FIG. 4 is a detailed flowchart of step S40 in FIG. 2;
fig. 5 is a schematic functional module diagram of an embodiment of the lidar fusion algorithm detection apparatus according to the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and do not limit the invention.
In a first aspect, an embodiment of the present invention provides a laser radar fusion algorithm detection device.
Referring to fig. 1, fig. 1 is a schematic diagram of a hardware structure of an embodiment of a lidar fusion algorithm detection device of the present invention. In this embodiment of the present invention, the lidar fusion algorithm detection device may include a processor 1001 (e.g., a Central Processing Unit, CPU), a communication bus 1002, a user interface 1003, a network interface 1004, and a memory 1005. The communication bus 1002 is used for realizing connection communication among the components; the user interface 1003 may include a Display screen (Display), an input unit such as a Keyboard (Keyboard); the network interface 1004 may optionally include a standard wired interface, a WIreless interface (e.g., a WI-FI interface, WIreless FIdelity, WI-FI interface); the memory 1005 may be a Random Access Memory (RAM) or a non-volatile memory (non-volatile memory), such as a magnetic disk memory, and the memory 1005 may optionally be a storage device independent of the processor 1001. Those skilled in the art will appreciate that the hardware configuration depicted in FIG. 1 is not intended to be limiting of the present invention, and may include more or less components than those shown, or some components in combination, or a different arrangement of components.
With continued reference to fig. 1, a memory 1005, which is one type of computer storage medium in fig. 1, may include an operating system, a network communication module, a user interface module, and a lidar fusion algorithm detection program. The processor 1001 may call a lidar fusion algorithm detection program stored in the memory 1005, and execute the lidar fusion algorithm detection method provided in the embodiment of the present invention.
In a second aspect, an embodiment of the present invention provides a laser radar fusion algorithm detection method.
In order to more clearly show the laser radar fusion algorithm detection method provided by the embodiment of the present application, an application scenario of the laser radar fusion algorithm detection method provided by the embodiment of the present application is first introduced.
The laser radar fusion algorithm detection method provided by the embodiment of the application is applied to an intelligent driving vehicle and uses a plurality of laser radars, the target is identified through the fusion algorithm, and the identification accuracy of the laser radar fusion algorithm directly influences the driving safety of the intelligent driving vehicle, so that the identification accuracy and reliability of the laser radar fusion algorithm are detected.
In an embodiment, referring to fig. 2, fig. 2 is a schematic flowchart of an embodiment of a lidar fusion algorithm detection method according to the present invention, and as shown in fig. 2, the lidar fusion algorithm detection method includes:
step S10, in the simulation test environment, acquiring detection overlapping areas of a plurality of laser radars.
In this embodiment, the installation poses of the plurality of virtual lidar in the simulation environment may be configured according to the corresponding poses of the plurality of lidar installed in the real vehicle in the real environment, so as to test a fusion algorithm of the plurality of lidar, and therefore, the configured plurality of lidar needs to have detection overlapping regions, and in the simulation environment, the positions and areas of the detection overlapping regions are obtained through the system functions of the simulation environment. The number of the lidar may be two or more, referring to fig. 3, fig. 3 is a schematic diagram of the detection overlapping area of an embodiment of the detection method of the lidar fusion algorithm of the present invention, as shown in fig. 3, the detection overlapping areas of the lidar a and the lidar B are indicated by a shaded portion in fig. 3.
And step S20, selecting the type of the target object to be detected from the identification object ranges of the plurality of laser radars.
In this embodiment, laser radar products of different types from different manufacturers are good at identifying target types and have different corresponding accuracies, for example, the identification range of the laser radar a is a vehicle and a pedestrian, the identification range of the laser radar B is a vehicle and a traffic sign, targeted configuration is performed according to the product type of the laser radar, and the type of the target to be detected is selected from the identification ranges of the plurality of laser radars.
And step S30, selecting the target object under the type of the target object to be detected as the target object to be detected.
In this embodiment, the target object to be detected is selected from the type of the target object to be detected, for example, the typical target object under the type of the target object to be detected, i.e., a "vehicle", may include a car, a passenger car, a tractor, etc., and the number of the typical target object may be adjusted according to actual test requirements.
And step S40, traversing and detecting the target object to be detected in the detection overlapping area, and outputting a statistical result of the identification accuracy of the fusion algorithm on the target object to be detected.
In this embodiment, according to the positions and areas of the detection regions of the plurality of laser radars, traversal detection is performed on the target object to be detected in the whole detection region, and a statistical result of the accuracy rate of the target object to be detected by the fusion algorithm is output after detection.
And step S50, if all the target objects under the type of the target object to be detected are not detected completely, selecting a new target object under the type of the target object to be detected as the target object to be detected, returning to execute the step of traversing detection on the target object to be detected in the detection overlapping area, and outputting a statistical result of the identification accuracy rate of the fusion algorithm on the target object to be detected.
In this embodiment, if all the target objects under the type of the target object to be detected are not detected, a new target object is selected as the target object to be detected, and the step S40 is executed again, that is, the new target object to be detected is continuously detected.
And step S60, if all the target object types in the identification object ranges of the laser radars are not detected, selecting a new target object type from the identification object ranges of the laser radars as the target object type to be detected, and returning to execute the step of selecting the target object under the target object type to be detected as the target object to be detected.
In this embodiment, if all the target object types in the range of the identification objects of the plurality of laser radars are not detected, the new target object type is selected as the target object type to be detected, and the step S30 is returned to be executed, and then the target object to be detected is selected from the new target object type to continue the detection.
And step S70, summarizing the statistical results of the identification accuracy of the target object to be detected, and outputting a summary report of the identification accuracy of the fusion algorithm.
In this embodiment, in each of the above steps, based on each target type in the range of the identification object of the plurality of laser radars and the typical target object under each target type, the fusion algorithms of the plurality of laser radars are detected respectively, the statistical results of the above identification accuracy are summarized, a summary report of the identification accuracy of the fusion algorithms is output, and the whole detection process is completed.
In the embodiment, in a simulation environment, the types of the target objects to be detected and the typical target objects to be detected are configured in a targeted manner for different radar products of different manufacturers, each type of the target objects and the typical target objects under each type of the target objects are sequentially subjected to traversal detection in the detection overlapping areas of a plurality of laser radars, after the detection results are summarized, a summary report of the identification accuracy of the fusion algorithm is output, the whole detection process has the obvious advantages of high efficiency, comprehensiveness and automation, the detection cost of the laser radar fusion algorithm can be remarkably reduced, the adaptability of the laser radar fusion algorithm can be detected through detection of different types of targets, the output comprehensive detection report facilitates fusion algorithm designers to improve and upgrade the fusion algorithm according to the report result, and therefore the identification precision and reliability of the laser radar fusion algorithm are further improved.
Further, in an embodiment, referring to fig. 4, fig. 4 is a schematic detailed flowchart of step S40 in fig. 2, and as shown in fig. 4, step S40 includes:
step S401, setting the position to be detected of the target object to be detected in the detection overlapping area.
In this embodiment, in the detection overlapping area, a detection position is set for the target object to be detected.
Step S402, setting the angle to be detected of the target object to be detected.
In this embodiment, a detection angle is set for the target object to be detected.
And S403, detecting the target object to be detected, and outputting the result of the identification accuracy of the fusion algorithm on the target object to be detected.
In the embodiment, after the target object to be detected is detected, the result of the accuracy rate of the target object to be detected by the fusion algorithm is output.
Step S404, if the angle of the target object to be detected is not detected, adjusting the angle of the target object to be detected, taking the adjusted angle as a new angle to be detected, returning to the step of executing the detection of the target object to be detected, outputting the result of the accuracy rate of the fusion algorithm for identifying the target object to be detected, and if the angle of the target object to be detected is detected, moving the target object to be detected according to the traversal step length.
In this embodiment, the angles of the target object to be detected include a roll angle, a pitch angle, and a course angle, and the angle to be detected may be selected according to a specific detection requirement, for example, in a certain position to be detected, the course angle of the target object to be detected may be sequentially subjected to angle adjustment detection of 0 degree, 30 degrees, 60 degrees, 90 degrees, 120 degrees, 150 degrees, and 180 degrees, and if the angle is not detected, the detection is continued after the angle is adjusted in the position to be detected, and if the angle is detected, the target object to be detected is moved to a new position to be detected according to a traversal step length.
Step S405, if the detection overlapping area is not detected, taking the position of the moved target object to be detected as a new position to be detected, and returning to the step of setting the angle to be detected of the target object to be detected.
In this embodiment, each angle is continuously adjusted to perform detection at the new position of the target object to be detected until the whole detection overlap region is detected.
And S406, counting the result of the identification accuracy of each pose of the target object to be detected, and outputting a counting result of the identification accuracy of the target object to be detected by a fusion algorithm.
In this embodiment, after the result of the recognition accuracy of the target object to be detected by the fusion algorithm at each position and each angle in the detection overlap region is obtained, statistics is performed, and then the statistical result of the recognition accuracy of the target object to be detected by the fusion algorithm is output.
Further, in an embodiment, step S401 includes:
and in the detection overlapping area, setting the position to be detected of the target object to be detected according to the width of the target object to be detected and the width of the detection overlapping area.
In this embodiment, since the target object to be detected has a width dimension, in the detection overlapping region, according to the width of the target object to be detected and the width of the detection overlapping region, an appropriate position to be detected is set for the target object to be detected, so that the efficiency of traversal detection can be improved.
Further, in an embodiment, step S403 includes:
acquiring first position and attitude information of a target object to be detected through a simulation test environment;
obtaining second position and posture information of the target object to be detected through a fusion algorithm;
and comparing and calculating the first position and orientation information and the second position and orientation information, and outputting a result of the identification accuracy of the fusion algorithm on the target object to be detected.
In this embodiment, the first position and orientation information of the target object to be detected, which is usually called as "true value", is obtained through a simulation test environment, the second position and orientation information of the target object to be detected, which is called as "fusion value", is obtained through a fusion algorithm of a plurality of laser radars, and the result of the fusion algorithm on the accuracy rate of the target object to be detected can be obtained by performing comparison calculation on the true value and the fusion value according to the accuracy requirement of a designer of the fusion algorithm on the fusion algorithm and a calculation method on the accuracy rate.
Further, in an embodiment, before step S404, the method further includes:
and setting the traversal step length according to the width of the detection overlapping area, the width of the target object to be detected and the number of the line groups of the plurality of laser radars.
In this embodiment, the wider the width of the detection overlapping region is, the larger the traversal step can be set, the wider the width of the target object to be detected is, the smaller the traversal step can be set, the larger the number of the line groups of the plurality of laser radars is, the larger the traversal step can be set, the traversal step is set according to specific test requirements, and the efficiency of traversal detection can be better improved.
In a third aspect, an embodiment of the present invention further provides a laser radar fusion algorithm detection apparatus.
Referring to fig. 5, fig. 5 is a schematic functional module diagram of an embodiment of the lidar fusion algorithm detection apparatus according to the present invention.
In this embodiment, the laser radar fusion algorithm detection apparatus includes:
an obtaining module 10, configured to obtain detection overlapping areas of multiple laser radars in a simulation test environment;
the first selection module 20 is configured to select a type of a target object to be detected from the range of the identification objects of the plurality of laser radars;
the second selection module 30 is used for selecting the target object under the type of the target object to be detected as the target object to be detected;
the detection module 40 is configured to perform traversal detection on the target object to be detected in the detection overlapping region, and output a statistical result of the recognition accuracy of the fusion algorithm on the target object to be detected;
a third selecting module 50, configured to select a new target object of the type of the target object to be detected as the target object to be detected if all target objects of the type of the target object to be detected are not detected completely, and return to execute the step of traversing detection on the target object to be detected in the detection overlapping area, and output a statistical result of the recognition accuracy of the fusion algorithm on the target object to be detected;
a fourth selecting module 60, configured to select a new target type from the identification object ranges of the plurality of laser radars as the target type to be detected if all target types in the identification object ranges of the plurality of laser radars are not detected, and return to the step of selecting the target under the target type to be detected as the target to be detected;
and the output module 70 is configured to summarize statistical results of the identification accuracy of the target object to be detected, and output a summary report of the identification accuracy of the fusion algorithm.
Further, in an embodiment, the detection module 40 includes:
the first setting unit is used for setting the position to be detected of the target object to be detected in the detection overlapping area;
the second setting unit is used for setting the angle to be detected of the target object to be detected;
the detection unit is used for detecting the target object to be detected and outputting the result of the identification accuracy of the fusion algorithm on the target object to be detected;
a third setting unit, configured to adjust the angle of the target object to be detected if the angle of the target object to be detected is not detected, use the adjusted angle as a new angle to be detected, return to the step of performing the detection on the target object to be detected, and output a result of a recognition accuracy rate of a fusion algorithm on the target object to be detected, and move the target object to be detected according to a traversal step length if the angle of the target object to be detected is detected;
a fourth setting unit, configured to, if the detection overlap area is not detected, take the position of the moved target object to be detected as a new position to be detected, and return to the step of setting the angle to be detected of the target object to be detected;
and the output unit is used for counting the result of the identification accuracy of each pose of the target object to be detected and outputting the counting result of the identification accuracy of the target object to be detected by a fusion algorithm.
Further, in an embodiment, the first setting unit is configured to:
and setting the position to be detected of the target object to be detected in the detection overlapping area according to the width of the target object to be detected and the width of the detection overlapping area.
Further, in an embodiment, the detection unit is configured to:
acquiring first position and attitude information of a target object to be detected through a simulation test environment;
obtaining second position and posture information of the target object to be detected through a fusion algorithm;
and comparing and calculating the first position and posture information and the second position and posture information, and outputting a result of the identification accuracy of the fusion algorithm on the target object to be detected.
Further, in an embodiment, the detection module 40 further includes a fifth setting unit, configured to:
and setting the traversal step length according to the width of the detection overlapping area, the width of the target object to be detected and the number of the line groups of the plurality of laser radars.
The function implementation of each module in the laser radar fusion algorithm detection device corresponds to each step in the laser radar fusion algorithm detection method embodiment, and the function and implementation process are not described in detail herein.
In a fourth aspect, the embodiment of the present invention further provides a readable storage medium.
The readable storage medium of the invention stores the laser radar fusion algorithm detection program, wherein when the laser radar fusion algorithm detection program is executed by the processor, the steps of the laser radar fusion algorithm detection method are realized.
The method implemented when the laser radar fusion algorithm detection program is executed may refer to each embodiment of the laser radar fusion algorithm detection method of the present invention, and details are not described herein again.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) as described above and includes instructions for causing a terminal device to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A laser radar fusion algorithm detection method is characterized by comprising the following steps:
acquiring detection overlapping areas of a plurality of laser radars in a simulation test environment;
selecting the type of the target object to be detected from the identification object ranges of the laser radars;
selecting the target object under the type of the target object to be detected as the target object to be detected;
in the detection overlapping area, traversing detection is carried out on a target object to be detected, and a statistical result of the identification accuracy of the target object to be detected by a fusion algorithm is output;
if all the target objects under the type of the target object to be detected are not detected, selecting a new target object under the type of the target object to be detected as the target object to be detected, returning to execute the step of traversing detection on the target object to be detected in the detection overlapping area, and outputting a statistical result of the identification accuracy rate of the fusion algorithm on the target object to be detected;
if all target object types in the identification object ranges of the plurality of laser radars are not detected, selecting a new target object type from the identification object ranges of the plurality of laser radars as a target object type to be detected, and returning to execute the step of selecting the target object under the target object type to be detected as the target object to be detected;
and summarizing the statistical results of the identification accuracy of the target object to be detected, and outputting a summary report of the identification accuracy of the fusion algorithm.
2. The lidar fusion algorithm detection method of claim 1, wherein the traversing detection of the target object to be detected in the detection overlap region and the outputting of the statistical result of the identification accuracy of the fusion algorithm on the target object to be detected comprise:
setting a position to be detected of a target object to be detected in the detection overlapping area;
setting a to-be-detected angle of the to-be-detected target object;
detecting the target object to be detected, and outputting a result of the accuracy rate of the fusion algorithm for identifying the target object to be detected;
if the angle of the target object to be detected is not detected, adjusting the angle of the target object to be detected, taking the adjusted angle as a new angle to be detected, returning to the step of detecting the target object to be detected and outputting a result of the identification accuracy rate of the fusion algorithm on the target object to be detected, and if the angle of the target object to be detected is detected, moving the target object to be detected according to the traversal step length;
if the detection overlapping area is not detected, taking the position of the moved target object to be detected as a new position to be detected, and returning to the step of setting the angle to be detected of the target object to be detected;
and counting the result of the identification accuracy of each pose of the target object to be detected, and outputting the counting result of the identification accuracy of the target object to be detected by a fusion algorithm.
3. The lidar fusion algorithm detection method of claim 2, wherein the setting of the position to be detected of the target to be detected within the detection overlap region comprises:
and in the detection overlapping area, setting the position to be detected of the target object to be detected according to the width of the target object to be detected and the width of the detection overlapping area.
4. The lidar fusion algorithm detection method of claim 2, wherein the detecting the target object to be detected and outputting the result of the accuracy rate of the fusion algorithm for identifying the target object to be detected comprises:
acquiring first position and attitude information of a target object to be detected through a simulation test environment;
obtaining second position and posture information of the target object to be detected through a fusion algorithm;
and comparing and calculating the first position and posture information and the second position and posture information, and outputting a result of the identification accuracy of the fusion algorithm on the target object to be detected.
5. The lidar fusion algorithm detection method according to claim 2, wherein if the angle of the target object to be detected is not detected, the angle of the target object to be detected is adjusted, the adjusted angle is used as a new angle to be detected, the step of detecting the target object to be detected is performed in a return mode, a result of the accuracy rate of the fusion algorithm for identifying the target object to be detected is output, and if the angle of the target object to be detected is detected, before the target object to be detected is moved according to a traversal step length, the lidar fusion algorithm detection method further comprises:
and setting the traversal step length according to the width of the detection overlapping area, the width of the target object to be detected and the number of the line groups of the plurality of laser radars.
6. A laser radar fusion algorithm detection device, characterized in that, laser radar fusion algorithm detection device includes:
the acquisition module is used for acquiring detection overlapping areas of a plurality of laser radars in a simulation test environment;
the first selection module is used for selecting the type of the target object to be detected from the identification object ranges of the laser radars;
the second selection module is used for selecting the target object under the type of the target object to be detected as the target object to be detected;
the detection module is used for performing traversal detection on the target object to be detected in the detection overlapping area and outputting a statistical result of the identification accuracy of the fusion algorithm on the target object to be detected;
a third selection module, configured to select a new target object of the type of the target object to be detected as the target object to be detected if all target objects of the type of the target object to be detected are not detected, return to execute the step of traversing detection on the target object to be detected in the detection overlapping area, and output a statistical result of the recognition accuracy of the fusion algorithm on the target object to be detected;
a fourth selection module, configured to select a new target object type from the identification object ranges of the plurality of lidar as the target object type to be detected if all target object types in the identification object ranges of the plurality of lidar have not been detected, and return to the step of executing the step of selecting the target object under the target object type to be detected as the target object to be detected;
and the output module is used for summarizing the statistical result of the identification accuracy of the target object to be detected and outputting a summary report of the identification accuracy of the fusion algorithm.
7. The lidar fusion algorithm detection apparatus of claim 6, wherein the detection module comprises:
the first setting unit is used for setting the position to be detected of the target object to be detected in the detection overlapping area;
the second setting unit is used for setting the angle to be detected of the target object to be detected;
the detection unit is used for detecting the target object to be detected and outputting the result of the identification accuracy of the fusion algorithm on the target object to be detected;
a third setting unit, configured to adjust the angle of the target object to be detected if the angle of the target object to be detected is not detected, use the adjusted angle as a new angle to be detected, return to the step of executing the detection on the target object to be detected, and output a result of the accuracy rate of the fusion algorithm for identifying the target object to be detected, and move the target object to be detected according to the traversal step length if the angle of the target object to be detected is detected;
a fourth setting unit, configured to, if the detection overlap area is not detected completely, use the position of the moved target object to be detected as a new position to be detected, and return to the step of setting the angle to be detected of the target object to be detected;
and the output unit is used for counting the result of the identification accuracy of each pose of the target object to be detected and outputting the counting result of the identification accuracy of the target object to be detected by a fusion algorithm.
8. The lidar fusion algorithm detection device of claim 7, wherein the detection unit is configured to:
acquiring first position and attitude information of a target object to be detected through a simulation test environment;
obtaining second position and posture information of the target object to be detected through a fusion algorithm;
and comparing and calculating the first position and posture information and the second position and posture information, and outputting a result of the identification accuracy of the fusion algorithm on the target object to be detected.
9. Lidar fusion algorithm detection apparatus, characterized in that the lidar fusion algorithm detection apparatus comprises a processor, a memory, and a lidar fusion algorithm detection program stored on the memory and executable by the processor, wherein the lidar fusion algorithm detection program, when executed by the processor, implements the steps of the lidar fusion algorithm detection method according to any of claims 1 to 5.
10. A readable storage medium, having a lidar fusion algorithm detection program stored thereon, wherein the lidar fusion algorithm detection program, when executed by a processor, performs the steps of the lidar fusion algorithm detection method of any of claims 1 to 5.
CN202210513219.6A 2022-05-11 2022-05-11 Laser radar fusion algorithm detection method, device and equipment and readable storage medium Pending CN114994646A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210513219.6A CN114994646A (en) 2022-05-11 2022-05-11 Laser radar fusion algorithm detection method, device and equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210513219.6A CN114994646A (en) 2022-05-11 2022-05-11 Laser radar fusion algorithm detection method, device and equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN114994646A true CN114994646A (en) 2022-09-02

Family

ID=83026406

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210513219.6A Pending CN114994646A (en) 2022-05-11 2022-05-11 Laser radar fusion algorithm detection method, device and equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN114994646A (en)

Similar Documents

Publication Publication Date Title
CN108226883B (en) Method and device for testing millimeter wave radar performance and computer readable storage medium
CN110716184B (en) Radar system angle calibration method, device, equipment and storage medium
US20210063577A1 (en) Robot relocalization method and apparatus and robot using the same
US20230077304A1 (en) Cooperative positioning method and apparatus, device, and storage medium
CN109613543B (en) Method and device for correcting laser point cloud data, storage medium and electronic equipment
CN109828250B (en) Radar calibration method, calibration device and terminal equipment
CN110470333B (en) Calibration method and device of sensor parameters, storage medium and electronic device
CN110824491B (en) Charging pile positioning method and device, computer equipment and storage medium
CN111563450B (en) Data processing method, device, equipment and storage medium
CN112017205B (en) Automatic calibration method and system for space positions of laser radar and camera sensor
CN111537967B (en) Radar deflection angle correction method and device and radar terminal
US20220327739A1 (en) Pose calibration method, robot and computer readable storage medium
CN114994646A (en) Laser radar fusion algorithm detection method, device and equipment and readable storage medium
CN110673114B (en) Method and device for calibrating depth of three-dimensional camera, computer device and storage medium
CN113203424B (en) Multi-sensor data fusion method and device and related equipment
CN113112551B (en) Camera parameter determining method and device, road side equipment and cloud control platform
CN115311761B (en) Non-real-time vehicle-mounted perception system evaluation method and related equipment
CN114063024A (en) Calibration method and device of sensor, electronic equipment and storage medium
CN115015889A (en) Laser radar pose adjusting method, device and equipment and readable storage medium
CN112477868A (en) Collision time calculation method and device, readable storage medium and computer equipment
CN117437602B (en) Dual-layer data calibration method, device, equipment and readable storage medium
CN111596288B (en) Method and device for measuring speed, vehicle-mounted terminal and vehicle-mounted speed measuring system
CN116148788B (en) Radar signal simulation method, device, equipment and storage medium
CN113191368B (en) Method and device for matching markers
CN115291535A (en) Camera simulation test method, device and equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination