CN112147632A - Method, device, equipment and medium for testing vehicle-mounted laser radar perception algorithm - Google Patents

Method, device, equipment and medium for testing vehicle-mounted laser radar perception algorithm Download PDF

Info

Publication number
CN112147632A
CN112147632A CN202011008488.4A CN202011008488A CN112147632A CN 112147632 A CN112147632 A CN 112147632A CN 202011008488 A CN202011008488 A CN 202011008488A CN 112147632 A CN112147632 A CN 112147632A
Authority
CN
China
Prior art keywords
vehicle
target vehicle
laser radar
detected
lane line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011008488.4A
Other languages
Chinese (zh)
Inventor
孙建蕾
陈博
王栋梁
王秋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
FAW Group Corp
Original Assignee
FAW Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by FAW Group Corp filed Critical FAW Group Corp
Priority to CN202011008488.4A priority Critical patent/CN112147632A/en
Publication of CN112147632A publication Critical patent/CN112147632A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications

Abstract

The application discloses a method, a device, equipment and a medium for testing a vehicle-mounted laser radar perception algorithm, and relates to a testing technology in the field of intelligent automobiles. The specific implementation scheme is as follows: acquiring first running information of at least one target vehicle sensed by a vehicle to be detected based on a laser radar sensing algorithm, wherein the vehicle to be detected and each target vehicle are provided with combined navigation equipment; respectively acquiring positioning results of the vehicle to be detected and each target vehicle determined based on the integrated navigation equipment; calculating second running information of each target vehicle according to the positioning results of the vehicle to be detected and each target vehicle; and comparing the first running information with the second running information to obtain a test result of the laser radar perception algorithm on the vehicle to be tested on the target vehicle. According to the method and the device, the combined navigation equipment is utilized, and the test verification of the vehicle-mounted laser radar perception algorithm in the real environment is realized on the basis of reducing the manpower and time cost.

Description

Method, device, equipment and medium for testing vehicle-mounted laser radar perception algorithm
Technical Field
The application relates to the field of intelligent automobiles, and further relates to a testing technology, in particular to a method, a device, equipment and a medium for testing a vehicle-mounted laser radar perception algorithm.
Background
The implementation of autonomous driving relies on a number of factors, of which high-performance environmental perception is a key factor. The laser radar is used as a high-precision sensing component and occupies a vital position in automatic driving.
At present, the test and verification means for the laser radar perception algorithm mainly comprises three test methods, namely generating a true value by manually marking point cloud in a development stage, using a scene simulator for simulation verification and carrying a planning decision algorithm for real vehicle road test.
However, the first testing method requires manual labeling, which is not suitable for large-scale product testing because of high time and labor costs; the second testing method uses a scene simulator to simulate and render a virtual scene and a sensor, so that a laser radar perception algorithm runs in a virtual environment, but the virtual environment and a real environment have insurmountable difference, and the testing verification in the real environment cannot be completely replaced; the third testing method is used for real vehicle road testing after a complete ADAS system is built, the advantages and the disadvantages of the sensing capability of the laser radar are analyzed through the performance of the ADAS function, but the influence of a comprehensive analysis planning decision algorithm, a sensing fusion algorithm and other sensing sensor algorithms such as a camera and the like is needed in the BUG analysis stage, so that the BUG reason is difficult to position, namely the time and the labor cost are high.
Disclosure of Invention
The application provides a method, a device, equipment and a medium for testing a vehicle-mounted laser radar perception algorithm, so that test verification in a real environment is realized on the basis of reducing labor and time cost.
In a first aspect, the application provides a method for testing a vehicle-mounted lidar sensing algorithm, which includes:
acquiring first running information of at least one target vehicle sensed by a vehicle to be detected based on a laser radar sensing algorithm, wherein the vehicle to be detected and each target vehicle are provided with combined navigation equipment;
respectively acquiring positioning results of the vehicle to be detected and each target vehicle determined based on the integrated navigation equipment;
calculating second running information of each target vehicle according to the positioning results of the vehicle to be detected and each target vehicle;
and comparing the first running information with the second running information to obtain a test result of the laser radar perception algorithm on the vehicle to be tested on the target vehicle.
In a second aspect, the present application further provides a device for testing a vehicle-mounted lidar sensing algorithm, including:
the system comprises a first running information acquisition module, a second running information acquisition module and a navigation module, wherein the first running information acquisition module is used for acquiring first running information of at least one target vehicle sensed by a vehicle to be detected based on a laser radar sensing algorithm, and the vehicle to be detected and each target vehicle are provided with combined navigation equipment;
the positioning result acquisition module is used for respectively acquiring the positioning results of the vehicle to be detected and each target vehicle determined based on the integrated navigation equipment;
the second running information acquisition module is used for calculating second running information of each target vehicle according to the positioning results of the vehicle to be detected and each target vehicle;
and the test result acquisition module is used for comparing the first running information with the second running information to obtain a test result of the laser radar perception algorithm on the vehicle to be tested on the target vehicle.
In a third aspect, the present application further provides an electronic device, including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor, and the instructions are executed by the at least one processor to enable the at least one processor to execute the method for testing the vehicle-mounted lidar sensing algorithm according to any embodiment of the application.
In a fourth aspect, the present application further provides a non-transitory computer-readable storage medium storing computer instructions for causing a computer to execute the method for testing a vehicle-mounted lidar sensing algorithm according to any of the embodiments of the present application.
It should be understood that the statements herein do not intend to identify key or critical features of the present application, nor to limit the scope of the present application. Other features of the present application will become readily apparent from the following description, and other effects of the above alternatives will be described hereinafter in conjunction with specific embodiments.
Drawings
The drawings are included to provide a better understanding of the present solution and are not intended to limit the present application. Wherein:
FIG. 1 is a schematic flow chart of a test method of a vehicle-mounted laser radar perception algorithm according to an embodiment of the application;
FIG. 2 is a schematic flow chart of a test method of a vehicle-mounted laser radar perception algorithm according to an embodiment of the application;
FIG. 3 is a schematic flow chart of a test method of a vehicle-mounted lidar sensing algorithm according to an embodiment of the application;
FIG. 4 is a schematic structural diagram of a testing device for a vehicle-mounted lidar sensing algorithm according to an embodiment of the application;
fig. 5 is a block diagram of an electronic device for implementing a test method of a vehicle-mounted lidar sensing algorithm according to an embodiment of the present disclosure.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the understanding of the same, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Fig. 1 is a schematic flow chart of a method for testing a vehicle-mounted lidar sensing algorithm according to an embodiment of the present application, which is applicable to a case of testing a vehicle-mounted lidar sensing algorithm. The method can be executed by a vehicle-mounted laser radar perception algorithm testing device which is realized in a software and/or hardware mode and is preferably configured in electronic equipment, such as a vehicle-mounted terminal, a server or computer equipment. As shown in fig. 1, the method specifically includes the following steps:
s101, first running information of at least one target vehicle sensed by the vehicle to be detected based on a laser radar sensing algorithm is obtained, wherein the vehicle to be detected and each target vehicle are provided with combined navigation equipment.
The vehicle-mounted laser radar sensing algorithm can sense target vehicles, pedestrians, lane lines and other environmental objects around a vehicle to be detected, for example, information such as the position, lanes or speed of the target vehicles is sensed, and data is provided for automatic driving. The vehicle to be tested is a vehicle carrying the laser radar sensing algorithm, the first running information is the relevant information of at least one target vehicle sensed based on the laser radar sensing algorithm, and the laser radar sensing algorithm can be tested by verifying the accuracy of the first running information. The first driving information may include any information that can be perceived by the lidar perception algorithm, and the specific contents of the perception algorithm and the first driving information are not limited in any way in the present application.
The vehicle under test and each target vehicle are also provided with an integrated navigation device. The integrated navigation equipment is an equipment in an integrated optimization fusion system of multi-sensor multi-source navigation information, and may include, for example, inertial navigation equipment, a GPS navigation system, a doppler navigation system, and the like. The key technology of the integrated navigation device is information fusion and processing, for example, the application of the kalman filtering method is the key for generating the integrated navigation, the kalman filtering not only considers the currently measured parameter value through a motion equation and a measurement equation, but also fully utilizes the parameter value measured in the past, estimates the currently due parameter value on the basis of the latter, and corrects the parameter value by taking the former as a correction value so as to obtain the optimal estimation of the current parameter value.
The combined navigation equipment can acquire more accurate positioning information, so that the related information of the target vehicle can be calculated according to the positioning information, and the related information is used as a true value to be compared with the information sensed by the laser radar sensing algorithm, and the accuracy of the sensing information is detected. The application does not limit the combined navigation equipment and the information fusion and processing algorithm.
S102, respectively obtaining the positioning results of the vehicle to be detected and each target vehicle determined based on the integrated navigation equipment.
Specifically, the combined navigation device and the fusion and processing algorithm of the information thereof in the prior art may be adopted to obtain the final positioning result of the vehicle to be detected and each target vehicle, which is not described herein again.
S103, calculating second running information of each target vehicle according to the positioning results of the vehicle to be detected and each target vehicle.
The second driving information is information for comparison with the first driving information obtained by sensing, and is used as a true value for testing the lidar sensing algorithm. For example, if performance indexes such as speed measurement, distance measurement and target identification classification of the laser radar sensing algorithm need to be tested according to requirements, the corresponding performance index of each target vehicle is calculated according to the positioning result determined by the combined navigation equipment, and the performance indexes are compared to finish the test.
Specifically, for example, for performance indexes of speed measurement and distance measurement, according to the positioning results of the vehicle to be measured and any one target vehicle, the relative distance between the vehicle to be measured and any one target vehicle can be calculated, and then the vehicle speed of the target vehicle can be calculated according to the vehicle speed of the vehicle to be measured; for the performance indexes of the target recognition classification, the detection frames (bounding boxes) of the target vehicles in the images can be calculated by combining the vehicle body information of the target vehicles according to the positioning results of the target recognition classification and the performance indexes of the target recognition classification, and the target recognition classification results can be obtained by classifying the detection frames. For other performance indexes, detailed description is omitted here.
And S104, comparing the first running information with the second running information to obtain a test result of the laser radar perception algorithm on the vehicle to be tested on the target vehicle.
Because the positioning information with higher precision can be determined based on the combined navigation equipment, the performance index calculated according to the positioning information can be used as a true value to be compared with the performance index sensed by the vehicle-mounted laser radar sensing algorithm to be tested, and therefore the test result of the vehicle-mounted laser radar sensing algorithm is determined according to the comparison result.
According to the technical scheme, the performance index calculated by the positioning information determined by the combined navigation equipment is used as a true value and is compared with the performance index obtained by sensing of the vehicle-mounted laser radar sensing algorithm, so that the test under a real scene is realized, manual marking, a simulation scene and a complex algorithm are not needed, the labor cost and the time cost are not high, and the purpose of realizing the test verification in a real environment on the basis of reducing the labor cost and the time cost is achieved.
Fig. 2 is a schematic flow chart of a method for testing a vehicle-mounted lidar sensing algorithm according to an embodiment of the present application, which is further optimized based on the above embodiment, and describes in detail a method for testing the detection performance of a target vehicle by the lidar sensing algorithm. As shown in fig. 2, the method specifically includes the following steps:
s201, acquiring first running information of at least one target vehicle sensed by a vehicle to be detected based on a laser radar sensing algorithm, wherein the vehicle to be detected and each target vehicle are provided with combined navigation equipment.
Wherein the first travel information includes at least one of: vehicle speed, distance, location, and detection frame. It should be noted that, according to the test requirement, which performance indexes of the lidar sensing algorithm need to be tested, the corresponding first driving information may be obtained. Therefore, the present application does not set any limit to the specific content of the first travel information.
S202, respectively obtaining the original positioning data of the vehicle to be detected and each target vehicle determined based on the integrated navigation equipment.
The positioning original data is unprocessed original data obtained through the integrated navigation equipment.
S203, post-processing the positioning original data to obtain the positioning result.
For example, post-processing can be performed by post-processing software, namely, interferometric Explorer, and the main working principle of the post-processing software is to perform bidirectional solution and shield satellites with poor signal quality, so as to further improve the accuracy of the positioning data. Therefore, the post-processed positioning result has higher accuracy than the positioning original data, and provides a data base for the next test.
S204, comparing the positioning results of the vehicle to be tested and each target vehicle, and determining the distance between each target vehicle and the vehicle to be tested.
Ranging is a commonly used performance index for lidar sensing algorithms. According to the method and the device, after the positioning result of each target vehicle is obtained through the integrated navigation equipment, the relative distance between the target vehicle and the vehicle to be detected can be determined by combining the positioning information of the vehicle to be detected.
S205, determining the speed of each target vehicle by utilizing a multi-frame tracking analysis technology, and determining the detection frame of each target vehicle by combining the body information of each target vehicle.
Speed measurement and target identification classification are also common performance indexes of the laser radar sensing algorithm. After the distance between the target vehicle and the vehicle to be detected is obtained, the speed of each target vehicle and the information of the relative position relationship between the target vehicle and the vehicle to be detected can be calculated by combining the speed and other basic information of the vehicle to be detected. And then, by combining the body information of each target vehicle, such as the length and width of the vehicle and other basic information, the detection frame of the target vehicle can be determined, including the coordinates of four vertexes of the detection frame of the target vehicle in the coordinate system of the vehicle to be detected, and the like, so that the targets in the detection frame can be identified and classified by using an image detection technology.
And S206, taking the distance, the vehicle speed and the detection frame as second driving information, and comparing the first driving information with the second driving information to obtain a test result of the laser radar perception algorithm on the vehicle to be tested on the target vehicle.
The second travel information includes at least one of: the vehicle speed, distance, position and detection frame are not limited to these, and may be determined according to the test requirements. Through comparison, if the difference between the first driving information obtained through sensing and the second driving information serving as a true value is far, the fact that the vehicle-mounted laser radar sensing algorithm is in a problem is indicated, and otherwise, the fact that the sensing algorithm has certain accuracy is indicated.
According to the technical scheme, the performance index calculated by the positioning information determined by the combined navigation equipment is used as a true value and is compared with the performance index obtained by sensing of the vehicle-mounted laser radar sensing algorithm, so that the test under a real scene is realized, manual marking, a simulation scene and a complex algorithm are not needed, the labor cost and the time cost are not high, and the purpose of realizing the test verification in a real environment on the basis of reducing the labor cost and the time cost is achieved.
Fig. 3 is a schematic flow chart of a method for testing a vehicle-mounted lidar sensing algorithm according to an embodiment of the present application, and this embodiment further optimizes the method based on the above embodiment, and introduces a method for testing the detection performance of a lane line by the lidar sensing algorithm in detail. As shown in fig. 3, the method specifically includes the following steps:
s301, a lane line mapping result measured and drawn by the combined navigation equipment of any target vehicle is obtained.
Among them, the target vehicle is mounted with an integrated navigation device such as an inertial navigation device, a GPS navigation system, a doppler navigation system, or the like.
In one embodiment, the obtaining the mapping result of the lane line drawn by the integrated navigation device of the arbitrary target vehicle includes: measuring a lane line by using a carrier phase differential technology and differential inertial navigation to obtain point coordinates on the lane line; and converting the point coordinates by a coordinate system to obtain coordinate information and a curve equation of the lane line in a global coordinate system, and taking the coordinate information and the curve equation as the lane line mapping result.
S302, acquiring a lane line sensing result sensed based on a laser radar sensing algorithm in the process that the vehicle to be detected runs on the lane to which the lane line belongs.
The lane line sensing result may include coordinate information of the sensed lane line and a curve equation.
And S303, comparing the lane line mapping result with a lane line sensing result to obtain a test result of the laser radar sensing algorithm on the vehicle to be tested on the lane line.
And taking the mapping result of the lane line measured and drawn by the combined navigation equipment as a true value, comparing the mapping result with the perceived lane line perception result, if the difference between the perception result and the mapping result is larger, indicating that the perception performance precision of the laser radar perception algorithm on the lane line is lower, otherwise indicating that the perception performance of the laser radar perception algorithm on the lane line has certain precision.
According to the technical scheme, the lane line mapping result obtained by mapping through the combined navigation equipment is used as a true value, and is compared with the lane line sensing result obtained by sensing through the vehicle-mounted laser radar sensing algorithm, so that the test under a real scene is realized, manual marking, a simulation scene and a complex algorithm are not needed, labor and time costs are not high, and the purpose of testing and verifying in a real environment is achieved on the basis of reducing labor and time costs.
Fig. 4 is a schematic structural diagram of a testing device for a vehicle-mounted lidar sensing algorithm according to an embodiment of the application, which can be applied to this embodiment. The device can realize the test method of the vehicle-mounted laser radar perception algorithm in any embodiment of the application. As shown in fig. 4, the apparatus 400 specifically includes:
the first driving information acquiring module 401 is configured to acquire first driving information of at least one target vehicle, which is sensed by a vehicle to be detected based on a laser radar sensing algorithm, where the vehicle to be detected and each target vehicle are equipped with a combined navigation device;
a positioning result obtaining module 402, configured to obtain positioning results determined by the vehicle under test and each target vehicle based on the integrated navigation device, respectively;
a second driving information obtaining module 403, configured to calculate second driving information of each target vehicle according to the positioning result of the vehicle to be tested and each target vehicle;
and the test result obtaining module 404 is configured to obtain a test result of the lidar sensing algorithm on the vehicle to be tested on the target vehicle by comparing the first driving information with the second driving information.
Optionally, the positioning result obtaining module is specifically configured to:
respectively acquiring positioning original data of the vehicle to be detected and each target vehicle determined based on the integrated navigation equipment;
and post-processing the positioning original data to obtain the positioning result.
Optionally, the first driving information and the second driving information each include at least one of: vehicle speed, distance, location, and detection frame.
Optionally, the second driving information obtaining module is specifically configured to:
comparing the positioning results of the vehicle to be tested and each target vehicle, and determining the distance between each target vehicle and the vehicle to be tested;
determining the speed of each target vehicle by utilizing a multi-frame tracking analysis technology, and determining a detection frame of each target vehicle by combining the body information of each target vehicle;
and taking the distance, the vehicle speed and the detection frame as the second driving information.
Optionally, the apparatus further includes a lane line testing module, including:
the lane line mapping unit is used for acquiring a lane line mapping result measured and drawn by using combined navigation equipment of any target vehicle;
the lane line perception result acquisition unit is used for acquiring a lane line perception result perceived based on a laser radar perception algorithm in the process that the vehicle to be detected runs on the lane to which the lane line belongs;
and the comparison unit is used for obtaining a test result of the laser radar perception algorithm on the vehicle to be tested on the lane line by comparing the lane line mapping result with the lane line perception result.
Optionally, the lane line mapping unit is specifically configured to:
measuring a lane line by using a carrier phase differential technology and differential inertial navigation to obtain point coordinates on the lane line;
and converting the point coordinates by a coordinate system to obtain coordinate information and a curve equation of the lane line in a global coordinate system, and taking the coordinate information and the curve equation as the lane line mapping result.
The device 400 for testing the vehicle-mounted laser radar perception algorithm provided by the embodiment of the application can execute the method for testing the vehicle-mounted laser radar perception algorithm provided by any embodiment of the application, and has corresponding functional modules and beneficial effects of the execution method. Reference may be made to the description of any method embodiment of the present application for details not explicitly described in this embodiment.
According to an embodiment of the present application, an electronic device and a readable storage medium are also provided.
As shown in fig. 5, the block diagram is an electronic device of a test method for a vehicle-mounted lidar sensing algorithm according to an embodiment of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the present application that are described and/or claimed herein.
As shown in fig. 5, the electronic apparatus includes: one or more processors 501, memory 502, and interfaces for connecting the various components, including high-speed interfaces and low-speed interfaces. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions for execution within the electronic device, including instructions stored in or on the memory to display graphical information of a GUI on an external input/output apparatus (such as a display device coupled to the interface). In other embodiments, multiple processors and/or multiple buses may be used, along with multiple memories and multiple memories, as desired. Also, multiple electronic devices may be connected, with each device providing portions of the necessary operations (e.g., as a server array, a group of blade servers, or a multi-processor system). In fig. 5, one processor 501 is taken as an example.
Memory 502 is a non-transitory computer readable storage medium as provided herein. The memory stores instructions executable by at least one processor to cause the at least one processor to execute the method for testing the vehicle-mounted lidar sensing algorithm provided by the application. The non-transitory computer-readable storage medium of the present application stores computer instructions for causing a computer to perform the method of testing an in-vehicle lidar sensing algorithm provided herein.
The memory 502, which is a non-transitory computer-readable storage medium, may be used to store non-transitory software programs, non-transitory computer-executable programs, and modules, such as program instructions/modules corresponding to the test method of the on-vehicle lidar sensing algorithm in the embodiment of the present application (for example, the first driving information acquisition module 401, the positioning result acquisition module 402, the second driving information acquisition module 403, and the test result acquisition module 404 shown in fig. 4). The processor 501 executes various functional applications and data processing of the server by running non-transitory software programs, instructions and modules stored in the memory 502, that is, the test method of the vehicle-mounted lidar sensing algorithm in the above method embodiment is implemented.
The memory 502 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to use of an electronic device implementing the test method of the vehicle-mounted lidar sensing algorithm of the embodiment of the present application, and the like. Further, the memory 502 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, memory 502 optionally includes memory located remotely from processor 501, which may be connected via a network to an electronic device implementing the test method of the in-vehicle lidar sensing algorithm of embodiments of the present application. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device for implementing the method for testing the vehicle-mounted laser radar perception algorithm according to the embodiment of the application may further include: an input device 503 and an output device 504. The processor 501, the memory 502, the input device 503 and the output device 504 may be connected by a bus or other means, and fig. 5 illustrates the connection by a bus as an example.
The input device 503 may receive input numeric or character information and generate key signal inputs related to user settings and function control of an electronic device implementing the test method of the in-vehicle lidar sensing algorithm of the embodiment of the present application, such as an input device of a touch screen, a keypad, a mouse, a track pad, a touch pad, a pointing stick, one or more mouse buttons, a track ball, a joystick, or the like. The output devices 504 may include a display device, auxiliary lighting devices (e.g., LEDs), and haptic feedback devices (e.g., vibrating motors), among others. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device can be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
These computer programs (also known as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented using high-level procedural and/or object-oriented programming languages, and/or assembly/machine languages. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, Programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), the internet, and blockchain networks.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical host and VPS service are overcome.
According to the technical scheme of the embodiment of the application, the performance index calculated by utilizing the positioning information determined by the integrated navigation equipment is used as a true value and is compared with the performance index obtained by sensing the vehicle-mounted laser radar sensing algorithm, so that the test under a real scene is realized, manual marking, simulation scenes and complex algorithms are not needed, the labor cost and the time cost are not high, and the aim of realizing the test verification in a real environment on the basis of reducing the labor cost and the time cost is fulfilled.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present application may be executed in parallel, sequentially, or in different orders, and the present invention is not limited thereto as long as the desired results of the technical solutions disclosed in the present application can be achieved.
The above-described embodiments should not be construed as limiting the scope of the present application. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. A test method for a vehicle-mounted laser radar perception algorithm is characterized by comprising the following steps:
acquiring first running information of at least one target vehicle sensed by a vehicle to be detected based on a laser radar sensing algorithm, wherein the vehicle to be detected and each target vehicle are provided with combined navigation equipment;
respectively acquiring positioning results of the vehicle to be detected and each target vehicle determined based on the integrated navigation equipment;
calculating second running information of each target vehicle according to the positioning results of the vehicle to be detected and each target vehicle;
and comparing the first running information with the second running information to obtain a test result of the laser radar perception algorithm on the vehicle to be tested on the target vehicle.
2. The method according to claim 1, wherein the separately obtaining the positioning results of the vehicle under test and each target vehicle determined based on the integrated navigation device comprises:
respectively acquiring positioning original data of the vehicle to be detected and each target vehicle determined based on the integrated navigation equipment;
and post-processing the positioning original data to obtain the positioning result.
3. The method of claim 1, wherein the first travel information and the second travel information each comprise at least one of: vehicle speed, distance, location, and detection frame.
4. The method according to claim 1, wherein the calculating of the second driving information of each target vehicle according to the positioning result of the vehicle under test and each target vehicle comprises:
comparing the positioning results of the vehicle to be tested and each target vehicle, and determining the distance between each target vehicle and the vehicle to be tested;
determining the speed of each target vehicle by utilizing a multi-frame tracking analysis technology, and determining a detection frame of each target vehicle by combining the body information of each target vehicle;
and taking the distance, the vehicle speed and the detection frame as the second driving information.
5. The method of claim 1, further comprising:
acquiring a lane line mapping result measured and drawn by using combined navigation equipment of any target vehicle;
acquiring a lane line sensing result sensed based on a laser radar sensing algorithm in the process that the vehicle to be detected runs on the lane to which the lane line belongs;
and comparing the lane line mapping result with a lane line sensing result to obtain a test result of the laser radar sensing algorithm on the vehicle to be tested on the lane line.
6. The method of claim 5, wherein the obtaining the mapped lane line mapping using the integrated navigation device of any target vehicle comprises:
measuring a lane line by using a carrier phase differential technology and differential inertial navigation to obtain point coordinates on the lane line;
and converting the point coordinates by a coordinate system to obtain coordinate information and a curve equation of the lane line in a global coordinate system, and taking the coordinate information and the curve equation as the lane line mapping result.
7. The utility model provides a testing arrangement of on-vehicle laser radar perception algorithm which characterized in that includes:
the system comprises a first running information acquisition module, a second running information acquisition module and a navigation module, wherein the first running information acquisition module is used for acquiring first running information of at least one target vehicle sensed by a vehicle to be detected based on a laser radar sensing algorithm, and the vehicle to be detected and each target vehicle are provided with combined navigation equipment;
the positioning result acquisition module is used for respectively acquiring the positioning results of the vehicle to be detected and each target vehicle determined based on the integrated navigation equipment;
the second running information acquisition module is used for calculating second running information of each target vehicle according to the positioning results of the vehicle to be detected and each target vehicle;
and the test result acquisition module is used for comparing the first running information with the second running information to obtain a test result of the laser radar perception algorithm on the vehicle to be tested on the target vehicle.
8. The apparatus according to claim 7, wherein the positioning result obtaining module is specifically configured to:
respectively acquiring positioning original data of the vehicle to be detected and each target vehicle determined based on the integrated navigation equipment;
and post-processing the positioning original data to obtain the positioning result.
9. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein the content of the first and second substances,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform a method of testing an on-board lidar sensing algorithm of any of claims 1-6.
10. A non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the method of testing the on-board lidar sensing algorithm of any of claims 1-6.
CN202011008488.4A 2020-09-23 2020-09-23 Method, device, equipment and medium for testing vehicle-mounted laser radar perception algorithm Pending CN112147632A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011008488.4A CN112147632A (en) 2020-09-23 2020-09-23 Method, device, equipment and medium for testing vehicle-mounted laser radar perception algorithm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011008488.4A CN112147632A (en) 2020-09-23 2020-09-23 Method, device, equipment and medium for testing vehicle-mounted laser radar perception algorithm

Publications (1)

Publication Number Publication Date
CN112147632A true CN112147632A (en) 2020-12-29

Family

ID=73897814

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011008488.4A Pending CN112147632A (en) 2020-09-23 2020-09-23 Method, device, equipment and medium for testing vehicle-mounted laser radar perception algorithm

Country Status (1)

Country Link
CN (1) CN112147632A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112799091A (en) * 2021-01-28 2021-05-14 知行汽车科技(苏州)有限公司 Algorithm evaluation method, device and storage medium
CN113155173A (en) * 2021-06-02 2021-07-23 福瑞泰克智能系统有限公司 Perception performance evaluation method and device, electronic device and storage medium
CN113419233A (en) * 2021-06-18 2021-09-21 阿波罗智能技术(北京)有限公司 Method, device and equipment for testing perception effect
CN115311761A (en) * 2022-07-15 2022-11-08 襄阳达安汽车检测中心有限公司 Non-real-time vehicle-mounted sensing system evaluation method and related equipment
CN115825901A (en) * 2023-02-21 2023-03-21 南京楚航科技有限公司 Vehicle-mounted sensor perception performance evaluation truth value system

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6392586B1 (en) * 1997-10-21 2002-05-21 Celsiustech Electronics Ab Car radar testing
CN105319540A (en) * 2015-04-03 2016-02-10 上海通用汽车有限公司 Method and device for automatic test of vehicle reverse sensor
CN107341453A (en) * 2017-06-20 2017-11-10 北京建筑大学 A kind of lane line extracting method and device
CN108196260A (en) * 2017-12-13 2018-06-22 北京汽车集团有限公司 The test method and device of automatic driving vehicle multi-sensor fusion system
CN109102702A (en) * 2018-08-24 2018-12-28 南京理工大学 Vehicle speed measuring method based on video encoder server and Radar Signal Fusion
CN109556615A (en) * 2018-10-10 2019-04-02 吉林大学 The driving map generation method of Multi-sensor Fusion cognition based on automatic Pilot
CN109696691A (en) * 2018-12-26 2019-04-30 北醒(北京)光子科技有限公司 A kind of laser radar and its method measured, storage medium
CN110081880A (en) * 2019-04-12 2019-08-02 同济大学 A kind of sweeper local positioning system and method merging vision, wheel speed and inertial navigation
CN110633800A (en) * 2019-10-18 2019-12-31 北京邮电大学 Lane position determination method, apparatus, and storage medium based on autonomous vehicle
CN110673115A (en) * 2019-09-25 2020-01-10 杭州飞步科技有限公司 Combined calibration method, device, equipment and medium for radar and integrated navigation system
US20200041623A1 (en) * 2018-02-05 2020-02-06 Centre Interdisciplinaire De Developpement En Cartographie Des Oceans (Cidco) METHOD AND APPARATUS FOR AUTOMATIC CALIBRATION OF MOBILE LiDAR SYSTEMS
CN111077534A (en) * 2019-12-30 2020-04-28 清华大学苏州汽车研究院(吴江) Structural device based on fusion of monocular camera and laser radar
CN111427348A (en) * 2020-03-24 2020-07-17 江苏徐工工程机械研究院有限公司 Automatic drive mining dump truck environmental perception system and mining dump truck

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6392586B1 (en) * 1997-10-21 2002-05-21 Celsiustech Electronics Ab Car radar testing
CN105319540A (en) * 2015-04-03 2016-02-10 上海通用汽车有限公司 Method and device for automatic test of vehicle reverse sensor
CN107341453A (en) * 2017-06-20 2017-11-10 北京建筑大学 A kind of lane line extracting method and device
CN108196260A (en) * 2017-12-13 2018-06-22 北京汽车集团有限公司 The test method and device of automatic driving vehicle multi-sensor fusion system
US20200041623A1 (en) * 2018-02-05 2020-02-06 Centre Interdisciplinaire De Developpement En Cartographie Des Oceans (Cidco) METHOD AND APPARATUS FOR AUTOMATIC CALIBRATION OF MOBILE LiDAR SYSTEMS
CN109102702A (en) * 2018-08-24 2018-12-28 南京理工大学 Vehicle speed measuring method based on video encoder server and Radar Signal Fusion
CN109556615A (en) * 2018-10-10 2019-04-02 吉林大学 The driving map generation method of Multi-sensor Fusion cognition based on automatic Pilot
CN109696691A (en) * 2018-12-26 2019-04-30 北醒(北京)光子科技有限公司 A kind of laser radar and its method measured, storage medium
CN110081880A (en) * 2019-04-12 2019-08-02 同济大学 A kind of sweeper local positioning system and method merging vision, wheel speed and inertial navigation
CN110673115A (en) * 2019-09-25 2020-01-10 杭州飞步科技有限公司 Combined calibration method, device, equipment and medium for radar and integrated navigation system
CN110633800A (en) * 2019-10-18 2019-12-31 北京邮电大学 Lane position determination method, apparatus, and storage medium based on autonomous vehicle
CN111077534A (en) * 2019-12-30 2020-04-28 清华大学苏州汽车研究院(吴江) Structural device based on fusion of monocular camera and laser radar
CN111427348A (en) * 2020-03-24 2020-07-17 江苏徐工工程机械研究院有限公司 Automatic drive mining dump truck environmental perception system and mining dump truck

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112799091A (en) * 2021-01-28 2021-05-14 知行汽车科技(苏州)有限公司 Algorithm evaluation method, device and storage medium
CN113155173A (en) * 2021-06-02 2021-07-23 福瑞泰克智能系统有限公司 Perception performance evaluation method and device, electronic device and storage medium
CN113155173B (en) * 2021-06-02 2022-08-30 福瑞泰克智能系统有限公司 Perception performance evaluation method and device, electronic device and storage medium
CN113419233A (en) * 2021-06-18 2021-09-21 阿波罗智能技术(北京)有限公司 Method, device and equipment for testing perception effect
CN115311761A (en) * 2022-07-15 2022-11-08 襄阳达安汽车检测中心有限公司 Non-real-time vehicle-mounted sensing system evaluation method and related equipment
CN115311761B (en) * 2022-07-15 2023-11-03 襄阳达安汽车检测中心有限公司 Non-real-time vehicle-mounted perception system evaluation method and related equipment
CN115825901A (en) * 2023-02-21 2023-03-21 南京楚航科技有限公司 Vehicle-mounted sensor perception performance evaluation truth value system

Similar Documents

Publication Publication Date Title
EP3505869B1 (en) Method, apparatus, and computer readable storage medium for updating electronic map
CN112147632A (en) Method, device, equipment and medium for testing vehicle-mounted laser radar perception algorithm
CN110595494B (en) Map error determination method and device
CN111998860B (en) Automatic driving positioning data verification method and device, electronic equipment and storage medium
CN110556012B (en) Lane positioning method and vehicle positioning system
CN109270545B (en) Positioning true value verification method, device, equipment and storage medium
CN111220154A (en) Vehicle positioning method, device, equipment and medium
KR20190082070A (en) Methods and apparatuses for map generation and moving entity localization
CN112132829A (en) Vehicle information detection method and device, electronic equipment and storage medium
CN111959495B (en) Vehicle control method and device and vehicle
CN110979346A (en) Method, device and equipment for determining lane where vehicle is located
JP2021119507A (en) Traffic lane determination method, traffic lane positioning accuracy evaluation method, traffic lane determination apparatus, traffic lane positioning accuracy evaluation apparatus, electronic device, computer readable storage medium, and program
CN111324945B (en) Sensor scheme determining method, device, equipment and storage medium
CN111310840B (en) Data fusion processing method, device, equipment and storage medium
CN111784835B (en) Drawing method, drawing device, electronic equipment and readable storage medium
CN111753765A (en) Detection method, device and equipment of sensing equipment and storage medium
CN111324115A (en) Obstacle position detection fusion method and device, electronic equipment and storage medium
CN111402326B (en) Obstacle detection method, obstacle detection device, unmanned vehicle and storage medium
CN112344855B (en) Obstacle detection method and device, storage medium and drive test equipment
CN111721281B (en) Position identification method and device and electronic equipment
CN111784837B (en) High-precision map generation method, apparatus, device, storage medium, and program product
CN110794844A (en) Automatic driving method, device, electronic equipment and readable storage medium
CN113091757A (en) Map generation method and device
CN114186007A (en) High-precision map generation method and device, electronic equipment and storage medium
CN111783611B (en) Unmanned vehicle positioning method and device, unmanned vehicle and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20201229

RJ01 Rejection of invention patent application after publication