CN114792469A - Method and device for testing sensing system and testing equipment - Google Patents

Method and device for testing sensing system and testing equipment Download PDF

Info

Publication number
CN114792469A
CN114792469A CN202210354970.6A CN202210354970A CN114792469A CN 114792469 A CN114792469 A CN 114792469A CN 202210354970 A CN202210354970 A CN 202210354970A CN 114792469 A CN114792469 A CN 114792469A
Authority
CN
China
Prior art keywords
information
target
sensing
detected
perception
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210354970.6A
Other languages
Chinese (zh)
Other versions
CN114792469B (en
Inventor
张�杰
闫丹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Datang Gaohong Zhilian Technology Chongqing Co ltd
Original Assignee
Datang Gaohong Zhilian Technology Chongqing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Datang Gaohong Zhilian Technology Chongqing Co ltd filed Critical Datang Gaohong Zhilian Technology Chongqing Co ltd
Priority to CN202210354970.6A priority Critical patent/CN114792469B/en
Publication of CN114792469A publication Critical patent/CN114792469A/en
Application granted granted Critical
Publication of CN114792469B publication Critical patent/CN114792469B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/042Detecting movement of traffic to be counted or controlled using inductive or magnetic detectors

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Traffic Control Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention provides a method, a device and a device for testing a perception system, wherein the method comprises the following steps: obtaining test data, the test data comprising: sensing data and corresponding truth value data thereof, wherein the sensing data is acquired by sensing equipment of a sensing system to be tested through roadside sensing; determining target information to be detected of the sensing system to be detected according to the sensing data and a sensing algorithm corresponding to the sensing system to be detected; determining target perception information of the perception system to be detected from the target information to be detected according to the truth value data; and determining the test result information of the perception system to be tested according to the target perception information. The scheme of the invention solves the problems that the test mode of the sensing equipment in the prior art needs to consume a large amount of labor cost and time cost, so that the test cost is higher and the whole test period is longer.

Description

Test method, device and test equipment for sensing system
Technical Field
The present invention relates to the field of communications technologies, and in particular, to a method, an apparatus, and a device for testing a sensing system.
Background
The vehicle-road cooperation is a technical scheme for realizing vehicle-road cooperative application by performing low-delay and high-reliability communication connection on the basis of performing sufficient perception calculation on roads and vehicles, and the importance of perception calculation capability in the vehicle-road cooperation can be seen. Therefore, the sensing device needs to be tested for its sensing capability before it is put into use, so as to ensure the accuracy and reliability of data output.
At present, the test method adopted by each research and development unit is mainly to drive vehicles manually, drive the vehicles in a test field according to the driving routes of different test scenes, then manually judge data indexes by analyzing logs output by each sensor algorithm, and finally, a tester needs to manually match the sensing result of a roadside sensor with a true value and manually output a test report.
The test mode has no large-scale test capability, a lot of errors are introduced in the test process, a large amount of labor cost and time cost are consumed, and the test cost is high and the whole test period is long.
Disclosure of Invention
The invention provides a method and a device for testing a sensing system and testing equipment, and solves the problems that in the prior art, a large amount of labor cost and time cost are consumed in a testing mode of the sensing equipment, so that the testing cost is high and the whole testing period is long.
In a first aspect, an embodiment of the present invention provides a method for testing a sensing system, including:
obtaining test data, the test data comprising: sensing data and corresponding truth value data thereof, wherein the sensing data is acquired by sensing equipment of a sensing system to be tested through roadside sensing;
determining target information to be detected of the perception system to be detected according to the perception data and a perception algorithm corresponding to the perception system to be detected;
determining target perception information of the perception system to be detected from the target information to be detected according to the truth value data;
and determining the test result information of the perception system to be tested according to the target perception information.
Optionally, the acquiring test data includes:
driving a test vehicle carrying first equipment according to a preset path, acquiring the truth value data through the first equipment, and,
in the process that the test vehicle runs along the preset path, performing roadside sensing through the sensing equipment of the sensing system to be tested to obtain the sensing data;
wherein the first device comprises a high precision positioning device.
Optionally, in a case that the sensing device is a camera, the determining, according to the sensing data and the sensing algorithm corresponding to the sensing system to be detected, target information to be detected of the sensing system to be detected includes:
acquiring a conversion matrix corresponding to the camera according to the perception data;
and determining the information of the target to be detected of the sensing system to be detected according to the sensing data, the conversion matrix and the sensing algorithm corresponding to the sensing system to be detected.
Optionally, when the sensing device includes a camera and a second device, and the second device is at least one of a laser radar and a millimeter wave radar, determining, according to the sensing data and a sensing algorithm corresponding to the sensing system to be detected, target information to be detected of the sensing system to be detected includes:
acquiring a conversion matrix corresponding to the camera according to the perception data; the sensing data comprises first sensing data corresponding to the camera and second sensing data corresponding to the second equipment;
determining first target information to be detected according to the first sensing data, the conversion matrix and a sensing algorithm corresponding to the camera; determining second target information to be detected according to the second sensing data and a sensing algorithm corresponding to the second equipment;
and determining the information of the target to be detected of the sensing system to be detected according to the information of the first target to be detected and the information of the second target to be detected.
Optionally, the determining, according to the truth value data, the target sensing information of the sensing system to be tested from the target information to be tested includes:
acquiring first truth value information in the truth value data frame by frame according to the output frequency of first equipment for acquiring the truth value data;
determining first target perception information corresponding to each first truth value information according to the target information to be detected and the first truth value information;
and determining the first target perception information as the target perception information of the perception system to be detected.
Optionally, the target information to be measured includes: a timestamp, coordinate information, and a target ID; the first truth information includes: timestamp and coordinate information;
determining first target perception information corresponding to each first truth value information according to the to-be-detected target information and the first truth value information, including:
acquiring first target information to be detected from the target information to be detected according to the timestamp of the first true value information, wherein the timestamp of the first target information to be detected is the same as the timestamp of the first true value information;
judging whether ID information which is the same as the target ID of the first target information to be tested exists in a first identification list, wherein the first identification list is generated in the testing process;
determining distance relationship information between the coordinate information of the first truth value information and the coordinate information of the first target information to be detected according to the first truth value information and the first target information to be detected when the judgment result is that the first truth value information does not exist, wherein the distance relationship information comprises: lateral and linear distances;
determining first target perception information corresponding to the first true value information according to the distance relation information and preconfigured threshold information;
wherein the threshold information is set according to the type of the sensing device, and the threshold information includes: and presetting the distance interval and a transverse distance threshold and a linear distance threshold corresponding to the distance interval.
Optionally, in a case that the first true value information is a first frame in the true value data, the determining, according to the distance relationship information and preconfigured threshold information, first target sensing information corresponding to the first true value information includes:
determining the first target information to be detected which meets a first preset condition and has the minimum straight line distance as the first target perception information according to the distance relation information and preconfigured threshold information;
wherein the first preset condition is as follows: the lateral distance is greater than or equal to the lateral distance threshold, and the linear distance is greater than the linear distance threshold.
Optionally, in a case that the first true value information is not a first frame in the true value data, the determining, according to the distance relationship information and preconfigured threshold information, first target sensing information corresponding to the first true value information includes:
determining whether a target ID of the first target information to be detected is the same as a first target ID of the first target perception information determined when the first true value information is a first frame in the true value data;
and determining first target perception information corresponding to the first true value information according to a judgment result, the distance relation information and pre-configured threshold information.
Optionally, the determining, according to the determination result, the distance relationship information, and preconfigured threshold information, first target perception information corresponding to the first true value information includes at least one of the following:
determining the first target information to be detected meeting a first preset condition as the first target perception information under the condition that the judgment results are the same, wherein the first preset condition is as follows: the lateral distance is less than or equal to the lateral distance threshold, and the linear distance is less than the linear distance threshold;
determining the first target information to be detected which meets a second preset condition and has the minimum straight-line distance as the first target perception information under the condition that the judgment results are different, wherein the second preset condition is as follows: the transverse distance is smaller than or equal to the transverse distance threshold, the linear distance is smaller than or equal to the linear distance threshold, and the absolute value of the time difference between the timestamp of the first to-be-measured target information and the timestamp of the first true value information is smaller than a preset time.
Optionally, after determining the first target-to-be-measured information meeting a first preset condition as the first target perception information, the method further includes:
and storing the target ID in the first target information to be detected which does not meet the first preset condition into the first identification list.
Optionally, the test result information includes at least one of: target positioning precision, target detection recall rate, accuracy rate and tracking success rate.
In a second aspect, an embodiment of the present invention provides a testing apparatus for a sensing system, including:
a data acquisition module for acquiring test data, the test data comprising: sensing data and corresponding truth value data thereof, wherein the sensing data is acquired by sensing equipment of a sensing system to be tested through roadside sensing;
the first processing module is used for determining the information of the target to be detected of the perception system to be detected according to the perception data and the perception algorithm corresponding to the perception system to be detected;
the second processing module is used for determining target perception information of the perception system to be detected from the target information to be detected according to the truth value data;
and the third processing module is used for determining the test result information of the perception system to be tested according to the target perception information.
In a third aspect, an embodiment of the present invention provides a test apparatus, including: a transceiver, a memory, a processor and a computer program stored on the memory and executable on the processor, the processor when executing the computer program implementing the steps of the method of testing of a perception system as described in the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, which computer program, when executed by a processor, implements the steps of the method for testing a perception system as described in the first aspect.
The technical scheme of the invention has the beneficial effects that:
according to the embodiment of the invention, the corresponding perception algorithm can be automatically called according to the test data to obtain the information of the target to be tested of the perception system to be tested, and then the target perception information is analyzed and obtained according to the information of the target to be tested and the truth value data, so that the automatic test of the perception capability of the perception system is realized, the test efficiency is improved, the labor cost and the time cost required by the test are saved, the errors caused by artificial analysis are reduced, and the test result information is more objective and credible.
Drawings
FIG. 1 is a flow chart of a method for testing a sensing system according to an embodiment of the present invention;
FIG. 2 illustrates a perceptual matching process of an embodiment of the present invention;
FIG. 3 illustrates a threshold acquisition process according to an embodiment of the present invention;
FIG. 4 is a block diagram showing a test apparatus for a sensing system according to an embodiment of the present invention;
fig. 5 shows a block diagram of a test apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the technical problems, technical solutions and advantages of the present invention more apparent, the following detailed description is given with reference to the accompanying drawings and specific embodiments. In the following description, specific details such as specific configurations and components are provided only to help the full understanding of the embodiments of the present invention. It will therefore be apparent to those skilled in the art that various changes and modifications can be made in the embodiments described herein without departing from the scope and spirit of the invention. In addition, descriptions of well-known functions and constructions are omitted for clarity and conciseness.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
In various embodiments of the present invention, it should be understood that the sequence numbers of the following processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
Additionally, the terms "system" and "network" are often used interchangeably herein.
In the embodiments provided herein, it should be understood that "B corresponding to a" means that B is associated with a from which B can be determined. It should also be understood that determining B from a does not mean determining B from a alone, but may also be determined from a and/or other information.
In the embodiment of the present invention, the access network may be an access network including a Macro Base Station (Macro Base Station), a micro Base Station (Pico Base Station), a Node B (3G mobile Station), an enhanced Base Station (eNB), a Home enhanced Base Station (Femto eNB or Home eNode B or Home eNB or HeNB), a relay Station, an access point, a Remote Radio Unit (RRU), a Remote Radio Head (RRH), and the like. The user terminal may be a mobile phone (or handset), or other device capable of sending or receiving wireless signals, including user Equipment, a Personal Digital Assistant (PDA), a wireless modem, a wireless communicator, a handheld device, a laptop computer, a cordless phone, a Wireless Local Loop (WLL) station, a CPE (Customer Premise Equipment) or a mobile smart hotspot capable of converting mobile signals into WiFi signals, a smart appliance, or other devices capable of autonomously communicating with a mobile communication network without human operation, and so on.
Specifically, embodiments of the present invention provide a method and an apparatus for testing a sensing system, and a testing device, so as to solve the problems of high testing cost and long overall testing period caused by the fact that a large amount of labor cost and time cost are consumed in a testing mode of a sensing device in the prior art.
First embodiment
As shown in fig. 1, an embodiment of the present invention provides a method for testing a sensing system, which specifically includes the following steps:
step 11: obtaining test data, the test data comprising: the sensing data are acquired by sensing equipment of the sensing system to be tested through roadside sensing.
It should be noted that the test data may be obtained by data import or may be obtained by uploading by the sensing system to be tested.
As an alternative embodiment, the step 11 may specifically include: the method comprises the steps that a test vehicle is used for carrying first equipment to run according to a preset path, real-value data are acquired and obtained through the first equipment, and roadside sensing is carried out through sensing equipment of a sensing system to be tested in the process that the test vehicle runs according to the preset path, so that sensing data are obtained; wherein the first device comprises a high precision positioning device.
It should be noted that the first device may be a device that uses a Real Time Kinematic (RTK) technique. For example, in the testing process, high-precision inertial navigation equipment is used as true value data acquisition equipment, and the positioning precision can reach the centimeter level. In the specific acquisition process, a target vehicle (namely a test vehicle) or a pedestrian can carry high-precision inertial navigation equipment to complete single-target and multi-target tests according to a planned route (namely a preset route) of a test case, wherein data output by the high-precision inertial navigation is truth value data, and the test vehicle can be one or more vehicles. Meanwhile, roadside sensing is performed through sensing equipment (such as roadside cameras, millimeter wave radar equipment, laser radar equipment and the like), and offline data (namely sensing data) are collected and stored in real time. Here, the offline data may include video, millimeter wave radar data, laser radar point cloud data, and the like, and it should be noted that the data format of such data needs to conform to a prescribed interface form.
Step 12: and determining the information of the target to be detected of the sensing system to be detected according to the sensing data and the sensing algorithm corresponding to the sensing system to be detected.
In this step, according to the sensing data, the sensing device used by the sensing system to be tested can be determined, so as to determine which sensing algorithm is used to determine the information of the target to be tested.
Step 13: and determining target perception information of the perception system to be detected from the target information to be detected according to the truth value data.
The sensing points which are successfully matched with the truth value data, namely the target sensing information of the sensing system to be tested, can be obtained in the step.
As an alternative embodiment, step 13 may specifically include:
step 1301, acquiring first truth value information in the truth value data frame by frame according to the output frequency of the first device for acquiring the truth value data.
For example, the output frequency of the first device is 10HZ, as shown in fig. 2, at which the perceptual target is matched frame by frame. Specifically, the sensing target is retrieved frame by frame (i.e. the first true value information in the true value data is obtained) according to the output frequency, wherein the true value data can be represented as the image sequence { R } 1 ,R 2 ,R 3 …,R n The target position corresponding to the t-th time can be represented as R t (x′ t, y′ t )。
Step 1302, determining first target sensing information corresponding to each first true value information according to the target information to be detected and the first true value information;
and 1303, determining the first target perception information as the target perception information of the perception system to be detected.
Step 14: and determining the test result information of the perception system to be tested according to the target perception information.
In this step, after determining the test result information of the sensing system to be tested, the test result information may also be output, for example, the test result information is displayed to the user through a screen. Here, the test result information may include at least one of: target positioning precision, target detection recall rate, accuracy rate and tracking success rate.
In the embodiment, according to the test data, the corresponding perception algorithm can be automatically called to obtain the information of the target to be tested of the perception system to be tested, and then the target perception information is analyzed and obtained according to the information of the target to be tested and the truth data, so that the automatic test of the perception capability of the perception system is realized, the test efficiency is improved, the labor cost and the time cost required by the test are saved, the errors caused by artificial analysis are reduced, and the test result information is more objective and credible.
In the embodiment of the present invention, according to the difference of the sensing devices, the step 12 can be divided into the following two cases:
the first condition is as follows: as an alternative embodiment, in the case that the sensing device is a camera, step 12 includes: acquiring a conversion matrix corresponding to the camera according to the perception data; and determining the information of the target to be detected of the sensing system to be detected according to the sensing data, the conversion matrix and the sensing algorithm corresponding to the sensing system to be detected.
In this embodiment, according to the sensing data (for example, an offline video of each test scene), an RT matrix (that is, a conversion matrix) of a camera corresponding to the offline data may be obtained, and then a visual algorithm interface is called to obtain a sensing result (that is, target information to be detected) corresponding to the sensing data, such as parameters such as a timestamp, coordinates, a target ID, and a type.
It should be noted that, in the prior art, conversion among a plurality of coordinate systems needs to be realized in space matching, the process is complex, the calibration difficulty is high, and errors are prone to occur when the positioning accuracy is evaluated. When the embodiment of the invention tests the vision algorithms corresponding to different cameras, the same conversion matrix can be adopted to complete the conversion of the image to the real coordinate system, thereby reducing the error caused by the conversion matrix and reflecting the difference between different vision algorithms more truly.
Case two: as an optional embodiment, in the case that the sensing device includes a camera and a second device, and the second device is at least one of a lidar or a millimeter-wave radar, step 12 includes: acquiring a conversion matrix corresponding to the camera according to the perception data; the sensing data comprise first sensing data corresponding to the camera and second sensing data corresponding to the second equipment; determining first target information to be detected according to the first sensing data, the conversion matrix and a sensing algorithm corresponding to the camera; determining second target information to be detected according to the second sensing data and a sensing algorithm corresponding to the second equipment; and determining the information of the target to be detected of the sensing system to be detected according to the information of the first target to be detected and the information of the second target to be detected.
For example, if the second device is a millimeter wave radar, the specific process of determining the target information to be detected of the sensing system to be detected is as follows: calling a visual algorithm interface, and acquiring a sensing result corresponding to an offline video according to the offline video (namely, first sensing data) of each test scene and an RT matrix (namely, a conversion matrix corresponding to a camera) corresponding to the offline video; and calling a radar vision fusion algorithm interface, inputting a vision perception result (namely first target information to be detected) and millimeter wave radar offline data (namely second perception data) in the same time period, and acquiring a radar vision fusion perception result (namely target information to be detected of a perception system to be detected), such as parameters of a timestamp, coordinates, a target ID, a type and the like.
In this embodiment, by calling an interface of the sensing system to be tested, a large number of test cases (that is, obtaining test data to determine target sensing information of the sensing system to be tested) can be automatically executed, and output information of the sensing system to be tested is collected, so that the sensing capability of the sensing device of the sensing system to be tested is evaluated in combination with preset evaluation criteria (for example, rules for calculating target positioning accuracy, target detection recall rate, accuracy rate, tracking success rate, and the like).
The invention can evaluate the perception capability of various sensors (namely perception equipment) in force, and supports the automatic test of roadside perception systems such as pure vision algorithm, millimeter-wave radar and vision fusion algorithm, millimeter-wave radar target detection, laser radar target detection and the like; the single-point test and multi-point fusion (such as vision + millimeter wave, vision + laser radar, vision + millimeter wave + laser radar and the like) test of vision, millimeter wave radar and laser radar is supported, test data are rich, test dimensionality is comprehensive, single-point detection and multi-target detection can be covered by the single-point test, multi-point detection and global point detection, the application range is wide, and practicability is high.
As shown in fig. 2, optionally, the target information to be measured includes: a timestamp, coordinate information, and a target ID; the first truth information includes: timestamp and coordinate information; determining first target perception information corresponding to each first truth value information according to the to-be-detected target information and the first truth value information, including:
step one, according to the timestamp of the first true value information, first target information to be detected is obtained from the target information to be detected, and the timestamp of the first target information to be detected is the same as the timestamp of the first true value information.
In this step, the truth data in the test data may be read, and the timestamp and the coordinate system information in the truth data may be extracted. According to the timestamp of the first truth value information, all the sensing targets to be detected (namely the first sensing target information) of the same timestamp can be matched, and the number of targets in each sensing image is assumed to be M t Then all the objects (i.e. the first object information to be measured) in each frame of image can be represented as
Figure BDA0003582097090000101
The position of the corresponding ith target at time t can be expressed as
Figure BDA0003582097090000102
And step two, judging whether the ID information which is the same as the target ID of the first target information to be tested exists in a first identification list, wherein the first identification list is generated in the testing process.
Here, the first identification list may be understood as a "blacklist" generated during the test, i.e. ID information that has been excluded by the test.
In the embodiment, the ID attribute is adopted for screening in the process of target matching, and the noise is removed by using the timestamp in the ID attribute, so that the false detection condition is greatly reduced.
Step three, under the condition that the judgment result is that the first truth value information does not exist, according to the first truth value information and the first target information to be detected, determining distance relationship information between the coordinate information of the first truth value information and the coordinate information of the first target information to be detected, wherein the distance relationship information comprises: lateral distance and linear distance.
That is, the first target information to be detected outside the blacklist is retained, and the first target perception information corresponding to the first true value information is screened from the first target information to be detected. In this process, the lateral-longitudinal deviation (i.e., the lateral distance, the longitudinal distance) and the linear distance between the ith sensing target to be measured (i.e., the first sensing target information) and the true target (i.e., the first true value information) at the t-th time need to be calculated:
wherein, the linear distance may be calculated by a first formula, and the first formula may be expressed as:
Figure BDA0003582097090000103
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003582097090000104
the distance of the straight line is shown,
Figure BDA0003582097090000105
Figure BDA0003582097090000106
here, the ith target position at the t-th time in the first target information to be measured is
Figure BDA0003582097090000107
The target position corresponding to the t-th time in the truth value data is R t (x′ t, y′ t );
The lateral distance may be calculated by a second formula, which may be expressed as:
Figure BDA0003582097090000111
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003582097090000112
the lateral distance is indicated as such,
Figure BDA0003582097090000113
the straight line distance theta represents the included angle between the lane direction and the due north direction;
the longitudinal distance may be calculated by a third formula, which may be expressed as:
Figure BDA0003582097090000114
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003582097090000115
indicating the longitudinal distance.
Step four, determining first target perception information corresponding to the first true value information according to the distance relation information and preconfigured threshold information; wherein the threshold information is set according to the type of the sensing device, and the threshold information includes: and presetting the distance interval and a transverse distance threshold and a linear distance threshold corresponding to the distance interval. Then, all the matched sensing point sequences P (i.e. the first target sensing information) are saved, as well as the linear distance, the transverse distance and the longitudinal distance.
It should be noted that, before the automated test is performed, the threshold information may be written into the configuration file, and when the automated test is used, the configuration file is read, and the corresponding transverse distance threshold and linear distance threshold are obtained according to the device type and the distance interval attribute (i.e., the preset distance interval). Different threshold information can be configured according to the characteristics and the sensing range of different sensing equipment types (namely the sensing performance of the sensing equipment in different regions), for example, different transverse distance thresholds and linear distance thresholds are set by regions and distances, so that the noise point target during matching is reduced to the maximum extent, the blind area error of the sensing algorithm is reduced, and the phenomena of mismatching and missing matching of the sensing points are reduced.
Optionally, in a case that the first true value information is a first frame in the true value data, the determining, according to the distance relationship information and preconfigured threshold information, first target sensing information corresponding to the first true value information includes: determining the first target information to be detected which meets a first preset condition and has the minimum straight line distance as the first target perception information according to the distance relation information and preconfigured threshold information; wherein the first preset condition is as follows: the lateral distance is greater than or equal to the lateral distance threshold, and the linear distance is greater than the linear distance threshold.
In this embodiment, when the first target sensing information corresponding to the first true value information is determined, the first true value information is a first frame in the true value data, at this time, a sensing point which satisfies a threshold and is closest to the true value may be directly obtained, where the sensing point satisfies the following condition:
Figure BDA0003582097090000116
Figure BDA0003582097090000117
then the sensing points are stored in a sequence P ═ P 1 ,P 2 ,P 3 …,P n And record the ID of the matched sensing point, i.e. the first target ID (for use)
Figure BDA0003582097090000118
Indicated).
As shown in fig. 3, the configuration file is read, and the lateral distance threshold is obtained according to the device type and the distance interval attribute, and is represented as a sequence { value } 1 ,value 2 ,value 3 …,value m Acquiring a linear distance threshold value according to the distance interval, and expressing the linear distance threshold value as a sequence { value } d1 ,value d2 ,valued 3 …,value dm }. Wherein value m Represents R t Transverse distance threshold parameter, value, of the area in which it is located dm Represents R t And a linear distance threshold parameter of the region where the current frame is located, wherein m represents an interval where the current frame is located.
Optionally, in a case that the first true value information is not a first frame in the true value data, the determining, according to the distance relationship information and preconfigured threshold information, first target sensing information corresponding to the first true value information includes:
step 201: and judging whether the target ID of the first target information to be detected is the same as a first target ID, where the first target ID is the target ID of the first target perception information determined when the first true value information is the first frame in the true value data.
That is to say, in the process of obtaining the first true value information in the true value data frame by frame for matching, if the first frame is successfully matched, the target ID of the first target information to be detected needs to be determined first (i.e. starting from the second frame) (i.e. the target ID of the first target information to be detected needs to be determined first)
Figure BDA0003582097090000121
ID) is the ID that the first frame matching is successful (i.e. the first target ID), so as to obtain the first true value information (using R) at the current time t t Indicated) and a linear distance threshold value (value) is read in accordance with the distance section dm Expressed) and lateral threshold (by value) m Indicated) to further screen according to step 202. For example, as shown in fig. 3, the distance interval may be divided into four distance intervals, which are respectively an interval of 0 to 50m, an interval of 50 to 100m, an interval of 100 to 150m, and an interval of 150 to 200m, and different distance intervals may correspond to different lateral distance thresholds and linear distance thresholds.
Step 202: and determining first target perception information corresponding to the first true value information according to the judgment result, the distance relation information and preconfigured threshold information.
As an alternative embodiment, step 202 may specifically include at least one of the following:
firstly, under the condition that the judgment results are the same, determining the first target information to be detected meeting a first preset condition as the first target perception information, where the first preset condition is: the lateral distance is less than or equal to the lateral distance threshold, and the linear distance is less than the linear distance threshold.
Specifically, if the object ID of the first target-to-be-measured information is set(assume to be represented as
Figure BDA0003582097090000122
) The same as the first target ID (assuming that the first target information to be measured is recorded as
Figure BDA0003582097090000123
) Then continue to judge
Figure BDA0003582097090000124
And R t Linear distance (i.e. of)
Figure BDA0003582097090000125
) And lateral distance (i.e.
Figure BDA0003582097090000126
) Whether a threshold is met; if satisfied (i.e. the
Figure BDA00035820970900001210
And is
Figure BDA0003582097090000127
) Then save the perception point
Figure BDA0003582097090000128
And will be
Figure BDA0003582097090000129
Storing sequence P ═ P 1 ,P 2 ,P 3 …,P n }; if not, then
Figure BDA0003582097090000131
Add to the black list and clear the sequence P and then re-match starting with the first frame.
Secondly, under the condition that the judgment results are different, determining the first target information to be detected which meets a second preset condition and has the minimum straight-line distance as the first target perception information, wherein the second preset condition is as follows: the transverse distance is smaller than or equal to the transverse distance threshold, the linear distance is smaller than or equal to the linear distance threshold, and the absolute value of the time difference between the timestamp of the first to-be-detected target information and the timestamp of the first true value information is smaller than preset time.
Specifically, if no sensing point having the same ID as the first target ID is matched (that is, the target ID of the first target information to be measured is different from the first target ID), the sensing point is acquired
Figure BDA0003582097090000132
To
Figure BDA0003582097090000133
All sensing points and R t The linear distance and the lateral distance of (c) are kept in accordance with the threshold value (i.e., the
Figure BDA0003582097090000134
And is provided with
Figure BDA0003582097090000135
) Sensing point of
Figure BDA0003582097090000136
Then, a time stamp carried in the ID is extracted from the ID (the time stamp in the ID indicates the discovery time of the target), and the time stamp is recorded as
Figure BDA0003582097090000137
The save timestamp satisfies the condition:
Figure BDA0003582097090000138
all the sensing points of (a); finally, the nearest point to the true value is output
Figure BDA0003582097090000139
Storing sequence P ═ P 1 ,P 2 ,P 3 …,P n }。
It should be noted that, when evaluating the matching degree between the trajectory output by each sensing system to be tested and the truth data, time and space synchronization processing is required. In the embodiment of the invention, different output of the sensing system to be tested is converted into real coordinates at a spatial position, so that the spatial matching of each sensor (namely sensing equipment) and a true value coordinate is realized; in terms of time, synchronous acquisition cannot be realized due to different acquisition frequencies of the sensors, and according to timestamps carried by true value data, data matching in similar time is performed on targets output by different sensing devices, so that time matching is realized. In addition, the embodiment of the invention also aims at the visual perception test, and generates real coordinates based on the same transformation matrix, thereby reducing matrix errors among different algorithms.
Optionally, after determining the first target-to-be-measured information meeting a first preset condition as the first target perception information, the method further includes: and storing the target ID in the first target information to be detected which does not meet the first preset condition into the first identification list.
Optionally, the test result information includes at least one of: target positioning precision, target detection recall rate, accuracy rate and tracking success rate.
In the embodiment of the invention, the target positioning precision, the target detection recall rate, the accuracy rate, the tracking success rate and the like can be used as evaluation indexes of the sensing system to be tested, target matching is carried out by acquiring test data, calling a sensing algorithm and the like, and finally the evaluation indexes are output according to the execution result, for example, a perfect test report can be generated, and the test report can be automatically output, so that the evaluation result of the sensing capability of the sensing system to be tested is displayed to a user, and a test scene with poor sensing capability performance of the sensing system to be tested can be highlighted.
In the embodiment, the target detection and tracking algorithm of the sensing equipment of the sensed system can be tested and analyzed, the quantitative index is output, the sensing capability of the sensed system can be evaluated systematically and objectively, the algorithm capability of the sensed system under different test scenes can be tested in a large scale, the operation is simple and convenient, and the test result is accurate.
Specifically, the test result information may include: and various indexes such as ID conversion times, accuracy, target detection recall, missed detection rate, false detection rate, transverse and longitudinal deviation (namely transverse distance and longitudinal distance), final Test Case coverage rate and the like in each driving scene. The following brief explanation of some indexes is as follows:
the ID conversion times are that the ID number of all targets (sequences P) which are perceived and recalled by a perception system to be tested is counted; wherein, the smaller the ID number, the better the tracking effect of the perception algorithm.
And (4) recall rate: counting the number (sequence P) of perception targets output by a perception system to be detected; wherein, the recall rate is the total number of perception recall targets/the total number of truth values;
the accuracy is as follows: if the identification accuracy judgment standard is that the types and the positions of the recalled target and the true value target are consistent, judging that the identification is accurate; the accuracy rate is the total number of the identification accuracy/the total number of the perception recalls of the perception system to be detected;
and (3) target positioning precision: according to different distance intervals (such as 0-50m, 50-100m, 100-150m, >150m) between the true value target and the lane starting point, test data extraction is carried out, the transverse and longitudinal deviations (namely the transverse distance and the longitudinal distance) at the same time t are calculated, and the average values of the transverse and longitudinal deviations are respectively counted according to the distance intervals.
According to the embodiment of the invention, the corresponding perception algorithm can be automatically called according to the test data to obtain the information of the target to be tested of the perception system to be tested, and then the target perception information is analyzed and obtained according to the information of the target to be tested and the truth value data, so that the perception capability (namely target detection capability, target tracking capability and the like) of the perception system to be tested can be automatically evaluated, and the test efficiency and the test precision are greatly improved.
Second embodiment
As shown in fig. 4, an embodiment of the invention provides a testing apparatus 400 for a sensing system, including:
a data obtaining module 401, configured to obtain test data, where the test data includes: sensing data and corresponding truth value data thereof, wherein the sensing data is acquired by sensing equipment of a sensing system to be tested through roadside sensing;
a first processing module 402, configured to determine, according to the sensing data and a sensing algorithm corresponding to the sensing system to be tested, target information to be tested of the sensing system to be tested;
a second processing module 403, configured to determine, according to the truth value data, target sensing information of the sensing system to be tested from the target information to be tested;
a third processing module 404, configured to determine test result information of the to-be-tested sensing system according to the target sensing information.
In the embodiment, according to the test data, the corresponding perception algorithm can be automatically called to obtain the information of the target to be tested of the perception system to be tested, and then the target perception information is analyzed and obtained according to the information of the target to be tested and the truth value data, so that the automatic test of the perception capability of the perception system is realized, the test efficiency is improved, the labor cost and the time cost required by the test are saved, the errors caused by artificial analysis are reduced, and the test result information is more objective and credible.
Optionally, the data obtaining module 401 includes:
the acquisition submodule is used for utilizing a test vehicle carrying first equipment to run according to a preset path, acquiring and obtaining the true value data through the first equipment, and performing roadside sensing through the sensing equipment of the sensing system to be tested in the process that the test vehicle runs according to the preset path to obtain the sensing data;
wherein the first device comprises a high precision positioning device.
Optionally, in a case that the sensing device is a camera, the first processing module 402 includes:
the first processing submodule is used for acquiring a conversion matrix corresponding to the camera according to the perception data;
and the second processing submodule is used for determining the information of the target to be detected of the sensing system to be detected according to the sensing data, the conversion matrix and the sensing algorithm corresponding to the sensing system to be detected.
Optionally, in a case that the sensing device includes a camera and a second device, and the second device is at least one of a lidar or a millimeter wave radar, the first processing module 402 includes:
the third processing submodule is used for acquiring a conversion matrix corresponding to the camera according to the perception data; the sensing data comprise first sensing data corresponding to the camera and second sensing data corresponding to the second equipment;
the fourth processing submodule is used for determining first target information to be detected according to the first sensing data, the conversion matrix and the sensing algorithm corresponding to the camera; determining second target information to be detected according to the second sensing data and a sensing algorithm corresponding to the second equipment;
and the fifth processing submodule is used for determining the information of the target to be detected of the sensing system to be detected according to the first information of the target to be detected and the second information of the target to be detected.
Optionally, the second processing module 403 includes:
the sixth processing submodule is used for acquiring first truth value information in the truth value data frame by frame according to the output frequency of first equipment for acquiring the truth value data;
a seventh processing sub-module, configured to determine, according to the to-be-detected target information and the first true value information, first target sensing information corresponding to each first true value information;
and the eighth processing submodule is used for determining the first target perception information as the target perception information of the perception system to be detected.
Optionally, the target information to be measured includes: a timestamp, coordinate information, and a target ID; the first truth information includes: timestamp and coordinate information;
wherein the seventh processing sub-module comprises:
a first processing unit, configured to obtain first target information to be measured from the target information to be measured according to a timestamp of the first true value information, where the timestamp of the first target information to be measured is the same as the timestamp of the first true value information;
the second processing unit is used for judging whether the ID information which is the same as the target ID of the first target information to be tested exists in a first identification list, and the first identification list is generated in the testing process;
a third processing unit, configured to, when a determination result is that the first truth value information does not exist, determine distance relationship information between the coordinate information of the first truth value information and the coordinate information of the first target information according to the first truth value information and the first target information to be measured, where the distance relationship information includes: lateral and linear distances;
a fourth processing unit, configured to determine, according to the distance relationship information and preconfigured threshold information, first target sensing information corresponding to the first true value information;
wherein the threshold information is set according to the type of the sensing device, and the threshold information includes: and presetting the distance interval and a transverse distance threshold and a linear distance threshold corresponding to the distance interval.
Optionally, the fourth processing unit includes:
the first processing subunit is configured to determine, according to the distance relationship information and preconfigured threshold information, the first target information to be measured that meets a first preset condition and has a minimum linear distance as the first target sensing information;
wherein the first preset condition is as follows: the lateral distance is greater than or equal to the lateral distance threshold, and the linear distance is greater than the linear distance threshold.
Optionally, the fourth processing unit includes:
a second processing subunit, configured to determine whether a target ID of the first target information to be detected is the same as a first target ID of the first target perception information, where the first target ID is determined when the first true value information is a first frame in the true value data;
and the third processing subunit is configured to determine, according to the determination result, the distance relationship information, and preconfigured threshold information, first target sensing information corresponding to the first true value information.
Optionally, the third processing subunit includes:
a fourth processing subunit, configured to, in a case that the determination results are the same, determine the first to-be-measured target information that meets a first preset condition as the first target sensing information, where the first preset condition is: the lateral distance is less than or equal to the lateral distance threshold, and the linear distance is less than the linear distance threshold;
a fifth processing subunit, configured to, when the determination results are different, determine that the first to-be-detected target information that meets a second preset condition and has a minimum linear distance is the first target sensing information, where the second preset condition is: the transverse distance is smaller than or equal to the transverse distance threshold, the linear distance is smaller than or equal to the linear distance threshold, and the absolute value of the time difference between the timestamp of the first to-be-detected target information and the timestamp of the first true value information is smaller than preset time.
Optionally, the third processing subunit further includes:
and the sixth processing subunit is configured to store the target ID in the first target information to be detected, which does not satisfy the first preset condition, in the first identifier list.
Optionally, the test result information includes at least one of: target positioning precision, target detection recall rate, accuracy rate and tracking success rate.
The second embodiment of the present invention is corresponding to the method of the first embodiment, and all the implementation means in the first embodiment are applied to the embodiment of the testing apparatus of the sensing system, so as to achieve the same technical effect.
Third embodiment
In order to better achieve the above object, as shown in fig. 5, a third embodiment of the present invention further provides a test apparatus, including:
a processor 500; and a memory 520 connected to the processor 500 through a bus interface, wherein the memory 520 is used for storing programs and data used by the processor 500 in executing operations, and the processor 500 calls and executes the programs and data stored in the memory 520.
Wherein, the transceiver 510 is connected with the bus interface for receiving and transmitting data under the control of the processor 500; the processor 500 is configured to read the program in the memory 520 and execute the following steps:
obtaining test data, the test data comprising: sensing data and corresponding truth value data thereof, wherein the sensing data is acquired by sensing equipment of a sensing system to be tested through roadside sensing;
determining target information to be detected of the sensing system to be detected according to the sensing data and a sensing algorithm corresponding to the sensing system to be detected;
determining target perception information of the perception system to be detected from the target information to be detected according to the truth value data;
and determining the test result information of the perception system to be tested according to the target perception information.
In the embodiment, according to the test data, the corresponding perception algorithm can be automatically called to obtain the information of the target to be tested of the perception system to be tested, and then the target perception information is analyzed and obtained according to the information of the target to be tested and the truth value data, so that the automatic test of the perception capability of the perception system is realized, the test efficiency is improved, the labor cost and the time cost required by the test are saved, the errors caused by artificial analysis are reduced, and the test result information is more objective and credible.
Wherein in fig. 5, the bus architecture may include any number of interconnected buses and bridges, with one or more processors, represented by processor 500, and various circuits, represented by memory 520, being linked together. The bus architecture may also link together various other circuits such as peripherals, voltage regulators, power management circuits, and the like, which are well known in the art, and therefore, will not be described any further herein. The bus interface provides an interface. The transceiver 510 may be a plurality of elements, including a transmitter and a transceiver, providing a means for communicating with various other apparatus over a transmission medium. For different terminals, the user interface 530 may also be an interface capable of interfacing with a desired device, including but not limited to a keypad, display, speaker, microphone, joystick, etc. The processor 500 is responsible for managing the bus architecture and general processing, and the memory 520 may store data used by the processor 500 in performing operations.
Optionally, when obtaining the test data, the processor 500 is specifically configured to:
driving a test vehicle carrying first equipment according to a preset path, acquiring the truth value data through the first equipment, and,
in the process that the test vehicle runs along the preset path, performing roadside sensing through the sensing equipment of the sensing system to be tested to obtain the sensing data;
wherein the first device comprises a high precision positioning device.
Optionally, in a case that the sensing device is a camera, when determining, according to the sensing data and the sensing algorithm corresponding to the sensing system to be tested, the processor 500 is specifically configured to:
acquiring a conversion matrix corresponding to the camera according to the perception data;
and determining the information of the target to be detected of the perception system to be detected according to the perception data, the conversion matrix and the perception algorithm corresponding to the perception system to be detected.
Optionally, in a case that the sensing device includes a camera and a second device, and the second device is at least one of a laser radar and a millimeter wave radar, when determining, according to the sensing data and a sensing algorithm corresponding to the sensing system to be detected, target information of the sensing system to be detected, the processor 500 is specifically configured to:
acquiring a conversion matrix corresponding to the camera according to the perception data; the sensing data comprises first sensing data corresponding to the camera and second sensing data corresponding to the second equipment;
determining first target information to be detected according to the first sensing data, the conversion matrix and a sensing algorithm corresponding to the camera; determining second target information to be detected according to the second sensing data and a sensing algorithm corresponding to the second equipment;
and determining the information of the target to be detected of the sensing system to be detected according to the information of the first target to be detected and the information of the second target to be detected.
Optionally, when determining the target sensing information of the sensing system to be tested from the target information to be tested according to the truth data, the processor 500 is specifically configured to:
acquiring first truth value information in the truth value data frame by frame according to the output frequency of first equipment for acquiring the truth value data;
determining first target perception information corresponding to each first truth value information according to the target information to be detected and the first truth value information;
and determining the first target perception information as the target perception information of the perception system to be detected.
Optionally, the target information to be measured includes: a timestamp, coordinate information, and a target ID; the first truth information includes: timestamp and coordinate information;
when determining, according to the target information to be detected and the first true value information, the processor 500 is specifically configured to:
acquiring first target information to be detected from the target information to be detected according to the timestamp of the first true value information, wherein the timestamp of the first target information to be detected is the same as the timestamp of the first true value information;
judging whether ID information which is the same as the target ID of the first target information to be tested exists in a first identification list, wherein the first identification list is generated in the testing process;
determining distance relationship information between the coordinate information of the first truth value information and the coordinate information of the first target information to be detected according to the first truth value information and the first target information to be detected when the judgment result is that the first truth value information and the first target information to be detected do not exist, wherein the distance relationship information comprises: a lateral distance and a linear distance;
determining first target perception information corresponding to the first true value information according to the distance relation information and preconfigured threshold information;
wherein the threshold information is set according to the type of the sensing device, and the threshold information includes: and presetting the distance interval and a transverse distance threshold and a linear distance threshold corresponding to the distance interval.
Optionally, in a case that the first true value information is a first frame in the true value data, when determining, according to the distance relationship information and preconfigured threshold information, first target sensing information corresponding to the first true value information, the processor 500 is specifically configured to:
determining the first target information to be detected which meets a first preset condition and has the minimum straight line distance as the first target perception information according to the distance relation information and preconfigured threshold information;
wherein the first preset condition is as follows: the lateral distance is greater than or equal to the lateral distance threshold, and the linear distance is greater than the linear distance threshold.
Optionally, in a case that the first true value information is not the first frame in the true value data, when determining the first target sensing information corresponding to the first true value information according to the distance relationship information and the preconfigured threshold information, the processor 500 is specifically configured to:
determining whether a target ID of the first target information to be detected is the same as a first target ID of the first target perception information determined when the first true value information is a first frame in the true value data;
and determining first target perception information corresponding to the first true value information according to the judgment result, the distance relation information and preconfigured threshold information.
Optionally, when determining, according to the determination result, the distance relationship information, and preconfigured threshold information, the processor 500 is specifically configured to:
determining the first target information to be detected meeting a first preset condition as the first target perception information under the condition that the judgment results are the same, wherein the first preset condition is as follows: the lateral distance is less than or equal to the lateral distance threshold, and the linear distance is less than the linear distance threshold;
and under the condition that the judgment results are different, determining the first target information to be detected which meets a second preset condition and has the minimum linear distance as the first target perception information, wherein the second preset condition is as follows: the transverse distance is smaller than or equal to the transverse distance threshold, the linear distance is smaller than or equal to the linear distance threshold, and the absolute value of the time difference between the timestamp of the first to-be-measured target information and the timestamp of the first true value information is smaller than a preset time.
Optionally, the processor 500 is further configured to:
and storing the target ID in the first target information to be detected which does not meet the first preset condition into the first identification list.
Optionally, the test result information includes at least one of: target positioning precision, target detection recall rate, accuracy rate and tracking success rate.
The test equipment provided by the invention can automatically call the corresponding perception algorithm according to the test data to obtain the information of the target to be tested of the perception system to be tested, and further analyze and obtain the perception information of the target according to the information of the target to be tested and the truth data, so that the perception capability (namely target detection, target tracking capability and the like) of the perception system to be tested can be automatically evaluated, and the test efficiency and the test precision are greatly improved.
Those skilled in the art will appreciate that all or part of the steps for implementing the above embodiments may be performed by hardware, or may be instructed to be performed by associated hardware by a computer program that includes instructions for performing some or all of the steps of the above methods; and the computer program may be stored in a readable storage medium, which may be any form of storage medium.
In addition, a computer-readable storage medium is provided according to an embodiment of the present invention, and a computer program is stored thereon, where the computer program is executed by a processor to implement the steps of the method as described in the first embodiment. And the same technical effect can be achieved, and in order to avoid repetition, the description is omitted here.
Furthermore, it should be noted that in the apparatus and method of the present invention, it is obvious that each component or each step may be decomposed and/or recombined. These decompositions and/or recombinations are to be considered as equivalents of the present invention. Also, the steps of performing the series of processes described above may naturally be performed chronologically in the order described, but need not necessarily be performed chronologically, and some steps may be performed in parallel or independently of each other. It will be understood by those skilled in the art that all or any of the steps or elements of the method and apparatus of the present invention may be implemented in any computing device (including processors, storage media, etc.) or network of computing devices, in hardware, firmware, software, or any combination thereof, which can be implemented by those skilled in the art using their basic programming skills after reading the description of the present invention.
The object of the invention is thus also achieved by a program or a set of programs running on any computing device. The computing device may be a well-known general purpose device. The object of the invention is thus also achieved solely by providing a program product comprising program code for implementing the method or the apparatus. That is, such a program product also constitutes the present invention, and a storage medium storing such a program product also constitutes the present invention. It is to be understood that the storage medium may be any known storage medium or any storage medium developed in the future. It is further noted that in the apparatus and method of the present invention, it is apparent that each component or step can be decomposed and/or recombined. These decompositions and/or recombinations are to be regarded as equivalents of the present invention. Also, the steps of executing the series of processes described above may naturally be executed chronologically in the order described, but need not necessarily be executed chronologically. Some steps may be performed in parallel or independently of each other.
While the foregoing is directed to the preferred embodiment of the present invention, it will be appreciated by those skilled in the art that various changes and modifications may be made therein without departing from the principles of the invention as set forth in the appended claims.

Claims (14)

1. A method for testing a perception system, comprising:
obtaining test data, the test data comprising: sensing data and corresponding truth value data thereof, wherein the sensing data is acquired by sensing equipment of a sensing system to be tested through roadside sensing;
determining target information to be detected of the perception system to be detected according to the perception data and a perception algorithm corresponding to the perception system to be detected;
determining target perception information of the perception system to be detected from the target information to be detected according to the truth value data;
and determining the test result information of the perception system to be tested according to the target perception information.
2. The method of claim 1, wherein said obtaining test data comprises:
the test vehicle is used for carrying first equipment to run according to a preset path, the truth value data is acquired and obtained through the first equipment, and,
in the process that the test vehicle runs along the preset path, roadside sensing is carried out through the sensing equipment of the sensing system to be tested, and the sensing data are obtained;
wherein the first device comprises a high precision positioning device.
3. The method according to claim 1, wherein in a case that the sensing device is a camera, the determining target information to be detected of the sensing system to be detected according to the sensing data and a sensing algorithm corresponding to the sensing system to be detected includes:
acquiring a conversion matrix corresponding to the camera according to the perception data;
and determining the information of the target to be detected of the sensing system to be detected according to the sensing data, the conversion matrix and the sensing algorithm corresponding to the sensing system to be detected.
4. The method according to claim 1, wherein in a case that the sensing device includes a camera and a second device, and the second device is at least one of a laser radar or a millimeter wave radar, the determining target information to be detected of the sensing system to be detected according to the sensing data and a sensing algorithm corresponding to the sensing system to be detected includes:
acquiring a conversion matrix corresponding to the camera according to the perception data; the sensing data comprise first sensing data corresponding to the camera and second sensing data corresponding to the second equipment;
determining first target information to be detected according to the first sensing data, the conversion matrix and a sensing algorithm corresponding to the camera; determining second target information to be detected according to the second sensing data and a sensing algorithm corresponding to the second equipment;
and determining the information of the target to be detected of the sensing system to be detected according to the information of the first target to be detected and the information of the second target to be detected.
5. The method according to claim 1, wherein the determining target sensing information of the sensing system under test from the target information under test according to the truth data comprises:
acquiring first truth value information in the truth value data frame by frame according to the output frequency of first equipment for acquiring the truth value data;
determining first target perception information corresponding to each first truth value information according to the target information to be detected and the first truth value information;
and determining the first target perception information as the target perception information of the perception system to be detected.
6. The method of claim 5, wherein the target information to be measured comprises: a timestamp, coordinate information, and a target ID; the first truth information includes: timestamp and coordinate information;
determining first target perception information corresponding to each first true value information according to the target information to be detected and the first true value information, including:
acquiring first target information to be detected from the target information to be detected according to the timestamp of the first true value information, wherein the timestamp of the first target information to be detected is the same as the timestamp of the first true value information;
judging whether ID information identical to the target ID of the first target information to be tested exists in a first identification list, wherein the first identification list is generated in the testing process;
determining distance relationship information between the coordinate information of the first truth value information and the coordinate information of the first target information to be detected according to the first truth value information and the first target information to be detected when the judgment result is that the first truth value information and the first target information to be detected do not exist, wherein the distance relationship information comprises: a lateral distance and a linear distance;
determining first target perception information corresponding to the first true value information according to the distance relation information and preconfigured threshold information;
wherein the threshold information is set according to the type of the sensing device, and the threshold information includes: and presetting the distance interval and a transverse distance threshold and a linear distance threshold corresponding to the distance interval.
7. The method according to claim 6, wherein in a case that the first true value information is a first frame in the true value data, the determining first target perception information corresponding to the first true value information according to the distance relationship information and preconfigured threshold information comprises:
determining the first target information to be detected which meets a first preset condition and has the minimum straight line distance as the first target perception information according to the distance relation information and preconfigured threshold information;
wherein the first preset condition is as follows: the lateral distance is greater than or equal to the lateral distance threshold, and the linear distance is greater than the linear distance threshold.
8. The method according to claim 6, wherein in a case that the first true value information is not a first frame in the true value data, the determining first target perception information corresponding to the first true value information according to the distance relationship information and preconfigured threshold information comprises:
determining whether a target ID of the first target information to be detected is the same as a first target ID of the first target perception information determined when the first true value information is a first frame in the true value data;
and determining first target perception information corresponding to the first true value information according to the judgment result, the distance relation information and preconfigured threshold information.
9. The method according to claim 8, wherein the determining the first target perception information corresponding to the first true value information according to the determination result, the distance relation information and preconfigured threshold information includes at least one of:
determining the first target information to be detected meeting a first preset condition as the first target perception information under the condition that the judgment results are the same, wherein the first preset condition is as follows: the lateral distance is less than or equal to the lateral distance threshold, and the linear distance is less than the linear distance threshold;
determining the first target information to be detected which meets a second preset condition and has the minimum straight-line distance as the first target perception information under the condition that the judgment results are different, wherein the second preset condition is as follows: the transverse distance is smaller than or equal to the transverse distance threshold, the linear distance is smaller than or equal to the linear distance threshold, and the absolute value of the time difference between the timestamp of the first to-be-measured target information and the timestamp of the first true value information is smaller than a preset time.
10. The method according to claim 9, wherein after the determining of the first target information satisfying a first preset condition as the first target perception information, the method further comprises:
and storing the target ID in the first target information to be detected which does not meet the first preset condition into the first identification list.
11. The method of claim 1, wherein the test result information comprises at least one of: target positioning precision, target detection recall rate, accuracy rate and tracking success rate.
12. A test apparatus, comprising: transceiver, memory, processor and computer program stored on the memory and executable on the processor, characterized in that the processor realizes the steps of the method of testing the perception system according to any of the claims 1 to 11 when executing the computer program.
13. A device for testing a sensing system, comprising:
a data acquisition module for acquiring test data, the test data comprising: sensing data and corresponding truth value data thereof, wherein the sensing data is acquired by sensing equipment of a sensing system to be tested through roadside sensing;
the first processing module is used for determining the information of the target to be detected of the sensing system to be detected according to the sensing data and the sensing algorithm corresponding to the sensing system to be detected;
the second processing module is used for determining target perception information of the perception system to be detected from the target information to be detected according to the truth value data;
and the third processing module is used for determining the test result information of the perception system to be tested according to the target perception information.
14. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method of testing a perception system according to any one of claims 1 to 11.
CN202210354970.6A 2022-04-06 2022-04-06 Testing method and device for sensing system and testing equipment Active CN114792469B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210354970.6A CN114792469B (en) 2022-04-06 2022-04-06 Testing method and device for sensing system and testing equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210354970.6A CN114792469B (en) 2022-04-06 2022-04-06 Testing method and device for sensing system and testing equipment

Publications (2)

Publication Number Publication Date
CN114792469A true CN114792469A (en) 2022-07-26
CN114792469B CN114792469B (en) 2023-06-02

Family

ID=82462364

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210354970.6A Active CN114792469B (en) 2022-04-06 2022-04-06 Testing method and device for sensing system and testing equipment

Country Status (1)

Country Link
CN (1) CN114792469B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115825901A (en) * 2023-02-21 2023-03-21 南京楚航科技有限公司 Vehicle-mounted sensor perception performance evaluation truth value system
CN116030551A (en) * 2023-03-29 2023-04-28 小米汽车科技有限公司 Method, device, equipment and storage medium for testing vehicle autopilot software

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109407547A (en) * 2018-09-28 2019-03-01 合肥学院 Multi-cam assemblage on-orbit test method and system towards panoramic vision perception
CN111596644A (en) * 2020-05-12 2020-08-28 重庆车辆检测研究院有限公司 Vehicle-mounted evaluation system and method based on comprehensive tester for vehicle-road cooperative application
CN112540352A (en) * 2019-09-20 2021-03-23 初速度(苏州)科技有限公司 Method and device for evaluating target detection algorithm based on unmanned vehicle
CN112816954A (en) * 2021-02-09 2021-05-18 中国信息通信研究院 Road side perception system evaluation method and system based on truth value
CN113033029A (en) * 2021-05-24 2021-06-25 湖北亿咖通科技有限公司 Automatic driving simulation method and device, electronic equipment and storage medium
CN113093178A (en) * 2021-04-21 2021-07-09 中国第一汽车股份有限公司 Obstacle target detection method and device, domain controller and vehicle
CN113155173A (en) * 2021-06-02 2021-07-23 福瑞泰克智能系统有限公司 Perception performance evaluation method and device, electronic device and storage medium
US20210276571A1 (en) * 2020-03-03 2021-09-09 Horiba Instruments Incorporated Apparatus and method for testing automated vehicles
WO2021185537A1 (en) * 2020-03-20 2021-09-23 Horiba Europe Gmbh Test system for testing a lidar device
CN113505687A (en) * 2021-07-08 2021-10-15 北京星云互联科技有限公司 Equipment test method, device, electronic equipment, system and storage medium
CN113920729A (en) * 2021-10-11 2022-01-11 华录易云科技有限公司 Method for evaluating perception capability of traffic participants based on roadside perception system
CN114173307A (en) * 2021-12-17 2022-03-11 浙江海康智联科技有限公司 Roadside perception fusion system based on vehicle-road cooperation and optimization method

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109407547A (en) * 2018-09-28 2019-03-01 合肥学院 Multi-cam assemblage on-orbit test method and system towards panoramic vision perception
CN112540352A (en) * 2019-09-20 2021-03-23 初速度(苏州)科技有限公司 Method and device for evaluating target detection algorithm based on unmanned vehicle
US20210276571A1 (en) * 2020-03-03 2021-09-09 Horiba Instruments Incorporated Apparatus and method for testing automated vehicles
WO2021185537A1 (en) * 2020-03-20 2021-09-23 Horiba Europe Gmbh Test system for testing a lidar device
CN111596644A (en) * 2020-05-12 2020-08-28 重庆车辆检测研究院有限公司 Vehicle-mounted evaluation system and method based on comprehensive tester for vehicle-road cooperative application
CN112816954A (en) * 2021-02-09 2021-05-18 中国信息通信研究院 Road side perception system evaluation method and system based on truth value
CN113093178A (en) * 2021-04-21 2021-07-09 中国第一汽车股份有限公司 Obstacle target detection method and device, domain controller and vehicle
CN113033029A (en) * 2021-05-24 2021-06-25 湖北亿咖通科技有限公司 Automatic driving simulation method and device, electronic equipment and storage medium
CN113155173A (en) * 2021-06-02 2021-07-23 福瑞泰克智能系统有限公司 Perception performance evaluation method and device, electronic device and storage medium
CN113505687A (en) * 2021-07-08 2021-10-15 北京星云互联科技有限公司 Equipment test method, device, electronic equipment, system and storage medium
CN113920729A (en) * 2021-10-11 2022-01-11 华录易云科技有限公司 Method for evaluating perception capability of traffic participants based on roadside perception system
CN114173307A (en) * 2021-12-17 2022-03-11 浙江海康智联科技有限公司 Roadside perception fusion system based on vehicle-road cooperation and optimization method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115825901A (en) * 2023-02-21 2023-03-21 南京楚航科技有限公司 Vehicle-mounted sensor perception performance evaluation truth value system
CN116030551A (en) * 2023-03-29 2023-04-28 小米汽车科技有限公司 Method, device, equipment and storage medium for testing vehicle autopilot software

Also Published As

Publication number Publication date
CN114792469B (en) 2023-06-02

Similar Documents

Publication Publication Date Title
US9234958B2 (en) Method, apparatus, and computer program product for distributed indoor three-dimensional radiomap
CN111983635B (en) Pose determination method and device, electronic equipment and storage medium
CN109286946B (en) Mobile communication indoor wireless network optimization method and system based on unsupported positioning
CN114792469A (en) Method and device for testing sensing system and testing equipment
CN101835089B (en) System and method for effectively populating a mesh network model
Kim et al. Robust localization with unknown transmission power for cognitive radio
CN112526572B (en) Network switching method and positioning system for indoor and outdoor seamless navigation
CN111782980B (en) Mining method, device, equipment and storage medium for map interest points
Bejarano-Luque et al. A data-driven algorithm for indoor/outdoor detection based on connection traces in a LTE network
US8880092B1 (en) Using smart meters to determine mobile wireless computing device location
US9377523B2 (en) Determining wireless access point locations using clustered data points
CN113556680B (en) Fingerprint data processing method, medium and mobile robot
CN111246386B (en) Terminal positioning method and device
CN112911506B (en) Network switching method and device, computer equipment and readable storage medium
CN113923587A (en) Positioning method, system and device
KR102094307B1 (en) Method and apparatus for wireless localization with improved initial accuracy
CN105142214A (en) WLAN hotspot positioning information acquisition method and corresponding terminal
KR101152384B1 (en) System and method for determining location of access point
Zhou et al. Mining geometric constraints from crowd-sourced radio signals and its application to indoor positioning
CN111210471A (en) Positioning method, device and system
CN115022965B (en) Cell positioning method, device, electronic equipment and storage medium
JP6185804B2 (en) Determination device, network node, determination method, and program
KR101090757B1 (en) Quality control system of Digital-TRS network and its using method
US11864059B2 (en) Device position accuracy with network-based crowdsourcing
US20230354251A1 (en) Method for increasing accuracy of position estimation, system, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 400040 No. 35, Jinghe Road, Huxi street, high tech Zone, Shapingba District, Chongqing

Applicant after: CITIC Technology Zhilian Technology Co.,Ltd.

Address before: 400040 No. 35, Jinghe Road, Huxi street, high tech Zone, Shapingba District, Chongqing

Applicant before: Datang Gaohong Zhilian Technology (Chongqing) Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant