CN116030551B - Method, device, equipment and storage medium for testing vehicle autopilot software - Google Patents

Method, device, equipment and storage medium for testing vehicle autopilot software Download PDF

Info

Publication number
CN116030551B
CN116030551B CN202310317474.8A CN202310317474A CN116030551B CN 116030551 B CN116030551 B CN 116030551B CN 202310317474 A CN202310317474 A CN 202310317474A CN 116030551 B CN116030551 B CN 116030551B
Authority
CN
China
Prior art keywords
speed
real
time
target
curve
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310317474.8A
Other languages
Chinese (zh)
Other versions
CN116030551A (en
Inventor
张琼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiaomi Automobile Technology Co Ltd
Original Assignee
Xiaomi Automobile Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiaomi Automobile Technology Co Ltd filed Critical Xiaomi Automobile Technology Co Ltd
Priority to CN202310317474.8A priority Critical patent/CN116030551B/en
Publication of CN116030551A publication Critical patent/CN116030551A/en
Application granted granted Critical
Publication of CN116030551B publication Critical patent/CN116030551B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Traffic Control Systems (AREA)

Abstract

The disclosure relates to a method, a device, equipment and a storage medium for testing vehicle automatic driving software, wherein the method comprises the following steps: acquiring real-time environment sensing data acquired by a vehicle environment sensing sensor; acquiring a speed truth value curve of a traffic target according to real-time environment perception data; acquiring a speed detection value curve of a traffic target according to the real-time environment sensing data and the automatic driving software; analyzing and processing the direction and/or the error of the speed truth curve and the speed detection value curve to obtain an analysis result; and determining whether the automatic driving software passes the test according to the analysis result. The technical scheme of the automatic control system can automatically compare based on large-scale real-time environment perception data, so that the detection of the perception capability of the automatic driving software is realized, and the safety and the effectiveness of the automatic driving software are ensured.

Description

Method, device, equipment and storage medium for testing vehicle autopilot software
Technical Field
The disclosure relates to the technical field of automatic driving, and in particular relates to a method, a device, equipment and a storage medium for testing vehicle automatic driving software.
Background
The perceived portion of the autopilot software is the primary input downstream of the other portions of the autopilot, and the perceived result directly affects the final decision of the autopilot software. Testing the perceptibility is therefore particularly important for the successful implementation of autopilot software functions. The test method in the related art mainly obtains test data by manually performing drive test, compares the test data with actual data to test the perceptibility, and has lower test efficiency.
Disclosure of Invention
In order to overcome the problems in the related art, the present disclosure provides a method, an apparatus, an electronic device, and a storage medium for testing vehicle autopilot software.
According to a first aspect of an embodiment of the present disclosure, there is provided a method for testing vehicle autopilot software, including: acquiring real-time environment sensing data acquired by a vehicle environment sensing sensor; acquiring a speed truth value curve of a traffic target according to the real-time environment perception data; acquiring a speed detection value curve of the traffic target according to the real-time environment sensing data and the automatic driving software; analyzing the direction and/or the error of the speed truth curve and the speed detection value curve to obtain an analysis result; and determining whether the automatic driving software passes the test according to the analysis result.
In one implementation, the context aware sensor comprises at least a lidar, and the real-time context aware data comprises at least real-time laser point cloud data; the obtaining the speed truth curve of the traffic target according to the real-time environment sensing data comprises the following steps: acquiring a target identification result according to the real-time laser point cloud data and the trained first point cloud deep learning model; the target identification result comprises the real-time size and the real-time position of the target; the trained first point cloud deep learning model is a model obtained according to labeling information of a sample target, wherein the labeling information at least comprises the size and the position of the sample target; acquiring real-time position information of the traffic target according to the real-time size and the real-time position of the target in the target identification result; and determining time information corresponding to the real-time laser point cloud data, and acquiring a speed truth curve of the traffic target according to the real-time position information and the time information.
In one implementation, the context-aware sensor includes at least one of the following sensors: laser radar, camera and millimeter wave radar; the real-time context aware data includes at least one of the following: real-time laser point cloud data, real-time image data, and real-time millimeter wave radar data; the autopilot software includes at least one of the following perception models: the system comprises a second point cloud deep learning model, an image recognition model and a millimeter wave radar perception model.
In one implementation, the analyzing the direction and/or the error of the speed truth curve and the speed detection value curve to obtain an analysis result includes: carrying out point-by-point analysis on the direction and/or error on the speed truth value curve and the speed detection value curve; and obtaining a speed direction analysis result and/or a speed error analysis result of the same points of the speed truth curve and the speed detection value curve.
In an alternative implementation, the determining whether the autopilot software passes the test according to the analysis result includes: acquiring a target point with a speed reversal and/or a speed error meeting a preset condition in the speed truth curve and the speed detection value curve according to a speed direction analysis result and/or a speed error analysis result of the same point of the speed truth curve and the speed detection value curve; determining the total times of the vehicle driving problems caused by the automatic driving software according to the target points with the speed reversal and/or the speed error meeting the preset conditions; and determining whether the automatic driving software passes the test according to the total times of the vehicle driving problems.
Optionally, the determining the total number of times that the automatic driving software causes the vehicle driving problem according to the target point that the speed reversal exists and/or the speed error meets the preset condition includes: carrying out sliding window processing on the speed truth value curve and the speed detection value curve, and obtaining the number of sliding windows of target points with the speed reversal and/or the speed error meeting preset conditions; and determining the number of sliding windows of the target point with the speed reversal and/or the speed error meeting the preset condition as the total number of times of vehicle driving problems caused by the automatic driving software.
In an alternative implementation, the method further comprises: and determining corresponding scene information according to the real-time environment sensing data and/or determining the category of the traffic target according to the real-time environment sensing data in response to the automatic driving software failing the test.
According to a second aspect of the embodiments of the present disclosure, there is provided a test device for vehicle autopilot software, comprising: the acquisition module is used for acquiring real-time environment sensing data acquired by the vehicle environment sensing sensor; the first processing module is used for acquiring a speed truth value curve of a traffic target according to the real-time environment perception data; the second processing module is used for acquiring a speed detection value curve of the traffic target according to the real-time environment sensing data and the automatic driving software; the third processing module is used for analyzing and processing the direction and/or the error of the speed true value curve and the speed detection value curve to obtain an analysis result; and the determining module is used for determining whether the automatic driving software passes the test according to the analysis result.
In one implementation, the context aware sensor comprises at least a lidar, and the real-time context aware data comprises at least real-time laser point cloud data; the second processing module is specifically configured to: acquiring a target identification result according to the real-time laser point cloud data and the trained first point cloud deep learning model; the target identification result comprises the real-time size and the real-time position of the target; the trained first point cloud deep learning model is a model obtained according to labeling information of a sample target, wherein the labeling information at least comprises the size and the position of the sample target; acquiring real-time position information of the traffic target according to the real-time size and the real-time position of the target in the target identification result; and determining time information corresponding to the real-time laser point cloud data, and acquiring a speed truth curve of the traffic target according to the real-time position information and the time information.
In one implementation, the context-aware sensor includes at least one of the following sensors: laser radar, camera and millimeter wave radar; the real-time context aware data includes at least one of the following: real-time laser point cloud data, real-time image data, and real-time millimeter wave radar data; the autopilot software includes at least one of the following perception models: the system comprises a second point cloud deep learning model, an image recognition model and a millimeter wave radar perception model.
In one implementation, the third processing module is specifically configured to: carrying out point-by-point analysis on the direction and/or error on the speed truth value curve and the speed detection value curve; and obtaining a speed direction analysis result and/or a speed error analysis result of the same points of the speed truth curve and the speed detection value curve.
In an alternative implementation, the determining module is specifically configured to: acquiring a target point with a speed reversal and/or a speed error meeting a preset condition in the speed truth curve and the speed detection value curve according to a speed direction analysis result and/or a speed error analysis result of the same point of the speed truth curve and the speed detection value curve; determining the total times of the vehicle driving problems caused by the automatic driving software according to the target points with the speed reversal and/or the speed error meeting the preset conditions; and determining whether the automatic driving software passes the test according to the total times of the vehicle driving problems.
Optionally, the determining module is specifically configured to: carrying out sliding window processing on the speed truth value curve and the speed detection value curve, and obtaining the number of sliding windows of target points with the speed reversal and/or the speed error meeting preset conditions; and determining the number of sliding windows of the target point with the speed reversal and/or the speed error meeting the preset condition as the total number of times of vehicle driving problems caused by the automatic driving software.
In one implementation, the apparatus further comprises: and the fourth processing module is used for responding to the failure of the test of the automatic driving software, determining corresponding scene information according to the real-time environment sensing data and/or determining the category of the traffic target according to the real-time environment sensing data.
According to a third aspect of embodiments of the present disclosure, there is provided an electronic device, comprising: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of the preceding first aspect.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer readable storage medium storing instructions that, when executed, cause the method according to the first aspect to be implemented.
According to a fifth aspect of embodiments of the present disclosure, there is provided a computer program product comprising a computer program which, when executed by a processor, implements the steps of the method of the first aspect.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects: the speed truth value curve of the traffic target can be obtained based on the real-time environment sensing data acquired by the vehicle environment sensing sensor, and the speed detection value curve of the traffic target is obtained by combining the automatic driving software, so that whether the automatic driving software passes the test is determined according to the analysis results obtained by analyzing the speed truth value curve and the speed detection value curve. The automatic comparison can be performed based on large-scale real-time environment perception data, so that the detection of the perception capability of the automatic driving software is realized, and the safety and the effectiveness of the automatic driving software are ensured.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
FIG. 1 is a flow chart illustrating a method of testing vehicle autopilot software according to one exemplary embodiment.
FIG. 2 is a flow chart illustrating another method of testing vehicle autopilot software according to one exemplary embodiment.
FIG. 3 is a flowchart illustrating yet another method of testing vehicle autopilot software according to one exemplary embodiment.
FIG. 4 is a test flow diagram illustrating a vehicle autopilot software in accordance with one exemplary embodiment.
Fig. 5 is a block diagram of a test apparatus for vehicle autopilot software according to one exemplary embodiment.
FIG. 6 is a block diagram of another test apparatus for vehicle autopilot software according to one exemplary embodiment.
Fig. 7 is a schematic diagram of an electronic device, according to an example embodiment.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the invention. Rather, they are merely examples of apparatus and methods consistent with aspects of the invention as detailed in the accompanying claims.
Wherein, in the description of the present disclosure, "/" means or is meant unless otherwise indicated, e.g., a/B may represent a or B; "and/or" herein is merely an association relationship describing an association object, and means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. The various numbers of first, second, etc. referred to in this disclosure are merely for ease of description and are not intended to limit the scope of embodiments of this disclosure nor to indicate sequencing.
FIG. 1 is a flow chart illustrating a method of testing vehicle autopilot software according to one exemplary embodiment. As shown in fig. 1, the method may include, but is not limited to, the following steps.
Step S101: and acquiring real-time environment sensing data acquired by the vehicle environment sensing sensor.
Wherein, in embodiments of the present disclosure, the context-aware sensor comprises at least one of the following sensors: laser radar, camera and millimeter wave radar; the real-time context awareness data includes at least one of the following: real-time laser point cloud data, real-time image data, and real-time millimeter wave radar data.
Step S102: and acquiring a speed truth value curve of the traffic target according to the real-time environment perception data.
For example, the distance difference between every two adjacent frames in the real-time environment sensing data is obtained according to the data of every two adjacent frames, the time difference corresponding to the two adjacent frames is obtained according to the distance difference and the time difference, the speed data of the traffic target in the time period between the two adjacent frames is obtained, so that a plurality of speed data of the traffic target in a period of time are obtained, the time and the speed are respectively used as two coordinate axes, the plurality of speed data are placed in a coordinate system and are connected, and a speed truth curve of the traffic target is obtained.
In embodiments of the present disclosure, among others, the traffic target may be a target that may have an influence on the running of the vehicle (e.g., other vehicles or pedestrians on the same road, etc.).
It should be noted that, in the embodiments of the present disclosure, each speed truth in the speed truth curve is a speed vector, i.e., includes the magnitude and direction of the speed.
In one possible implementation, real-time laser point cloud data in real-time context awareness data may be utilized to obtain a speed truth curve for a traffic target. For example, the real-time laser point cloud data is input into the laser point cloud model to obtain a corresponding real-time speed, and the real-time speed is used as a speed truth value and is connected to obtain a speed truth value curve of the traffic target. The laser point cloud model is a large model which is trained in advance, and the performance of the large model is far greater than that of the laser point cloud model in automatic driving software.
Step S103: and acquiring a speed detection value curve of the traffic target according to the real-time environment sensing data and the automatic driving software.
For example, using autopilot software to be tested, frame-by-frame speed prediction is performed on a traffic target based on real-time environment sensing data to obtain a plurality of speed detection values of the traffic target, and the speed detection values of the traffic target in all frames of the real-time environment sensing data are connected to obtain a speed detection value curve of the traffic target.
In the embodiment of the present disclosure, the autopilot software is software capable of detecting related information (such as a relative position and a speed) of a traffic target based on environmental awareness data, where the autopilot software may include at least one of a laser point cloud model and a visual target detection model, and the autopilot software may further include any other type of awareness model, which is not limited and not repeated herein.
It should be noted that, in the embodiment of the present disclosure, each speed detection value in the speed detection value curve is a speed vector, that is, includes the magnitude and direction of the speed.
In some embodiments of the present disclosure, step S103 may be performed at the time of the circuit.
In other embodiments of the present disclosure, in order to avoid the influence of the reduced calculation power of the drive test vehicle caused by executing the step S103, the real-time environment sensing data may be collected only by using the environment sensing sensor during the drive test, and the real-time environment sensing data is input into a simulation platform of the pre-built autopilot software after the drive test is finished, so as to execute the step S103 to obtain a speed detection value curve of the traffic target.
Wherein, in embodiments of the present disclosure, the autopilot software includes at least one of the following perception models: the system comprises a second point cloud deep learning model, an image recognition model and a millimeter wave radar perception model.
In some embodiments of the present disclosure, the autopilot software may also include a laser point cloud model.
Step S104: and analyzing and processing the direction and/or the error of the speed truth curve and the speed detection value curve to obtain an analysis result.
As an example, the speed truth curve and the speed detection value curve are subjected to analysis processing in directions, and whether the speed truth curve and the speed detection value curve have analysis results with different speed directions at the same time or not is obtained.
As another example, the speed truth curve and the speed detection value curve are subjected to error analysis processing, and an analysis result of the error magnitude between the speed truth curve and the speed detection value curve is obtained.
As an example, the speed truth curve and the speed detection value curve are subjected to direction and error analysis processing, and an analysis result of whether the speed truth curve and the speed detection value curve have different speed directions at the same moment and the error between the speed truth curve and the speed detection value curve is obtained.
Step S105: and determining whether the automatic driving software passes the test according to the analysis result.
As an example, taking an analysis processing of directions of the speed truth curve and the speed detection value curve as an example, determining that the automatic driving software passes the test in response to the analysis result that the speed truth curve and the speed detection value curve are not different in speed direction at the same time, or determining that the automatic driving software fails the test in response to the analysis result that the speed truth curve and the speed detection value curve are different in speed direction at the same time.
As yet another example, taking an analysis process of performing an error on the speed truth curve and the speed detection value curve as an example, determining that the automatic driving software passes the test in response to the analysis result being that the error between the speed truth curve and the speed detection value curve is less than or equal to a preset error threshold; or, in response to the analysis result being that the error between the speed truth curve and the speed detection value curve is greater than a preset error threshold, determining that the autopilot software fails the test.
As yet another example, taking an analysis processing of directions and errors of a speed truth curve and a speed detection value curve as an example, determining that the automatic driving software passes the test in response to the analysis result that the speed truth curve and the speed detection value curve are not different in speed direction at the same time and the error between the speed truth curve and the speed detection value curve is less than or equal to a preset error threshold; otherwise, determining that the autopilot software fails the test.
By implementing the embodiment of the disclosure, the speed truth curve of the traffic target can be obtained based on the real-time environment sensing data acquired by the vehicle environment sensing sensor, and the speed detection value curve of the traffic target is obtained by combining the automatic driving software, so that whether the automatic driving software passes the test is determined according to the analysis result obtained by analyzing the speed truth curve and the speed detection value curve. The automatic comparison can be performed based on large-scale real-time environment perception data, so that the detection of the perception capability of the automatic driving software is realized, and the safety and the effectiveness of the automatic driving software are ensured.
In one implementation, the context-aware sensor includes at least a lidar such that a speed truth curve of the traffic target may be obtained based on data obtained by the lidar. As an example, referring to fig. 2, fig. 2 is a flowchart illustrating another method of testing vehicle autopilot software according to one exemplary embodiment. As shown in fig. 2, the method may include, but is not limited to, the following steps.
Step S201: and acquiring real-time environment sensing data acquired by the vehicle environment sensing sensor.
In the embodiment of the present disclosure, step S201 may be implemented in any manner of each embodiment of the present disclosure, which is not limited to this embodiment, and is not described in detail.
Step S202: and acquiring a target identification result according to the real-time laser point cloud data and the trained first point cloud deep learning model.
Wherein, in embodiments of the present disclosure, the target recognition result includes a real-time size and a real-time location of the target; the trained first point cloud deep learning model is obtained according to the labeling information of the sample target, wherein the labeling information at least comprises the size and the position of the sample target.
For example, the real-time laser point cloud data is input into a trained first point cloud deep learning model to process the real-time laser point cloud data and obtain a target recognition result.
In an embodiment of the present disclosure, the trained first point cloud deep learning model is a model obtained according to labeling information of a sample target, where the labeling information includes a size and a position of the sample target, and the labeling information is information obtained according to prediction data output by the point cloud depth model, where the prediction data includes a size and a position of the predicted sample target.
Step S203: and acquiring real-time position information of the traffic target according to the real-time size and the real-time position of the target in the target identification result.
For example, according to the real-time size and the real-time position of the target in the target recognition result, it is determined whether the target is a traffic target. And responding to the traffic target, and acquiring real-time position information of the traffic target according to the real-time position.
In some embodiments of the present disclosure, a target having a real-time size greater than a preset threshold may be determined as a traffic target, and a real-time location of the target may be determined as location information of the traffic target. The preset threshold is a preset threshold for judging whether the target is a traffic target or not.
Step S204: and determining time information corresponding to the real-time laser point cloud data, and acquiring a speed truth curve of the traffic target according to the real-time position information and the time information.
For example, time information corresponding to a real-time position of a target in real-time laser point cloud data is determined, a plurality of corresponding speed truth values of a traffic target at a plurality of time points are obtained according to the real-time position information and the time information, the plurality of speed truth values are connected in time sequence, and a speed truth value curve of the traffic target is obtained.
It may be appreciated that the real-time laser point cloud data is continuous multi-frame data, and in some embodiments of the present disclosure, the real-time laser point cloud data may be input into a pre-trained model to obtain position information of a traffic target in each frame, so as to obtain a speed true value of each frame of traffic target according to the position information of each frame of traffic target and time information corresponding to each frame, and connect the speed true values corresponding to the real-time laser point cloud data, so as to obtain a speed true value curve of the traffic target.
Step S205: and acquiring a speed detection value curve of the traffic target according to the real-time environment sensing data and the automatic driving software.
In the embodiment of the present disclosure, step S205 may be implemented in any manner in each embodiment of the present disclosure, which is not limited to this embodiment, and is not described in detail.
Step S206: and analyzing and processing the direction and/or the error of the speed truth curve and the speed detection value curve to obtain an analysis result.
In the embodiment of the present disclosure, step S206 may be implemented in any manner of each embodiment of the present disclosure, which is not limited to this embodiment, and is not described in detail.
Step S207: and determining whether the automatic driving software passes the test according to the analysis result.
In the embodiment of the present disclosure, step S207 may be implemented in any manner of each embodiment of the present disclosure, which is not limited to this embodiment, and is not described in detail.
By implementing the embodiment of the disclosure, the speed truth curve of the traffic target can be obtained based on the data obtained by the laser radar, and the speed detection value curve of the traffic target is obtained by combining the automatic driving software, so that whether the automatic driving software passes the test is determined according to the analysis results obtained by analyzing the speed truth curve and the speed detection value curve. The automatic comparison can be performed based on large-scale real-time environment perception data, so that the detection of the perception capability of the automatic driving software is realized, and the safety and the effectiveness of the automatic driving software are ensured.
In one implementation, the speed truth curve and the speed detection value curve may be analyzed point-by-point to determine whether the autopilot software passes the test based on the analysis results. As an example, referring to fig. 3, fig. 3 is a flowchart illustrating a test method of still another vehicle autopilot software according to one exemplary embodiment. As shown in fig. 3, the method may include, but is not limited to, the following steps.
Step S301: and acquiring real-time environment sensing data acquired by the vehicle environment sensing sensor.
In the embodiment of the present disclosure, step S301 may be implemented in any manner in each embodiment of the present disclosure, which is not limited to this embodiment, and is not described in detail.
Step S302: and acquiring a speed truth value curve of the traffic target according to the real-time environment perception data.
In the embodiment of the present disclosure, step S302 may be implemented in any manner of each embodiment of the present disclosure, which is not limited to this embodiment, and is not described in detail.
Step S303: and acquiring a speed detection value curve of the traffic target according to the real-time environment sensing data and the automatic driving software.
In the embodiment of the present disclosure, step S303 may be implemented in any one of the embodiments of the present disclosure, which is not limited to this embodiment, and is not described in detail.
Step S304: and analyzing the direction and/or the error point by using the speed true value curve and the speed detection value curve.
As one example, the speed truth curve and the speed detection value curve are analyzed point by point for speed direction.
As another example, the speed error analysis is performed point-by-point on a speed truth curve and a speed detection value curve.
As yet another example, the speed truth curve and the speed detection value curve are analyzed point-by-point for speed direction and speed error.
Step S305: and obtaining a speed direction analysis result and/or a speed error analysis result of the same points of the speed true value curve and the speed detection value curve.
As an example, taking the analysis of the direction of the speed truth curve and the speed detection value curve point by point as an example, the speed direction analysis result of the same point of the speed truth curve and the speed detection value curve is obtained.
As another example, taking the analysis of the error of the speed truth curve and the speed detection value curve point by point as an example, the speed error analysis result of the same point of the speed truth curve and the speed detection value curve is obtained.
As yet another example, taking an example of analyzing the direction and the error point by point of the speed truth curve and the speed detection value curve, a speed direction analysis result and a speed error analysis result of the same point of the speed truth curve and the speed detection value curve are obtained.
Step S306: and determining whether the automatic driving software passes the test according to the analysis result.
In an alternative implementation manner, the determining whether the autopilot software passes the test according to the analysis result may include the following steps: acquiring a target point with speed reversal and/or speed error meeting preset conditions in the speed truth curve and the speed detection value curve according to the speed direction analysis result and/or the speed error analysis result of the same points of the speed truth curve and the speed detection value curve; determining the total times of vehicle driving problems caused by automatic driving software according to target points with speed reversal and/or speed errors meeting preset conditions; and determining whether the automatic driving software passes the test according to the total times of the vehicle driving problems.
Taking an example that the analysis result comprises a speed direction analysis result, obtaining target points with opposite speed directions at the same moment in a speed truth curve and a speed detection value curve according to the speed direction analysis result of the same time point of the speed truth curve and the speed detection value curve, and determining the total times of the vehicle driving problems caused by automatic driving software according to the number of the target points with opposite speeds; and determining that the autopilot software fails the test in response to the total number of times being greater than the number of times threshold, or determining that the autopilot software fails the test in response to the total number of times being less than or equal to the number of times threshold.
As another example, taking an analysis result including a speed error analysis result as an example, obtaining target points with speed errors meeting preset conditions (for example, greater than a speed threshold value) in a speed truth curve and a speed detection value curve according to the speed error analysis result of the same time point of the speed truth curve and the speed detection value curve, and determining the total times of vehicle driving problems caused by automatic driving software according to the number of the target points with the speed errors; and determining that the autopilot software fails the test in response to the total number of times being greater than the number of times threshold, or determining that the autopilot software fails the test in response to the total number of times being less than or equal to the number of times threshold.
As yet another example, taking an example that the analysis result includes a speed direction analysis result and a speed error analysis result, obtaining a target point with opposite speed directions at the same moment in the speed truth curve and the speed detection value curve according to the speed direction analysis result and the speed error analysis result of the same time point in the speed truth curve and the speed detection value curve, and obtaining a target point with a speed error meeting a preset condition in the speed truth curve and the speed detection value curve; determining the total times of vehicle driving problems caused by automatic driving software according to the number of the target points; and determining that the autopilot software fails the test in response to the total number of times being greater than the number of times threshold, or determining that the autopilot software fails the test in response to the total number of times being less than or equal to the number of times threshold.
In the embodiment of the present application, the speed threshold is a speed value for determining whether a speed error of the speed truth curve and the speed detection value curve is excessive at the same time.
Optionally, the determining the total number of times the automatic driving software causes the vehicle to run according to the target point that the speed reversal and/or the speed error meet the preset condition may include the following steps: carrying out sliding window processing on the speed truth value curve and the speed detection value curve, and obtaining the number of sliding windows of target points with speed reversal and/or speed errors meeting preset conditions; and determining the number of sliding windows of the target points with the speed reversal and/or the speed error meeting the preset conditions as the total number of times of vehicle driving problems caused by automatic driving software.
As an example, the speed truth value curve and the speed detection value curve are subjected to sliding window processing, so that the number of sliding windows with the speed reversal target points is obtained; and determining the number of sliding windows with the speed reversal target points as the total number of times of vehicle driving problems caused by automatic driving software.
As yet another example, the speed truth value curve and the speed detection value curve are subjected to sliding window processing, so that the number of sliding windows of the target point with the speed error meeting the preset condition is obtained; and determining the number of sliding windows of the target point with the speed error meeting the preset condition as the total number of times of vehicle driving problems caused by automatic driving software.
As yet another example, the speed truth value curve and the speed detection value curve are subjected to sliding window processing, the number of sliding windows of the target point with the speed reversal is obtained, and the number of sliding windows of the target point with the speed error meeting the preset condition is obtained; and determining the number of sliding windows of the target point with the speed reversal and the speed error meeting the preset conditions as the total number of times of vehicle driving problems caused by automatic driving software.
By implementing the embodiment of the application, the speed truth value curve of the traffic target can be obtained based on the real-time environment sensing data acquired by the vehicle environment sensing sensor, the speed detection value curve of the traffic target is obtained by combining the automatic driving software, the speed truth value curve and the speed detection value curve are analyzed point by point, and therefore whether the automatic driving software passes the test is determined according to the analysis result. The automatic comparison can be performed based on large-scale real-time environment perception data, so that the detection of the perception capability of the automatic driving software is realized, and the safety and the effectiveness of the automatic driving software are ensured.
In some embodiments of the present disclosure, the method for testing vehicle autopilot software may further include: and determining corresponding scene information according to the real-time environment sensing data and/or determining the category of the traffic target according to the real-time environment sensing data in response to the failure of the automatic driving software.
As one example, responsive to the autopilot software failing the test, the context information corresponding to the target point is determined from the real-time context awareness data. For example, weather, illumination intensity, and road conditions (e.g., intersections, tunnels, or tight-turning roads) are acquired through real-time image data.
As another example, in response to the autopilot software failing the test, a category of a traffic target corresponding to the target point is determined from the real-time context awareness data. For example, when the traffic target is a vehicle, acquiring a vehicle type (e.g., a large truck or a car, etc.) of the traffic target through real-time environmental awareness data; when the traffic target includes a person, a specific action pattern (e.g., walking or riding, etc.) of the person is acquired.
As yet another example, responsive to the autopilot software failing the test, scene information corresponding to the aforementioned target point, and a category of traffic target corresponding to the aforementioned target point, are determined from the real-time context awareness data.
By implementing the embodiment of the application, corresponding scene information and/or the category of the traffic target can be determined according to the real-time environment perception data in response to the failure of the automatic driving software. So as to analyze the reasons of the vehicle driving problems caused by the automatic driving software and help to improve the safety and the effectiveness of the automatic driving software.
Referring to fig. 4, fig. 4 is a test flow diagram illustrating a vehicle autopilot software in accordance with one exemplary embodiment. In this scenario, as shown in fig. 4, a raw data packet (pack) acquired by a vehicle sensor is first acquired; then obtaining a speed measurement curve (namely a measured speed value curve) of the traffic target according to the original data fragment, and obtaining a true value curve (namely a real speed value curve) of the same traffic target based on a tracking algorithm; the speed measurement curve and the true value curve are compared point by point, points with reverse problems and larger errors in the curves are obtained, the number of times of statistics is counted by using a sliding window method, and the continuous larger error points are counted to generate a problem once; and the data packet passing rate is counted, and the vehicle automatic driving software is determined to pass the test in response to the passing rate being larger than the passing rate threshold value, or the vehicle automatic driving software is determined to fail the test in response to the passing rate being smaller than or equal to the passing rate threshold value.
Referring to fig. 5, fig. 5 is a block diagram of a test apparatus for vehicle autopilot software according to one exemplary embodiment. As shown in fig. 5, the apparatus 500 includes: an acquisition module 501, configured to acquire real-time environmental awareness data acquired by a vehicle environmental awareness sensor; the first processing module 502 is configured to obtain a speed truth curve of a traffic target according to real-time environmental awareness data; a second processing module 503, configured to obtain a speed detection value curve of the traffic target according to the real-time environmental awareness data and the autopilot software; the third processing module 504 is configured to perform analysis processing on the direction and/or the error on the speed truth curve and the speed detection value curve to obtain an analysis result; a determining module 505, configured to determine whether the autopilot software passes the test according to the analysis result.
In one implementation, the context-aware sensor includes at least a lidar and the real-time context-aware data includes at least real-time laser point cloud data; the second processing module 503 is specifically configured to: acquiring a target identification result according to the real-time laser point cloud data and the trained first point cloud deep learning model; the target identification result comprises the real-time size and the real-time position of the target; the trained first point cloud deep learning model is a model obtained according to labeling information of a sample target, wherein the labeling information at least comprises the size and the position of the sample target; acquiring real-time position information of a traffic target according to the real-time size and the real-time position of the target in the target identification result; and determining time information corresponding to the real-time laser point cloud data, and acquiring a speed truth curve of the traffic target according to the real-time position information and the time information.
In one implementation, the context-aware sensor includes at least one of the following sensors: laser radar, camera and millimeter wave radar; the real-time context awareness data includes at least one of the following: real-time laser point cloud data, real-time image data, and real-time millimeter wave radar data; the autopilot software includes at least one of the following perception models: the system comprises a second point cloud deep learning model, an image recognition model and a millimeter wave radar perception model.
In one implementation, the third processing module 504 is specifically configured to: analyzing the direction and/or the error point by point of the speed truth value curve and the speed detection value curve; and obtaining a speed direction analysis result and/or a speed error analysis result of the same points of the speed true value curve and the speed detection value curve.
In an alternative implementation, the determining module 505 is specifically configured to: acquiring a target point with speed reversal and/or speed error meeting preset conditions in the speed truth curve and the speed detection value curve according to the speed direction analysis result and/or the speed error analysis result of the same points of the speed truth curve and the speed detection value curve; determining the total times of vehicle driving problems caused by automatic driving software according to target points with speed reversal and/or speed errors meeting preset conditions; and determining whether the automatic driving software passes the test according to the total times of the vehicle driving problems.
Optionally, the determining module 505 is specifically configured to: carrying out sliding window processing on the speed truth value curve and the speed detection value curve, and obtaining the number of sliding windows of target points with speed reversal and/or speed errors meeting preset conditions; and determining the number of sliding windows of the target points with the speed reversal and/or the speed error meeting the preset conditions as the total number of times of vehicle driving problems caused by automatic driving software.
According to the device disclosed by the embodiment of the invention, the speed truth curve of the traffic target can be obtained based on the real-time environment sensing data acquired by the vehicle environment sensing sensor, and the speed detection value curve of the traffic target is obtained by combining the automatic driving software, so that whether the automatic driving software passes the test is determined according to the analysis result obtained by analyzing the speed truth curve and the speed detection value curve. The automatic comparison can be performed based on large-scale real-time environment perception data, so that the detection of the perception capability of the automatic driving software is realized, and the safety and the effectiveness of the automatic driving software are ensured.
In one implementation, the apparatus further includes a fourth processing module. As an example, referring to fig. 6, fig. 6 is a block diagram of a testing apparatus of another vehicle autopilot software shown according to an exemplary embodiment. As shown in fig. 6, the apparatus 600 further includes a fourth processing module 606 for determining corresponding context information from the real-time context awareness data and/or determining a category of traffic targets from the real-time context awareness data in response to the autopilot software failing the test. The modules 601 to 605 in fig. 6 have the same structure and function as the modules 501 to 505 in fig. 5.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
Referring to fig. 7, fig. 7 is a schematic diagram of an electronic device according to an exemplary embodiment. For example, the electronic device 700 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, exercise device, personal digital assistant, wearable device, or the like.
Referring to fig. 7, an electronic device 700 may include one or more of the following components: a processing component 702, a memory 704, a power component 706, a multimedia component 708, an audio component 710, an input/output (I/O) interface 712, a sensor component 714, and a communication component 716.
The processing component 702 generally controls overall operation of the electronic device 700, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 702 may include one or more processors 720 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 702 can include one or more modules that facilitate interaction between the processing component 702 and other components. For example, the processing component 702 may include a multimedia module to facilitate interaction between the multimedia component 708 and the processing component 702.
The memory 704 is configured to store various types of data to support operations at the electronic device 700. Examples of such data include instructions for any application or method operating on the electronic device 700, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 704 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power supply component 706 provides power to the various components of the electronic device 700. Power supply components 706 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for electronic device 700.
The multimedia component 708 includes a touch-sensitive display screen between the electronic device 700 and the user that provides an output interface. In some embodiments, the touch display screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 708 includes a front-facing camera and/or a rear-facing camera. When the electronic device 700 is in an operational mode, such as a shooting mode or a video mode, the front camera and/or the rear camera may receive external multimedia data. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 710 is configured to output and/or input audio signals. For example, the audio component 710 includes a Microphone (MIC) configured to receive external audio signals when the electronic device 700 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 704 or transmitted via the communication component 716. In some embodiments, the audio component 710 further includes a speaker for outputting audio signals.
The input/output interface 712 provides an interface between the processing component 702 and peripheral interface modules, which may be a keyboard, click wheel, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 714 includes one or more sensors for providing status assessment of various aspects of the electronic device 700. For example, the sensor assembly 714 may detect an on/off state of the electronic device 700, a relative positioning of the components, such as a display and keypad of the electronic device 700, a change in position of the electronic device 700 or a component of the electronic device 700, the presence or absence of a user's contact with the electronic device 700, an orientation or acceleration/deceleration of the electronic device 700, and a change in temperature of the electronic device 700. The sensor assembly 714 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact. The sensor assembly 714 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 714 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 716 is configured to facilitate communication between the electronic device 700 and other devices, either wired or wireless. The electronic device 700 may access a wireless network based on a communication standard, such as WiFi,2G, or 3G, or a combination thereof. In one exemplary embodiment, the communication component 716 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 716 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the electronic device 700 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for executing the methods described in any one of the embodiments above.
The present disclosure also provides a readable storage medium having instructions stored thereon which, when executed by a computer, perform the functions of any of the method embodiments described above.
The present disclosure also provides a computer program product which, when executed by a computer, performs the functions of any of the method embodiments described above.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product comprises one or more computer programs. When the computer program is loaded and executed on a computer, the flow or functions described in accordance with the embodiments of the present disclosure are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer program may be stored in or transmitted from one computer readable storage medium to another, for example, by wired (e.g., coaxial cable, optical fiber, digital subscriber line (digital subscriber line, DSL)) or wireless (e.g., infrared, wireless, microwave, etc.) means from one website, computer, server, or data center. The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a high-density digital video disc (digital video disc, DVD)), or a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.
Those of ordinary skill in the art will appreciate that: the various numbers of first, second, etc. referred to in this disclosure are merely for ease of description and are not intended to limit the scope of embodiments of this disclosure, nor to indicate sequencing.
At least one of the present disclosure may also be described as one or more, a plurality may be two, three, four or more, and the present disclosure is not limited. In the embodiment of the disclosure, for a technical feature, the technical features in the technical feature are distinguished by "first", "second", "third", "a", "B", "C", and "D", and the technical features described by "first", "second", "third", "a", "B", "C", and "D" are not in sequence or in order of magnitude.
Predefined in this disclosure may be understood as defining, predefining, storing, pre-negotiating, pre-configuring, curing, or pre-sintering.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
The foregoing is merely specific embodiments of the disclosure, but the protection scope of the disclosure is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the disclosure, and it is intended to cover the scope of the disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (9)

1. A method for testing vehicle autopilot software, comprising:
acquiring real-time environment sensing data acquired by a vehicle environment sensing sensor;
acquiring a speed truth value curve of a traffic target according to the real-time environment perception data;
acquiring a speed detection value curve of the traffic target according to the real-time environment sensing data and the automatic driving software;
analyzing the direction and/or the error of the speed truth curve and the speed detection value curve to obtain an analysis result;
Determining whether the autopilot software passes a test according to the analysis result;
the environment sensing sensor at least comprises a laser radar, and the real-time environment sensing data at least comprises real-time laser point cloud data; the obtaining the speed truth curve of the traffic target according to the real-time environment sensing data comprises the following steps:
acquiring a target identification result according to the real-time laser point cloud data and the trained first point cloud deep learning model; the target identification result comprises the real-time size and the real-time position of the target; the trained first point cloud deep learning model is a model obtained according to labeling information of a sample target, wherein the labeling information at least comprises the size and the position of the sample target;
acquiring real-time position information of the traffic target according to the real-time size and the real-time position of the target in the target identification result;
and determining time information corresponding to the real-time laser point cloud data, and acquiring a speed truth curve of the traffic target according to the real-time position information and the time information.
2. The method of claim 1, wherein,
the context-aware sensor includes at least one of the following sensors: laser radar, camera and millimeter wave radar;
The real-time context aware data includes at least one of the following: real-time laser point cloud data, real-time image data, and real-time millimeter wave radar data;
the autopilot software includes at least one of the following perception models: the system comprises a second point cloud deep learning model, an image recognition model and a millimeter wave radar perception model.
3. The method of claim 1, wherein said subjecting said speed truth curve and said speed detection curve to direction and/or error analysis results comprises:
carrying out point-by-point analysis on the direction and/or error on the speed truth value curve and the speed detection value curve;
and obtaining a speed direction analysis result and/or a speed error analysis result of the same points of the speed truth curve and the speed detection value curve.
4. A method according to claim 3, wherein said determining whether said autopilot software passes a test based on said analysis results comprises:
acquiring a target point with a speed reversal and/or a speed error meeting a preset condition in the speed truth curve and the speed detection value curve according to a speed direction analysis result and/or a speed error analysis result of the same point of the speed truth curve and the speed detection value curve;
Determining the total times of the vehicle driving problems caused by the automatic driving software according to the target points with the speed reversal and/or the speed error meeting the preset conditions;
and determining whether the automatic driving software passes the test according to the total times of the vehicle driving problems.
5. The method of claim 4, wherein determining the total number of times the autopilot software causes a vehicle ride problem based on the target points for which the existing speed reversal and/or speed error satisfies a preset condition comprises:
carrying out sliding window processing on the speed truth value curve and the speed detection value curve, and obtaining the number of sliding windows of target points with the speed reversal and/or the speed error meeting preset conditions;
and determining the number of sliding windows of the target point with the speed reversal and/or the speed error meeting the preset condition as the total number of times of vehicle driving problems caused by the automatic driving software.
6. The method of any one of claims 1 to 5, further comprising:
and determining corresponding scene information according to the real-time environment sensing data and/or determining the category of the traffic target according to the real-time environment sensing data in response to the automatic driving software failing the test.
7. A test device for vehicle autopilot software, comprising:
the acquisition module is used for acquiring real-time environment sensing data acquired by the vehicle environment sensing sensor;
the first processing module is used for acquiring a speed truth value curve of a traffic target according to the real-time environment perception data;
the second processing module is used for acquiring a speed detection value curve of the traffic target according to the real-time environment sensing data and the automatic driving software;
the third processing module is used for analyzing and processing the direction and/or the error of the speed true value curve and the speed detection value curve to obtain an analysis result;
the determining module is used for determining whether the automatic driving software passes the test according to the analysis result;
the environment sensing sensor at least comprises a laser radar, and the real-time environment sensing data at least comprises real-time laser point cloud data; the obtaining the speed truth curve of the traffic target according to the real-time environment sensing data comprises the following steps:
acquiring a target identification result according to the real-time laser point cloud data and the trained first point cloud deep learning model; the target identification result comprises the real-time size and the real-time position of the target; the trained first point cloud deep learning model is a model obtained according to labeling information of a sample target, wherein the labeling information at least comprises the size and the position of the sample target;
Acquiring real-time position information of the traffic target according to the real-time size and the real-time position of the target in the target identification result;
and determining time information corresponding to the real-time laser point cloud data, and acquiring a speed truth curve of the traffic target according to the real-time position information and the time information.
8. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1 to 6.
9. A computer readable storage medium storing instructions which, when executed, cause a method as claimed in any one of claims 1 to 6 to be implemented.
CN202310317474.8A 2023-03-29 2023-03-29 Method, device, equipment and storage medium for testing vehicle autopilot software Active CN116030551B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310317474.8A CN116030551B (en) 2023-03-29 2023-03-29 Method, device, equipment and storage medium for testing vehicle autopilot software

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310317474.8A CN116030551B (en) 2023-03-29 2023-03-29 Method, device, equipment and storage medium for testing vehicle autopilot software

Publications (2)

Publication Number Publication Date
CN116030551A CN116030551A (en) 2023-04-28
CN116030551B true CN116030551B (en) 2023-06-20

Family

ID=86089710

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310317474.8A Active CN116030551B (en) 2023-03-29 2023-03-29 Method, device, equipment and storage medium for testing vehicle autopilot software

Country Status (1)

Country Link
CN (1) CN116030551B (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002366387A (en) * 2001-06-13 2002-12-20 Hitachi Ltd Automatic test system for software program
CN111983935B (en) * 2020-08-19 2024-04-05 北京京东叁佰陆拾度电子商务有限公司 Performance evaluation method and device
CN113029136B (en) * 2021-03-12 2024-06-04 北京百度网讯科技有限公司 Method, apparatus, storage medium and program product for processing positioning information
CN114384547A (en) * 2022-01-10 2022-04-22 清华珠三角研究院 Radar sensor model-based fidelity detection evaluation method and system
CN114691492A (en) * 2022-02-25 2022-07-01 智己汽车科技有限公司 Automatic driving software evaluation system
CN114792469B (en) * 2022-04-06 2023-06-02 中信科智联科技有限公司 Testing method and device for sensing system and testing equipment
CN115562974A (en) * 2022-08-29 2023-01-03 嬴彻星创智能科技(上海)有限公司 Vehicle-mounted software testing method and device for automatic driving
CN115774680B (en) * 2023-01-16 2023-04-11 小米汽车科技有限公司 Version testing method, device and equipment of automatic driving software and storage medium

Also Published As

Publication number Publication date
CN116030551A (en) 2023-04-28

Similar Documents

Publication Publication Date Title
US20200317190A1 (en) Collision Control Method, Electronic Device and Storage Medium
US20170300503A1 (en) Method and apparatus for managing video data, terminal, and server
CN108632081B (en) Network situation evaluation method, device and storage medium
US20210133468A1 (en) Action Recognition Method, Electronic Device, and Storage Medium
CN109668742B (en) Laser radar-based unmanned vehicle testing method and device
CN111104920B (en) Video processing method and device, electronic equipment and storage medium
CN110191085B (en) Intrusion detection method and device based on multiple classifications and storage medium
EP3901827A1 (en) Image processing method and apparatus based on super network, intelligent device and computer storage medium
CN110992979B (en) Detection method and device and electronic equipment
CN115774680B (en) Version testing method, device and equipment of automatic driving software and storage medium
CN111435422B (en) Action recognition method, control method and device, electronic equipment and storage medium
US20210326649A1 (en) Configuration method and apparatus for detector, storage medium
CN111523599B (en) Target detection method and device, electronic equipment and storage medium
CN112837454A (en) Passage detection method and device, electronic equipment and storage medium
CN112525224B (en) Magnetic field calibration method, magnetic field calibration device, and storage medium
CN115907566B (en) Evaluation method and device for automatic driving perception detection capability and electronic equipment
CN111913850A (en) Data anomaly detection method, device, equipment and storage medium
CN116030551B (en) Method, device, equipment and storage medium for testing vehicle autopilot software
CN114802233B (en) Vehicle control method, device, electronic device and storage medium
US20240054489A1 (en) Traffic information processing methods, apparatuses, electronic devices, servers, and storage mediums
CN110149310B (en) Flow intrusion detection method, device and storage medium
CN111008606B (en) Image prediction method and device, electronic equipment and storage medium
CN110889966B (en) Method, device and system for real-time detection of tunnel traffic
CN115071704B (en) Trajectory prediction method, apparatus, medium, device, chip and vehicle
CN113919292B (en) Model training method and device for formula identification

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant