CN116046417A - Automatic driving perception limitation testing method and device, electronic equipment and storage medium - Google Patents

Automatic driving perception limitation testing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN116046417A
CN116046417A CN202310342476.2A CN202310342476A CN116046417A CN 116046417 A CN116046417 A CN 116046417A CN 202310342476 A CN202310342476 A CN 202310342476A CN 116046417 A CN116046417 A CN 116046417A
Authority
CN
China
Prior art keywords
test
automatic driving
limitation
information
scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310342476.2A
Other languages
Chinese (zh)
Other versions
CN116046417B (en
Inventor
杨子江
张晓东
鲍杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xi'an Xinxin Information Technology Co ltd
Original Assignee
Xi'an Xinxin Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xi'an Xinxin Information Technology Co ltd filed Critical Xi'an Xinxin Information Technology Co ltd
Priority to CN202310342476.2A priority Critical patent/CN116046417B/en
Publication of CN116046417A publication Critical patent/CN116046417A/en
Application granted granted Critical
Publication of CN116046417B publication Critical patent/CN116046417B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M17/00Testing of vehicles
    • G01M17/007Wheeled or endless-tracked vehicles
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application provides an automatic driving perception limitation testing method, an automatic driving perception limitation testing device, electronic equipment and a storage medium, wherein the method comprises the following steps: acquiring test scene information, and generating an automatic driving traffic scene according to the test scene information through a simulation simulator; and performing perception limitation test on the automatic driving vehicle by using test information of the automatic driving traffic scene to obtain a perception limitation test result, wherein the test information comprises: the route is preset or the limitation test video is perceived. In the implementation process of the scheme, the automatic driving traffic scene is generated through the simulation simulator according to the self-defined test scene information, and the automatic driving vehicle is subjected to the perception limitation test by using the test information of the automatic driving traffic scene, so that the function of carrying out the perception limitation test according to the self-defined scene information of the user is completed, and the flexibility of the perception limitation test is improved.

Description

Automatic driving perception limitation testing method and device, electronic equipment and storage medium
Technical Field
The application relates to the technical field of automatic driving simulation, in particular to an automatic driving perception limitation testing method, an automatic driving perception limitation testing device, electronic equipment and a storage medium.
Background
The perception limitation (Perceptual Limitation) refers to the limitation of the perception capability of the automatic driving vehicle to the external environment. The information received by humans during driving mostly comes from vision, such as traffic signs, road signs, traffic signals, etc., which are the main decision basis for human drivers to control vehicles. However, the camera or the sensor of the automatic driving vehicle has the sensing limitation of the capability, for example, the camera has the limitations of too low sampling frequency, extreme exposure time or distortion, and the like, and the laser radar sensor has the limitations of undersrange, large range error or incomplete point cloud, and the like, and the sensing limitations cause the automatic driving vehicle to have some limitations of the capability in the driving process, and the limitations of the capability need to be tested through sensing limitation tests.
The current traffic scene simulation software generally only provides a few sets of fixed traffic scenes, and can not perform perception limitation test according to the user-defined scene information. That is, current automated driving vehicles have less flexibility in performing perception limitation tests.
Disclosure of Invention
An embodiment of the application aims to provide an automatic driving perception limitation testing method, an automatic driving perception limitation testing device, electronic equipment and a storage medium, which are used for solving the problem that the flexibility of carrying out perception limitation testing on an automatic driving vehicle is low.
The embodiment of the application provides an automatic driving perception limitation testing method, which comprises the following steps: acquiring test scene information, and generating an automatic driving traffic scene according to the test scene information through a simulation simulator; and performing perception limitation test on the automatic driving vehicle by using test information of the automatic driving traffic scene to obtain a perception limitation test result, wherein the test information comprises: the route is preset or the limitation test video is perceived. In the implementation process of the scheme, the automatic driving traffic scene is generated through the simulation simulator according to the self-defined test scene information, and the automatic driving vehicle is subjected to the perception limitation test by using the test information of the automatic driving traffic scene, so that the function of carrying out the perception limitation test according to the self-defined scene information of the user is completed, and the flexibility of the perception limitation test is improved.
Optionally, in an embodiment of the present application, the test scenario information includes: multiple pieces of perception limitation information; generating, by the simulation simulator, an autopilot traffic scenario from the test scenario information, comprising: judging whether any two pieces of sensing limitation information in the plurality of pieces of sensing limitation information accord with superposition rules in a preset rule base or not; if so, generating sensing limitation test points in the automatic driving traffic scene according to superposition of the two sensing limitation information, otherwise, respectively generating two sensing limitation test points in the automatic driving traffic scene according to the two sensing limitation information to obtain a plurality of sensing limitation test points.
Optionally, in an embodiment of the present application, the test information is a preset route; using test information of an autopilot traffic scenario to perform a perception limitation test on an autopilot vehicle, comprising: acquiring a preset route in an automatic driving traffic scene, wherein the preset route passes through a plurality of perception limiting test points; and sending a preset route to the automatic driving vehicle so that the automatic driving vehicle can drive according to the preset route, and completing the perception limitation test after passing through a plurality of perception limitation test points.
Optionally, in an embodiment of the present application, the test information is a perceptually limited test video; using test information of an autopilot traffic scenario to perform a perception limitation test on an autopilot vehicle, comprising: shooting by using a perception limitation test point of the simulated light sensing camera in an automatic driving traffic scene to obtain a perception limitation test video; and using the perception limitation test video to carry out the perception limitation test on the automatic driving vehicle.
Optionally, in an embodiment of the present application, using the autopilot traffic scenario to perform a perception limitation test on an autopilot vehicle includes: acquiring feedback behaviors of the automatic driving vehicle on the perception limitation test video; judging whether the feedback behavior is the same as the preset behavior; if yes, determining the sensing limitation test result as the test passing, otherwise, determining the sensing limitation test result as the test failing. In the implementation process of the scheme, the feedback behavior of the automatic driving vehicle on the perception limitation test video is obtained, and the test result is determined according to whether the feedback behavior is the same as the preset behavior or not, so that the function of carrying out the perception limitation test according to the user-defined scene information is completed, and the flexibility of the perception limitation test is improved.
Optionally, in an embodiment of the present application, after generating, by the simulation simulator, an autopilot traffic scene according to the test scene information, the method further includes: shooting in an automatic driving traffic scene by using an analog light sensing camera to obtain an automatic driving simulation scene video; modifying the test scene information according to the simulation scene video to obtain modified scene information; generating a modified traffic scene according to the modified scene information through a simulation simulator; and using the modified traffic scene to perform perception limitation test on the automatic driving vehicle. In the implementation process of the scheme, the simulation simulator generates the modified traffic scene according to the modified scene information, and the modified traffic scene is used for performing the perception limitation test on the automatic driving vehicle, so that the function of re-performing the perception limitation test according to the scene information customized and modified by the user is completed, and the flexibility of the perception limitation test is improved.
Optionally, in an embodiment of the present application, modifying the test scene information according to the simulation scene video includes: acquiring a real field video in a real traffic driving scene; comparing the real field video with the simulation scene video to obtain a comparison result; and modifying the test scene information according to the comparison result. In the implementation process of the scheme, the real field video is compared with the simulation scene video to obtain a comparison result, and the test scene information is modified according to the comparison result, so that the efficiency of the automatic driving perception limitation test is effectively improved.
The embodiment of the application also provides an automatic driving perception limitation testing device, which comprises: the traffic scene generation module is used for acquiring the test scene information and generating an automatic driving traffic scene according to the test scene information through the simulation simulator; the test result obtaining module is used for performing a perception limitation test on the automatic driving vehicle by using the test information of the automatic driving traffic scene to obtain a perception limitation test result, and the test information comprises: the route is preset or the limitation test video is perceived.
Optionally, in an embodiment of the present application, the test scenario information includes: multiple pieces of perception limitation information; a traffic scene generation module comprising: the superposition rule judging sub-module is used for judging whether any two pieces of sensing limitation information in the plurality of pieces of sensing limitation information accord with superposition rules in a preset rule base or not; and the limitation test perception sub-module is used for generating perception limitation test points in the automatic driving traffic scene according to superposition of the two perception limitation information if any two pieces of the perception limitation information in the plurality of pieces of perception limitation information accord with superposition rules in a preset rule base, or respectively generating two perception limitation test points in the automatic driving traffic scene according to the two pieces of perception limitation information to obtain a plurality of perception limitation test points.
Optionally, in an embodiment of the present application, the test information is a preset route; the test result obtaining module comprises: the preset route acquisition sub-module is used for acquiring a preset route in an automatic driving traffic scene, and the preset route passes through a plurality of perception limiting test points; the automatic driving vehicle is driven according to the preset route, and the sensing limitation test is completed after the automatic driving vehicle passes through the plurality of sensing limitation test points.
Optionally, in an embodiment of the present application, the test information is a perceptually limited test video; the test result obtaining module comprises: the test video acquisition sub-module is used for shooting by using a perception limitation test point of the simulated light sensing camera in an automatic driving traffic scene to acquire a perception limitation test video; and the perception limitation testing sub-module is used for carrying out perception limitation testing on the automatic driving vehicle by using the perception limitation testing video.
Optionally, in an embodiment of the present application, the sensing limitation testing sub-module includes: the feedback behavior acquisition sub-module is used for acquiring feedback behavior of the automatic driving vehicle on the perception limitation test video; the feedback behavior judging sub-module is used for judging whether the feedback behavior is the same as the preset behavior; and the test result determining submodule is used for determining the perception limitation test result as the test passing if the feedback behavior is the same as the preset behavior, or determining the perception limitation test result as the test failing if the feedback behavior is the same as the preset behavior.
Optionally, in an embodiment of the present application, the autopilot perception limitation testing apparatus further includes: the scene video acquisition module is used for shooting in an automatic driving traffic scene by using the simulated light-sensitive camera to acquire an automatic driving simulated scene video; the scene information modification module is used for modifying the test scene information according to the simulation scene video to obtain modified scene information; the modified scene generation module is used for generating a modified traffic scene according to the modified scene information through the simulation simulator; and the perception limitation testing module is used for carrying out perception limitation testing on the automatic driving vehicle by using the traffic scene modification.
Optionally, in an embodiment of the present application, the scene information modification module includes: the scene video acquisition sub-module is used for acquiring a real scene video in a real traffic driving scene; the comparison result obtaining submodule is used for comparing the real field video with the simulation scene video to obtain a comparison result; and the scene information modification sub-module is used for modifying the test scene information according to the comparison result.
The embodiment of the application also provides electronic equipment, which comprises: a processor and a memory storing machine-readable instructions executable by the processor to perform the method as described above when executed by the processor.
Embodiments of the present application also provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs a method as described above.
Additional features and advantages of embodiments of the application will be set forth in the description which follows, and in part will be apparent from the description, or may be learned by practice of embodiments of the application.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application, and therefore should not be considered as limiting the scope, and other related drawings may be obtained according to these drawings without inventive effort to a person having ordinary skill in the art.
Fig. 1 is a schematic flow chart of an autopilot perception limitation testing method according to an embodiment of the present application;
FIG. 2 is a schematic flow chart of generating a perception localization test report according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of an autopilot perception limitation testing apparatus according to an embodiment of the present application;
Fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it should be understood that the accompanying drawings in the embodiments of the present application are only for the purpose of illustration and description, and are not intended to limit the scope of protection of the embodiments of the present application. In addition, it should be understood that the schematic drawings are not drawn to scale. A flowchart, as used in embodiments of the present application, illustrates operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be implemented out of order and that steps without logical context may be performed in reverse order or concurrently. Moreover, one or more other operations may be added to or removed from the flowcharts within the scope of embodiments of the present application.
In addition, the described embodiments are only a portion of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, which are generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Accordingly, the following detailed description of the embodiments of the present application, which is provided in the accompanying drawings, is not intended to limit the scope of the claimed embodiments of the present application, but is merely representative of selected embodiments of the present application.
It is understood that "first" and "second" in the embodiments of the present application are used to distinguish similar objects. It will be appreciated by those skilled in the art that the words "first," "second," etc. do not limit the number and order of execution, and that the words "first," "second," etc. do not necessarily differ. In the description of the embodiments of the present application, the term "and/or" is merely an association relationship describing an association object, which means that three relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship. The term "plurality" refers to two or more (including two), and similarly, "plurality" refers to two or more (including two).
Before introducing the method for testing the perception limitation of the automatic driving provided by the embodiment of the application, some concepts related in the embodiment of the application are introduced:
perception limitation (Perceptual Limitation) testing refers to testing limitations that exist in the perception system of an autonomous vehicle. In a specific course of practice, tests may be performed from the Point of view of Vision-based algorithms (Vision-based), and sensing systems of such autonomous vehicles are built primarily from camera data or data collected by cameras, and also from the Point-cloud-based algorithms (Point-based), which are built primarily from data collected by active sensors from points (or distances to objects measured) in 3D space.
The perception system of an autonomous vehicle may be built based on these classes of perception algorithms: the first is the intermediary perception (Mediated Perception), which algorithm develops a detailed map of the vehicle surroundings by analyzing distances to vehicles, pedestrians, trees or road markings etc.; the second is behavior-reflecting awareness (Behavior Reflex Perception), which algorithms use artificial intelligence techniques to apply sensor data (e.g., images of the vehicle environment) directly into the driving operating system; the third class is direct perception (direct perception), an algorithm that combines the above first class of algorithms with the second class of algorithms.
The sensing limitation information refers to sensing limitation boundary information of a sensing system of an automatic driving vehicle on the outside, and if the sensing system of the automatic driving vehicle is established depending on data collected by a camera, if the exposure time of the camera is too long (when the ambient light is strong, the image is too white to be beneficial to sensing an object by a sensing algorithm) or the exposure time is too short (the image is too dark to be beneficial to the recognition and detection of the object by the sensing algorithm), the limitations can lead the sensing system of the automatic driving vehicle to not sense all information of the outside (such as too strong sunlight, leaf shielding signal traffic lights, fast zebra crossing of pedestrians and the like).
It should be noted that, the method for testing the limitation of automatic driving perception provided in the embodiments of the present application may be executed by an electronic device, where the electronic device refers to a device terminal or a server having a function of executing a computer program, and the device terminal is for example: smart phones, personal computers, tablet computers, personal digital assistants, or mobile internet appliances, etc. A server refers to a device that provides computing services over a network, such as: an x86 server and a non-x 86 server, the non-x 86 server comprising: mainframe, minicomputer, and UNIX servers.
Application scenarios to which the autopilot perception limitation test method is applicable are described below, where the application scenarios include, but are not limited to: the automatic driving perception limitation testing method is used for carrying out perception limitation testing on the automatic driving vehicle, wherein the perception limitation testing comprises the following steps of: safety testing, simulated scene testing, interactive testing of autonomous vehicles and traffic scenes, and the like. In a specific course of practice, the autopilot awareness limitation test method may also be used to improve or enhance the functionality of a traffic simulator, such as: the functions of traffic scene simulation software such as Carla and deep are added, specifically, the functions of performing perception limitation test according to user-defined scene information can be added for the software, so that the traffic scene simulation software can generate dangerous traffic scenes related to a perception system.
Please refer to fig. 1, which illustrates a flowchart of an autopilot perception limitation testing method according to an embodiment of the present disclosure; the main thought of the automatic driving perception limitation testing method is that an automatic driving traffic scene is generated through a simulation simulator according to self-defined test scene information, and the automatic driving vehicle is subjected to perception limitation testing by using the test information of the automatic driving traffic scene, so that the function of carrying out perception limitation testing according to the self-defined scene information of a user is completed, and the flexibility of the perception limitation testing is improved. The implementation mode of the automatic driving perception limitation testing method can comprise the following steps:
step S110: and acquiring the test scene information, and generating an automatic driving traffic scene according to the test scene information through a simulation simulator.
The test scene information refers to parameter information for constructing an autopilot traffic scene, where the autopilot traffic scene may be a traffic scene under a special condition, and the test scene information may include: weather information, road condition information, and/or traffic participant information, etc.
Step S120: performing a perception limitation test on the automatic driving vehicle by using test information of the automatic driving traffic scene to obtain a perception limitation test result, wherein the test information comprises: the route is preset or the limitation test video is perceived.
It is to be understood that the perception limitation test herein includes, but is not limited to: generating an automatic driving traffic scene, naturally distributing and sampling, and detecting a system capacity boundary test; the perception system capability boundary test here is for example: the automatic driving perception limitation testing method is used for generating automatic driving traffic scenes, such as automatic driving traffic scenes of pedestrians walking on roads, automobile waiting signal lamps and the like, so that the automatic driving vehicle can be determined to make correct feedback behaviors on the automatic driving traffic scenes, namely whether the feedback behaviors are preset behaviors or not is judged, and if the feedback behaviors are preset behaviors, a perception limitation testing result can be determined to pass the test; if the feedback behavior is not the preset behavior, the perception limitation test result can be determined as the test failed.
In the implementation process, the automatic driving traffic scene is generated through the simulation simulator according to the self-defined test scene information, and the automatic driving vehicle is subjected to the perception limitation test by using the test information of the automatic driving traffic scene, so that the function of carrying out the perception limitation test according to the self-defined scene information of the user is completed, and the flexibility of the perception limitation test is improved.
As an alternative embodiment of the acquiring test scenario information in step S110, the test scenario information may include: weather information, road condition information, and/or traffic participant information; acquiring test scenario information through an information input operation of a graphical user interface (Graphical User Interface, GUI), the embodiment may include:
step S111: and responding to the information input operation of the graphical user interface, and obtaining weather information, road condition information and/or traffic participant information.
Weather information refers to information such as weather category and weather degree of an autopilot traffic scene, for example: the user can select weather types such as rainy days, snowy days, sunny days and the like, the weather degree can be small, medium or large, and the special weather degree such as heavy rain or snow storm is also selected.
Road condition information refers to road surface condition information of an automatic driving traffic scene, and the road condition information comprises but is not limited to: of course, the user can set the degree information of the water pit area, the obstacle object area, the class and the like according to the specific road condition class.
Traffic participant information, which refers to traffic participant-related information that participates in an autopilot traffic scenario, where traffic participants include, but are not limited to: wild animals, pedestrians or other vehicles, etc., although the user may also select a particular class of wild animals (e.g., cattle, sheep, horses, etc.), as well as a class of pedestrians (e.g., children, adults, or elderly persons), a degree of distribution (e.g., dense or rare, etc.), a path of travel (e.g., zebra crossings or crossing roads, etc.), depending on the particular choice.
The embodiment of step S111 described above is, for example: weather information, road condition information, and/or traffic participant information is obtained in response to an information input operation of a graphical user interface (Graphical User Interface, GUI). The information input operation here includes, but is not limited to: the default value (default value) of weather information, road condition information and/or traffic participant information can be directly displayed on the GUI, and the default value can be a specific value selected by a user, and the default value of the traffic participant information can be "child queuing at kindergarten gate" or "old people who occupy road-jumping square dance" or the like. When it is desired to set the traffic participant information to be an adult, a function such as SetPeople () or an application program interface (Application Programming Interface, API) may be called to set the traffic participant information to be an adult.
It can be understood that after the traffic simulator is operated, the user can set the test scene information in the UI setting panel of the traffic simulator, and the electronic device can obtain the test scene information set by the user through the UI setting panel. If the traffic simulator runs on the terminal equipment, the electronic equipment can receive the test scene information sent by the terminal equipment, the electronic equipment can store the test scene information into a file system, a database or a mobile storage device, receive the test scene information sent by other terminal equipment and store the test scene information into the file system, the database or the mobile storage device;
As a first embodiment of generating an autopilot traffic scene from test scene information by a simulation simulator in the step S110, the test scene information includes: multiple pieces of perception limitation information, a perception limitation test point can be generated according to the perception limitation information, and the implementation mode specifically includes:
step S112: judging whether any two pieces of sensing limitation information in the plurality of pieces of sensing limitation information accord with superposition rules in a preset rule base.
The overlay rule refers to a rule that whether two pieces of perception limitation information can be overlaid, and as there are countless kinds of perception limitation information in the real world, not all pieces of perception limitation information can be overlaid, for example: the strong sunlight and the heavy rain fall are completely two kinds of conflicting information, and the conflicting information cannot appear in the real world, so that the mutually conflicting perception limitation information is avoided in the same automatic driving traffic scene by using a preset rule base.
The embodiment of step S112 described above is, for example: and judging whether any two pieces of perception limitation information in the plurality of pieces of perception limitation information accord with superposition rules in a preset rule base or not by using an executable program compiled or interpreted by a preset programming language. Among these, programming languages that can be used are, for example: C. c++, java, BASIC, javaScript, LISP, shell, perl, ruby, python, PHP, etc., the preset rule base herein may employ Mysql, postgreSQL, oracle and SQLSever relational databases.
Step S113: and if any two pieces of sensing limitation information in the plurality of pieces of sensing limitation information accord with the superposition rule in the preset rule base, generating sensing limitation test points in the automatic driving traffic scene according to superposition of the two pieces of sensing limitation information.
The embodiment of step S113 described above is, for example: assuming that first sensing limitation information in the plurality of sensing limitation information is too strong in sunlight and second sensing limitation information is a leaf shielding signal traffic light, the first sensing limitation information and the second sensing limitation information accord with superposition rules in a preset rule base, and the two sensing limitation information can be superposed to generate the same sensing limitation test point in an automatic driving traffic scene.
Step S114: if any two pieces of sensing limitation information in the plurality of pieces of sensing limitation information do not accord with the superposition rule in the preset rule base, respectively generating two sensing limitation test points in the automatic driving traffic scene according to the two pieces of sensing limitation information, and obtaining a plurality of pieces of sensing limitation test points.
The embodiment of step S114 described above is, for example: assuming that the first sensing limitation information in the sensing limitation information is too strong in sunlight and the second sensing limitation information is heavy rain reducing, two different sensing limitation test points are required to be generated respectively if the first sensing limitation information is not in accordance with the superposition rule in the preset rule base, and the distance between the two different sensing limitation test points can exceed a preset threshold (for example, 200 km), so that a real automatic driving traffic scene is simulated as much as possible.
In the implementation process, the same perception limitation test point in the automatic driving traffic scene is generated by superposition only through judging that the perception limitation information accords with the superposition rule in the preset rule base, so that the generation of the perception limitation test point is more reasonable, and the rationality of the automatic driving perception limitation test is also increased.
As a second embodiment of generating the automatic driving traffic scene from the test scene information by the simulation simulator in the above-described step S110, specifically, for example: generating an automatic driving traffic scene of 'one old man running a red light and crossing a zebra crossing' at a front crossroad by a simulation simulator according to the test scene information; the simulation simulator can be simulator software continuously developed based on traffic scene simulation software such as Carla, deep drive and the like, can also be traffic scene simulation simulator software developed from the beginning, and can also be software developed by using a 3D engine technology.
In a specific practical process, the automatic driving traffic scene generated in advance can be imported into a simulation simulator, or the self-defined traffic participant model can be imported into the automatic driving traffic scene, wherein the automatic driving traffic scene is generated by the simulation simulator according to preset test scene information, specifically for example: the user can select an automatic driving traffic scene through a scene UI panel in the traffic simulator, wherein the automatic driving traffic scene can be a scene that an old man runs a red light and passes a zebra crossing at a front crossroad, and then a self-defined traffic participant model is imported into the automatic driving traffic scene, so that the function of fully self-defining the automatic driving traffic scene is realized.
As the first embodiment of the above-mentioned step S120, when the sensing limitation test is performed on the automatically driven vehicle using the test information of the automatically driven traffic scene, the test information may be a preset route, so the sensing limitation test may be performed according to the preset route in the automatically driven traffic scene, and the embodiment may include:
step S121: the electronic equipment acquires a preset route in an automatic driving traffic scene, and the preset route passes through a plurality of perception limiting test points.
The embodiment of step S121 described above is, for example: the electronic device may acquire a preset route in the autopilot traffic scene by using an executable program compiled or interpreted by a preset programming language, where the preset route may be a route that is set manually in advance and passes through multiple perception limitation test points. Among these, programming languages that can be used are, for example: C. c++, java, BASIC, javaScript, LISP, shell, perl, ruby, python, PHP, etc.
Step S122: the electronic equipment sends a preset route to the automatic driving vehicle so that the automatic driving vehicle can drive according to the preset route, and the perception limitation test is completed after the automatic driving vehicle passes through a plurality of perception limitation test points.
The embodiment of step S122 described above is, for example: the electronic device sends a preset route to the automatic driving vehicle through a transmission control protocol (Transmission Control Protocol, TCP) or a user datagram protocol (User Datagram Protocol, UDP), so that the automatic driving vehicle can drive in an automatic driving traffic scene according to the preset route, and after passing through a plurality of perception limitation test points, perception limitation test is completed, and a perception limitation test result is obtained. The autopilot traffic scenario may be a scenario running in a simulation simulator.
As the second embodiment of the above step S120, when the sensing limitation test is performed on the automatically driven vehicle using the test information of the automatically driven traffic scene, the test information may be a sensing limitation test video, so the sensing limitation test may be performed according to the photographed sensing limitation test video in the automatically driven traffic scene, and the embodiment may include:
step S123: shooting by using a simulated light sensing camera at a perception limitation test point in an automatic driving traffic scene to obtain a perception limitation test video.
The embodiment of step S123 described above is, for example: after the automatic driving traffic scene is generated or set, a user can operate an automatic driving automobile provided with a simulated light-sensing camera, then the simulated light-sensing camera on the automatic driving automobile is used for shooting in the automatic driving traffic scene to obtain a simulated scene video of the automatic driving, and a scene UI panel in the traffic simulator can be opened and the automatic driving traffic scene can be modified through the scene UI panel in the traffic simulator.
Step S124: and performing a perception limitation test on the automatic driving vehicle by using the perception limitation test video to obtain a perception limitation test result.
As an embodiment of the above step S124, when the perception limitation test video is used to perform the perception limitation test on the automatic driving vehicle, the perception limitation test result may be determined according to the feedback behavior of the automatic driving vehicle to the perception limitation test video, and the embodiment may include:
step S124a: and acquiring feedback behaviors of the automatic driving vehicle on the perception limitation test video.
Step S124b: and judging whether the feedback behavior of the automatic driving vehicle is the same as the preset behavior.
The embodiments of the above steps S124a to S124b are, for example: and acquiring feedback behaviors of the automatic driving vehicle on the perception limitation test video by using an executable program compiled or interpreted by a preset programming language, and judging whether the feedback behaviors of the automatic driving vehicle are the same as the preset behaviors or not by using the executable program. Among these, programming languages that can be used are, for example: C. c++, java, BASIC, javaScript, LISP, shell, perl, ruby, python, PHP, etc.
Step S124c: and if the feedback behavior of the automatic driving vehicle is the same as the preset behavior, determining the perception limitation test result as the passing of the test.
Step S124d: if the feedback behavior of the automatic driving vehicle is different from the preset behavior, the perception limitation test result is determined to be failed in the test.
The embodiments of the above steps S124c to S124d are, for example: and if the feedback behavior of the automatic driving vehicle to the perception limitation test video is the same as the preset behavior, determining the perception limitation test result as the passing of the test. If the feedback behavior of the automatic driving vehicle to the perception limitation test video is different from the preset behavior, the perception limitation test result is determined as that the test is not passed.
As an alternative embodiment of the above-mentioned method for testing the perception limitation of the automatic driving vehicle, after obtaining the result of the perception limitation test, the operation parameters of the automatic driving vehicle may be modified, and the perception limitation test may be performed again, which may include:
step S130: if the sensing limitation test result is that the test is not passed, the operation parameters of the automatic driving vehicle are modified to obtain a modified automatic driving vehicle, and the automatic driving vehicle is subjected to the sensing limitation test by using the automatic driving traffic scene, so that the feedback behavior of the automatic driving traffic scene is identical to the preset behavior.
The embodiment of step S130 described above is, for example: if the sensing limitation test result is that the test is not passed, the running parameters of the automatic driving vehicle are modified through a UI (user interface) setting panel or an API (application program interface) of an executable program in the traffic simulator to obtain a modified automatic driving vehicle, and the automatic driving vehicle is subjected to the sensing limitation test by using the automatic driving traffic scene, so that the feedback behavior of the automatic driving traffic scene is identical to the preset behavior.
As an optional implementation manner of the above-mentioned automatic driving perception limitation testing method, after the automatic driving traffic scene is generated according to the test scene information by the simulation simulator, the test scene information may be modified according to the simulation scene video, and this implementation manner may include:
step S140: and shooting in the automatic driving traffic scene by using the simulated light-sensitive camera to obtain a simulated scene video of the automatic driving.
The embodiment of step S140 described above is, for example: after the automatic driving traffic scene is generated or set, a user can operate an automatic driving automobile provided with a simulated light-sensing camera, then the simulated light-sensing camera on the automatic driving automobile is used for shooting in the automatic driving traffic scene to obtain a simulated scene video of the automatic driving, and a scene UI panel in the traffic simulator can be opened and the automatic driving traffic scene can be modified through the scene UI panel in the traffic simulator.
Step S150: and modifying the test scene information according to the simulation scene video to obtain modified scene information.
The embodiment of step S150 described above is, for example: scene elements can be parsed from the simulation scene video, and then the test scene information is modified according to the parsed scene elements, so that modified scene information is obtained.
Step S160: and generating a modified traffic scene according to the modified scene information through a simulation simulator.
Step S170: and (5) carrying out perception limitation test on the automatic driving vehicle again by using the traffic scene modification to obtain a retest result.
The implementation principle and implementation of the steps S160 to S170 are similar to those of the steps S110 to S120, and thus, the implementation principle and implementation thereof will not be described herein, and reference may be made to the descriptions of the steps S110 to S120, if not clear.
As an alternative implementation manner of the above step S150, when modifying the test scene information according to the simulation scene video, the modification may also be performed according to the comparison result between the real field video and the simulation scene video, which may include:
step S151: and acquiring a real field video in a real traffic driving scene.
The obtaining manner of the real live video in the step S151 includes: the first obtaining mode is to use a camera, a video recorder or a color camera and other acquisition equipment to shoot a real traffic driving scene to obtain a real field video; then the acquisition equipment sends a real field video to the electronic equipment, and the electronic equipment receives the real field video sent by the acquisition equipment; the second way of obtaining, from a video server, a real live video, specifically for example: acquiring a real field video from a file system, a database or mobile storage equipment of a video server; and thirdly, acquiring real field video on the Internet by using software such as a browser or other application programs, or accessing the real field video on the Internet.
Step S152: and comparing the real field video with the simulation scene video to obtain a comparison result.
Step S153: and modifying the test scene information according to the comparison result.
The embodiments of the above steps S152 to S153 are, for example: analyzing the simulation scene element from the simulation scene video, analyzing the real scene element from the real scene video, and comparing the analyzed simulation scene element with the real scene element to determine whether the simulation scene element is the same as the real scene element or not. If the simulation scene element is different from the real scene element as a result of the comparison, the test scene information is modified so that the simulation scene element is identical to the real scene element. It can be understood that after the test scene information is modified according to the simulation scene video, modified scene information is obtained, a modified traffic scene can be generated according to the modified scene information through the simulation simulator, and the perception limitation test is performed on the automatic driving vehicle again by using the modified traffic scene, so that a retest result is obtained.
Please refer to fig. 2, which is a schematic flow chart of generating a perception limitation test report according to an embodiment of the present application; as an alternative embodiment of the above-mentioned automatic driving perception limitation testing method, after obtaining the perception limitation testing result, a perception limitation testing report is generated, and the embodiment may include:
Step S210: the electronic equipment acquires the test scene information and generates an automatic driving traffic scene according to the test scene information through the simulation simulator.
Step S220: the electronic equipment performs a perception limitation test on the automatic driving vehicle by using test information of the automatic driving traffic scene to obtain a perception limitation test result, wherein the test information comprises: the route is preset or the limitation test video is perceived.
The implementation principle and implementation of the steps S210 to S220 are similar to those of the steps S110 to S120, and thus, the implementation principle and implementation thereof will not be described herein, and reference may be made to the descriptions of the steps S110 to S120, if not clear.
Step S230: and the electronic equipment generates a perception limitation test report according to the perception limitation test result.
The embodiment of step S230 described above is, for example: the electronic device generates a perception limitation test report according to a perception limitation test result by using an executable program compiled or interpreted by a preset programming language, specifically, the score corresponding to the perception limitation test point can be increased when an automatic driving vehicle passes through the perception limitation test point, so that the total score of the perception limitation test is finally obtained, and the total score of the perception limitation test and the score corresponding to each perception limitation test point are output on the perception limitation test report. Among these, programming languages that can be used are, for example: C. c++, java, BASIC, javaScript, LISP, shell, perl, ruby, python, PHP, etc.
Please refer to fig. 3, which illustrates a schematic structural diagram of an autopilot perception limitation testing apparatus according to an embodiment of the present application; the embodiment of the application provides an automatic driving perception limitation testing device 300, which comprises:
the traffic scene generation module 310 is configured to obtain the test scene information, and generate an automatic driving traffic scene according to the test scene information through the simulation simulator.
The test result obtaining module 320 is configured to perform a perception limitation test on an autopilot vehicle by using test information of an autopilot traffic scenario, to obtain a perception limitation test result, where the test information includes: the route is preset or the limitation test video is perceived.
Optionally, in an embodiment of the present application, the test scenario information includes: multiple pieces of perception limitation information; a traffic scene generation module comprising:
the superposition rule judging sub-module is used for judging whether any two pieces of sensing limitation information in the plurality of pieces of sensing limitation information accord with superposition rules in a preset rule base or not;
and the limitation test perception sub-module is used for generating perception limitation test points in the automatic driving traffic scene according to superposition of the two perception limitation information if any two pieces of the perception limitation information in the plurality of pieces of perception limitation information accord with superposition rules in a preset rule base, or respectively generating two perception limitation test points in the automatic driving traffic scene according to the two pieces of perception limitation information to obtain a plurality of perception limitation test points.
Optionally, in an embodiment of the present application, the test information is a preset route; the test result obtaining module comprises:
the preset route acquisition sub-module is used for acquiring a preset route in an automatic driving traffic scene, and the preset route passes through a plurality of perception limiting test points;
the automatic driving vehicle is driven according to the preset route, and the sensing limitation test is completed after the automatic driving vehicle passes through the plurality of sensing limitation test points.
Optionally, in an embodiment of the present application, the test information is a perceptually limited test video; the test result obtaining module comprises:
the test video acquisition sub-module is used for shooting by using a perception limitation test point of the simulated light sensing camera in an automatic driving traffic scene to acquire a perception limitation test video;
and the perception limitation testing sub-module is used for carrying out perception limitation testing on the automatic driving vehicle by using the perception limitation testing video.
Optionally, in an embodiment of the present application, the sensing limitation testing sub-module includes:
and the feedback behavior acquisition sub-module is used for acquiring feedback behavior of the automatic driving vehicle on the perception limitation test video.
And the feedback behavior judging sub-module is used for judging whether the feedback behavior is the same as the preset behavior.
And the test result determining submodule is used for determining the perception limitation test result as the test passing if the feedback behavior is the same as the preset behavior, or determining the perception limitation test result as the test failing if the feedback behavior is the same as the preset behavior.
Optionally, in an embodiment of the present application, the autopilot perception limitation testing apparatus further includes:
and the scene video obtaining module is used for shooting in the automatic driving traffic scene by using the simulated light-sensitive camera to obtain the simulated scene video of the automatic driving.
The scene information modification module is used for modifying the test scene information according to the simulation scene video to obtain modified scene information.
And the modified scene generation module is used for generating a modified traffic scene according to the modified scene information through the simulation simulator.
And the re-perception limitation testing module is used for carrying out perception limitation testing on the automatic driving vehicle by using the traffic scene modification.
Optionally, in an embodiment of the present application, the scene information modification module includes:
the scene video acquisition sub-module is used for acquiring the real scene video in the real traffic driving scene.
And the comparison result obtaining sub-module is used for comparing the real field video with the simulation scene video to obtain a comparison result.
And the scene information modification sub-module is used for modifying the test scene information according to the comparison result.
It should be understood that the apparatus corresponds to the above-described embodiment of the automatic driving perception limitation testing method, and is capable of executing the steps involved in the above-described embodiment of the method, and specific functions of the apparatus may be referred to the above description, and detailed descriptions thereof will be omitted herein as appropriate. The device includes at least one software functional module that can be stored in memory in the form of software or firmware (firmware) or cured in an Operating System (OS) of the device.
Please refer to fig. 4, which illustrates a schematic structural diagram of an electronic device provided in an embodiment of the present application. An electronic device 400 provided in an embodiment of the present application includes: a processor 410 and a memory 420, the memory 420 storing machine-readable instructions executable by the processor 410, which when executed by the processor 410 perform the method as described above.
The present embodiment also provides a computer readable storage medium 430, the computer readable storage medium 430 having stored thereon a computer program which, when executed by the processor 410, performs the method as above. The computer-readable storage medium 430 may be implemented by any type or combination of volatile or nonvolatile Memory devices, such as static random access Memory (Static Random Access Memory, SRAM for short), electrically erasable programmable Read-Only Memory (Electrically Erasable Programmable Read-Only Memory, EEPROM for short), erasable programmable Read-Only Memory (Erasable Programmable Read Only Memory, EPROM for short), programmable Read-Only Memory (Programmable Read-Only Memory, PROM for short), read-Only Memory (ROM for short), magnetic Memory, flash Memory, magnetic disk, or optical disk.
It should be noted that, in the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described as different from other embodiments, and identical and similar parts between the embodiments are all enough to be referred to each other. For the apparatus class embodiments, the description is relatively simple as it is substantially similar to the method embodiments, and reference is made to the description of the method embodiments for relevant points.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. The apparatus embodiments described above are merely illustrative, for example, of the flowcharts and block diagrams in the figures that illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
In addition, the functional modules of the embodiments in the embodiments of the present application may be integrated together to form a single part, or each module may exist alone, or two or more modules may be integrated to form a single part. Furthermore, in the description of the present specification, the descriptions of the terms "one embodiment," "some embodiments," "examples," "specific examples," "some examples," and the like, mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the embodiments of the present application. In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction.
The foregoing description is merely an optional implementation of the embodiments of the present application, but the scope of the embodiments of the present application is not limited thereto, and any person skilled in the art may easily think about changes or substitutions within the technical scope of the embodiments of the present application, and the changes or substitutions should be covered in the scope of the embodiments of the present application.

Claims (10)

1. An automatic driving perception limitation testing method is characterized by comprising the following steps:
acquiring test scene information, and generating an automatic driving traffic scene according to the test scene information through a simulation simulator;
and performing a perception limitation test on the automatic driving vehicle by using the test information of the automatic driving traffic scene to obtain a perception limitation test result, wherein the test information comprises: the route is preset or the limitation test video is perceived.
2. The method of claim 1, wherein the test scenario information comprises: multiple pieces of perception limitation information; the generating, by the simulation simulator, an autopilot traffic scene according to the test scene information, including:
judging whether any two pieces of sensing limitation information in the plurality of pieces of sensing limitation information accord with superposition rules in a preset rule base or not;
and if so, generating a perception limitation test point in the automatic driving traffic scene according to the superposition of the two perception limitation information, otherwise, respectively generating two perception limitation test points in the automatic driving traffic scene according to the two perception limitation information to obtain a plurality of perception limitation test points.
3. The method of claim 1, wherein the test information is a preset route; the sensing limitation test for the automatic driving vehicle by using the test information of the automatic driving traffic scene comprises the following steps:
Acquiring a preset route in the automatic driving traffic scene, wherein the preset route passes through a plurality of perception limiting test points;
and sending the preset route to the automatic driving vehicle so that the automatic driving vehicle can drive according to the preset route, and completing the perception limitation test after passing through the plurality of perception limitation test points.
4. The method of claim 1, wherein the test information is a perceptually limited test video; the sensing limitation test for the automatic driving vehicle by using the test information of the automatic driving traffic scene comprises the following steps:
shooting by using a perception limitation test point of an analog light-sensing camera in the automatic driving traffic scene to obtain a perception limitation test video;
and performing perception limitation testing on the automatic driving vehicle by using the perception limitation testing video.
5. The method of claim 4, wherein using the perception limitation test video to perform a perception limitation test on an autonomous vehicle comprises:
acquiring feedback behaviors of the automatic driving vehicle on the perception limitation test video;
judging whether the feedback behavior is the same as a preset behavior or not;
If yes, determining the sensing limitation test result as the test passing, otherwise, determining the sensing limitation test result as the test failing.
6. The method of any of claims 1-5, further comprising, after the generating, by the simulation simulator, an autopilot traffic scenario from the test scenario information:
shooting in the automatic driving traffic scene by using an analog light-sensing camera to obtain an automatic driving simulation scene video;
modifying the test scene information according to the simulation scene video to obtain modified scene information;
generating a modified traffic scene according to the modified scene information through the simulation simulator;
and performing a perception limitation test on the automatic driving vehicle by using the modified traffic scene.
7. The method of claim 6, wherein modifying the test scene information from the simulated scene video comprises:
acquiring a real field video in a real traffic driving scene;
comparing the real field video with the simulation scene video to obtain a comparison result;
and modifying the test scene information according to the comparison result.
8. An autopilot perception limitation testing apparatus, comprising:
the traffic scene generation module is used for acquiring the test scene information and generating an automatic driving traffic scene according to the test scene information through the simulation simulator;
the test result obtaining module is used for performing a perception limitation test on the automatic driving vehicle by using the test information of the automatic driving traffic scene to obtain a perception limitation test result, and the test information comprises: the route is preset or the limitation test video is perceived.
9. An electronic device, comprising: a processor and a memory storing machine-readable instructions executable by the processor to perform the method of any one of claims 1 to 7 when executed by the processor.
10. A computer-readable storage medium, characterized in that it has stored thereon a computer program which, when executed by a processor, performs the method according to any of claims 1 to 7.
CN202310342476.2A 2023-04-03 2023-04-03 Automatic driving perception limitation testing method and device, electronic equipment and storage medium Active CN116046417B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310342476.2A CN116046417B (en) 2023-04-03 2023-04-03 Automatic driving perception limitation testing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310342476.2A CN116046417B (en) 2023-04-03 2023-04-03 Automatic driving perception limitation testing method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN116046417A true CN116046417A (en) 2023-05-02
CN116046417B CN116046417B (en) 2023-11-24

Family

ID=86120457

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310342476.2A Active CN116046417B (en) 2023-04-03 2023-04-03 Automatic driving perception limitation testing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116046417B (en)

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105956268A (en) * 2016-04-29 2016-09-21 百度在线网络技术(北京)有限公司 Construction method and device applied to test scene of pilotless automobile
CN109446371A (en) * 2018-11-09 2019-03-08 苏州清研精准汽车科技有限公司 A kind of intelligent automobile emulation testing scene library generating method and test macro and method
EP3530521A1 (en) * 2018-02-22 2019-08-28 Continental Automotive GmbH Driver assistance method and apparatus
CN112115761A (en) * 2020-05-12 2020-12-22 吉林大学 Countermeasure sample generation method for detecting vulnerability of visual perception system of automatic driving automobile
CN112631257A (en) * 2020-12-29 2021-04-09 清华大学苏州汽车研究院(相城) Expected function safety test evaluation method for misoperation of automatic driving vehicle
CN112711260A (en) * 2020-12-29 2021-04-27 清华大学苏州汽车研究院(相城) Expected function safety test evaluation method for error/omission recognition of automatic driving vehicle
CN113515105A (en) * 2021-04-09 2021-10-19 清华大学 Platform, method and storage medium for vehicle expected function safety simulation test
CN114021327A (en) * 2021-10-28 2022-02-08 同济大学 Quantitative evaluation method for performance of automatic driving automobile sensing system
CN114326667A (en) * 2021-12-23 2022-04-12 清华大学 Unmanned test method for fusion of on-line traffic flow simulation and real road environment
CN114564003A (en) * 2022-02-14 2022-05-31 东风汽车集团股份有限公司 Automatic driving expected function safety perception performance limitation modification method and vehicle
WO2022141294A1 (en) * 2020-12-30 2022-07-07 深圳市大疆创新科技有限公司 Simulation test method and system, simulator, storage medium, and program product
CN115016323A (en) * 2022-06-21 2022-09-06 际络科技(上海)有限公司 Automatic driving simulation test system and method
CN115017742A (en) * 2022-08-08 2022-09-06 西安深信科创信息技术有限公司 Automatic driving test scene generation method, device, equipment and storage medium
CN115688496A (en) * 2023-01-05 2023-02-03 西安深信科创信息技术有限公司 Method for obtaining automatic driving simulation test script and related device

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105956268A (en) * 2016-04-29 2016-09-21 百度在线网络技术(北京)有限公司 Construction method and device applied to test scene of pilotless automobile
EP3530521A1 (en) * 2018-02-22 2019-08-28 Continental Automotive GmbH Driver assistance method and apparatus
CN109446371A (en) * 2018-11-09 2019-03-08 苏州清研精准汽车科技有限公司 A kind of intelligent automobile emulation testing scene library generating method and test macro and method
CN112115761A (en) * 2020-05-12 2020-12-22 吉林大学 Countermeasure sample generation method for detecting vulnerability of visual perception system of automatic driving automobile
CN112631257A (en) * 2020-12-29 2021-04-09 清华大学苏州汽车研究院(相城) Expected function safety test evaluation method for misoperation of automatic driving vehicle
CN112711260A (en) * 2020-12-29 2021-04-27 清华大学苏州汽车研究院(相城) Expected function safety test evaluation method for error/omission recognition of automatic driving vehicle
WO2022141294A1 (en) * 2020-12-30 2022-07-07 深圳市大疆创新科技有限公司 Simulation test method and system, simulator, storage medium, and program product
CN113515105A (en) * 2021-04-09 2021-10-19 清华大学 Platform, method and storage medium for vehicle expected function safety simulation test
CN114021327A (en) * 2021-10-28 2022-02-08 同济大学 Quantitative evaluation method for performance of automatic driving automobile sensing system
CN114326667A (en) * 2021-12-23 2022-04-12 清华大学 Unmanned test method for fusion of on-line traffic flow simulation and real road environment
CN114564003A (en) * 2022-02-14 2022-05-31 东风汽车集团股份有限公司 Automatic driving expected function safety perception performance limitation modification method and vehicle
CN115016323A (en) * 2022-06-21 2022-09-06 际络科技(上海)有限公司 Automatic driving simulation test system and method
CN115017742A (en) * 2022-08-08 2022-09-06 西安深信科创信息技术有限公司 Automatic driving test scene generation method, device, equipment and storage medium
CN115688496A (en) * 2023-01-05 2023-02-03 西安深信科创信息技术有限公司 Method for obtaining automatic driving simulation test script and related device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
罗璎珞;石娟;: "自动驾驶仿真系统中网络安全测试方法研究", 摩托车技术, no. 06 *

Also Published As

Publication number Publication date
CN116046417B (en) 2023-11-24

Similar Documents

Publication Publication Date Title
CN109086668B (en) Unmanned aerial vehicle remote sensing image road information extraction method based on multi-scale generation countermeasure network
CN111507210B (en) Traffic signal lamp identification method, system, computing equipment and intelligent vehicle
CN112868022A (en) Driving scenarios for autonomous vehicles
CN111507160B (en) Method and apparatus for integrating travel images acquired from vehicles performing cooperative driving
WO2022083259A1 (en) Substitute autonomous vehicle data
KR20200080402A (en) System and method for detecting abnormal situation
CN112101272A (en) Traffic light detection method and device, computer storage medium and road side equipment
CN113139446A (en) End-to-end automatic driving behavior decision method, system and terminal equipment
CN113192107A (en) Target identification tracking method and robot
CN111539268A (en) Road condition early warning method and device during vehicle running and electronic equipment
CN110795975A (en) Face false detection optimization method and device
CN115019060A (en) Target recognition method, and training method and device of target recognition model
CN113160272B (en) Target tracking method and device, electronic equipment and storage medium
CN116046417B (en) Automatic driving perception limitation testing method and device, electronic equipment and storage medium
CN112509321A (en) Unmanned aerial vehicle-based driving control method and system for urban complex traffic situation and readable storage medium
CN110969173A (en) Target classification method and device
CN111339834B (en) Method for identifying vehicle driving direction, computer device and storage medium
CN112528944A (en) Image identification method and device, electronic equipment and storage medium
CN116776288A (en) Optimization method and device of intelligent driving perception model and storage medium
CN112912892A (en) Automatic driving method and device and distance determining method and device
CN111160282A (en) Traffic light detection method based on binary Yolov3 network
CN114627443B (en) Target detection method, target detection device, storage medium, electronic equipment and vehicle
CN115909126A (en) Target detection method, apparatus and storage medium
CN116252813A (en) Vehicle driving track prediction method, device and storage medium
WO2022201276A1 (en) Reliability determination device and reliability determination method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Room 533, 5th Floor, Building A3A4, Phase I, Zhong'an Chuanggu Science and Technology Park, No. 900 Wangjiang West Road, High tech Zone, Hefei City, Anhui Province, 230031

Applicant after: Anhui Xinxin Science and Technology Innovation Information Technology Co.,Ltd.

Address before: 2nd Floor, Building B2, Yunhui Valley, No. 156, Tiangu 8th Road, Software New Town, Yuhua Street Office, High-tech Zone, Xi'an City, Shaanxi Province 710000

Applicant before: Xi'an Xinxin Information Technology Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant