CN116783461A - Vehicle testing method and system based on unmanned aerial vehicle - Google Patents

Vehicle testing method and system based on unmanned aerial vehicle Download PDF

Info

Publication number
CN116783461A
CN116783461A CN202180087983.5A CN202180087983A CN116783461A CN 116783461 A CN116783461 A CN 116783461A CN 202180087983 A CN202180087983 A CN 202180087983A CN 116783461 A CN116783461 A CN 116783461A
Authority
CN
China
Prior art keywords
vehicle
model
unmanned aerial
aerial vehicle
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202180087983.5A
Other languages
Chinese (zh)
Inventor
张玉新
王璐瑶
赵福民
李鹏飞
杜昕一
苏泽文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jilin University
SZ DJI Technology Co Ltd
Original Assignee
Jilin University
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jilin University, SZ DJI Technology Co Ltd filed Critical Jilin University
Publication of CN116783461A publication Critical patent/CN116783461A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M17/00Testing of vehicles
    • G01M17/007Wheeled or endless-tracked vehicles

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The method comprises the steps of sending a preset test instruction to the unmanned aerial vehicle, wherein the test instruction is used for indicating the unmanned aerial vehicle to move in a traffic scene, the unmanned aerial vehicle carries a model of a target object, and the model of the target object moves along with the unmanned aerial vehicle in the traffic scene (S110); acquiring a running state of a vehicle to be tested in a traffic scene (S120); a test result of the vehicle under test is generated according to the running state of the vehicle under test (S130). The mobility and the sensitivity of the model of the target object are better.

Description

Vehicle testing method and system based on unmanned aerial vehicle Technical Field
The application relates to the technical field of automatic driving, in particular to a vehicle testing method and system based on an unmanned aerial vehicle.
Background
The automatic driving means that the driver does not need to operate the vehicle, but automatically collects environmental information through a sensor on the vehicle, and automatically runs according to the environmental information. The test evaluation of the automatic driving function is an essential link of vehicle development, technical application and commercial popularization, and the automatic driving function of vehicles is produced in mass from a laboratory, and a large number of tests are required to prove the stability, the robustness, the reliability and the like of each application function and performance of the automatic driving function of vehicles. When a closed field test or an actual road test is performed, a running scene can be created for a tested vehicle (an automatic driving vehicle) by simulating a traffic participant through a real vehicle or through some devices, and the capability of the automatic driving function of the tested vehicle to cope with the running scene is verified. However, under specific conditions, such as the situations of failure of software and hardware, interference of external environment, etc., if the automatic driving function of the tested vehicle cannot be successfully controlled, the tested vehicle collides with the real vehicle or the devices, especially at higher speeds, and the devices are generally not good enough in mobility and sensitivity, so that the testing scene is relatively limited, for example, only the lower speed test can be performed.
Disclosure of Invention
The application provides a vehicle testing method and system based on an unmanned aerial vehicle, in particular to an automatic driving testing method, device and system, a movable target object and a storage medium, which can provide a device with better maneuverability and sensitivity to create a driving scene for a tested vehicle, and the tested scene can be richer and safer.
In a first aspect, an embodiment of the present application provides an autopilot test method, where the test method includes:
transmitting a preset test instruction to an unmanned aerial vehicle, wherein the test instruction is used for indicating the unmanned aerial vehicle to move in a traffic scene, and the unmanned aerial vehicle carries a model of a target object which moves along with the unmanned aerial vehicle in the traffic scene;
acquiring a running state of a vehicle to be tested in the traffic scene, wherein the vehicle to be tested can autonomously move in the traffic scene based on observation data of a model of the target object and a preset automatic driving algorithm;
and generating a test result of the tested vehicle according to the running state of the tested vehicle.
In a second aspect, an embodiment of the present application provides an autopilot test method, for an unmanned aerial vehicle, where the unmanned aerial vehicle is capable of carrying a model of a target object, the test method including:
Receiving a test instruction;
and moving in a traffic scene according to the test instruction so that the model of the target object moves in the traffic scene along with the unmanned aerial vehicle.
In a third aspect, an embodiment of the present application provides an autopilot test method for a vehicle under test, where the test method includes:
autonomous movement is performed in a traffic scene based on observation data of a model of a target object in the traffic scene and a preset automatic driving algorithm, wherein the traffic scene comprises an unmanned aerial vehicle and the model of the target object carried by the unmanned aerial vehicle, and the model of the target object moves in the traffic scene along with the unmanned aerial vehicle;
acquiring the running state of the tested vehicle;
and sending the running state of the tested vehicle to an automatic driving testing device so that the automatic driving testing device generates a testing result of the tested vehicle according to the running state of the tested vehicle.
In a fourth aspect, embodiments of the present application provide an autopilot test apparatus including one or more processors, working individually or together, to perform the steps of the autopilot test method described above.
In a fifth aspect, an embodiment of the present application provides a unmanned aerial vehicle, including:
The flight platform is used for flying;
one or more processors, working individually or collectively, are used to perform the steps of the automated driving test method described previously.
In a sixth aspect, an embodiment of the present application provides a movable target, where the movable target includes:
the unmanned aerial vehicle;
the model of the target object can be connected to the unmanned aerial vehicle and moves in a traffic scene along with the unmanned aerial vehicle.
In a seventh aspect, an embodiment of the present application provides a vehicle, including:
a vehicle platform;
one or more processors, working individually or collectively, are used to perform the steps of the automated driving test method described previously.
In an eighth aspect, an embodiment of the present application provides an autopilot test system, including:
unmanned plane;
the model of the target object can be carried on the unmanned aerial vehicle and moves in a traffic scene along with the unmanned aerial vehicle;
the detected vehicle can autonomously move in the traffic scene based on the observation data of the model of the target object and a preset automatic driving algorithm;
the automatic driving test device.
In a ninth aspect, embodiments of the present application provide a computer readable storage medium storing a computer program, which when executed by a processor causes the processor to implement the method described above.
The embodiment of the application provides a vehicle testing method and system based on an unmanned aerial vehicle, in particular to an automatic driving testing method, device and system, a movable target object and a storage medium, wherein the unmanned aerial vehicle, namely the unmanned aerial vehicle carries a model of the target object, the model of the target object moves along with the unmanned aerial vehicle in a traffic scene to simulate traffic participants, a tested vehicle is created with a testing scene, the model of the target object carried by the unmanned aerial vehicle has better maneuverability and sensitivity, quick response and high precision, and the tested vehicle which runs on the ground is not easy to collide with the unmanned aerial vehicle or has lighter collision loss, so that a richer testing scene can be created for the tested vehicle, for example, a higher-speed test can be performed, and the vehicle is safer
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure of embodiments of the application.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic flow chart of an autopilot test method according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of a model of an unmanned aerial vehicle carrying an object in an embodiment;
fig. 3 is a schematic structural view of a model of an unmanned aerial vehicle-mounted object in another embodiment;
fig. 4 is a schematic structural view of a model of a drone carrying an object in yet another embodiment;
FIG. 5 is a schematic illustration of observed autonomous movement of a vehicle under test based on a model of a target in one embodiment;
FIG. 6 is a schematic illustration of observed autonomous movement of a vehicle under test based on a model of a target in another embodiment;
FIG. 7 is a schematic diagram of an unmanned aerial vehicle performing an avoidance task in one embodiment;
FIG. 8 is a schematic diagram of a model of an object in one embodiment;
fig. 9 is a schematic diagram of an unmanned aerial vehicle performing an avoidance task in another embodiment;
FIG. 10 is a schematic diagram of an unmanned aerial vehicle-mounted environment operating condition simulation device in an embodiment;
FIG. 11 is a flow chart of an autopilot testing method according to another embodiment of the present application;
FIG. 12 is a flow chart of an autopilot testing method according to yet another embodiment of the present application;
FIG. 13 is a schematic block diagram of an autopilot test arrangement provided by an embodiment of the present application;
Fig. 14 is a schematic block diagram of a unmanned aerial vehicle provided by an embodiment of the present application;
FIG. 15 is a schematic block diagram of a movable target provided by an embodiment of the present application;
FIG. 16 is a schematic block diagram of a vehicle provided by an embodiment of the present application;
fig. 17 is a schematic block diagram of an autopilot test system provided by an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The flow diagrams depicted in the figures are merely illustrative and not necessarily all of the elements and operations/steps are included or performed in the order described. For example, some operations/steps may be further divided, combined, or partially combined, so that the order of actual execution may be changed according to actual situations.
Some embodiments of the present application are described in detail below with reference to the accompanying drawings. The following embodiments and features of the embodiments may be combined with each other without conflict.
With the development of scientific technology and the application of artificial intelligence technology, the automatic driving technology is rapidly developed and widely applied. The automatic driving vehicle is provided with advanced vehicle-mounted sensors, controllers, actuators and other devices, integrates the modern communication, network, artificial intelligence and other technologies, realizes intelligent information exchange and sharing among vehicles, roads, people, cloud and the like, has the functions of complex environment sensing, intelligent decision, cooperative control and the like, can realize safe, efficient, comfortable and energy-saving driving, and can finally realize a new-generation automobile operated by replacing people.
Based on the driving Automation level of the vehicle, the existing SAE J3016 standard classifies driving Automation into 6 classes, i.e., L0-L5 classes, no driving Automation (No Automation, L0), driving assistance (Driver Assistance, L1), partial driving Automation (Partial Automation, L2), conditional driving Automation (Conditional Automation, L3), high driving Automation (High Automation, L4) and Full driving Automation (L5), respectively. With the continuous increase in the level of driving automation, the degree of human participation in driving activities is becoming lower. It is anticipated that in the future there will be more autonomous vehicles traveling on the road, so that situations will arise in which autonomous vehicles and manually driven vehicles are in parallel on the road.
The test evaluation is an essential link for development of automatic driving functions, technical application and commercial popularization of the automatic driving vehicle. Unlike a conventional automobile, a test evaluation object of an automatic driving vehicle becomes a human-vehicle-environment-task strong coupling system. With the improvement of the driving automation level, the functions realized by the automation levels of different levels are gradually increased, so that the testing and verification of the functions are very challenging, and the road testing of the automatic driving vehicle is allowed by the corresponding laws and regulations of part of countries and regions to fully verify the safety of the automatic driving vehicle. In addition to road testing, a great deal of research work has been conducted around the standard systems and related assessment methods required by the test and assessment links of automated driving vehicles, government agencies, scientific research institutions, and related enterprises in various countries.
The automatic driving vehicle generally performs virtual simulation test, closed field test and real road test according to different requirements. The virtual simulation test can cover all the scenes which can be predicted within the range of the running design domain (Operational Design Domain, ODD), including corner scenes which are not easy to appear, and cover all the automatic driving functions within the range of the ODD; the closed field test can cover limit scenes in the ODD range, such as safety related accident scenes and dangerous scenes, cover typical functions of an automatic (auxiliary) driving system in a normal state, and verify simulation test results; the real road test can cover the roads of typical scene combinations in the ODD range, cover random scenes and random element combinations, and verify the capability of the automatic driving function to cope with the random scenes.
The virtual simulation test cannot well simulate the actual traffic scene, and when the tested vehicle is subjected to the closed field test and the real road test, under specific conditions, such as the conditions of faults of software and hardware, external environment interference and the like, the automatic driving function of the tested vehicle can collide with the tested vehicle if the automatic driving function cannot be successfully controlled. The real test vehicle driven by a real person is adopted for testing, the risk is high, and if the tested vehicle cannot be successfully controlled, the vehicle can be destroyed and the person can be killed. At present, a test vehicle is realized by carrying a foam panel vehicle body on a chassis with a lower expansion ratio, when a test is carried out, if a control error of a tested vehicle collides with the test vehicle, the foam panel vehicle body is crashed and scattered, the tested vehicle can be rolled from the chassis, and some chassis can be compressed passively to reduce the height under the rolling of the tested vehicle, so that the tested vehicle can smoothly pass through, and the damage loss of the tested vehicle in a dangerous scene is reduced, but the test is still higher in danger under higher vehicle speed; the chassis has complex structure and has the structures of a contraction spring and the like, and has the defects of high cost, high price, inconvenient operation and the like.
Therefore, the embodiment of the application provides a vehicle testing method and system based on an unmanned aerial vehicle, and particularly provides an automatic driving testing method, device and system, a movable target object and a storage medium.
Referring to fig. 1, fig. 1 is a schematic flow chart of a vehicle testing method based on an unmanned aerial vehicle, that is, an autopilot testing method according to an embodiment of the present application. The automatic driving test method can be applied to an automatic driving test device and is used for indicating the unmanned aerial vehicle to move in a traffic scene, so that a model of a target object carried by the unmanned aerial vehicle moves along with the unmanned aerial vehicle in the traffic scene, a test scene is created for a tested vehicle, a test result is generated according to the running state of the tested vehicle in the test scene, and the like. The detected vehicle can comprise vehicles with different automatic driving grades, such as any grade of L0-L5, and it can be understood that the detected vehicle can be a manned vehicle or an unmanned vehicle.
The unmanned aerial vehicle may be a rotary-wing unmanned aerial vehicle, such as a four-rotor unmanned aerial vehicle, a six-rotor unmanned aerial vehicle, an eight-rotor unmanned aerial vehicle, or a fixed-wing unmanned aerial vehicle.
The automatic driving testing device can be arranged on an unmanned aerial vehicle, can be arranged on a tested vehicle, or can be arranged on Road Side equipment (RSU) or terminal equipment, wherein the Road Side equipment is the core of an intelligent Road system, plays a role of connecting Road Side facilities, transmitting Road information to a vehicle and a cloud, and can realize a background communication function, an information broadcasting function and a high-precision positioning foundation enhancement function; the terminal device may include at least one of a mobile phone, a tablet computer, a notebook computer, a desktop computer, a remote controller, and the like. In some embodiments, the terminal device may further generate a corresponding test instruction according to the operation of the user, and send the generated test instruction to the unmanned aerial vehicle, so that the unmanned aerial vehicle moves according to the test instruction, for example, the unmanned aerial vehicle moves according to a preset test track.
As shown in fig. 1, the automatic driving test method according to the embodiment of the application includes steps S110 to S130.
S110, a preset test instruction is sent to the unmanned aerial vehicle, and the test instruction is used for indicating the unmanned aerial vehicle to move in a traffic scene, wherein the unmanned aerial vehicle carries a model of a target object, and the model of the target object moves along with the unmanned aerial vehicle in the traffic scene.
Referring to fig. 2 to fig. 4, fig. 2 to fig. 4 are schematic structural diagrams of a model 120 for carrying an object on a drone 110 according to various embodiments. In some embodiments, the drone 110 and the model 120 of the object together may be referred to as a moveable object 100, although the moveable object 100 is not limited to only including the drone 110 and the model 120 of the object, and may include a connector 130 for connecting the drone 110 and the model 120 of the object, for example.
Fig. 5 is a schematic diagram of autonomous movement of a vehicle under test 200 based on observation of a model of an object on a movable object 100 according to an embodiment.
And the unmanned aerial vehicle in the movable target receives the test instruction and moves according to the test instruction. In some embodiments, the test instructions are for instructing the drone to move in a traffic scene with a preset test trajectory. The test instruction is used for indicating the unmanned aerial vehicle to run straight above a certain lane, the certain lane is the same as or different from the lane of the tested vehicle, and the certain lane and the lane of the tested vehicle can be parallel or intersected; for example, the test instructions are for instructing the drone to change from above one lane to above another lane; the test instruction is used for indicating that the unmanned plane moves to the front of the tested vehicle after the tested vehicle and above the lane on the left side or the right side of the tested vehicle, and is not limited to the above. For example, the test instruction may also be used to indicate the speed of the unmanned aerial vehicle, for example, indicate the speed of the unmanned aerial vehicle on the test track, where the speeds at different positions on the test track may be the same or different.
S120, acquiring a running state of a tested vehicle in the traffic scene, wherein the tested vehicle can autonomously move in the traffic scene based on the observation data of the model of the target object and a preset automatic driving algorithm.
The model of the object in the movable object moves in the traffic scene along with the unmanned aerial vehicle, the detected vehicle can autonomously move in the traffic scene based on the observation data of the model of the object and a preset automatic driving algorithm, and it is required to be noted that the observation data of the detected vehicle on the traffic scene is not only limited to the observation data of the model of the object, for example, the observation data of other traffic participants, roadside building facilities, traffic markers and the like can be included, and the detected vehicle can autonomously move according to the observation data on the traffic scene based on the preset automatic algorithm.
In some embodiments, the unmanned aerial vehicle includes an environmental sensor, and the acquiring the driving state of the vehicle under test in the traffic scene includes: and acquiring image information in the traffic scene through an environment sensor of the unmanned aerial vehicle, and determining the running state of the tested vehicle according to the image information. For example, the unmanned aerial vehicle-mounted environmental sensor comprises a radar and/or a vision sensor, wherein the radar comprises, for example, at least one of: laser radar, ultrasonic radar, millimeter wave radar, etc., the vision sensor includes, for example, at least one of: binocular cameras, wide angle cameras, infrared cameras, etc.
Illustratively, the unmanned aerial vehicle carries the environmental sensor through a cradle head device, and the method further includes: and controlling the cradle head to adjust the posture according to the running state of the vehicle to be tested so that the vehicle to be tested is in the sensing range of the environment sensor. Of course, the present invention is not limited thereto, and for example, the visual sensor such as the focal length and the field of view of the camera may be adjusted according to the running state of the vehicle to be measured.
In some embodiments, the vehicle under test comprises a first communication module, the unmanned aerial vehicle comprises a second communication module, and the unmanned aerial vehicle can be in communication connection with the vehicle under test through the first communication module and the second communication module; the obtaining the running state of the tested vehicle in the traffic scene comprises the following steps: and acquiring the running state of the tested vehicle sent by the tested vehicle through communication connection with the tested vehicle.
In some embodiments, as shown in fig. 5, the traffic scene is provided with a road side device 300, the road side device is used for acquiring a running state of the tested vehicle, the unmanned aerial vehicle comprises a second communication module, the road side device comprises a third communication module, and the unmanned aerial vehicle can be in communication connection with the road side device through the second communication module and the third communication module; the obtaining the running state of the tested vehicle in the traffic scene comprises the following steps: and acquiring the running state of the tested vehicle acquired by the road side equipment through communication connection with the road side equipment.
S130, generating a test result of the tested vehicle according to the running state of the tested vehicle.
In some embodiments, the test result of the tested vehicle includes a data set obtained by performing a preset process on the running state of the tested vehicle.
The running state of the tested vehicle is processed into data in a preset format, such as a table, a curve, etc., to obtain a test result of the tested vehicle, which is not limited to this, and a plurality of evaluation indexes may be determined according to the running state of the tested vehicle, for example.
In some embodiments, the generating the test result of the tested vehicle according to the running state of the tested vehicle includes: and generating a test result of the tested vehicle according to the movement state of the unmanned aerial vehicle and the running state of the tested vehicle. Based on the unmanned aerial vehicle carrying the model motion of the target object, a test scene is created for the tested vehicle, different motion states of the unmanned aerial vehicle are expected to enable the corresponding motion states of the tested vehicle to be adjusted, and according to the motion states of the unmanned aerial vehicle and the test results generated by the running states of the tested vehicle, the performance of the tested vehicle on the environment observation and the performance of an automatic driving algorithm can be more accurately reflected.
Illustratively, the test results can be used to indicate at least one of: whether the detected vehicle and other traffic participants are safe, whether the detected vehicle executes actions in time, and whether the detected vehicle executes driving actions accurately. For example, the test result can be used to indicate whether the vehicle under test collides with the model of the object under test created by the model of the unmanned aerial vehicle and the model of the object, whether the deceleration of the vehicle under test exceeds a preset deceleration threshold, and the safety distance between the vehicle under test and the model of the object can be maintained when the vehicle under test bypasses the model of the object, which is not limited thereto.
In some embodiments, the performance of the vehicle under test, such as a driving assist function or an autopilot function, may be tested by observing a model of how the vehicle under test responds to the target. Various real scenes can be simulated through the model of the unmanned aerial vehicle carrying the target object, and the scenes can cause dangerous events. For example, in an automotive automatic emergency brake (Autonomous Emergency Braking, AEB) test, a model of the target may simulate sudden pedestrian crossing from the roadside of a non-pedestrian crosswalk (or behind a stopped truck on the road), evaluating the performance of the AEB system based on measured response parameters of the vehicle under test, such as the response time of the vehicle under test, critical collision time, minimum braking distance, degree of collision injury, etc.; as in the automatic driving test of a highway vehicle cut-in scene, the model of the object may simulate a motor vehicle, and the critical safety boundary of the cut-in scene may be extracted by closing the collision time (Time to Collision, TTC) of the vehicle under test, such as the model of the object, at the cut-in moment.
Optionally, the model of the target includes one or more of: flexible board, flexible membrane, inflatable gasbag. Illustratively, a plurality of flexible plates are combined to form a front side plate, a left side plate, a rear side plate, and a right side plate of the model of the object; illustratively, the model of the object includes a frame and a flexible membrane supported by the frame to form a three-dimensional model of the object; illustratively, the airbag is inflated to form a model of the three-dimensional object; illustratively, the part of the model of the object is the frame of the flex-board assembly, and the other part is the flexible membrane supported by the frame. Nor is it of course limited thereto. Alternatively, the side walls of the flexible plate, the flexible film and the air bag can be metal or nonmetal, and can be of a single-layer structure or a multi-layer structure.
Optionally, the maximum weight of the model of the target is positively correlated with the area covered by the paddle of the unmanned aerial vehicle, so that the model of the target can move with the unmanned aerial vehicle in the traffic scene and has high maneuverability, such as acceleration. Alternatively, the number of unmanned aerial vehicles carrying the model of the object may be one or more.
Optionally, the weight of the model of the object is less than or equal to a preset value, for example, in the range of 1-20 kg. For example, the weight of the model of the object is 1 kg, 2 kg, or 5 kg. Alternatively, the pattern of the object is made of a lightweight material, such as foam board, cardboard, porous board, etc.
Optionally, the airbag can be inflated or deflated when mounted on the unmanned aerial vehicle. The air bag is small in size when being deflated, convenient to store and transport, and the model weight of the target object formed during inflation is light, so that the model is beneficial to moving along with the unmanned aerial vehicle.
Optionally, the balloon is inflated by one or more of: the unmanned aerial vehicle is characterized in that the unmanned aerial vehicle is inflated by air flow generated by the blades during flight, by an inflation device on the air bag and by an external inflation device connected with the air bag. The external air charging device connected with the air bag can be arranged on the unmanned aerial vehicle or not. When the inflatable air bag is inflated in various modes, the air quantity of the air bag can be better ensured, so that the air bag can better serve as a traffic participant.
Alternatively, the model of the object may be a traffic participant, e.g., a different type of vehicle, a person of a different size, a different animal, a bicycle, a motorcycle or a balance car, etc.
The model of the object has an exterior shape similar to a traffic participant, the traffic participant including one or more of: the vehicle, the non-vehicle, the pedestrian, the animal, and the like are not limited to this, and may be, for example, a plastic bag or the like on a lane. The exterior shape of the object model 120 shown in fig. 2 is similar to a motor vehicle, the exterior shape of the object model 120 shown in fig. 3 is similar to a non-motor vehicle, and the exterior shape of the object model 120 shown in fig. 4 is similar to a pedestrian, so that the exterior characteristics of a traffic participant can be simulated through the object model 120, and the movement of the traffic participant can be simulated by the unmanned aerial vehicle 110 with the movement of the object model 120. Models 120 of targets of different traffic participants may be used for different test scenarios.
Referring to fig. 6, the traffic scene has a plurality of movable objects 100 for simulating pedestrians, vehicles and non-vehicles, so as to perform a test on the tested vehicle 200 in a complex test scene. In other embodiments, a plurality of movable objects 100 are used in a traffic scene to simulate a motor vehicle, a part of the movable objects 100 are located in the same lane with the tested vehicle 200, the lanes of the rest of the movable objects 100 are adjacent to or intersect with the lanes of the tested vehicle 200, and the movable objects 100 can be used for testing a single scene or continuous scenes of the tested vehicle in an environment with a plurality of traffic participants, so as to realize the cooperative work of the plurality of movable objects 100, for example, the plurality of movable objects 100 can preset paths and accurately execute based on the test scene, and can also adjust the movement of the unmanned aerial vehicle in real time based on the running state of the tested vehicle (the vehicle with the ADAS/AD to be tested), and the movable objects 100 can also be based on wireless communication or self-perception real-time obstacle avoidance.
Optionally, the model of the target object and/or the material of all or part of the outer surface of the unmanned aerial vehicle is set according to the radar type of the tested vehicle. So that the radar detection characteristics of the model of the target object are similar to those of the corresponding traffic participants, the radar detection data of the model of the target object by the tested vehicle can be more similar to the radar detection data of the real traffic participants. Optionally, the detection signal of the radar is any one of laser, millimeter wave and ultrasonic wave, and is not limited to the above.
Optionally, the outer surface is configured to detect signals for the radar in one or more of: diffuse reflection characteristics, refractive characteristics, absorption characteristics. The outer surface of the unmanned aerial vehicle can absorb detection signals of the radar, the model of the target object can reflect the detection signals of the radar, the radar of the detected vehicle or other traffic participants can better detect the model of the target object, interference of the detection signals reflected by the unmanned aerial vehicle on the radar can be prevented, and the test scene is more in line with the real motion scene of the detected vehicle.
Optionally, the unmanned aerial vehicle is rotatably connected with the model of the target object. In some embodiments, when the tested vehicle collides with the model of the target object, the model of the target object rotates relative to the unmanned aerial vehicle under the effect of collision, so that damage to the unmanned aerial vehicle caused by collision can be reduced or avoided, and damage to the tested vehicle caused by collision can be reduced or avoided. In some embodiments, when the running state of the tested vehicle does not meet the preset running state condition, for example, the model of the target object may collide with the running state condition, the unmanned aerial vehicle may be controlled to adjust the pose of the model of the target object, for example, the model of the target object may rotate relative to the unmanned aerial vehicle, the bottom end of the model of the target object may be far away from the ground and the tested vehicle, for example, the tested vehicle may pass under the model of the target object, damage caused by collision to the model of the target object and the unmanned aerial vehicle may be reduced or avoided, and damage to the tested vehicle caused by collision may be reduced or avoided.
Optionally, referring to fig. 7, the unmanned aerial vehicle 110 is detachably connected to the model 120 of the target object. Facilitating the connection of the drone 110 to the model 120 of another object, or the connection of the model 120 of the object to another drone 110, and facilitating storage and transportation. For example, when the model 120 of the object is damaged or when different test scenes need to be created, the model 120 of the object mounted on the unmanned aerial vehicle 110 can be replaced conveniently.
Optionally, the connection between all or part of the model of the target object and the unmanned aerial vehicle can be automatically disconnected in a preset state.
In some embodiments, referring to fig. 7, the connection between the model 120 of the object and the unmanned aerial vehicle 110 can be automatically disconnected in a preset state, which is based on disconnection of at least two components of the connection 130 between the model 120 of the object and the unmanned aerial vehicle 110. Illustratively, the connector 130 connects the model 120 of the target with the drone 110 by at least one of: the clamping and buckling, interference fit, magnetic attraction and adhesive connection are not limited to the above.
In some embodiments, all or part of the model of the target object is easily detached from the unmanned aerial vehicle under the action of a tensile force, for example, when the tested vehicle collides with the model of the target object, the generated tensile force enables all or part of the model of the target object to be disconnected from the unmanned aerial vehicle, so that damage to the model of the target object and the unmanned aerial vehicle caused by collision is reduced or avoided, the unmanned aerial vehicle can be conveniently and quickly kept away from the tested vehicle, and damage to the tested vehicle caused by collision is reduced or avoided.
Optionally, the preset state includes: the tensile force between the model of the target object and the unmanned aerial vehicle is larger than a preset threshold value; and/or the tensile force direction between the model of the target object and the unmanned aerial vehicle is in a preset direction. By designing the connection structure between the model of the target object and the unmanned aerial vehicle, the reliable connection between the model of the target object and the unmanned aerial vehicle can be ensured, and the connection between all or part of the model of the target object and the unmanned aerial vehicle can be disconnected by the tensile force generated when the tested vehicle collides with the model of the target object. For example, when a component of a pull force between the model of the target and the unmanned aerial vehicle in a horizontal direction is greater than a specific value, the model of the target is disconnected from the unmanned aerial vehicle.
In some embodiments, the model of the target is connected with the unmanned aerial vehicle based on a connection, and the connection is disconnected from the model of the target in response to a trigger instruction to enter the preset state. The unmanned aerial vehicle is provided with an electromagnetic lock catch, the electromagnetic lock catch can be clamped with a corresponding structure on the model of the target object, and the electromagnetic lock catch can be controlled to unlock the clamping according to the trigger instruction, so that the model of the target object is disconnected with the unmanned aerial vehicle.
Optionally, the triggering instruction is generated by sensing a tensile force between the model of the target object and the unmanned aerial vehicle by a sensor, and the sensor is arranged on the model of the target object and/or the unmanned aerial vehicle. The sensor may include, for example, a hall sensor and/or a tension sensor, although not limited thereto. Illustratively, one end of the tension sensor is connected to the unmanned aerial vehicle, and the other end is connected to the model of the target object. The model of the target object is close to a preset position of the unmanned aerial vehicle under the action of tensile force, and the sensor at the preset position senses the approach of the model of the target object, so that the trigger instruction can be generated.
Alternatively, as shown in fig. 8 and 9, the object model 120 includes a plurality of members 121, and the plurality of members 121 are detachably connected. Facilitating assembly and storage, and transportation of the object model 120. Illustratively, at least two adjacent ones 121 of the plurality of components 121 are connected by one or more of: the clamping and buckling, interference fit, magnetic attraction and adhesive connection. For example, as shown in fig. 8 and 9, adjacent components 121 are detachably connected by a connector 122.
Optionally, as shown in fig. 9, at least two parts 121 of the plurality of parts 121 are connected to different unmanned aerial vehicles 110. Multiple drones 110 may provide the model 120 of the object with greater mobility, and higher speed, acceleration.
Alternatively, as shown in fig. 10, the unmanned aerial vehicle 110 may be equipped with an environmental condition simulation device 140, and the environmental condition simulation device 140 may simulate one or more of the following environments: rainfall, dense fog, dust, illumination.
In some embodiments, the method further comprises: controlling the operation of the environment working condition simulation device carried by the unmanned aerial vehicle, so that the environment working condition simulation device simulates one or more of the following environments: rainfall, dense fog, dust, illumination. As shown in fig. 10, the unmanned aerial vehicle 100 may receive a control instruction sent by the terminal device 400 through the communication device 111, and control the environmental condition simulation device 140 to simulate one or more of the following environments according to the control instruction: rainfall, dense fog, dust, illumination. Therefore, the performance of the tested vehicle under various environmental working conditions can be realized.
In some embodiments, the method further comprises: and when the running state of the tested vehicle does not meet the preset running state condition, controlling the unmanned aerial vehicle to execute a preset avoidance task so as to enable the model of the target object to avoid the tested vehicle. The collision of the tested vehicle and the model of the target object is prevented, the model damage of the target object is prevented from influencing the test progress, and the time spent on replacing, maintaining and assembling the model of the target object can be reduced or avoided. The cost of the automatic driving test is reduced, more accurate test can be performed, the test of a single scene can be performed, the test of the automatic driving function in a continuous operation scene can be realized in some embodiments, and the test verification of a high-level automatic driving system is supported.
In some embodiments, the controlling the unmanned aerial vehicle to execute a preset avoidance task to cause the model of the target object to avoid the detected vehicle includes: and controlling the unmanned aerial vehicle to adjust the motion state in the horizontal direction and/or the vertical direction so as to enable the model of the target object to avoid the tested vehicle.
The model of the object is moved in front of a lane where the vehicle to be tested is located, and when the vehicle to be tested collides with the model of the object, the unmanned aerial vehicle is controlled to accelerate, so that the model of the object is far away from the vehicle to be tested.
Illustratively, as shown in fig. 7, the unmanned aerial vehicle 110 moves upward in a vertical direction, with the model 120 of the object being separated from the ground, so that the vehicle under test passes under the model 120 of the object, or damage generated at the time of collision can be reduced.
Illustratively, the acceleration of the unmanned aerial vehicle in the vertical direction when adjusting the motion state is determined according to the weight of the model of the target object. For example, the acceleration is lower when the weight of the model of the object is heavier, so that the drone can successfully bring the model of the object away from the vehicle under test.
In some embodiments, the controlling the unmanned aerial vehicle to execute a preset avoidance task to cause the model of the target object to avoid the detected vehicle includes: and controlling the unmanned aerial vehicle to adjust the pose of the model of the target object so as to enable the bottom end of the model of the target object to be far away from the ground. For example, by rotating the model of the target relative to the drone, the bottom end of the model of the target may be remote from the ground and the vehicle under test, for example, the vehicle under test may pass under the model of the target, reducing or avoiding damage to the model of the target and the drone from collisions, and reducing or avoiding damage to the vehicle under test from collisions.
In some embodiments, referring to fig. 8, the controlling the unmanned aerial vehicle to execute a preset avoidance task to cause the model of the target object to avoid the vehicle under test includes: and controlling the unmanned aerial vehicle to execute a preset avoidance task so as to separate a plurality of parts which are detachably connected with the model of the target object from each other. The components are separated from each other, so that the loss caused by collision with the tested vehicle can be reduced.
Illustratively, after the plurality of components are separated from each other, the number of components located on the travel path of the vehicle under test decreases. The tested vehicle runs along the running path to collide with fewer parts or passes through a gap after the parts are separated, so that damage caused by collision is reduced or avoided.
Illustratively, the plurality of components are separated from one another under the influence of a depressed wind field of the unmanned aerial vehicle blade when the unmanned aerial vehicle performs the avoidance task. For example, the power output of the unmanned aerial vehicle may be adjusted in magnitude and/or direction so that a down-draft wind field of the blade of the unmanned aerial vehicle faces the model of the target object, so that the detachably connected components of the model of the target object are separated from each other under the action of the down-draft wind field.
Illustratively, the plurality of components are separated from one another under the driving of a mechanical structure when the unmanned aerial vehicle performs the avoidance task. For example, the mechanical structure includes a driving device and a plurality of support arms connected to the driving device, at least two of the plurality of components are connected to different support arms, and the driving device can drive the support arms to act so as to separate the at least two components from each other. The drive means may comprise, for example, an electric drive means and/or an elastic drive means.
The components may be separated from each other by a function of a wind field of the unmanned aerial vehicle blade when the unmanned aerial vehicle performs the avoidance task and a driving function of a mechanical structure when the unmanned aerial vehicle performs the avoidance task. The farther the separation distance can be, the wider the gap left for the passage of the vehicle under test.
In some embodiments, referring to fig. 9, the controlling the unmanned aerial vehicle to perform a preset avoidance task to separate a plurality of detachably connected components of the model of the target object from each other includes: and controlling the unmanned aerial vehicles to be far away from the tested vehicle in different directions, so that the unmanned aerial vehicles are mutually separated from each other and are far away from the tested vehicle in different directions with a plurality of parts of the model of the target object. As shown in fig. 9, the left unmanned aerial vehicle moves leftwards with the left component, and the right unmanned aerial vehicle moves rightwards with the right component, away from the vehicle under test, and leaves a gap for the vehicle under test to pass through.
In some embodiments, the connection between adjacent components is easily broken under tension, for example under wind field or under driving of mechanical structure, or under tension when different unmanned aerial vehicles move in different directions, so that the components are convenient to be far away from the vehicle to be tested.
In other embodiments, the connection between adjacent components can be broken according to a control command. For example, when the unmanned aerial vehicle is controlled to execute a preset avoidance task, the connection piece between the adjacent components can be controlled to disconnect the adjacent components. After the connection of adjacent parts is disconnected, the vehicle to be tested can be more quickly kept away from under the action of a wind field or under the driving action of a mechanical structure or under the action of tensile force when different unmanned aerial vehicles move in different directions.
Optionally, the method further comprises: and when the running state of the tested vehicle does not meet the preset running state condition, disconnecting all or part of the model of the target object from the unmanned aerial vehicle. For example, when the running state of the vehicle under test does not meet a preset running state condition, the connection of the model for connecting the target object with the unmanned aerial vehicle and the unmanned aerial vehicle is triggered. The unmanned aerial vehicle collision detection method can prevent the model of the target object from affecting the safety of the unmanned aerial vehicle when the model of the target object collides with the detected vehicle, and can reduce the damage of the collision to the detected vehicle. The cost of the model of the object is low or easy to assemble, so that the loss caused by collision and the time cost can be reduced.
In some embodiments, whether the running state of the vehicle under test satisfies the running state condition is determined according to the running state of the vehicle under test obtained by one sampling period or according to the trend of the running state of the vehicle under test obtained by a plurality of sampling periods. The one sampling period may be the last sampling period or one of the plurality of sampling periods, for example, determining a running state with highest confidence in running states sampled by the plurality of sampling periods, and determining the running state with highest confidence as the running state of the tested vehicle. For example, according to the trend of the running state of the tested vehicle obtained in a plurality of sampling periods, whether the automatic driving function of the tested vehicle is invalid or unreliable may be determined.
In some embodiments, the driving state of the vehicle under test includes at least one of: the motion parameters of the detected vehicle, the observation information of the detected vehicle on the traffic scene, the control information of the detected vehicle, and the relative motion relation between the detected vehicle and other objects in the traffic scene are not limited to the above. And when one or more running states of the tested vehicle do not meet preset running state conditions, controlling the unmanned aerial vehicle to adjust the running state so as to enable the model of the target object to be far away from the tested vehicle.
Exemplary, the acquiring the driving state of the vehicle under test in the traffic scene includes: and acquiring the observation information of the traffic scene acquired by the environment sensor carried by the tested vehicle and/or acquiring the control information of the tested vehicle determined by the automatic driving module of the tested vehicle.
For example, the motion parameters of the vehicle under test include at least one of: speed, acceleration, position, for example, the lane in which the position is located.
For example, the vehicle under test may obtain the observation information of the traffic scene through one or more of environmental sensors, such as a camera, millimeter wave radar, and laser radar, where the observation information includes at least one of the following: target tracking information, lane line identification information, drivable region information, traffic flow information.
For example, the vehicle under test may determine control information of the vehicle under test by an autopilot module, the control information including, for example, at least one of: track planning information, behavior interpretation information, diagnosis information, braking signals, steering signals, acceleration signals and man-machine interaction warning information.
For example, the relative motion relationship between the vehicle under test and the rest of the objects in the traffic scene includes at least one of the following: the relative positional relationship, the relative velocity, and the relative acceleration are not limited to this, of course. It should be noted that the relative motion relationship may be acquired by the vehicle under test, or acquired by the unmanned aerial vehicle, or acquired by the roadside device, or may be determined according to a running state of the vehicle under test and a flight state of the unmanned aerial vehicle, for example, the relative motion relationship between the vehicle under test and the unmanned aerial vehicle is determined according to a position of the vehicle under test and a position of the unmanned aerial vehicle.
In some embodiments, whether the running state of the vehicle under test satisfies the running state condition is determined according to a relative positional relationship of the vehicle under test and a model of the object. The relative position relationship between the vehicle to be tested and the model of the target object may be determined according to the relative position relationship between the vehicle to be tested and the unmanned aerial vehicle, for example, the relative position relationship between the vehicle to be tested and the unmanned aerial vehicle may be determined as the relative position relationship between the vehicle to be tested and the model of the target object.
For example, when the relative distance between the detected vehicle and the model of the target object is smaller than or equal to a preset value, for example, 1 meter, it is determined that the running state of the detected vehicle does not meet the preset running state condition, and the unmanned aerial vehicle may be controlled to execute a preset avoidance task, so that the model of the target object avoids the detected vehicle. The unmanned aerial vehicle can be controlled to execute a preset avoidance task in time when the distance between the detected vehicle and the model of the target object is smaller than the safety distance, so that the model of the target object is far away from the detected vehicle, and collision is prevented.
When the relative distance between the detected vehicle and the model of the target object is smaller than or equal to a preset value, and the detected vehicle and the model of the target object move in parallel or soon, the running state of the detected vehicle is determined to not meet the preset running state condition, and the unmanned aerial vehicle can be controlled to execute a preset avoidance task so that the model of the target object avoids the detected vehicle. Collision can be prevented more accurately.
In other embodiments, whether the running state of the vehicle under test satisfies the running state condition is determined according to a preset type of evaluation index, which is determined according to the running state of the vehicle under test. For example, a value of a preset evaluation index may be determined according to one or more running states of the tested vehicle, whether the running state meets the running state condition is determined according to the value of the evaluation index, for example, when the value of the evaluation index exceeds a preset range, it is determined that the running state of the tested vehicle does not meet the preset running state condition, and the unmanned aerial vehicle may be controlled to execute a preset avoidance task, so that the model of the target object avoids the tested vehicle.
Illustratively, the evaluation index is determined based on a relative positional relationship and a relative speed of the vehicle under test and a model of the target object. For example, the predetermined type of evaluation index includes: and determining the collision time and/or critical collision deceleration according to the relative position relation and the relative speed of the tested vehicle and the model of the target object.
Wherein the collision time (Time to Collision, TTC) can be determined from: the time from the relative positional relationship to the collision when the model of the vehicle under test and the object maintains the relative speed may be determined, for example, from the ratio of the relative distance of the vehicle under test and the model of the object to the relative speed. At the current moment, the speed of the rear vehicle is greater than that of the front vehicle, if the two vehicles keep the original speed and the original running track unchanged (namely, the driver or the automatic driving system is not supposed to take the risk avoidance action), the collision can occur at a certain moment according to the current speed and the track, and then the time period from the current moment to the collision is the collision time. The smaller the collision time, the greater the probability of an accident. For example, when the collision time is less than or equal to a preset time threshold, it is determined that the running state of the vehicle under test does not satisfy a preset running state condition.
Wherein the critical collision deceleration (Deceleration Rate to Avoid a Crash, DRAC) is determined from the following decelerations: the deceleration required by the vehicle to be measured or the deceleration required by the model of the object just when the model of the object is collision-avoided, for example, the critical deceleration for collision is determined according to the ratio of the square of the relative speed of the vehicle to be measured and the model of the object to the relative distance. Two vehicles with a closer distance from each other are said critical deceleration for collision, or may be said critical deceleration for collision avoidance, if the rear vehicle speed is greater than the front vehicle speed, and there is a greater probability of collision when the critical deceleration for collision of the rear vehicle exceeds the maximum deceleration (maximum available decelerate, MADR) that the vehicle performance or the passengers can withstand. For example, when the critical deceleration for collision is greater than or equal to a preset deceleration threshold value, it is determined that the running state of the vehicle under test does not satisfy a preset running state condition.
Illustratively, the predetermined type of evaluation index includes: and when the tested vehicle is in the running state, the probability of collision between the tested vehicle and the model of the target object is increased. For example, the probability of collision may be determined from the driving state of the vehicle under test, or from the driving state of the vehicle under test and the flight state of the unmanned aerial vehicle, e.g. based on a model of a machine learning object. For example, if the probability of the model collision of the vehicle under test with the target object while the vehicle under test maintains the running state is greater than or equal to a probability threshold, it is determined that the running state of the vehicle under test does not satisfy the running state condition.
In some other embodiments, whether the running state of the vehicle under test satisfies the running state condition is determined according to the acquired running state of the vehicle under test and an expected running state of the vehicle under test, which is determined according to the test instruction.
The unmanned aerial vehicle is capable of moving in a traffic scene according to the test instruction, a test scene is created for the tested vehicle, and if the tested vehicle can run in an expected running state corresponding to the test instruction, the tested vehicle at least cannot collide with the model of the target object. It can be understood that the expected driving state is a driving state in which the vehicle to be tested can avoid collision with the model of the target object when the unmanned aerial vehicle moves in a traffic scene according to the test instruction.
In some embodiments, the expected driving state may be determined according to a setting operation of a user, for example. Of course, also without limitation, this may be determined, for example, by analysis statistics of a large number of traffic scene images.
For example, if the running state of the detected vehicle obtained in step S120 does not match the expected running state of the detected vehicle, it may be determined that the running state of the detected vehicle does not meet the preset running state condition, and there is a risk of collision, and the unmanned aerial vehicle may be controlled to perform a preset avoidance task, so that the model of the target object avoids the detected vehicle.
For example, if the unmanned aerial vehicle decelerates in front of the lane in which the vehicle under test is located according to the test command, the expected driving state of the vehicle under test includes a negative longitudinal acceleration and/or a non-zero lateral acceleration. When the unmanned aerial vehicle decelerates in front of the detected vehicle, if the detected vehicle can decelerate and/or change lanes, collision with a model of a target object can be avoided; however, if the detected vehicle does not actually decelerate or change the lane, but maintains the vehicle speed unchanged or accelerates, the risk of collision exists, and it is determined that the running state of the detected vehicle does not meet the preset running state condition, the unmanned aerial vehicle can be controlled to execute the preset avoidance task, so that the model of the target object avoids the detected vehicle. Optionally, the range of the expected acceleration may be determined according to the test instruction, and if the actual acceleration of the tested vehicle exceeds the range of the expected acceleration, it may also be determined that the running state of the tested vehicle does not meet the preset running state condition.
For example, if the unmanned aerial vehicle decelerates in front of the tested vehicle according to the test command, the expected driving state of the tested vehicle includes: the prompt module carried by the tested vehicle outputs first prompt information. When the unmanned aerial vehicle decelerates in front of the detected vehicle, if the prompt module of the detected vehicle outputs first prompt information, a driver can be reminded to control the detected vehicle to decelerate and/or change lanes, and collision with a model of a target object is avoided. If the prompt module of the tested vehicle does not output the first prompt information, the risk of collision exists, and the running state of the tested vehicle is determined to not meet the preset running state condition. It will be appreciated that in some embodiments, the running state of the tested vehicle may further include a prompt message output by the tested vehicle.
For example, if the unmanned aerial vehicle enters the current lane of the tested vehicle in front of the left or the right of the tested vehicle according to the test instruction, the expected running state of the tested vehicle includes: negative longitudinal acceleration and/or non-zero lateral acceleration. When the unmanned aerial vehicle is in parallel with the position of the front left or front right of the detected vehicle, which is close to the detected vehicle, if the detected vehicle can decelerate and/or change lanes, collision with a model of a target object can be avoided; however, if the detected vehicle does not actually decelerate or change the lane, but maintains the vehicle speed unchanged or accelerates, the risk of collision exists, and it is determined that the running state of the detected vehicle does not meet the preset running state condition, the unmanned aerial vehicle can be controlled to execute the preset avoidance task, so that the model of the target object avoids the detected vehicle. Optionally, the range of the expected acceleration may be determined according to the test instruction, and if the actual acceleration of the tested vehicle exceeds the range of the expected acceleration, it may also be determined that the running state of the tested vehicle does not meet the preset running state condition.
In some embodiments, a subsequent movement trend of the detected vehicle may be determined according to the observed information of the detected vehicle on the traffic scene, and whether there is a risk of collision is determined according to the flight state of the unmanned aerial vehicle corresponding to the test instruction and the subsequent movement trend of the vehicle, and whether the running state of the detected vehicle meets a preset running state condition is determined.
In some embodiments, an expected value of the observed information of the measured vehicle on the traffic scene may be determined according to the test instruction, when an actual observed value of the measured vehicle is the same as or close to the expected value, collision with the model of the target object may be avoided, if the actual observed value of the measured vehicle is different from the expected value, there is a probability of collision, and it may be determined that the running state of the measured vehicle does not satisfy a preset running state condition.
For example, if the unmanned aerial vehicle decelerates in front of the lane where the tested vehicle is located according to the test command, the expected running state of the tested vehicle includes at least one of the following: and the acceleration of the model of the target object observed by the tested vehicle is negative, and the relative distance between the model of the target object observed by the tested vehicle and the tested vehicle is reduced.
In some embodiments, the subsequent movement trend of the tested vehicle may be determined according to the control information output by the automatic driving module of the tested vehicle, and whether there is a risk of collision is determined according to the running state of the model of the target object corresponding to the test instruction and the subsequent movement trend of the vehicle, and whether the running state of the tested vehicle meets the preset running state condition is determined.
In some embodiments, the expected value of the control information of the tested vehicle may be determined according to the test instruction, when the actual control information of the tested vehicle is the same as or close to the expected value, collision with the model of the target object may be avoided, if the actual control information of the tested vehicle is different from the expected value, there is a probability of collision, and it may be determined that the running state of the tested vehicle does not meet the preset running state condition.
For example, if the unmanned aerial vehicle decelerates in front of the lane where the tested vehicle is located according to the test instruction, the expected running state of the tested vehicle includes: control information for controlling the deceleration and/or lane change of the vehicle under test.
Optionally, the running state of the tested vehicle includes a state of an execution system of the tested vehicle; the execution system of the tested vehicle comprises at least one of the following components: braking system, steering system, actuating system. The running track of the tested vehicle can be determined according to the state of the execution system of the tested vehicle, whether the tested vehicle collides with the model of the target object or not is determined according to the running track, and if the tested vehicle collides with the model of the target object, the running state of the tested vehicle can be determined to not meet the preset running state condition.
For example, if the unmanned aerial vehicle decelerates in front of the lane where the tested vehicle is located according to the test command, the expected running state of the tested vehicle includes at least one of the following: the braking system brakes, the steering system steers and the driving system reduces the throttle.
For example, if the unmanned aerial vehicle drives into the current lane of the tested vehicle in front of the left or the right of the tested vehicle according to the test instruction, the expected driving state of the tested vehicle includes at least one of the following: the braking system brakes, the steering system steers and the driving system reduces the throttle.
For example, when the unmanned aerial vehicle decelerates in front of the tested vehicle or enters the current lane of the tested vehicle in front of the left or right of the tested vehicle, if the actual state of the execution system of the tested vehicle can be the same according to the expected value of the state of the execution system corresponding to the test instruction, the tested vehicle decelerates or reduces the acceleration or changes lanes, so that the collision with the model of the target object can be avoided; however, if the actual state of the execution system of the detected vehicle is different from the expected value of the state, so that the detected vehicle does not actually decelerate or change the track, but the vehicle speed is maintained unchanged or accelerated, the risk of collision exists, so that it can be determined that the running state of the detected vehicle does not meet the preset running state condition, and the unmanned aerial vehicle can be controlled to execute the preset avoidance task, so that the model of the target object is avoided.
In other embodiments, whether the running state of the vehicle under test meets the running state condition is determined according to whether the observed information of the vehicle under test on the traffic scene is consistent with the observed information of the unmanned aerial vehicle on the traffic scene. For example, if the observed information of the detected vehicle on the traffic scene is inconsistent with the observed information of the unmanned aerial vehicle on the traffic scene, there is a risk of collision, so that it can be determined that the running state of the detected vehicle does not meet the preset running state condition, and the unmanned aerial vehicle can be controlled to execute a preset avoidance task, so that the model of the target object avoids the detected vehicle; if the observed information of the detected vehicle on the traffic scene is consistent with the observed information of the unmanned aerial vehicle on the traffic scene, the observed information of the detected vehicle can be determined to be accurate, an accurate decision and control can be made according to the observed information, and the possibility of collision is low. For example, if the unmanned aerial vehicle observes that a traffic light displays a red light or a speed limit sign in the near front when traveling in front of a lane where the tested vehicle is located according to the test instruction, and the tested vehicle does not observe the red light or the speed limit sign, the model of the unmanned aerial vehicle and the target object decelerates and brakes, but the tested vehicle does not decelerate and brakes, there is a risk of collision, so that it can be determined that the traveling state of the tested vehicle does not meet the preset traveling state condition.
According to the automatic driving test method provided by the embodiment of the application, the model of the target object is carried by the unmanned aerial vehicle, the model of the target object simulates traffic participants along with the movement of the unmanned aerial vehicle in the traffic scene, a test scene is created for the tested vehicle, the model of the target object carried by the unmanned aerial vehicle has the advantages of better mobility and sensitivity, quick response, high precision, and less possibility of collision or less loss of collision between the tested vehicle and the unmanned aerial vehicle, so that a richer test scene can be created for the tested vehicle, for example, higher-speed test can be performed, and the test is safer.
According to the automatic driving test method provided by the embodiment of the application, through various escape modes, collision can be avoided, or almost no loss is caused after collision, and continuous simulation test of multiple groups of dangerous scenes can be performed.
Optionally, the model of the target object is formed by inflating a small amount of soft plates or films, and has the advantages of low cost, easy splicing and the like.
Optionally, the detected vehicle and the movable object can communicate with each other in real time, so that the movable object can simulate a real scene more accurately according to the running state of the detected vehicle.
Optionally, for a complex traffic flow test scene, the complex traffic flow test scene can be completed without combining other entity vehicles and expensive auxiliary driving equipment, and a multi-vehicle cooperative scene can be completed safely and efficiently.
Optionally, complex test scenes of pedestrians, motor vehicles and non-motor vehicles can be completed, real-time communication and perception can be achieved among the pedestrians, the motor vehicles and the non-motor vehicles, and road conditions of real complex blocks can be simulated.
Optionally, not only key parameters and motion paths can be set in advance, but also the sensing communication function in the test can be met, the real-time change self-adjustment, the multi-vehicle cooperative change, the mutual reference and the mutual adjustment are realized.
Optionally, complex environmental conditions such as rainfall, dense fog, dust, illumination and the like can be simulated, and expected functional safety under severe environments can be verified in test work.
Optionally, the flexibility of the unmanned aerial vehicle-based test system is relatively high, the reaction flexibility of the unmanned aerial vehicle is superior to that of a ground moving platform, the maximum deceleration is high, and the speed is high (the limit dangerous working condition can be simulated).
Referring to fig. 11 in combination with the above embodiment, fig. 11 is a schematic flow chart of a vehicle testing method based on an unmanned aerial vehicle, that is, an autopilot testing method according to another embodiment of the application.
The automatic driving test method is used for an unmanned aerial vehicle, and the unmanned aerial vehicle can carry a model of a target object. The test method includes steps S210 to S220.
S210, receiving a test instruction.
S220, moving in a traffic scene according to the test instruction, so that the model of the target object moves in the traffic scene along with the unmanned aerial vehicle.
The specific principle and implementation manner of the autopilot test method provided in the embodiment of the present application are similar to those of the autopilot test method in the foregoing embodiment, and are not repeated here.
Referring to fig. 12 in combination with the above embodiment, fig. 12 is a schematic flow chart of a vehicle testing method based on an unmanned aerial vehicle, that is, an autopilot testing method according to still another embodiment of the present application.
The automatic driving test method is used for the tested vehicle, and comprises steps S310 to S330.
S310, autonomously moving in a traffic scene based on observation data of a model of a target object in the traffic scene and a preset automatic driving algorithm, wherein the traffic scene comprises an unmanned aerial vehicle and the model of the target object carried by the unmanned aerial vehicle, and the model of the target object moves in the traffic scene along with the unmanned aerial vehicle;
S320, acquiring the running state of the tested vehicle;
s330, the running state of the tested vehicle is sent to an automatic driving testing device, so that the automatic driving testing device generates a testing result of the tested vehicle according to the running state of the tested vehicle.
The specific principle and implementation manner of the autopilot test method provided in the embodiment of the present application are similar to those of the autopilot test method in the foregoing embodiment, and are not repeated here.
Referring to fig. 13 in combination with the above embodiments, fig. 13 is a schematic block diagram of an automatic driving test device 500 according to an embodiment of the application.
The autopilot test apparatus 500 may be disposed on an unmanned aerial vehicle, or may be disposed on a vehicle under test, or may be disposed on a Road Side Unit (RSU) or a terminal device. When the automatic driving testing device is arranged on the unmanned aerial vehicle, the unmanned aerial vehicle adjusting track can be controlled faster, more in real time and more accurately.
The autopilot test arrangement 500 includes one or more processors 501, the one or more processors 501 working individually or collectively to perform the autopilot test method described previously.
Illustratively, the autopilot test arrangement 500 further includes a memory 502.
The processor 501 and the memory 502 are illustratively connected by a bus 503, such as an I2C (Inter-integrated Circuit) bus.
Specifically, the processor 501 may be a Micro-controller Unit (MCU), a central processing Unit (Central Processing Unit, CPU), a digital signal processor (Digital Signal Processor, DSP), or the like.
Specifically, the Memory 502 may be a Flash chip, a Read-Only Memory (ROM) disk, an optical disk, a U-disk, a removable hard disk, or the like.
The processor 501 is configured to execute a computer program stored in the memory 502, and implement the foregoing autopilot test method when the computer program is executed.
The specific principle and implementation manner of the autopilot test apparatus 500 provided in the embodiment of the present application are similar to those of the autopilot test method in the foregoing embodiment, and will not be repeated here.
Referring to fig. 14 in combination with the above embodiments, fig. 14 is a schematic block diagram of an unmanned aerial vehicle 600 according to an embodiment of the present application.
The drone 600 includes a flight platform 610, the flight platform 610 being configured for flight. The drone 600 also includes one or more processors 601, the one or more processors 601 working individually or together to perform the steps of the automated driving test method described previously.
The specific principle and implementation manner of the unmanned aerial vehicle 600 provided in the embodiment of the present application are similar to those of the automatic driving test method in the foregoing embodiment, and will not be repeated here.
Referring to fig. 15 in combination with the above embodiments, fig. 15 is a schematic block diagram of a movable object 700 according to an embodiment of the present application.
The movable object 700 comprises the aforementioned unmanned aerial vehicle 600, and a model 710 of the object, the model 710 of the object being connectable to the unmanned aerial vehicle for movement with the unmanned aerial vehicle in a traffic scene.
The specific principle and implementation of the movable object 700 provided in the embodiment of the present application are similar to those of the unmanned aerial vehicle 600 in the previous embodiment, and will not be repeated here.
Referring to fig. 16 in combination with the above embodiments, fig. 16 is a schematic block diagram of a vehicle 800 according to an embodiment of the present application.
The vehicle 800 includes a vehicle platform 810, and one or more processors 801, the one or more processors 801 working individually or collectively to perform the steps of the automated driving test method described previously.
The specific principle and implementation of the vehicle 800 provided in the embodiment of the present application are similar to those of the automatic driving test method in the foregoing embodiment, and will not be repeated here.
Referring to fig. 17 in conjunction with the above embodiments, fig. 17 is a schematic block diagram of an unmanned vehicle-based vehicle testing system, otherwise referred to as an autopilot testing system 900, provided by an embodiment of the present application.
An autopilot test system 900 comprising:
unmanned plane 910;
a model 920 of a target object, which can be mounted on the unmanned aerial vehicle 910, and move in a traffic scene along with the unmanned aerial vehicle 910;
a vehicle under test 930, wherein the vehicle under test 930 is capable of autonomously moving in the traffic scene based on the observation data of the model of the target object and a preset automatic driving algorithm;
the aforementioned automated driving test apparatus 500.
In some embodiments, the autopilot testing apparatus 500 may be disposed on the drone 910.
The specific principle and implementation of the autopilot test system 900 provided in the embodiment of the present application are similar to those of the autopilot test method in the foregoing embodiment, and will not be repeated here.
The embodiment of the present application also provides a computer readable storage medium storing a computer program, where the computer program when executed by a processor causes the processor to implement the steps of the automatic driving test method provided in the foregoing embodiment.
It is to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
It should also be understood that the term "and/or" as used in the present application and the appended claims refers to any and all possible combinations of one or more of the associated listed items, and includes such combinations.
While the application has been described with reference to certain preferred embodiments, it will be understood by those skilled in the art that various changes and substitutions of equivalents may be made and equivalents will be apparent to those skilled in the art without departing from the scope of the application. Therefore, the protection scope of the application is subject to the protection scope of the claims.

Claims (76)

  1. An automated driving test method, the test method comprising:
    transmitting a preset test instruction to an unmanned aerial vehicle, wherein the test instruction is used for indicating the unmanned aerial vehicle to move in a traffic scene, and the unmanned aerial vehicle carries a model of a target object which moves along with the unmanned aerial vehicle in the traffic scene;
    Acquiring a running state of a vehicle to be tested in the traffic scene, wherein the vehicle to be tested can autonomously move in the traffic scene based on observation data of a model of the target object and a preset automatic driving algorithm;
    and generating a test result of the tested vehicle according to the running state of the tested vehicle.
  2. The automated driving test method of claim 1, wherein the model of the target comprises one or more of: flexible board, flexible membrane, inflatable gasbag.
  3. The automated driving test method of claim 2, wherein the airbag is capable of inflating or deflating when mounted on the drone.
  4. The autopilot test method of claim 3 wherein the airbag is inflated by one or more of: the unmanned aerial vehicle is characterized in that the unmanned aerial vehicle is inflated by air flow generated by the blades during flight, by an inflation device on the air bag and by an external inflation device connected with the air bag.
  5. The autopilot test method of claim 1 wherein the maximum weight of the model of the target is positively correlated with the area covered by the unmanned aerial vehicle's propeller disk.
  6. The automated driving test method of claim 1, wherein the model of the target has an exterior shape similar to a traffic participant, the traffic participant comprising one or more of: motor vehicles, non-motor vehicles, pedestrians, animals.
  7. The automated driving test method of claim 1, wherein the model of the object and/or the material of all or part of the outer surface of the drone is set according to the radar type of the vehicle under test.
  8. The autopilot testing method of claim 7 wherein the outer surface is configured to detect signals for the radar that one or more of: diffuse reflection characteristics, refractive characteristics, absorption characteristics.
  9. The method for testing the automatic driving according to claim 8, wherein the detection signal of the radar is any one of laser, millimeter wave and ultrasonic wave.
  10. The automated driving test method of claim 1, wherein the model of the drone and the target are detachably connected.
  11. The automated driving test method of claim 1, wherein the model of the drone and the target are rotatably connected.
  12. The automated driving test method of claim 1, wherein all or part of the model of the object is automatically disconnectable from the unmanned aerial vehicle in a preset state.
  13. The autopilot test method of claim 12 wherein the preset state includes: the tensile force between the model of the target object and the unmanned aerial vehicle is larger than a preset threshold value; and/or the tensile force direction between the model of the target object and the unmanned aerial vehicle is in a preset direction.
  14. The automated driving test method of claim 12, wherein the automatically disconnecting the model of the object from the drone in a preset state is based on disconnecting at least two components of a connection between the model of the object and the drone.
  15. The automated driving test method of claim 12, wherein the model of the target is connected to the unmanned aerial vehicle based on a connection that disconnects the model of the target from the unmanned aerial vehicle in response to a trigger instruction to enter the preset state.
  16. The automated driving test method of claim 15, wherein the trigger instruction is generated by a sensor sensing a pull force between the model of the target object and the drone, the sensor being disposed on the model of the target object and/or the drone.
  17. The autopilot testing method of claim 1 wherein the model of the target includes a plurality of components that are detachably connected.
  18. The autopilot testing method of claim 17 wherein at least two of the plurality of components connect different drones.
  19. The autopilot testing method of any one of claims 1-18 wherein the method further comprises:
    and when the running state of the tested vehicle does not meet the preset running state condition, controlling the unmanned aerial vehicle to execute a preset avoidance task so as to enable the model of the target object to avoid the tested vehicle.
  20. The method for automatic driving test according to claim 19, wherein the controlling the unmanned aerial vehicle to perform a preset avoidance task to cause the model of the target object to avoid the vehicle under test includes:
    and controlling the unmanned aerial vehicle to adjust the motion state in the horizontal direction and/or the vertical direction so as to enable the model of the target object to avoid the tested vehicle.
  21. The automated driving test method of claim 20, wherein the acceleration of the unmanned aerial vehicle in adjusting the state of motion in the vertical direction is determined from the weight of the model of the target.
  22. The automated driving test method of claim 19, wherein the controlling the unmanned aerial vehicle to perform a preset avoidance task to cause the model of the target to avoid the vehicle under test comprises:
    and controlling the unmanned aerial vehicle to adjust the pose of the model of the target object so as to enable the bottom end of the model of the target object to be far away from the ground.
  23. The automated driving test method of claim 19, wherein the controlling the unmanned aerial vehicle to perform a preset avoidance task to cause the model of the target to avoid the vehicle under test comprises:
    and controlling the unmanned aerial vehicle to execute a preset avoidance task so as to separate a plurality of parts which are detachably connected with the model of the target object from each other.
  24. The automated driving test method of claim 23, wherein the number of parts located on the path of travel of the vehicle under test decreases after the plurality of parts are separated from one another.
  25. The autopilot testing method of claim 23 wherein the plurality of components are separated from one another under the influence of a depressed wind field of unmanned aerial vehicle blades when the unmanned aerial vehicle performs the avoidance task and/or under the driving action of mechanical structure when the unmanned aerial vehicle performs the avoidance task.
  26. The automated driving test method of claim 23, wherein controlling the unmanned aerial vehicle to perform a preset avoidance task to separate the detachably connected components of the model of the target from each other comprises:
    and controlling the unmanned aerial vehicles to be far away from the tested vehicle in different directions, so that the unmanned aerial vehicles are mutually separated from each other and are far away from the tested vehicle in different directions with a plurality of parts of the model of the target object.
  27. The autopilot testing method of any one of claims 23-26 wherein at least two adjacent components of the plurality of components are connected by one or more of: the clamping and buckling, interference fit, magnetic attraction and adhesive connection.
  28. The autopilot testing method of any one of claims 1-18 wherein the method further comprises:
    and when the running state of the tested vehicle does not meet the preset running state condition, disconnecting all or part of the model of the target object from the unmanned aerial vehicle.
  29. The autopilot testing method of any one of claims 1-28 wherein the method further comprises:
    Controlling the operation of the environment working condition simulation device carried by the unmanned aerial vehicle, so that the environment working condition simulation device simulates one or more of the following environments: rainfall, dense fog, dust, illumination.
  30. The automated driving test method of any one of claims 1-28, wherein the test instructions are for instructing the drone to move in a traffic scenario with a preset test trajectory.
  31. The automated driving test method of any one of claims 1-28, wherein the generating a test result of the vehicle under test based on the driving state of the vehicle under test comprises:
    and generating a test result of the tested vehicle according to the movement state of the unmanned aerial vehicle and the running state of the tested vehicle.
  32. The automated driving test method of any one of claims 1-28, wherein the test results of the vehicle under test comprise a data set that is a pre-set of the driving state of the vehicle under test.
  33. The automated driving test method of any one of claims 19 to 28, wherein whether the running state of the vehicle under test satisfies the running state condition is determined from the running state of the vehicle under test obtained by one sampling period or from a trend of change in the running state of the vehicle under test obtained by a plurality of sampling periods.
  34. The automated driving test method of claim 33, wherein the driving condition of the vehicle under test comprises at least one of:
    the method comprises the steps of measuring the motion parameters of a measured vehicle, observing information of the measured vehicle on a traffic scene, control information of the measured vehicle and the relative motion relation between the measured vehicle and other objects in the traffic scene.
  35. The automated driving test method of claim 33, wherein whether the driving state of the vehicle under test satisfies the driving state condition is determined based on a relative positional relationship of the vehicle under test and a model of the target object.
  36. The automated driving test method of claim 33, wherein whether the driving state of the vehicle under test satisfies the driving state condition is determined based on a preset type of evaluation index, the evaluation index being determined based on the driving state of the vehicle under test.
  37. The automated driving test method of claim 33, wherein whether the driving state of the vehicle under test satisfies the driving state condition is determined from the obtained driving state of the vehicle under test and an expected driving state of the vehicle under test, the expected driving state being determined from the test instruction.
  38. The automated driving test method of claim 33, wherein whether the driving state of the vehicle under test satisfies the driving state condition is determined based on whether the observed information of the vehicle under test for the traffic scene is consistent with the observed information of the model of the object for the traffic scene.
  39. An automated driving test method for an unmanned aerial vehicle, the unmanned aerial vehicle being capable of carrying a model of a target object, the test method comprising:
    receiving a test instruction;
    and moving in a traffic scene according to the test instruction so that the model of the target object moves in the traffic scene along with the unmanned aerial vehicle.
  40. The automated driving test method of claim 39, wherein the model of the drone and the target are detachably connected.
  41. The automated driving test method of claim 39, wherein the model of the drone and the target are rotatably connected.
  42. The automated driving test method of claim 39, wherein all or part of the model of the target object is automatically disconnected from the drone in a preset state.
  43. The automated driving test method of claim 39, wherein the model of the target comprises a plurality of components, the plurality of components being detachably connected.
  44. The automated driving test method of claim 43, wherein at least two of the plurality of components are connected to different drones.
  45. The automated driving test method of any one of claims 39-44, wherein the method further comprises:
    and when the running state of the tested vehicle in the traffic scene does not meet the preset running state condition, executing a preset avoidance task so as to enable the model of the target object to avoid the tested vehicle.
  46. The automated driving test method of claim 45, wherein performing a preset avoidance task to cause the model of the target to avoid the vehicle under test comprises:
    and adjusting the motion state in the horizontal direction and/or the vertical direction so as to enable the model of the target object to avoid the tested vehicle.
  47. The automated driving test method of claim 45, wherein performing a preset avoidance task to cause the model of the target to avoid the vehicle under test comprises:
    And adjusting the pose of the model of the target object so as to enable the bottom end of the model of the target object to be far away from the ground.
  48. The automated driving test method of claim 45, wherein performing a preset avoidance task to cause the model of the target to avoid the vehicle under test comprises:
    and executing a preset avoidance task so as to separate the detachably connected parts of the model of the target object from each other.
  49. The automated driving test method of any one of claims 39-44, wherein the method further comprises:
    and when the running state of the tested vehicle in the traffic scene does not meet the preset running state condition, disconnecting all or part of the model of the target object.
  50. An automated driving test method for a vehicle under test, the test method comprising:
    autonomous movement is performed in a traffic scene based on observation data of a model of a target object in the traffic scene and a preset automatic driving algorithm, wherein the traffic scene comprises an unmanned aerial vehicle and the model of the target object carried by the unmanned aerial vehicle, and the model of the target object moves in the traffic scene along with the unmanned aerial vehicle;
    Acquiring the running state of the tested vehicle;
    and sending the running state of the tested vehicle to an automatic driving testing device so that the automatic driving testing device generates a testing result of the tested vehicle according to the running state of the tested vehicle.
  51. The automated driving test method of claim 50, wherein the driving status of the vehicle under test is further for: and when the automatic driving testing device determines that the running state of the tested vehicle does not meet the preset running state condition, controlling the unmanned aerial vehicle to execute a preset avoidance task so as to enable the model of the target object to avoid the tested vehicle.
  52. The automated driving test method of claim 50, wherein the driving status of the vehicle under test is further for: and when the automatic driving testing device determines that the running state of the tested vehicle does not meet the preset running state condition, disconnecting all or part of the model of the target object from the unmanned aerial vehicle.
  53. An autopilot testing apparatus comprising one or more processors operable individually or collectively to perform the steps of the autopilot testing method of any one of claims 1-38.
  54. An unmanned aerial vehicle, comprising:
    the flight platform is used for flying;
    one or more processors, working individually or collectively, to perform the steps of the autopilot testing method of any one of claims 39-49.
  55. A movable target, the movable target comprising:
    the drone of claim 54;
    the model of the target object can be connected to the unmanned aerial vehicle and moves in a traffic scene along with the unmanned aerial vehicle.
  56. The movable object of claim 55, wherein the model of the object comprises one or more of: flexible board, flexible membrane, inflatable gasbag.
  57. The movable target according to claim 56, wherein the airbag is capable of being inflated or deflated when mounted on the unmanned aerial vehicle.
  58. The movable target according to claim 57, wherein the balloon is inflated by one or more of: the unmanned aerial vehicle is characterized in that the unmanned aerial vehicle is inflated by air flow generated by the blades during flight, by an inflation device on the air bag and by an external inflation device connected with the air bag.
  59. The movable object of claim 55 wherein the maximum weight of the model of the object is positively correlated with the area covered by the unmanned aerial vehicle's paddles.
  60. The movable object of claim 55 wherein the model of the object has an exterior shape similar to a traffic participant, the traffic participant comprising one or more of: motor vehicles, non-motor vehicles, pedestrians, animals.
  61. The movable object of claim 55, wherein the model of the object and/or the material of all or part of the outer surface of the drone is set according to the radar type of the vehicle under test in the traffic scene.
  62. The movable target of claim 61, wherein the outer surface is configured to detect signals for the radar in one or more of the following: diffuse reflection characteristics, refractive characteristics, absorption characteristics.
  63. The movable target of claim 62, wherein the radar detection signal is any one of laser, millimeter wave, and ultrasonic.
  64. The movable object of claim 55, wherein the model of the object is detachably connectable to the drone.
  65. The movable object of claim 55 wherein the model of the object is capable of being rotatably coupled to the unmanned aerial vehicle.
  66. The movable object of claim 55, wherein all or part of the model of the object is automatically disconnected from the drone in a preset state.
  67. The movable target of claim 66, wherein the predetermined state comprises: the tensile force between the model of the target object and the unmanned aerial vehicle is larger than a preset threshold value; and/or the tensile force direction between the model of the target object and the unmanned aerial vehicle is in a preset direction.
  68. The movable object according to claim 66, wherein the model of the object is automatically disconnected from the unmanned aerial vehicle in a predetermined state based on the disconnection of at least two parts of a connection between the model of the object and the unmanned aerial vehicle.
  69. The movable object according to claim 66, wherein the model of the object is connected to the unmanned aerial vehicle based on a connection that disconnects the model of the object from the unmanned aerial vehicle in response to a trigger instruction to enter the preset state.
  70. The movable target of claim 69, wherein the trigger instruction is generated by a sensor sensing a pulling force between a model of the target and the drone, the sensor being disposed on the model of the target and/or the drone.
  71. The movable object of claim 55 wherein the model of the object comprises a plurality of components, the plurality of components being detachably connected.
  72. The movable target of claim 71, wherein at least two of the plurality of components are connected to different drones.
  73. A vehicle, characterized by comprising:
    a vehicle platform;
    one or more processors, working individually or collectively, to perform the steps of the autopilot testing method of any one of claims 50-52.
  74. An autopilot test system comprising:
    unmanned plane;
    the model of the target object can be carried on the unmanned aerial vehicle and moves in a traffic scene along with the unmanned aerial vehicle;
    the detected vehicle can autonomously move in the traffic scene based on the observation data of the model of the target object and a preset automatic driving algorithm;
    the autopilot test unit of claim 53.
  75. The autopilot testing system of claim 74 wherein the autopilot testing arrangement is disposed on the drone.
  76. A computer readable storage medium storing a computer program which, when executed by a processor, causes the processor to implement:
    An autopilot testing method as claimed in any one of claims 1 to 38; and/or
    An autopilot testing method as claimed in any one of claims 39 to 49; and/or
    An autopilot testing method as claimed in any one of claims 50 to 52.
CN202180087983.5A 2021-05-28 2021-05-28 Vehicle testing method and system based on unmanned aerial vehicle Pending CN116783461A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/096991 WO2022246849A1 (en) 2021-05-28 2021-05-28 Vehicle test method and system based on unmanned aircraft

Publications (1)

Publication Number Publication Date
CN116783461A true CN116783461A (en) 2023-09-19

Family

ID=84229466

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180087983.5A Pending CN116783461A (en) 2021-05-28 2021-05-28 Vehicle testing method and system based on unmanned aerial vehicle

Country Status (2)

Country Link
CN (1) CN116783461A (en)
WO (1) WO2022246849A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116183242A (en) * 2023-02-22 2023-05-30 吉林大学 Automatic driving test method, equipment and storage medium based on rainfall simulation environment

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102007035474B4 (en) * 2007-07-26 2009-06-25 Volkswagen Ag System and method for functional testing of a driver assistance system
EP2781904B1 (en) * 2013-03-22 2015-09-16 Continental Safety Engineering International GmbH Test device for simulating driving situations
CN107218941B (en) * 2017-05-25 2021-05-28 百度在线网络技术(北京)有限公司 Test method and device applied to unmanned automobile
CN208012871U (en) * 2018-03-20 2018-10-26 浙江合众新能源汽车有限公司 A kind of auxiliary stage property people's equipment for pilotless automobile experiment
CN208443587U (en) * 2018-08-01 2019-01-29 上海汽车集团股份有限公司 A kind of collision protection mechanism
CN110197036B (en) * 2019-06-04 2023-02-28 云动(上海)汽车技术有限公司 Intelligent driving evaluation system and evaluation method
CN210912663U (en) * 2019-11-22 2020-07-03 上海汽车集团股份有限公司 Dummy simulating pedestrian

Also Published As

Publication number Publication date
WO2022246849A1 (en) 2022-12-01

Similar Documents

Publication Publication Date Title
JP6922739B2 (en) Information processing equipment, information processing methods, and programs
JP6773040B2 (en) Information processing system, information processing method of information processing system, information processing device, and program
CN110362077B (en) Unmanned vehicle emergency hedge decision making system, method and medium
CN111795832B (en) Intelligent driving vehicle testing method, device and equipment
KR100904767B1 (en) Test evaluation apparatus of collision avoidance system
US20170158175A1 (en) Collision mitigated braking for autonomous vehicles
CN112819968B (en) Test method and device for automatic driving vehicle based on mixed reality
US11496707B1 (en) Fleet dashcam system for event-based scenario generation
CN110663073B (en) Policy generation device and vehicle
CN110271545A (en) Controller of vehicle, control method for vehicle and storage medium
CN102910168A (en) Image processing device
CN108604284A (en) Scene of the accident restoring method, device and movement monitoring equipment
CN114261404B (en) Automatic driving method and related device
CN107992056A (en) It is a kind of based on divide domain centralized calculation unit automobile intelligent drive calculating platform terminal
CN113631452A (en) Lane change area acquisition method and device
WO2022246853A1 (en) Safety test method for vehicle system and test vehicle
CN112162540B (en) Manned vehicle experiment platform for ADAS experiment and automatic driving test
CN115195741A (en) Vehicle control method, vehicle control device, vehicle, and storage medium
CN113892088A (en) Test method and system
Li et al. A unified, scalable and replicable approach to development, implementation and HIL evaluation of autonomous shuttles for use in a smart city
CN115793665A (en) Vehicle road cooperative obstacle avoidance method and system based on unmanned truck-mounted remote driving system and storage medium
CN116783461A (en) Vehicle testing method and system based on unmanned aerial vehicle
JP2023554108A (en) Control method and control device
KR102103450B1 (en) Advanced driving assistance system based on car distributed simulation
US11932242B1 (en) Fleet dashcam system for autonomous vehicle operation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination