WO2022246849A1 - 基于无人飞行器的车辆测试方法及系统 - Google Patents

基于无人飞行器的车辆测试方法及系统 Download PDF

Info

Publication number
WO2022246849A1
WO2022246849A1 PCT/CN2021/096991 CN2021096991W WO2022246849A1 WO 2022246849 A1 WO2022246849 A1 WO 2022246849A1 CN 2021096991 W CN2021096991 W CN 2021096991W WO 2022246849 A1 WO2022246849 A1 WO 2022246849A1
Authority
WO
WIPO (PCT)
Prior art keywords
model
target
automatic driving
vehicle
under test
Prior art date
Application number
PCT/CN2021/096991
Other languages
English (en)
French (fr)
Inventor
张玉新
王璐瑶
赵福民
李鹏飞
杜昕一
苏泽文
Original Assignee
吉林大学
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 吉林大学, 深圳市大疆创新科技有限公司 filed Critical 吉林大学
Priority to PCT/CN2021/096991 priority Critical patent/WO2022246849A1/zh
Priority to CN202180087983.5A priority patent/CN116783461A/zh
Publication of WO2022246849A1 publication Critical patent/WO2022246849A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M17/00Testing of vehicles
    • G01M17/007Wheeled or endless-tracked vehicles

Definitions

  • the present application relates to the technical field of automatic driving, in particular to a vehicle testing method and system based on an unmanned aerial vehicle.
  • Autonomous driving means that the driver does not need to operate the vehicle, but automatically collects environmental information through the sensors on the vehicle, and automatically drives according to the environmental information.
  • the test and evaluation of automatic driving function is an indispensable part of vehicle development, technology application and commercial promotion. Vehicles with automatic driving function move from the laboratory to mass production, and a large number of tests are required to prove the stability of various application functions and performance , robustness, reliability, etc.
  • real vehicles or some devices can be used to simulate traffic participants to create a driving scene for the tested vehicle (autonomous driving vehicle) to verify the ability of the tested vehicle's automatic driving function to cope with the driving scene .
  • test scenarios are relatively limited, for example, only low speed tests can be carried out.
  • This application provides a vehicle testing method and system based on an unmanned aerial vehicle. Specifically, it provides an automatic driving test method, device, system, movable target and storage medium, which can provide a device with better mobility and sensitivity to the vehicle under test. Create driving scenarios, and test scenarios can be richer and safer.
  • the embodiment of the present application provides an automatic driving test method, the test method comprising:
  • test instructions are used to instruct the drone to move in the traffic scene, wherein the drone is equipped with a model of the target, and the model of the target follows the The UAV moves in the traffic scene;
  • a test result of the tested vehicle is generated.
  • the embodiment of the present application provides an automatic driving test method for an unmanned aerial vehicle, the unmanned aerial vehicle can carry a model of a target object, and the test method includes:
  • an embodiment of the present application provides an automatic driving test method for a vehicle under test, the test method comprising:
  • the traffic scene includes the UAV and the model of the target object carried by the UAV.
  • the model of the target moves with the drone in the traffic scene;
  • an embodiment of the present application provides an automatic driving test device, including one or more processors working individually or jointly to execute the steps of the foregoing automatic driving test method.
  • the embodiment of the present application provides a drone, including:
  • One or more processors working individually or jointly, are used to execute the steps of the aforementioned automatic driving testing method.
  • the embodiment of the present application provides a movable target, and the movable target includes:
  • the model of the target can be connected to the UAV and move with the UAV in the traffic scene.
  • the embodiment of the present application provides a vehicle, including:
  • One or more processors working individually or jointly, are used to execute the steps of the aforementioned automatic driving testing method.
  • an automatic driving test system including:
  • the model of the target can be carried on the drone and move with the drone in the traffic scene;
  • the vehicle under test can move autonomously in the traffic scene based on the observation data of the model of the target object and a preset automatic driving algorithm;
  • the embodiment of the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the processor implements the above method.
  • the embodiment of the present application provides a vehicle testing method and system based on an unmanned aerial vehicle.
  • an automatic driving test method, device, system, movable target and storage medium are provided.
  • the model of the target object moves with the drone in the traffic scene to simulate traffic participants, and creates a test scene for the vehicle under test.
  • the model of the target object carried by the drone has better mobility and sensitivity. Fast response, high precision, the vehicle under test driving on the ground is not easy to collide with the drone or the damage of the collision is relatively light, so it can create a richer test scene for the vehicle under test, for example, it can perform higher speed testing and is safer
  • Fig. 1 is a schematic flow chart of an automatic driving test method provided in an embodiment of the present application
  • Fig. 2 is a schematic structural view of a model of an unmanned aerial vehicle carrying a target in an embodiment
  • Fig. 3 is a structural schematic diagram of a model of an unmanned aerial vehicle carrying a target in another embodiment
  • Fig. 4 is the structural representation of the model of unmanned aerial vehicle carrying object in another embodiment
  • Fig. 5 is a schematic diagram of the observed voluntary movement of the vehicle under test based on the model of the target in an embodiment
  • Fig. 6 is a schematic diagram of the observed voluntary movement of the vehicle under test based on the model of the target object in another embodiment
  • FIG. 7 is a schematic diagram of an unmanned aerial vehicle performing an avoidance task in an embodiment
  • Fig. 8 is a schematic structural view of a model of a target in an embodiment
  • Fig. 9 is a schematic diagram of an unmanned aerial vehicle performing an avoidance task in another embodiment
  • Fig. 10 is a schematic diagram of an unmanned aerial vehicle carrying an environmental working condition simulation device in an embodiment
  • Fig. 11 is a schematic flowchart of an automatic driving test method provided by another embodiment of the present application.
  • Fig. 12 is a schematic flow chart of an automatic driving test method provided in another embodiment of the present application.
  • FIG. 13 is a schematic block diagram of an automatic driving test device provided in an embodiment of the present application.
  • Fig. 14 is a schematic block diagram of an unmanned aerial vehicle provided by an embodiment of the present application.
  • Fig. 15 is a schematic block diagram of a movable target provided by an embodiment of the present application.
  • Fig. 16 is a schematic block diagram of a vehicle provided by an embodiment of the present application.
  • Fig. 17 is a schematic block diagram of an automatic driving test system provided by an embodiment of the present application.
  • Self-driving vehicles are equipped with advanced on-board sensors, controllers, actuators and other devices, and integrate modern communication and network, artificial intelligence and other technologies to realize intelligent information exchange and sharing between vehicles, roads, people, and the cloud.
  • intelligent decision-making, collaborative control and other functions can realize "safe, efficient, comfortable, energy-saving driving, and finally realize a new generation of vehicles that can replace human operation.
  • the existing SAE J3016 standard divides driving automation into six levels, that is, L0-L5 levels, which are No Automation (L0) and Driver Assistance (L1). , partial driving automation (Partial Automation, L2), conditional driving automation (Conditional Automation, L3), high driving automation (High Automation, L4) and full driving automation (Full Automation, L5).
  • L0-L5 levels which are No Automation (L0) and Driver Assistance (L1).
  • partial driving automation Partial Automation, L2
  • conditional driving automation Conditional Automation, L3
  • High Automation, L4 High Automation, L4
  • Full Driving automation Full Driving Automation
  • Test evaluation is an indispensable and important link in the development of automatic driving functions, technology applications and commercial promotion of autonomous vehicles. Unlike traditional cars, the test and evaluation object of autonomous vehicles becomes a human-vehicle-environment-task strong coupling system. With the improvement of the level of driving automation, the functions realized by different levels of automation are increasing step by step, which makes it extremely challenging to test and verify.
  • Some countries and regions have issued corresponding laws and regulations to allow autonomous vehicles to conduct road tests, in order to Fully validate the safety of autonomous vehicles.
  • government agencies, scientific research institutes, and related companies in various countries have carried out a lot of research work on the standard system and related evaluation methods required for the test and evaluation of autonomous vehicles.
  • Self-driving vehicles generally conduct virtual simulation tests, closed field tests, and real road tests according to different needs.
  • the virtual simulation test can cover all predictable scenarios within the scope of the Operational Design Domain (ODD), including corner scenes that are not easy to appear, covering all automatic driving functions within the ODD range
  • the closed field test can cover all scenarios within the ODD range
  • Extreme scenarios such as safety-related accident scenarios and dangerous scenarios, cover the typical functions of the automatic (assisted) driving system under normal conditions, and verify the simulation test results
  • real road tests can cover roads with typical scenarios within the ODD range, covering random scenarios and The combination of random elements verifies the ability of the automatic driving function to deal with random scenarios.
  • the virtual simulation test cannot simulate the actual traffic scene very well.
  • the closed field test and the real road test are carried out on the tested vehicle, under some specific circumstances, such as software and hardware failures, external environment interference, etc., the automatic driving of the tested vehicle
  • the function cannot be successfully controlled, it may collide with the test vehicle.
  • the test is carried out with a real test vehicle driven by a real person, and the risk is relatively high. If the tested vehicle cannot be successfully controlled, it may cause car crash and death.
  • some test vehicles are realized by mounting a foam panel body on a chassis with a relatively low external expansion ratio. During the test, if the vehicle under test collides with the test vehicle due to control errors, the foam panel body will be knocked apart, and the vehicle under test can be removed from the test vehicle.
  • the chassis is rolled over, and some chassis can be passively compressed and lowered under the rolling of the tested vehicle, so that the tested vehicle can pass smoothly, reducing the damage loss of the tested vehicle in dangerous scenarios, but when testing at a higher speed , there is still a high risk; and the chassis structure is complex, with structures such as retractable springs, which have the disadvantages of high cost, high price, and inconvenient operation.
  • the embodiment of the present application provides a vehicle testing method and system based on an unmanned aerial vehicle.
  • an automatic driving test method, device, system, movable target and storage medium are provided.
  • the model of the target object, the model of the target object follows the movement of the drone in the traffic scene to simulate the traffic participants, and creates a test scene for the vehicle under test.
  • the mobility and sensitivity of the model of the target object carried by the drone are more Good, fast response, high precision, the vehicle under test driving on the ground is not easy to collide with the drone or the damage of the collision is relatively light, so it can create a richer test scene for the vehicle under test, for example, it can perform higher speed tests, and more Safety.
  • FIG. 1 is a schematic flowchart of a vehicle testing method based on an unmanned aerial vehicle, ie, an automatic driving testing method, provided in an embodiment of the present application.
  • the automatic driving test method can be applied in the automatic driving test device for instructing the UAV to move in the traffic scene, so that the model of the target object carried by the UAV moves with the UAV in the traffic scene , creating a test scenario for the vehicle under test, and generating test results according to the driving state of the vehicle under test in the test scenario.
  • the tested vehicle may include vehicles of different automatic driving levels, such as vehicles of any level in L0-L5. It is understandable that the tested vehicle may be a manned vehicle or an unmanned vehicle.
  • the UAV can be a rotor UAV, such as a quadrotor UAV, a hexacopter UAV, an octorotor UAV, or a fixed-wing UAV.
  • a rotor UAV such as a quadrotor UAV, a hexacopter UAV, an octorotor UAV, or a fixed-wing UAV.
  • the automatic driving test device can be set on the drone, of course, it can also be set on the vehicle under test, or it can also be set on the roadside equipment (Road Side Unit, RSU) or terminal equipment, where the roadside equipment is an intelligent road
  • the core of the system plays the role of connecting roadside facilities and transmitting road information to vehicles and the cloud. It can realize background communication functions, information broadcasting functions, and high-precision positioning ground enhancement functions; terminal devices can include mobile phones, tablet computers, laptops, At least one of desktop computers, remote controls, etc.
  • the terminal device can also generate corresponding test instructions according to user operations, and send the generated test instructions to the drone, so that the drone can move according to the test instructions, for example, make the drone move according to the test instructions. Preset test track movement.
  • the automatic driving testing method of the embodiment of the present application includes steps S110 to S130.
  • test instruction is used to instruct the UAV to move in the traffic scene, wherein, the UAV carries a model of the target object, and the model of the target object follows The drone moves in the traffic scene.
  • FIG. 2 to FIG. 4 are schematic structural diagrams of a model 120 carrying a target on the drone 110 in different embodiments.
  • the UAV 110 and the model 120 of the target can be referred to as a movable target 100, and of course the movable target 100 is not limited to the model 120 including the UAV 110 and the target, for example A connector 130 for connecting the drone 110 to the model 120 of the target may also be included.
  • FIG. 5 is a schematic diagram of the autonomous movement of the vehicle under test 200 based on the observation of the model of the target on the movable target 100 in an embodiment.
  • the drone in the movable object receives the test instruction and moves according to the test instruction.
  • the test instruction is used to instruct the UAV to move on a preset test track in a traffic scene.
  • the test instruction is used to instruct the UAV to drive straight above a certain lane, the certain lane is the same or different from the lane of the vehicle under test, and the certain lane is the same as the lane of the vehicle under test.
  • the lanes of the vehicle under test can be parallel or intersecting; exemplary, the test instruction is used to instruct the UAV to change from above one lane to another lane; exemplary, the test instruction is used to instruct no one to The machine moves behind the vehicle under test to the front of the vehicle under test through the left or right lane of the vehicle under test, but of course it is not limited thereto.
  • the test instruction can also be used to indicate the speed of the UAV, for example, to indicate the speed of the UAV on the test track, and the speeds of different positions on the test track may be the same or different.
  • the model of the target in the movable target moves with the UAV in the traffic scene, and the vehicle under test can move autonomously in the traffic scene based on the observation data of the model of the target and the preset automatic driving algorithm
  • the observation data of the traffic scene by the tested vehicle is not limited to the observation data of the model of the target object, for example, it may also include observation data of other traffic participants, roadside building facilities, traffic markers, etc., the measured vehicle It can move autonomously according to the observation data of the traffic scene based on the preset automatic algorithm.
  • the drone includes an environmental sensor
  • the acquiring the driving state of the vehicle under test in the traffic scene includes: acquiring image information in the traffic scene through the environmental sensor of the drone , determining the driving state of the vehicle under test according to the image information.
  • the environmental sensors carried by the drone include radar and/or visual sensors, wherein the radar includes at least one of the following: lidar, ultrasonic radar, millimeter wave radar, etc., and the visual sensor includes at least one of the following, for example: Binocular camera, wide-angle camera, infrared camera, etc.
  • the UAV carries the environment sensor through the pan-tilt device, and the method further includes: controlling the pan-tilt to adjust the attitude according to the driving state of the vehicle under test, so that the vehicle under test is in the within the sensing range of the environmental sensor.
  • the visual sensor such as the focal length and field of view of the camera, can also be adjusted according to the driving state of the measured vehicle.
  • the vehicle under test includes a first communication module
  • the drone includes a second communication module
  • the drone can communicate with the first communication module, the second communication module, and the The communication connection of the vehicle under test
  • the acquisition of the driving state of the vehicle under test in the traffic scene includes: obtaining the driving status of the vehicle under test sent by the vehicle under test through the communication connection with the vehicle under test state.
  • the traffic scene is provided with a roadside device 300
  • the roadside device is used to obtain the driving state of the vehicle under test
  • the unmanned aerial vehicle includes a second communication module
  • the roadside device includes a third communication module
  • the UAV can communicate with the roadside device through the second communication module and the third communication module
  • Measuring the running state of the vehicle includes: acquiring the running state of the vehicle under test acquired by the roadside device through a communication connection with the roadside device.
  • the test result of the tested vehicle includes a data set obtained by performing preset processing on the driving state of the tested vehicle.
  • the driving state of the tested vehicle is processed into data in a preset format, such as tables, curves, etc., to obtain the test results of the tested vehicle, and of course it is not limited thereto. Determine a number of evaluation indicators based on the driving state of the test vehicle.
  • the generating the test result of the vehicle under test according to the driving state of the vehicle under test includes: generating The test results of the vehicle under test. Create a test scene for the vehicle under test based on the model movement of the drone with the target object. It is expected that the different motion states of the drone can make the vehicle under test adjust the motion state accordingly.
  • the test results generated by the motion state of the man-machine and the driving state of the vehicle under test can more accurately reflect the observation of the environment by the vehicle under test and the performance of the automatic driving algorithm.
  • the test result can be used to indicate at least one of the following: whether the tested vehicle and other traffic participants are safe, whether the tested vehicle performs actions in a timely manner, and whether the tested vehicle performs driving behavior accurately .
  • the test result can be used to indicate whether the vehicle under test collides with the model of the target object under the test scenario created by the model of the drone and the target object, whether the vehicle under test brakes Whether the deceleration at the time exceeds the preset deceleration threshold, whether the vehicle under test can keep a safe distance from the model of the target object when detouring around the model of the target object, etc., of course, are not limited thereto.
  • the performance of the vehicle under test may be tested by observing a model of how the vehicle under test responds to an object.
  • a wide variety of real-world scenarios can be simulated by drones carrying models of objects that could lead to dangerous incidents.
  • the model of the target can simulate a pedestrian suddenly jumping out from the side of the non-pedestrian crossing (or jumping out from behind a stopped truck on the road), according to the measured
  • the response parameters of the tested vehicle such as the response time of the tested vehicle, the critical collision time, the minimum braking distance, the degree of collision damage, etc., are used to evaluate the performance of the AEB system;
  • the model can simulate motor vehicles, and the critical safety boundary of the cut-in scene can be extracted by paying attention to the time to collision (Time to Collision, TTC) of the vehicle under test such as the model with the target object at the cut-in moment.
  • TTC Time to Collision
  • the model of the target includes one or more of the following: a flexible board, a flexible film, and an inflatable air bag.
  • a plurality of flexible boards are combined to form the front side board, left side board, rear side board, and right side board of the model of the target object;
  • the model of the target object includes a frame and a flexible film, The flexible film is supported by a frame to form a three-dimensional model of the target;
  • the airbag is inflated to form a three-dimensional model of the target;
  • part of the model of the target is a frame composed of flexible boards, and the other part is a frame Supported flexible membrane.
  • the side walls of the flexible board, flexible film, and airbag can be metal or non-metal, and can be a single-layer structure or a multi-layer structure.
  • the maximum weight of the model of the target is positively correlated with the area covered by the paddle disc of the drone, so that the model of the target can move with the drone in the traffic scene, and has Higher maneuverability, such as acceleration.
  • the weight of the model of the target object is less than or equal to a preset value, for example, the range of the preset value is, for example, 1-20 kg.
  • the weight of the model of the target object is 1 kg, 2 kg, or 5 kg.
  • the model of the target object is made of lightweight materials, such as foam board, cardboard, porous board and the like.
  • the airbag when the airbag is mounted on the drone, it can be inflated or deflated.
  • the airbag When the airbag is deflated, its volume is small, which is convenient for storage and transportation.
  • the model of the target object formed is light in weight, which is conducive to moving with the drone.
  • the airbag is inflated by one or more of the following methods: inflated by the airflow generated by the blades when the drone is flying, inflated by the inflatable device on the airbag, and inflated by an external airbag connected to the airbag
  • the device is inflated.
  • the external inflation device connected to the airbag may or may not be arranged on the drone.
  • the air volume of the airbag can be better ensured, so that the airbag can better protect the traffic participants.
  • the model of the target object may be a traffic participant, for example, different types of vehicles, people of different sizes, different animals, bicycles, motorcycles or balance vehicles and so on.
  • the appearance shape of the model of the target object is similar to traffic participants, and the traffic participants include one or more of the following: motor vehicles, bicycles, pedestrians, animals, and of course are not limited thereto, for example, it can be Plastic bags and other sundries.
  • the external shape of the model 120 of the target as shown in Figure 2 is similar to the motor vehicle
  • the external shape of the model 120 of the target as shown in Figure 3 is similar to the non-motor vehicle
  • the external shape of the model 120 of the target as shown in Figure 4 Similar to pedestrians, the appearance characteristics of traffic participants can be simulated through the model 120 of the target object, and the motion of the traffic participant can be simulated by the movement of the model 120 of the target object carried by the drone 110 .
  • Models 120 of objects of different road users can be used for different test scenarios.
  • FIG. 6 there are a plurality of movable objects 100 in the traffic scene for simulating pedestrians, motor vehicles and non-motor vehicles, and it is possible to test the vehicle under test 200 in a complex test scene.
  • there are multiple movable objects 100 in the traffic scene for simulating motor vehicles some of the movable objects 100 are in the same lane as the vehicle under test 200, and the lanes of the rest of the movable objects 100 are in the same lane as the vehicle under test.
  • the lanes of the test vehicle 200 are adjacent or intersecting, which can be used for the test vehicle to perform a single scene or continuous scene test in an environment with multiple traffic participants, so as to realize the cooperative work of multiple movable objects 100, such as multiple movable objects 100
  • the target object 100 can pre-set the path based on the test scene and execute it accurately, and can also adjust the movement of the drone in real time based on the driving state of the vehicle under test (vehicle equipped with the ADAS/AD to be tested).
  • the movable target object 100 can also Real-time obstacle avoidance based on wireless communication or self-awareness.
  • the model of the target and/or the material of all or part of the outer surface of the UAV are set according to the radar type of the vehicle under test.
  • the radar detection data of the target model by the tested vehicle can be closer to the radar detection data of real traffic participants.
  • the detection signal of the radar is any of the following: laser, millimeter wave, ultrasonic, and certainly not limited thereto.
  • the outer surface is configured to have one or more of the following for detection signals of the radar: diffuse reflection characteristics, refraction characteristics, and absorption characteristics.
  • the outer surface of the drone can absorb radar detection signals, the target model can reflect the radar detection signals, and the radar of the vehicle under test or other traffic participants can better detect the target object model, It can also prevent the UAV from reflecting the detection signal to interfere with the radar, and the test scene is more in line with the real motion scene of the vehicle under test.
  • the drone and the model of the target are rotatably connected.
  • the model of the target object rotates relative to the UAV under the impact of the impact, which can reduce or avoid the damage caused by the collision to the UAV, and Reduce or avoid collision damage to the vehicle under test.
  • the UAV can be controlled to adjust the position of the model of the target object.
  • the bottom of the model of the target can be far away from the ground and the vehicle under test, for example, the vehicle under test can pass under the model of the target, reduce or avoid impact on the target damage caused by models and drones, and reduce or avoid collision damage to the vehicle under test.
  • the UAV 110 is detachably connected to the target model 120 . It facilitates connecting the drone 110 to another model 120 of an object, or connecting the model 120 of the object to another drone 110, as well as facilitating storage and transportation. For example, when the model 120 of the target is damaged or a different test scene needs to be created, the model 120 of the target carried by the UAV 110 can be easily replaced.
  • all or part of the model of the target can be automatically disconnected from the UAV in a preset state.
  • connection between the model 120 of the target and the drone 110 can be automatically disconnected in a preset state, which is based on the model 120 of the target and the drone 110. This is achieved by disconnecting at least two parts of the connecting piece 130 between the man-machine 110 .
  • the connecting member 130 connects the model 120 of the target and the drone 110 through at least one of the following methods: snap fit, interference fit, magnetic attraction, adhesive connection, and of course It is not limited to this.
  • all or part of the connection between the model of the target and the UAV is easy to fall off under the action of pulling force, for example, when the vehicle under test collides with the model of the target, the The pulling force disconnects all or part of the model of the target from the UAV, reducing or avoiding the damage caused by the impact on the model of the target and the UAV, and also facilitating the UAV to move away quickly The vehicle under test, and reducing or avoiding collision damage to the vehicle under test.
  • the preset state includes: the pulling force between the model of the target object and the drone is greater than a preset threshold; and/or, the tension between the model of the target object and the drone
  • the pulling force direction is in the preset direction.
  • the design of the connection structure between the model of the target and the UAV it is possible to ensure the reliable connection between the model of the target and the UAV, and also to make the vehicle under test and the UAV
  • the pulling force generated when the model of the target object collides causes all or part of the model of the target object to be disconnected from the UAV. For example, when the component of the pulling force in the horizontal direction between the model of the target object and the drone is greater than a certain value, the model of the target object is disconnected from the drone.
  • the model of the target is connected to the UAV based on a connecting piece, and the connecting piece disconnects the model of the target from the UAV in response to the trigger instruction of entering the preset state.
  • the connection of the drone the UAV is equipped with an electromagnetic lock, which can be engaged with the corresponding structure on the model of the target, and the electromagnetic lock can be controlled to unlock the card according to the trigger instruction. to disconnect the model of the target from the drone.
  • the trigger instruction is generated by a sensor sensing the pulling force between the model of the target and the UAV, and the sensor is set on the model of the target and/or the UAV.
  • the sensor includes, for example, a Hall sensor and/or a tension sensor, but of course it is not limited thereto.
  • one end of the tension sensor is connected to the drone, and the other end is connected to the model of the target.
  • part of the model of the target is close to the preset position of the drone under the action of pulling force, and the sensor at the preset position senses the approach of the model of the target, and can generate the Trigger command.
  • the target object model 120 includes a plurality of parts 121 , and the plurality of parts 121 are detachably connected. It is convenient for the assembly, storage and transportation of the model 120 of the target object.
  • at least two adjacent parts 121 among the plurality of parts 121 are connected by one or more of the following methods: snap fit, interference fit, magnetic attraction, and adhesive connection.
  • adjacent components 121 are detachably connected by connecting pieces 122 .
  • At least two of the multiple components 121 are connected to different drones 110 .
  • a plurality of drones 110 can provide the target model 120 with stronger maneuverability, and higher speed and acceleration.
  • the drone 110 can be equipped with an environmental working condition simulation device 140, and the environmental working condition simulation device 140 can simulate one or more of the following environments: rainfall, dense fog, dust, and light.
  • the method further includes: controlling the operation of the environmental condition simulation device carried by the drone, so that the environmental condition simulation device simulates one or more of the following environments: rainfall, dense fog, Dust, light.
  • the UAV 100 can receive the control instruction sent by the terminal device 400 through the communication device 111, and control the environmental working condition simulation device 140 according to the control instruction to simulate one or more of the following environments: rainfall, dense fog , dust, light. Therefore, the performance of the vehicle under test under various environmental working conditions can be realized.
  • the method further includes: when the driving state of the vehicle under test does not meet the preset driving state conditions, controlling the UAV to perform a preset avoidance task so that the target object
  • the model avoids the vehicle under test. Prevent the vehicle under test from colliding with the model of the target, and prevent the damage of the model of the target from affecting the test progress. For example, it can reduce or avoid the time spent on replacing, repairing, and assembling the model of the target. Reduce the cost of autopilot testing, enable more accurate testing, and not only perform single-scenario testing, but in some implementations it is also possible to test autopilot functions in continuous operation scenarios, supporting high-level autopilot system testing verify.
  • controlling the UAV to perform a preset avoidance task, so that the model of the target object avoids the vehicle under test includes: controlling the UAV in the horizontal direction and/or The motion state is adjusted vertically so that the model of the target object avoids the vehicle under test.
  • the model of the target moves in front of the lane where the vehicle under test is located, and when the vehicle under test collides with the model of the target, the UAV is controlled to accelerate forward so that all The model of the target object is far away from the vehicle under test.
  • the UAV 110 moves upward in the vertical direction, and leaves the ground with the model 120 of the target, so that the vehicle under test can move from the model 120 of the target. pass below, or can reduce the damage caused by the collision.
  • the acceleration when the UAV adjusts its motion state in the vertical direction is determined according to the weight of the target model. For example, when the weight of the object model is heavy, the acceleration is low so that the UAV can successfully carry the object model away from the vehicle under test.
  • the controlling the UAV to perform a preset avoidance task, so that the model of the target object avoids the vehicle under test includes: controlling the UAV to adjust the position of the target object
  • the model is posed so that the bottom end of the object's model is away from the ground. For example, by rotating the model of the target object relative to the drone, the bottom of the model of the target object can be kept away from the ground and the vehicle under test.
  • the vehicle under test can pass under the model of the target object, reducing or avoiding the impact on the target object. Damage from models and drones, and reducing or avoiding collision damage to vehicles under test.
  • the controlling the UAV to perform a preset avoidance task, so that the model of the target object avoids the vehicle under test includes: controlling the UAV to perform Preset avoidance tasks to separate detachably connected parts of the model of the object from each other. Separation of multiple components reduces damage in the event of a collision with the vehicle under test.
  • the number of components located on the driving path of the tested vehicle is reduced.
  • the vehicle under test collides with fewer components while traveling along the travel path, or passes through the gap after the components are separated, so as to reduce or avoid the damage caused by the collision.
  • the plurality of components are separated from each other under the action of the downforce wind field of the blades of the UAV when the UAV performs the avoidance task.
  • the size and/or direction of the power output of the UAV can be adjusted so that the downforce wind field of the blades of the UAV faces the model of the target object, so that the target object
  • the detachably connected multiple parts of the model are separated from each other under the action of the downpressing wind field.
  • the plurality of components are separated from each other under the drive of the mechanical structure when the UAV performs the avoidance task.
  • the mechanical structure includes a driving device and a plurality of support arms connected to the driving device, at least two parts in the plurality of parts are connected to different supporting arms, and the driving device can drive the supporting arms.
  • the arm is actuated to separate said at least two components from each other.
  • the drive can comprise, for example, an electric drive and/or an elastic drive.
  • the plurality of components may be the effect of the downpressure wind field of the blades of the UAV when the UAV performs the avoidance task and the mechanical structure of the UAV when the avoidance task is performed. Driven to separate from each other. The longer the separation distance, the wider the gap left for the vehicle under test to pass through.
  • control of the UAV to perform a preset avoidance task so that the multiple parts of the detachable connection of the target model are separated from each other, including: control A plurality of drones move away from the vehicle under test in different directions, so that the multiple parts of the model carrying the target object are separated from each other and move away from the vehicle under test in different directions .
  • control A plurality of drones move away from the vehicle under test in different directions, so that the multiple parts of the model carrying the target object are separated from each other and move away from the vehicle under test in different directions .
  • the drone on the left moves to the left with the parts on the left
  • the drone on the right moves to the right with the parts on the right, away from the vehicle under test, leaving the Measure the clearance for vehicles.
  • connection between adjacent parts is easy to fall off under the action of tension, such as under the action of a wind field or the driving action of a mechanical structure, or when different drones move in different directions.
  • the connection between adjacent components is disconnected under the action of pulling force, so that the components are far away from the vehicle under test.
  • connection between adjacent components can be disconnected according to control instructions.
  • the connection between adjacent components may be controlled to disconnect the adjacent components. After the connection of adjacent components is disconnected, it can move away from the vehicle under test faster under the action of the wind field or the driving action of the mechanical structure, or the pulling force of different drones moving in different directions.
  • the method further includes: disconnecting all or part of the target model from the UAV when the driving state of the vehicle under test does not meet a preset driving state condition.
  • a preset driving state condition triggering the model for connecting the target object and the connecting piece of the drone to disconnect the model of the target object said connection to said drone. It can prevent the safety of the UAV from being affected when the model of the target object collides with the vehicle under test, and can also reduce the damage of the collision to the vehicle under test.
  • the cost of the model of the target object is low or it is easy to assemble, so the loss and time cost caused by the collision can be reduced.
  • whether the driving state of the vehicle under test satisfies the driving state condition is determined according to the driving state of the vehicle under test obtained in one sampling period, or based on the The change trend of the driving state of the vehicle under test is determined.
  • the one sampling period may be the latest sampling period, or one of the plurality of sampling periods, for example, determine the driving condition with the highest confidence among the driving conditions sampled in the plurality of sampling periods, and determine the confidence The driving state with the highest degree is the driving state of the tested vehicle.
  • the driving state of the tested vehicle includes at least one of the following: motion parameters of the tested vehicle, observation information of the tested vehicle on the traffic scene, control of the tested vehicle Information, the relative motion relationship between the vehicle under test and other objects in the traffic scene, of course, is not limited thereto.
  • the UAV is controlled to adjust the driving state so that the model of the target object is far away from the vehicle under test.
  • the acquiring the driving state of the vehicle under test in the traffic scene includes: acquiring the observation information of the traffic scene collected by the environment sensor mounted on the vehicle under test, and/or acquiring the The control information of the tested vehicle determined by the automatic driving module of the vehicle.
  • the motion parameters of the measured vehicle include at least one of the following: speed, acceleration, and position, where the position is, for example, a lane.
  • the vehicle under test can obtain observation information of the traffic scene through one or more of environmental sensors, such as cameras, millimeter-wave radars, and lidars, and the observation information includes at least one of the following: target Tracking information, lane line recognition information, drivable area information, traffic flow information.
  • environmental sensors such as cameras, millimeter-wave radars, and lidars
  • the observation information includes at least one of the following: target Tracking information, lane line recognition information, drivable area information, traffic flow information.
  • the vehicle under test may determine control information of the vehicle under test through an automatic driving module, and the control information includes, for example, at least one of the following: trajectory planning information, behavior interpretation information, diagnostic information, braking signals, Turn signals, acceleration signals, human-computer interaction warning information.
  • the relative motion relationship between the measured vehicle and other objects in the traffic scene includes at least one of the following: relative position relationship, relative speed, and relative acceleration, and of course it is not limited thereto.
  • the relative motion relationship can be obtained by the vehicle under test, or by the drone, or by the roadside equipment, or it can also be based on the vehicle under test.
  • the driving state and the flight state of the UAV are determined, for example, the relative motion relationship between the vehicle under test and the UAV is determined according to the position of the vehicle under test and the position of the UAV.
  • whether the driving state of the vehicle under test satisfies the driving state condition is determined according to the relative positional relationship between the vehicle under test and the model of the target object.
  • the relative positional relationship between the vehicle under test and the model of the target can be determined according to the relative positional relationship between the vehicle under test and the UAV, for example, the vehicle under test and the UAV can be The relative positional relationship is determined as the relative positional relationship between the vehicle under test and the model of the target object.
  • the UAV when the relative distance between the vehicle under test and the model of the target object is less than or equal to a preset value, such as 1 meter, it is determined that the driving state of the vehicle under test does not meet the preset driving state conditions, the UAV can be controlled to perform a preset avoidance task, so that the model of the target object avoids the vehicle under test.
  • a preset value such as 1 meter
  • the UAV can be controlled in time to perform a preset avoidance task, so that the model of the target object is far away from the vehicle under test to prevent collisions .
  • the drone may be controlled to perform a preset avoidance task, so that the model of the target object avoids the vehicle under test. Collisions can be prevented more precisely.
  • whether the driving state of the tested vehicle satisfies the driving state condition is determined according to a preset type of evaluation index, and the evaluation index is determined according to the driving state of the tested vehicle .
  • the value of a preset evaluation index can be determined according to one or more driving states of the vehicle under test, and whether the driving state satisfies the driving state condition is determined according to the value of the evaluation index, such as the evaluation index
  • the UAV can be controlled to perform a preset avoidance task, so that the model of the target object can be avoided.
  • the evaluation index is determined according to the relative position relationship and relative speed of the vehicle under test and the model of the target object.
  • the preset type of evaluation index includes: collision time and/or collision critical deceleration determined according to the relative positional relationship and relative speed of the vehicle under test and the model of the target object.
  • the time to collision can be determined according to the following time: when the model of the vehicle under test and the target object maintains the relative speed, the time from the relative position relationship to the collision occurs
  • the collision time may be determined according to the ratio of the relative distance and the relative speed of the vehicle under test to the model of the target object.
  • the speed of the vehicle behind is greater than that of the vehicle in front. If the two vehicles keep their original speed and trajectory unchanged (that is, assuming that the driver or the automatic driving system does not take risk-avoiding behavior), according to the current speed and trajectory, they will A collision occurs at a moment, then the time period from the current moment to the collision is the collision time.
  • the smaller the time to collision the greater the likelihood of an accident.
  • the collision time is less than or equal to a preset time threshold, it is determined that the driving state of the tested vehicle does not satisfy a preset driving state condition.
  • the collision critical deceleration (Deceleration Rate to Avoid a Crash, DRAC) is determined according to the following deceleration: when the model of the vehicle under test and the target object just avoids collision, the required deceleration rate of the vehicle under test.
  • the speed or the deceleration required by the model of the target object, for example, the critical deceleration for collision is determined according to the ratio of the square of the relative speed of the vehicle under test to the model of the target object and the relative distance.
  • the deceleration required by the rear vehicle to avoid rear-end collision with the front vehicle is the critical deceleration for collision, or it can be called the deceleration for avoiding rear-end collision.
  • the critical deceleration of the rear vehicle exceeds the vehicle performance or the maximum available deceleration (MADR) that the passengers can bear, there is a high probability of collision.
  • MADR maximum available deceleration
  • the collision critical deceleration is greater than or equal to a preset deceleration threshold, it is determined that the driving state of the measured vehicle does not satisfy a preset driving state condition.
  • the evaluation index of the preset type includes: when the vehicle under test is in the driving state, the probability of the vehicle under test colliding with the model of the target object.
  • the probability of collision may be determined according to the driving state of the vehicle under test, or according to the driving state of the vehicle under test and the flight state of the drone, for example, based on a machine learning target model to determine the probability of collision. For example, if the probability of the vehicle under test colliding with the target model is greater than or equal to a probability threshold when the vehicle under test maintains the driving state, it is determined that the driving state of the vehicle under test does not satisfy the driving condition.
  • whether the driving state of the tested vehicle satisfies the driving state condition is determined according to the acquired driving state of the tested vehicle and the expected driving state of the tested vehicle, so The expected driving state is determined according to the test instruction.
  • the drone moves in the traffic scene according to the test instruction, and creates a test scene for the vehicle under test. If the vehicle under test can drive in the expected driving state corresponding to the test instruction, Then at least there will be no collision with the model of the target object. It can be understood that the expected driving state is a driving state in which the vehicle under test can avoid collision with the model of the target object when the drone is moving in the traffic scene according to the test instruction.
  • the expected driving state can be determined, for example, according to a user's setting operation. Of course, it is not limited thereto, for example, it may be determined through statistical analysis of a large number of traffic scene images.
  • step S120 if the driving state of the vehicle under test obtained in step S120 does not match the expected driving state of the vehicle under test, it may be determined that the driving state of the vehicle under test does not meet the preset driving state conditions, If there is a risk of collision, the UAV can be controlled to perform a preset avoidance task, so that the model of the target object avoids the vehicle under test.
  • the expected driving state of the tested vehicle includes negative longitudinal acceleration and/or non-zero lateral acceleration.
  • the UAV decelerates directly in front of the vehicle under test if the vehicle under test can slow down and/or change lanes, it can avoid collision with the model of the target object; but if the vehicle under test does not actually slow down and change lanes, but If the speed of the vehicle remains constant or accelerates, there is a risk of collision, and it is determined that the driving state of the vehicle under test does not meet the preset driving state conditions, and the drone can be controlled to perform a preset avoidance task so that all A model of the target object is used to avoid the vehicle under test.
  • the range of the expected acceleration can also be determined according to the test instruction, and if the actual acceleration of the vehicle under test exceeds the range of the expected acceleration, it can also be determined that the driving state of the vehicle under test does not meet the preset driving state. state condition.
  • the expected driving state of the vehicle under test includes: a prompt module mounted on the vehicle under test outputs first prompt information .
  • the prompt module of the vehicle under test outputs the first prompt information, it can remind the driver to control the vehicle under test to slow down and/or change lanes, so as to avoid collision with the target model .
  • the prompting module of the vehicle under test does not output the first prompt information, there is a risk of collision, then it is determined that the driving state of the vehicle under test does not meet the preset driving state condition. It can be understood that, in some implementations, the driving state of the tested vehicle may also include prompt information output by the tested vehicle.
  • the expected driving state of the vehicle under test includes: negative longitudinal acceleration and/or non-zero lateral acceleration.
  • the UAV When the UAV is in line with the vehicle under test at a position close to the vehicle under test on the left or right front, if the vehicle under test can slow down and/or change lanes, it can avoid collision with the model of the target; However, if the vehicle under test does not actually decelerate or change lanes, but maintains a constant speed or accelerates, there is a risk of collision, then it is determined that the driving state of the vehicle under test does not meet the preset driving state conditions, and all vehicles can be controlled.
  • the unmanned aerial vehicle performs a preset avoidance task, so that the model of the target object avoids the vehicle under test.
  • the range of the expected acceleration can also be determined according to the test instruction, and if the actual acceleration of the vehicle under test exceeds the range of the expected acceleration, it can also be determined that the driving state of the vehicle under test does not meet the preset driving state. state condition.
  • the subsequent action trend of the vehicle under test can be determined according to the observation information of the vehicle under test on the traffic scene, and according to the flight status of the drone corresponding to the test instruction and the vehicle The subsequent action trend determines whether there is a risk of collision, and determines whether the driving state of the tested vehicle satisfies a preset driving state condition.
  • the expected value of the observed vehicle's observation information of the traffic scene may be determined according to the test instruction, and when the actual observed value of the tested vehicle is the same as or close to the expected value, Collision with the model of the target object can be avoided, if the actual observed value of the measured vehicle is different from the expected value, there is a probability of collision, and it can be determined that the driving state of the measured vehicle does not meet the preset driving state state condition.
  • the expected driving state of the vehicle under test includes at least one of the following: The acceleration of the model of the target object is negative, and the relative distance between the model of the target object observed by the vehicle under test and the vehicle under test decreases.
  • the subsequent action trend of the tested vehicle can be determined according to the control information output by the automatic driving module of the tested vehicle, and according to the driving state of the target object model corresponding to the test instruction and the The subsequent action trend of the vehicle determines whether there is a risk of collision, and determines whether the driving state of the tested vehicle satisfies a preset driving state condition.
  • the expected value of the control information of the vehicle under test can be determined according to the test instruction, and when the actual control information of the vehicle under test is the same as or close to the expected value, it can avoid If the model of the target object collides, if the actual control information of the tested vehicle is different from the expected value, there is a probability of collision, and it can be determined that the driving state of the tested vehicle does not meet the preset driving state conditions.
  • the expected driving state of the vehicle under test includes: for controlling the deceleration of the vehicle under test and/or Lane change control information.
  • the driving state of the tested vehicle includes the state of the executive system of the tested vehicle; the executive system of the tested vehicle includes at least one of the following: a braking system, a steering system, and a driving system.
  • the driving track of the tested vehicle can be determined according to the state of the execution system of the tested vehicle, and whether it will collide with the model of the target object is determined according to the driving track, and if it is determined that it will collide with the model of the target object , it can be determined that the driving state of the vehicle under test does not meet the preset driving state condition.
  • the expected driving state of the vehicle under test includes at least one of the following: braking by the braking system, The steering system turns, and the drive system deaccelerates.
  • the expected driving state of the vehicle under test includes at least the following: One: the brake system brakes, the steering system turns, and the drive system decelerates.
  • the vehicle under test can be decelerated or accelerated or changed lanes, so as to avoid collision with the model of the target object; but if the actual execution system of the vehicle under test
  • the state of the vehicle is different from the expected value of the state, so that the vehicle under test does not actually slow down or change lanes, but maintains the same speed or accelerates, and there is a risk of collision, so it can be determined that the driving state of the vehicle under test does not meet the expected value.
  • the set driving state conditions can control the UAV to perform a preset avoidance task, so that the model of the target object avoids the vehicle under test.
  • whether the driving state of the vehicle under test satisfies the driving state condition is based on the observation information of the traffic scene by the vehicle under test and the observation information of the traffic scene by the drone. Whether the observation information is consistent is determined. Exemplarily, if the observation information of the traffic scene by the vehicle under test is inconsistent with the observation information of the traffic scene by the UAV, there is a risk of collision, so that the vehicle under test can be determined If the driving state does not meet the preset driving state conditions, the UAV can be controlled to perform a preset avoidance task, so that the model of the target object avoids the vehicle under test; If the observation information of the scene is consistent with the observation information of the traffic scene by the drone, it can be determined that the observation information of the vehicle under test is also accurate, accurate decision-making and control can be made according to the observation information, and the possibility of collision is relatively low.
  • the UAV is driving in front of the lane where the vehicle under test is located according to the test instructions, it observes that there are traffic lights showing red lights or speed limit signs not far ahead, but the vehicle under test does not observe When a red light or a speed limit sign is reached, the model of the UAV and the target decelerates and brakes, but the tested vehicle does not decelerate and brakes, and there is a risk of collision, so it can be determined that the driving state of the tested vehicle does not meet the preset requirements. driving condition.
  • the model of the target object is carried by the drone, and the model of the target object moves with the drone in the traffic scene to simulate traffic participants, and creates a test scene for the vehicle under test.
  • the model mobility and sensitivity of the target carried by the UAV are better, the response is fast, and the precision is high. Richer test scenarios, such as higher speed testing, and safer.
  • the automatic driving test method provided in the embodiment of the present application can avoid collisions through the design of various escape methods, or there is almost no loss after the collision, and can carry out continuous simulation tests of multiple groups of dangerous scenes, which has high efficiency, continuous scene simulation, and safety. Etc.
  • the model of the target object is formed by inflating a small amount of soft board or film, which has the advantages of low cost and easy splicing.
  • the vehicle under test can communicate with the movable object in real time, so that the movable object can more accurately simulate a real scene according to the driving state of the vehicle under test.
  • Pedestrians, motor vehicles, and non-motor vehicles can communicate and perceive in real time, simulating real complex street conditions.
  • key parameters and motion paths can be set in advance, but also the sensory communication function in the test can be satisfied, real-time change self-adjustment, multi-vehicle coordinated change, mutual reference and mutual adjustment.
  • it can simulate complex environmental conditions, such as rainfall, dense fog, dust, light, etc., and can verify the expected functional safety in harsh environments during the test work.
  • the maneuverability sensitivity of the UAV-based test system is relatively high, and the response maneuverability of the UAV is better than that of the ground mobile platform, and the maximum deceleration and speed are sufficient (extreme dangerous conditions can be simulated).
  • FIG. 11 is a schematic flowchart of a vehicle testing method based on an unmanned aerial vehicle, ie, an automatic driving testing method, provided by another embodiment of the present application.
  • the automatic driving test method is used for an unmanned aerial vehicle capable of carrying a model of a target object.
  • the testing method includes steps S210 to S220.
  • FIG. 12 is a schematic flowchart of a vehicle testing method based on an unmanned aerial vehicle, ie, an automatic driving testing method, provided in another embodiment of the present application.
  • the automatic driving test method is used for a vehicle under test, and the test method includes steps S310 to S330.
  • FIG. 13 is a schematic block diagram of an automatic driving test device 500 provided in an embodiment of the present application.
  • the automatic driving test device 500 can be set on the drone, of course, it can also be set on the vehicle under test, or it can also be set on the roadside unit (Road Side Unit, RSU) or terminal equipment.
  • RSU Road Side Unit
  • the automatic driving test device can control the UAV to adjust the trajectory faster, more real-time and more accurately.
  • the automatic driving test device 500 includes one or more processors 501 , and the one or more processors 501 work individually or jointly to execute the aforementioned automatic driving test method.
  • the automatic driving test device 500 further includes a memory 502 .
  • processor 501 and the memory 502 are connected through a bus 503, such as an I2C (Inter-integrated Circuit) bus.
  • I2C Inter-integrated Circuit
  • the processor 501 may be a micro-controller unit (Micro-controller Unit, MCU), a central processing unit (Central Processing Unit, CPU), or a digital signal processor (Digital Signal Processor, DSP), etc.
  • MCU Micro-controller Unit
  • CPU Central Processing Unit
  • DSP Digital Signal Processor
  • the memory 502 may be a Flash chip, a read-only memory (ROM, Read-Only Memory) disk, an optical disk, a U disk, or a mobile hard disk.
  • the processor 501 is configured to run a computer program stored in the memory 502, and implement the aforementioned automatic driving test method when executing the computer program.
  • FIG. 14 is a schematic block diagram of an unmanned aerial vehicle provided by an embodiment of the present application, that is, a drone 600 .
  • the drone 600 includes a flying platform 610 for flying.
  • the UAV 600 also includes one or more processors 601 , and the one or more processors 601 work individually or jointly to execute the steps of the aforementioned automatic driving test method.
  • FIG. 15 is a schematic block diagram of a movable object 700 provided by an embodiment of the present application.
  • the movable target 700 includes the aforementioned drone 600 and a model 710 of the target, which can be connected to the drone and move with the drone in the traffic scene.
  • FIG. 16 is a schematic block diagram of a vehicle 800 provided by an embodiment of the present application.
  • the vehicle 800 includes a vehicle platform 810 and one or more processors 801 , and the one or more processors 801 work individually or jointly to execute the steps of the aforementioned automatic driving test method.
  • FIG. 17 is a schematic block diagram of an unmanned aerial vehicle-based vehicle test system, or called an automatic driving test system 900 , provided by an embodiment of the present application.
  • Automatic driving test system 900 including:
  • the model 920 of the target can be carried on the UAV 910 and move with the UAV 910 in the traffic scene;
  • the vehicle under test 930 the vehicle under test 930 can move autonomously in the traffic scene based on the observation data of the model of the target object and a preset automatic driving algorithm;
  • the aforementioned automatic driving test device 500 The aforementioned automatic driving test device 500 .
  • the automatic driving test device 500 can be set on the drone 910 .
  • An embodiment of the present application also provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the processor implements the automatic driving test method provided in the above-mentioned embodiment A step of.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

一种基于无人飞行器的车辆测试方法及系统,方法包括:将预设的测试指令发送给无人机,测试指令用于指示无人机在交通场景中运动,其中,无人机搭载目标物的模型,目标物的模型随无人机在交通场景中运动(S110);获取交通场景中被测车辆的行驶状态(S120);根据被测车辆的行驶状态,生成被测车辆的测试结果(S130)。目标物的模型的机动性、灵敏度更好。

Description

基于无人飞行器的车辆测试方法及系统 技术领域
本申请涉及自动驾驶技术领域,尤其涉及一种基于无人飞行器的车辆测试方法及系统。
背景技术
自动驾驶是指无需驾驶员对车辆进行操作,而是通过车辆上的传感器自动采集环境信息,并根据环境信息进行自动行驶。自动驾驶功能的测试评价是车辆开发、技术应用和商业推广不可或缺的重要环节,自动驾驶功能的车辆从实验室走向量产,需要大量的测试来证明其各项应用功能和性能的稳定性、鲁棒性、可靠性等。在进行封闭场地测试或者实际道路测试时,可以通过真车或者通过一些装置模拟交通参与者给被测车辆(自动驾驶车辆)创设行驶场景,验证被测车辆的自动驾驶功能应对行驶该场景的能力。但是在一些特定情况下,比如软硬件出现故障、外界环境干扰等情况,被测车辆的自动驾驶功能如果不能进行成功控制时,被测车辆会与真车或者这些装置发生碰撞,尤其是在较高速度下碰撞时具有较大的危险性,而且这些装置通常机动性、灵敏度不够好,因此测试的场景比较限定,例如只能进行较低速度的测试。
发明内容
本申请提供了基于无人飞行器的车辆测试方法及系统,具体的,提供自动驾驶测试方法、装置、系统、可移动目标物及存储介质,能够提供机动性、灵敏度更好的装置给被测车辆创设行驶场景,测试的场景可以更丰富且更安全。
第一方面,本申请实施例提供了一种自动驾驶测试方法,所述测试方法包括:
将预设的测试指令发送给无人机,所述测试指令用于指示所述无人机在交通场景中运动,其中,所述无人机搭载目标物的模型,所述目标物的模型随所述无人机在所述交通场景中运动;
获取所述交通场景中被测车辆的行驶状态,其中,所述被测车辆能够基于对所述目标物的模型的观测数据和预设的自动驾驶算法在所述交通场景中自主运动;
根据所述被测车辆的行驶状态,生成所述被测车辆的测试结果。
第二方面,本申请实施例提供了一种自动驾驶测试方法,用于无人机,所述无人机能够搭载目标物的模型,所述测试方法包括:
接收测试指令;
根据所述测试指令在交通场景中运动,以使所述目标物的模型随所述无人机在所述交通场景中运动。
第三方面,本申请实施例提供了一种自动驾驶测试方法,用于被测车辆,所述测试方法包括:
基于对交通场景中目标物的模型的观测数据和预设的自动驾驶算法在交通场景中自主运动,所述交通场景中包括无人机和所述无人机搭载的目标物的模型,所述目标物的模型随所述无人机在所述交通场景中运动;
获取所述被测车辆的行驶状态;
将所述被测车辆的行驶状态发送给自动驾驶测试装置,以使所述自动驾驶测试装置根据所述被测车辆的行驶状态,生成所述被测车辆的测试结果。
第四方面,本申请实施例提供了一种自动驾驶测试装置,包括一个或多个处理器,单独地或共同地工作,用于执行前述的自动驾驶测试方法的步骤。
第五方面,本申请实施例提供了一种无人机,包括:
飞行平台,用于飞行;
一个或多个处理器,单独地或共同地工作,用于执行前述的自动驾驶测试方法的步骤。
第六方面,本申请实施例提供了一种可移动目标物,所述可移动目标物包括:
前述的无人机;
目标物的模型,能够连接在所述无人机上,随所述无人机在交通场景中运动。
第七方面,本申请实施例提供了一种车辆,包括:
车辆平台;
一个或多个处理器,单独地或共同地工作,用于执行前述的自动驾驶测试方法的步骤。
第八方面,本申请实施例提供了一种自动驾驶测试系统,包括:
无人机;
目标物的模型,能够搭载于所述无人机,随所述无人机在交通场景中运动;
被测车辆,所述被测车辆能够基于对所述目标物的模型的观测数据和预设的自动驾驶算法在所述交通场景中自主运动;
前述的自动驾驶测试装置。
第九方面,本申请实施例提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现上述的方法。
本申请实施例提供了基于无人飞行器的车辆测试方法及系统,具体的,提供自动驾驶测试方法、装置、系统、可移动目标物及存储介质,通过无人机,即无人飞行器搭载目标物的模型,目标物的模型随所述无人机在所述交通场景中运动模拟交通参与者,给被测车辆创设测试场景,由无人机搭载的目标物的模型机动性、灵敏度更好,响应快,精度高,地面行驶的被测车辆不易与无人机碰撞或者碰撞的损失较轻,因此可以给被测车辆创设更丰富的测试场景,例如可以进行更高速的测试,而且更安全
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本申请实施例的公开内容。
附图说明
为了更清楚地说明本申请实施例的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可 以根据这些附图获得其他的附图。
图1是本申请实施例提供的一种自动驾驶测试方法的流程示意图;
图2是一实施方式中无人机搭载目标物的模型的结构示意图;
图3是另一实施方式中无人机搭载目标物的模型的结构示意图;
图4是又一实施方式中无人机搭载目标物的模型的结构示意图;
图5是一实施方式中被测车辆基于对目标物的模型的观测自主运动的示意图;
图6是另一实施方式中被测车辆基于对目标物的模型的观测自主运动的示意图;
图7是一实施方式中无人机执行避让任务的示意图;
图8是一实施方式中目标物的模型的结构示意图;
图9是另一实施方式中无人机执行避让任务的示意图;
图10是一实施方式中无人机搭载环境工况模拟装置的示意图;
图11是本申请另一实施例提供的一种自动驾驶测试方法的流程示意图;
图12是本申请又一实施例提供的一种自动驾驶测试方法的流程示意图;
图13是本申请实施例提供的一种自动驾驶测试装置的示意性框图;
图14是本申请实施例提供的一种无人机的示意性框图;
图15是本申请实施例提供的一种可移动目标物的示意性框图;
图16是本申请实施例提供的一种车辆的示意性框图;
图17是本申请实施例提供的一种自动驾驶测试系统的示意性框图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
附图中所示的流程图仅是示例说明,不是必须包括所有的内容和操作/步骤,也不是必须按所描述的顺序执行。例如,有的操作/步骤还可以分解、组合或部分合并,因此实际执行的顺序有可能根据实际情况改变。
下面结合附图,对本申请的一些实施方式作详细说明。在不冲突的情况下,下述的实施例及实施例中的特征可以相互组合。
随着科学技术的发展以及人工智能技术的应用,自动驾驶技术得到了快速的发展和广泛的应用。自动驾驶车辆是指搭载先进的车载传感器、控制器、执行器等装置,并融合现代通信与网络、人工智能等技术,实现车与车、路、人、云端等之间的智能信息交换、共享,具备复杂环境感知、智能决策、协同控制等功能,可实现“安全、高效、舒适、节能行驶,并最终可实现替代人来操作的新一代汽车。
基于车辆的驾驶自动化水平,现有的SAE J3016标准将驾驶自动化划分为6个等级,也即是L0-L5等级,分别为无驾驶自动化(No Automation,L0),驾驶辅助(Driver Assistance,L1),部分驾驶自动化(Partial Automation,L2),有条件驾驶自动化(Conditional Automation,L3),高度驾驶自动化(High Automation,L4)和完全驾驶自动化(Full Automation,L5)。随着驾驶自动化等级的不断提高,在驾驶活动中,人的参与程度越来越低。可以预见的是,未来将会有更多自动驾驶车辆行驶在道路上,从而出现自动驾驶车辆和人工驾驶车辆并行在道路上的局面。
测试评价是自动驾驶车辆自动驾驶功能开发、技术应用和商业推广不可或缺的重要环节。不同于传统汽车,自动驾驶车辆的测试评价对象变为人-车-环境-任务强耦合系统。随着驾驶自动化等级的提高,不同等级自动化水平所实现的功能逐级递增,导致对其进行测试验证极具挑战性,部分国家和区域已出台相应的法律法规允许自动驾驶车辆进行公路测试,以充分验证自动驾驶车辆的安全性。除了道路测试,围绕自动驾驶车辆测试评价环节所需的标准体系和相关测评方法,各国的政府机构、科研院所、相关企业开展了大量研究工作。
自动驾驶车辆一般会根据不同需要进行虚拟仿真测试、封闭场地测试、真实道路测试。其中虚拟仿真测试可以覆盖运行设计域(Operational Design Domain,ODD)范围内可预测的全部场景,包括不易出现的边角场景,覆盖ODD范围内全部自动驾驶功能;封闭场地测试可以覆盖ODD范围内的极限场景,如安全相关的事故场景和危险场景,覆盖自动(辅助)驾驶系统正常状态下的典型功能,验证仿真测试结果;真实道路测试可以覆盖ODD范围内典型场景组合 的道路,覆盖随机场景及随机要素组合,验证自动驾驶功能应对随机场景的能力。
虚拟仿真测试不能很好的模拟实际交通场景,在对被测车辆进行封闭场地测试、真实道路测试时,一些特定情况下,比如软硬件出现故障、外界环境干扰等情况,被测车辆的自动驾驶功能如果不能进行成功控制,可能与测试用车碰撞。采用真人驾驶的真实的测试用车进行测试,危险性较高,被测车辆如果不能进行成功控制,可能造成车毁人亡。目前有测试用车通过外扩比较低的底盘搭载泡沫面板车身实现,当进行测试时,如果被测车辆控制失误与该测试用车发生碰撞,泡沫面板车身将被撞散,被测车辆可以从底盘上碾压过去,一些底盘可以在被测车辆的碾压下被动压缩降低高度,让被测车辆能够顺利通过,减少被测车辆在危险场景下的危害损失,但是在较高车速下测试时,仍有较高的危险性;而且底盘结构复杂,带有收缩弹簧等结构,具有成本高、价格昂贵,操作不便等缺点。
为此,本申请实施例提供基于无人飞行器的车辆测试方法及系统,具体的,提供自动驾驶测试方法、装置、系统、可移动目标物及存储介质,通过无人机,即无人飞行器搭载目标物的模型,目标物的模型随所述无人机在所述交通场景中运动模拟交通参与者,给被测车辆创设测试场景,由无人机搭载的目标物的模型机动性、灵敏度更好,响应快,精度高,地面行驶的被测车辆不易与无人机碰撞或者碰撞的损失较轻,因此可以给被测车辆创设更丰富的测试场景,例如可以进行更高速的测试,而且更安全。
请参阅图1,图1是本申请实施例提供的一种基于无人飞行器的车辆测试方法,即自动驾驶测试方法的流程示意图。所述自动驾驶测试方法可以应用在自动驾驶测试装置中,用于指示无人机在交通场景中运动,以便无人机搭载的目标物的模型随所述无人机在所述交通场景中运动,给被测车辆创设测试场景,以及根据被测车辆在所述测试场景中的行驶状态生成测试结果等过程。其中,被测车辆可以包括不同自动驾驶等级的车辆,如L0-L5中任一等级的车辆,可以理解的,被测车辆可以是有人驾驶的车辆,也可以是无人驾驶的车辆。
无人机可以为旋翼型无人机,例如四旋翼无人机、六旋翼无人机、八旋翼无人机,也可以是固定翼无人机。
其中,自动驾驶测试装置可以设置在无人机上,当然也可以设置在被测车辆上,或者也可以设置在路侧设备(Road Side Unit,RSU)或者终端设备上,其中路侧设备是智能道路系统的核心,起到连接路侧设施,传递道路信息给车、云端的作用,可以实现后台通信功能、信息广播功能、高精定位地基增强功能;终端设备可以包括手机、平板电脑、笔记本电脑、台式电脑、遥控器等中的至少一项。在一些实施方式中,终端设备还可以根据用户的操作生成相应的测试指令,以及将生成的测试指令发送给无人机,以使无人机根据所述测试指令运动,例如使无人机根据预设的测试轨迹运动。
如图1所示,本申请实施例的自动驾驶测试方法包括步骤S110至步骤S130。
S110、将预设的测试指令发送给无人机,所述测试指令用于指示无人机在交通场景中运动,其中,所述无人机搭载目标物的模型,所述目标物的模型随所述无人机在所述交通场景中运动。
请参阅图2至图4,图2至图4所示为不同实施方式中无人机110搭载目标物的模型120的结构示意图。在一些实施方式中,无人机110和目标物的模型120一起可以称为可移动目标物100,当然所述可移动目标物100也不仅限于包括无人机110和目标物的模型120,例如还可以包括用于连接无人机110和目标物的模型120的连接件130。
如图5所示为一实施方式中被测车辆200基于对可移动目标物100上目标物的模型的观测自主运动的示意图。
可移动目标物中的无人机接收所述测试指令,以及根据所述测试指令运动。在一些实施方式中,所述测试指令用于指示无人机在交通场景中以预设的测试轨迹运动。示例性的,所述测试指令用于指示无人机在某一条车道上方直线行驶,所述某一条车道与所述被测车辆的车道为同一个或者不为同一个,所述某一条车道与所述被测车辆的车道可以平行或者相交;示例性的,所述测试指令用于指示无人机从一条车道上方变换至另一条车道上方;示例性的,所述测试指令用于指示无人机在所述被测车辆的后方经所述被测车辆左侧或右侧的车道上方运动至所述被测车辆的前方,当然也不限于此。例如所述测试指令还可以用于指示无人机的速度,例如指示无人机在所述测试轨迹上的速度,所述测试轨迹上不同位置的速度可以相同也可以不相同。
S120、获取所述交通场景中被测车辆的行驶状态,其中,所述被测车辆能够基于对所述目标物的模型的观测数据和预设的自动驾驶算法在所述交通场景中自主运动。
可移动目标物中目标物的模型随无人机在交通场景中运动,被测车辆能够基于对所述目标物的模型的观测数据和预设的自动驾驶算法在所述交通场景中自主运动,需要说明的是,被测车辆对交通场景的观测数据不仅限于目标物的模型的观测数据,例如还可以包括对其他交通参与者、路旁建筑设施、交通标志物等的观测数据,被测车辆能够基于预设的自动算法,根据对交通场景的观测数据自主运动。
在一些实施方式中,所述无人机包括环境传感器,所述获取所述交通场景中被测车辆的行驶状态,包括:通过所述无人机的环境传感器获取所述交通场景中的影像信息,根据所述影像信息确定所述被测车辆的行驶状态。举例而言,所述无人机搭载的环境传感器包括雷达和/或视觉传感器,其中雷达例如包括以下至少一种:激光雷达、超声波雷达、毫米波雷达等,视觉传感器例如包括以下至少一种:双目摄像头、广角摄像头、红外摄像头等。
示例性的,所述无人机通过云台设备搭载所述环境传感器,所述方法还包括:根据所述被测车辆的行驶状态控制所述云台调整姿态,以使所述被测车辆处于所述环境传感器的感知范围内。当然也不限于此,例如还可以根据被测车辆的行驶状态调整视觉传感器,如摄像头的焦距和视野。
在一些实施方式中,所述被测车辆包括第一通信模块,所述无人机包括第二通信模块,所述无人机能够通过所述第一通信模块、所述第二通信模块与所述被测车辆通信连接;所述获取所述交通场景中被测车辆的行驶状态,包括:通过与所述被测车辆的通信连接,获取所述被测车辆发送的所述被测车辆的行驶状态。
在一些实施方式中,如图5所示,所述交通场景设置有路侧设备300,所述路侧设备用于获取所述被测车辆的行驶状态,所述无人机包括第二通信模块,所述路侧设备包括第三通信模块,所述无人机能够通过所述第二通信模块、所述第三通信模块与所述路侧设备通信连接;所述获取所述交通场景中被测车辆的行驶状态,包括:通过与所述路侧设备的通信连接,获取所述路侧设备获取 的所述被测车辆的行驶状态。
S130、根据所述被测车辆的行驶状态,生成所述被测车辆的测试结果。
在一些实施方式中,所述被测车辆的测试结果包括对所述被测车辆的所述行驶状态进行预设处理得到的数据集。
示例性的,将所述被测车辆的行驶状态处理为预设格式的数据,如表格、曲线等,得到所述被测车辆的测试结果,当然也不限于此,例如还可以根据所述被测车辆的行驶状态确定若干评价指标。
在一些实施方式中,所述根据所述被测车辆的行驶状态,生成所述被测车辆的测试结果,包括:根据所述无人机的运动状态和所述被测车辆的行驶状态,生成所述被测车辆的测试结果。基于所述无人机带着所述目标物的模型运动给所述被测车辆创设测试场景,期望无人机的不同运动状态能使所述被测车辆相应的调整运动状态,根据所述无人机的运动状态和所述被测车辆的行驶状态生成的所述测试结果,可以更准确的体现被测车辆对环境的观测和自动驾驶算法的性能。
示例性的,所述测试结果能够用于指示以下至少一种:所述被测车辆和其他交通参与者是否安全、所述被测车辆执行动作是否及时、所述被测车辆执行驾驶行为是否精准。举例而言,所述测试结果能够用于指示所述被测车辆在所述无人机和目标物的模型创设的测试场景下是否与所述目标物的模型发生碰撞、所述被测车辆刹车时的减速度是否超过预设的减速度阈值、所述被测车辆绕行所述目标物的模型时能够与所述目标物的模型保持安全距离等,当然也不限于此。
在一些实施方式中,可通过观察被测车辆如何响应目标物的模型来测试被测车辆的性能,如辅助驾驶功能或者自动驾驶功能。可以通过无人机搭载目标物的模型模拟各种各样的真实场景,这些场景可能会导致危险事件的发生。举例而言,在汽车自动紧急制动(Autonomous Emergency Braking,AEB)测试中,目标物的模型可以模拟行人突然从非人行横道的路边窜出(或路上停止的卡车后面窜出),根据测量的被测车辆的响应参数,如被测车辆的响应时间、临界碰撞时间、最小制动距离、碰撞伤害程度等来评价AEB系统的性能;如在高速公路车辆切入场景的自动驾驶测试中,目标物的模型可以模拟机动车,可以通 过关注切入时刻被测车辆如与目标物的模型的碰撞时间(Time to Collision,TTC)去提取切入场景的临界安全边界。
可选的,所述目标物的模型包括以下一种或者多种:柔性板、柔性膜、能够充气的气囊。示例性的,多个柔性板组合,形成所述目标物的模型的前侧板、左侧板、后侧板、右侧板;示例性的,所述目标物的模型包括框架和柔性膜,柔性膜由框架支撑,形成立体的目标物的模型;示例性的,气囊充气后形成立体的目标物的模型;示例性的,目标物的模型的部分为柔性板组合的框架,其他部分为框架支撑的柔性膜。当然也不限于此。可选的,柔性板、柔性膜、气囊的侧壁可以是金属或非金属,可以为单层结构或者为多层结构。
可选的,所述目标物的模型的最大重量与所述无人机的桨盘覆盖的面积正相关,以便目标物的模型能够随所述无人机在所述交通场景中运动,且具有较高的机动性,如加速度。可选的,搭载所述目标物的模型的无人机可以为一个,也可以为多个。
可选的,所述目标物的模型的重量小于或等于预设值,举例而言,所述预设值的范围例如为1-20千克。例如,所述目标物的模型的重量为1千克、2千克、或者5千克。可选的,所述目标物的模型由轻质材料制成,如泡沫板、纸板、多孔板等。
可选的,所述气囊搭载在所述无人机上时,能够充气或放气。气囊放气时体积较小,方便收纳和运输,充气时形成的目标物的模型重量较轻,利于随无人机运动。
可选的,所述气囊由以下一种或者多种方式充气:由所述无人机飞行时桨叶产生的气流充气、由所述气囊上的充气装置充气、由所述气囊连接的外部充气装置充气。其中气囊连接的外部充气装置可以设置在所述无人机上,也可以不设置在所述无人机上。通过多种方式充气时,可以更好的保证气囊的气量,以便气囊更好的交通参与者。
可选的,目标物的模型可以是交通参与者,例如,不同类型的车辆,不同身材的人,不同的动物,自行车,摩托车或者平衡车等等。
所述目标物的模型的外表形状与交通参与者类似,所述交通参与者包括以下一种或者多种:机动车、非机动车、行人、动物,当然也不限于此,例如可 以为车道上的塑料袋等杂物。如图2所示目标物的模型120的外表形状与机动车类似,如图3所示目标物的模型120的外表形状与非机动车类似,如图4所示目标物的模型120的外表形状与行人类似,从而可以通过目标物的模型120模拟交通参与者的外观特性,由无人机110带着目标物的模型120运动模拟交通参与者的运动。不同交通参与者的目标物的模型120可以用于不同的测试场景。
请参阅图6,所述交通场景中有多个可移动目标物100,用于模拟行人、机动车和非机动车,可以进行复杂测试场景下对被测车辆200的测试。在另一些实施方式中,交通场景中有多个可移动目标物100用于模拟机动车,部分可移动目标物100与被测车辆200处于同一车道,其余部分可移动目标物100的车道与被测车辆200的车道相邻或相交,可以用于被测车辆在具有多个交通参与者的环境中进行单一场景或连续场景测试,实现多个可移动目标物100协同工作,例如多个可移动目标物100可以基于测试场景预先设定路径并精确执行,也可以基于被测车辆(装有待测ADAS/AD的车辆)的行驶状态实时调整无人机的运动,可移动目标物100还可以基于无线通讯或自身感知实时避障。
可选的,所述目标物的模型和/或所述无人机的全部或者部分外表面的材料根据所述被测车辆的雷达类型设置。以使目标物的模型的雷达探测特性与对应的交通参与者类似,被测车辆对目标物的模型的雷达探测数据可以更接近对真实交通参与者的雷达探测数据。可选的,所述雷达的探测信号为以下任一种:激光、毫米波、超声波,当然也不限于此。
可选的,所述外表面被设置为针对所述雷达的探测信号有以下一种或者多种:漫反射特性、折射特性、吸收特性。示例性的,所述无人机的外表面能够吸收雷达的探测信号,目标物的模型能够反射雷达的探测信号,被测车辆或者其他交通参与者的雷达可以更好的探测目标物的模型,还可以防止无人机反射探测信号对雷达的干扰,测试场景更符合被测车辆真实的运动场景。
可选的,所述无人机和所述目标物的模型可转动连接。在一些实施方式中,在被测车辆与所述目标物的模型碰撞时,目标物的模型在撞击的作用下相对于无人机转动,可以降低或避免撞击对无人机造成的损伤,以及降低或避免碰撞对被测车辆的损害。在一些实施方式中,当所述被测车辆的行驶状态不满足预 设的行驶状态条件,例如可能与目标物的模型碰撞时,可以控制所述无人机调整所述目标物的模型的位姿,例如使目标物的模型相对于无人机转动,目标物的模型的底端可以远离地面和被测车辆,例如被测车辆可以从目标物的模型下方通过,降低或避免撞击对目标物的模型和无人机造成的损伤,以及降低或避免碰撞对被测车辆的损害。
可选的,请参阅图7,所述无人机110和所述目标物的模型120可拆卸的连接。便于将无人机110连接到另一目标物的模型120,或者将所述目标物的模型120连接到另一无人机110上,以及便于收纳和运输。举例而言,目标物的模型120损坏时或者需要创设不同的测试场景时可以很方便的更换无人机110搭载的目标物的模型120。
可选的,所述目标物的模型的全部或者部分与所述无人机的连接能够在预设状态下自动断开。
在一些实施方式中,请参阅图7,所述目标物的模型120与所述无人机110的连接能够在预设状态下自动断开,是基于所述目标物的模型120与所述无人机110之间的连接件130的至少两个部件断开连接实现的。示例性的,所述连接件130通过以下至少一种方式连接所述目标物的模型120与所述无人机110:卡接扣合、过盈配合、磁性吸合、黏性连接,当然也不限于此。
在一些实施方式中,所述目标物的模型的全部或者部分与所述无人机的连接在拉力作用下是易脱落的,例如在被测车辆与所述目标物的模型碰撞时,产生的拉力使所述目标物的模型的全部或者部分与所述无人机的连接断开,降低或避免撞击对目标物的模型和无人机造成的损伤,还可以便于无人机较快的远离被测车辆,以及降低或避免碰撞对被测车辆的损害。
可选的,所述预设状态包括:所述目标物的模型与所述无人机之间的拉力大于预设阈值;和/或,所述目标物的模型与所述无人机之间的拉力方向处于预设方向。示例性的,可以通过对所述目标物的模型与所述无人机的连接结构的设计,既保证目标物的模型与所述无人机的可靠连接,也可以使被测车辆与所述目标物的模型碰撞时产生的拉力使所述目标物的模型的全部或者部分与所述无人机的连接断开。举例而言,所述目标物的模型与所述无人机之间的拉力在水平方向上的分量大于特定值时,所述目标物的模型与所述无人机的连接断开。
在一些实施方式中,所述目标物的模型与所述无人机基于连接件进行连接,所述连接件响应于进入所述预设状态的触发指令,断开所述目标物的模型与所述无人机的所述连接。示例性的,所述无人机上设有电磁锁扣,所述电磁锁扣能够与所述目标物的模型上的相应结构卡接,电磁锁扣可以根据所述触发指令受控解锁所述卡接,以使所述目标物的模型与所述无人机断开连接。
可选的,所述触发指令由传感器感测所述目标物的模型与所述无人机之间的拉力生成,所述传感器设置在所述目标物的模型和/或所述无人机上。所述传感器例如包括霍尔传感器和/或拉力传感器,当然也不限于此。示例性的,所述拉力传感器的一端连接在所述无人机上,另一端连接在所述目标物的模型上。示例性的,所述目标物的模型在拉力作用下部分部位靠近所述无人机的预设位置,所述预设位置的传感器感测到所述目标物的模型的靠近,可以生成所述触发指令。
可选的,如图8和图9所示,所述目标物的模型120包括多个部件121,所述多个部件121可拆卸的连接。便于所述目标物的模型120的组装和收纳、运输。示例性的,所述多个部件121中至少两个相邻的部件121通过以下一种或多种方式连接:卡接扣合、过盈配合、磁性吸合、黏性连接。举例而言,如图8和图9所示,相邻的部件121通过连接件122可拆卸的连接。
可选的,如图9所示,所述多个部件121中的至少两个部件121连接不同的无人机110。多个无人机110可以给所述目标物的模型120提供更强的机动性能,和更高的速度、加速度。
可选的,如图10所示,无人机110能够搭载环境工况模拟装置140,环境工况模拟装置140能够模拟以下一种或多种环境:降雨,浓雾,灰尘、光照。
在一些实施方式中,所述方法还包括:控制所述无人机搭载的环境工况模拟装置运行,以使所述环境工况模拟装置模拟以下一种或多种环境:降雨,浓雾,灰尘、光照。如图10所示,无人机100可以通过通信装置111接收终端设备400发送的控制指令,以及根据所述控制指令控制环境工况模拟装置140模拟以下一种或多种环境:降雨,浓雾,灰尘、光照。从而可以实现各种环境工况下所述被测车辆的性能。
在一些实施方式中,所述方法还包括:当所述被测车辆的行驶状态不满足 预设的行驶状态条件时,控制所述无人机执行预设的避让任务,以使所述目标物的模型避让所述被测车辆。防止被测车辆与目标物的模型碰撞,防止目标物的模型损坏影响测试进度,例如可以减少或者避免花费在更换、维修、组装目标物的模型的时间。降低自动驾驶测试的代价,能够进行更准确的测试,而且不仅能进行单一场景的测试,在一些实施方式中还可以实现在连续运行场景中针对自动驾驶功能进行测试,支持高级别自动驾驶系统测试验证。
在一些实施方式中,所述控制所述无人机执行预设的避让任务,以使所述目标物的模型避让所述被测车辆,包括:控制所述无人机在水平方向和/或竖直方向上调整运动状态,以使所述目标物的模型避让所述被测车辆。
示例性的,所述目标物的模型在所述被测车辆所在车道的前方运动,所述被测车辆将与所述目标物的模型碰撞时,控制所述无人机加速前进,以使所述目标物的模型远离所述被测车辆。
示例性的,如图7所示,所述无人机110在竖直方向上向上运动,带着所述目标物的模型120离开地面,以便所述被测车辆从所述目标物的模型120下方通过,或者可以降低碰撞时产生的损坏。
示例性的,所述无人机在竖直方向上调整运动状态时的加速度,根据所述目标物的模型的重量确定。举例而言,所述目标物的模型的重量较重时,所述加速度较低,以便所述无人机能够成功带着所述目标物的模型远离所述被测车辆。
在一些实施方式中,所述控制所述无人机执行预设的避让任务,以使所述目标物的模型避让所述被测车辆,包括:控制所述无人机调整所述目标物的模型的位姿,以使所述目标物的模型的底端远离地面。例如通过使目标物的模型相对于无人机转动,目标物的模型的底端可以远离地面和被测车辆,例如被测车辆可以从目标物的模型下方通过,降低或避免撞击对目标物的模型和无人机造成的损伤,以及降低或避免碰撞对被测车辆的损害。
在一些实施方式中,请参阅图8,所述控制所述无人机执行预设的避让任务,以使所述目标物的模型避让所述被测车辆,包括:控制所述无人机执行预设的避让任务,以使所述目标物的模型的可拆卸连接的多个部件之间互相分离。多个部件之间互相分离可以降低与被测车辆发生碰撞时的损失。
示例性的,所述多个部件之间互相分离后,位于所述被测车辆的行驶路径上的部件数量减少。所述被测车辆沿所述行驶路径行驶与较少的部件碰撞,或者从部件分离后的空隙中经过,降低或避免碰撞引起的损害。
示例性的,所述多个部件,在所述无人机执行所述避让任务时无人机桨叶的下压风场的作用下互相分离。举例而言,可以整所述无人机的动力输出的大小和/或方向,以使所述无人机的桨叶的下压风场朝向所述目标物的模型,以使所述目标物的模型的可拆卸连接的多个部件,在所述下压风场的作用下互相分离。
示例性的,所述多个部件,在所述无人机执行所述避让任务时机械结构的驱动作用下互相分离。举例而言,所述机械结构包括驱动装置和连接所述驱动装置的多个支撑臂,所述多个部件中的至少两个部件连接于不同的支撑臂,所述驱动装置可以驱动所述支撑臂动作,以使所述的至少两个部件互相分离。所述驱动装置例如可以包括电驱动装置和/或弹性驱动装置。
示例性的,所述多个部件,可以在所述无人机执行所述避让任务时无人机桨叶的下压风场的作用和所述无人机执行所述避让任务时机械结构的驱动作用下互相分离。分离的距离可以更远,留给所述被测车辆通行的空隙越宽。
在一些实施方式中,请参阅图9,所述控制所述无人机执行预设的避让任务,以使所述目标物的模型的可拆卸连接的多个部件之间互相分离,包括:控制多个无人机向不同方向远离所述被测车辆,以使所述多个无人机带着所述目标物的模型的多个部件之间互相分离且向不同方向远离所述被测车辆。如图9所示,左侧的无人机带着左侧的部件向左运动,右侧的无人机带着右侧的部件向右运动,远离所述被测车辆,留给所述被测车辆通行的空隙。
在一些实施实施方式中,相邻部件之间的连接在拉力作用下是易脱落的,例如在风场作用下或者在机械结构的驱动作用下,或者在不同无人机向不同方向运动时的拉力作用下断开相邻部件之间的连接,便于部件远离所述被测车辆。
在另一些实施方式中,相邻部件之间的连接能够根据控制指令断开。示例性的,控制所述无人机执行预设的避让任务时,可以控制相邻部件之间的连接件断开所述相邻部件的连接。相邻部件的连接断开后,在风场作用下或者在机械结构的驱动作用下,或者在不同无人机向不同方向运动时的拉力作用下可以 更快的远离被测车辆。
可选的,所述方法还包括:当所述被测车辆的行驶状态不满足预设的行驶状态条件时,断开所述目标物的模型的全部或者部分与所述无人机的连接。示例性的,当所述被测车辆的行驶状态不满足预设的行驶状态条件时,触发用于连接所述目标物的模型与所述无人机的连接件断开所述目标物的模型与所述无人机的所述连接。可以防止目标物的模型与被测车辆碰撞时影响无人机的安全,还可以降低碰撞对被测车辆的损害。目标物的模型的成本较低或者容易组装,因此可以降低碰撞造成的损失和时间成本。
在一些实施方式中,所述被测车辆的行驶状态是否满足所述行驶状态条件,是根据一个采样周期得到的所述被测车辆的行驶状态确定的,或者是根据多个采样周期得到的所述被测车辆的行驶状态的变化趋势确定的。其中,所述一个采样周期可以是最近一次采样周期,或者是所述多个采样周期中的一个,例如确定所述多个采样周期采样的行驶状态中置信度最高的行驶状态,确定所述置信度最高的行驶状态为所述被测车辆的行驶状态。示例性的,根据多个采样周期得到的所述被测车辆的行驶状态的变化趋势,可以确定所述被测车辆的自动驾驶功能是否失效或者不可靠。
在一些实施方式中,所述被测车辆的行驶状态包括以下至少一种:所述被测车辆的运动参数、所述被测车辆对所述交通场景的观测信息、所述被测车辆的控制信息、所述被测车辆与所述交通场景中其余物体的相对运动关系,当然也不限于此。当所述被测车辆的一种或多种行驶状态不满足预设的行驶状态条件时,控制所述无人机调整行驶状态,以使所述目标物的模型远离所述被测车辆。
示例性的,所述获取所述交通场景中被测车辆的行驶状态,包括:获取所述被测车辆搭载的环境传感器采集的对所述交通场景的观测信息,和/或获取所述被测车辆的自动驾驶模块确定的所述被测车辆的控制信息。
举例而言,所述被测车辆的运动参数包括以下至少一种:速度、加速度、位置,位置例如为所在的车道。
举例而言,所述被测车辆可以通过环境传感器,如摄像头、毫米波雷达、激光雷达中的一种或多种获取所述交通场景的观测信息,所述观测信息包括以 下至少一种:目标跟踪信息、车道线识别信息、可行驶区域信息、交通流信息。
举例而言,所述被测车辆可以通过自动驾驶模块确定所述被测车辆的控制信息,所述控制信息例如包括以下至少一种:轨迹规划信息、行为解释信息、诊断信息、制动信号、转向信号、加速信号、人机交互警示信息。
举例而言,所述被测车辆与所述交通场景中其余物体的相对运动关系,包括以下至少一种:相对位置关系、相对速度、相对加速度,当然也不限于此。需要说明的是所述相对运动关系可以是所述被测车辆获取的,或者是所述无人机获取的,或者是所述路侧设备获取的,当然也可以是根据所述被测车辆的行驶状态和所述无人机的飞行状态确定的,例如根据被测车辆的位置和无人机的位置确定被测车辆与所述无人机的相对运动关系。
在一些实施方式中,所述被测车辆的行驶状态是否满足所述行驶状态条件,是根据所述被测车辆与所述目标物的模型的相对位置关系确定的。所述被测车辆与所述目标物的模型的相对位置关系可以根据所述被测车辆与所述无人机的相对位置关系确定,例如可以将所述被测车辆与所述无人机的相对位置关系确定为所述被测车辆与所述目标物的模型的相对位置关系。
示例性的,当所述被测车辆与所述目标物的模型之间的相对距离小于或等于预设值,如1米时,确定所述被测车辆的行驶状态不满足预设的行驶状态条件,可以控制控制所述无人机执行预设的避让任务,以使所述目标物的模型避让所述被测车辆。可以在被测车辆与所述目标物的模型之间的距离小于安全距离时及时控制所述无人机执行预设的避让任务,以使所述目标物的模型远离被测车辆,防止发生碰撞。
示例性的,当所述被测车辆与所述目标物的模型之间的相对距离小于或等于预设值,且所述被测车辆与所述目标物的模型已经或即将并线运动时,确定所述被测车辆的行驶状态不满足预设的行驶状态条件,可以控制所述无人机执行预设的避让任务,以使所述目标物的模型避让所述被测车辆。可以更精准的防止发生碰撞。
在另一些实施方式中,所述被测车辆的行驶状态是否满足所述行驶状态条件,是根据预设类型的评价指标确定的,所述评价指标是根据所述被测车辆的行驶状态确定的。例如可以根据所述被测车辆的一种或多种行驶状态确定预设 的评价指标的值,根据所述评价指标的值确定所述行驶状态是否满足所述行驶状态条件,例如所述评价指标的值超出预设范围时,确定所述被测车辆的行驶状态不满足预设的行驶状态条件,可以控制所述无人机执行预设的避让任务,以使所述目标物的模型避让所述被测车辆。
示例性的,所述评价指标是根据所述被测车辆与所述目标物的模型的相对位置关系和相对速度确定的。举例而言,所述预设类型的评价指标包括:根据所述被测车辆与所述目标物的模型的相对位置关系和相对速度确定的碰撞时间和/或碰撞临界减速度。
其中,所述碰撞时间(Time to Collision,TTC)可以根据以下时间确定:所述被测车辆与所述目标物的模型维持所述相对速度时,从所述相对位置关系起至发生碰撞的时间,举例而言,所述碰撞时间可以根据所述被测车辆与所述目标物的模型的相对距离与相对速度的比值确定。当前时刻下,后车速度大于前车,若两车保持原有的速度和行驶轨迹不变(即假定驾驶人或自动驾驶系统不采取避险行为),根据当前速度和轨迹,将会在某个时刻发生碰撞,那么从当前时刻至碰撞发生的时间段就为碰撞时间。碰撞时间越小,发生事故的可能性就越大。示例性的,当所述碰撞时间小于或等于预设的时间阈值时,确定所述被测车辆的行驶状态不满足预设的行驶状态条件。
其中,所述碰撞临界减速度(Deceleration Rate to Avoid a Crash,DRAC)根据以下减速度确定:所述被测车辆与所述目标物的模型刚好避免碰撞时,所述被测车辆所需的减速度或所述目标物的模型所需的减速度,举例而言,所述碰撞临界减速度根据所述被测车辆与所述目标物的模型的相对速度的平方与相对距离的比值确定。跟驰间距较近的两辆车,若后车速度大于前车,后车为了不与前车追尾所需要的减速度,即为所述碰撞临界减速度,或者可以称为避免追尾碰撞的减速度,当后车的碰撞临界减速度超过车辆性能或者乘客能承受的最大减速度(maximum available decelerate,MADR)时,就有较大概率发生碰撞。示例性的,当所述碰撞临界减速度大于与或等于预设的减速度阈值时,确定所述被测车辆的行驶状态不满足预设的行驶状态条件。
示例性的,所述预设类型的评价指标包括:所述被测车辆处于所述行驶状态时,所述被测车辆与所述目标物的模型碰撞的概率。示例性的,可以根据所 述被测车辆的行驶状态,或者根据所述被测车辆的行驶状态和所述无人机的飞行状态确定碰撞的概率,例如基于机器学习目标物的模型确定所述碰撞的概率。举例而言,若所述被测车辆维持所述行驶状态时所述被测车辆与所述目标物的模型碰撞的概率大于或等于概率阈值,确定所述被测车辆的行驶状态不满足所述行驶状态条件。
在其他一些实施方式中,所述被测车辆的行驶状态是否满足所述行驶状态条件,是根据获取到的所述被测车辆的行驶状态和所述被测车辆的预期行驶状态确定的,所述预期行驶状态根据所述测试指令确定。
示例性的,所述无人机根据所述测试指令在交通场景中运动,给所述被测车辆创设测试场景,所述被测车辆的如果能够以所述测试指令对应的预期行驶状态行驶,则至少不会与所述目标物的模型发生碰撞。可以理解的,所述预期行驶状态为所述无人机根据所述测试指令在交通场景中运动时,所述被测车辆能够避免与所述目标物的模型碰撞的行驶状态。
在一些实施方式中,所述预期行驶状态例如可以根据用户的设置操作确定。当然也不限于此,例如可以通过对大量交通场景影像的分析统计确定。
示例性的,若步骤S120获取到的所述被测车辆的行驶状态和所述被测车辆的预期行驶状态不符,则可以确定所述被测车辆的行驶状态不满足预设的行驶状态条件,有发生碰撞的风险,可以控制所述无人机执行预设的避让任务,以使所述目标物的模型避让所述被测车辆。
示例性的,若所述无人机按照所述测试指令在所述被测车辆所在车道的前方减速,所述被测车辆的预期行驶状态包括负的纵向加速度和/或非零的横向加速度。当无人机在被测车辆正前方减速时,如果被测车辆能够减速和/或变道,则可以避免与目标物的模型发生碰撞;但若被测车辆实际没有减速和变道,而是维持车速不变或者加速,则有发生碰撞的风险,则确定所述被测车辆的行驶状态不满足预设的行驶状态条件,可以控制所述无人机执行预设的避让任务,以使所述目标物的模型避让所述被测车辆。可选的,还可以根据所述测试指令确定预期的加速度的范围,如果被测车辆实际加速度超出所述预期的加速度的范围,也可以确定所述被测车辆的行驶状态不满足预设的行驶状态条件。
示例性的,若所述无人机按照所述测试指令在所述被测车辆的前方减速, 所述被测车辆的预期行驶状态包括:所述被测车辆搭载的提示模块输出第一提示信息。所述无人机在被测车辆正前方减速时,如果被测车辆的提示模块输出第一提示信息,可以提醒驾驶员控制被测车辆减速和/或变道,避免与目标物的模型发生碰撞。如果被测车辆的提示模块未输出所述第一提示信息,则有发生碰撞的风险,则确定所述被测车辆的行驶状态不满足预设的行驶状态条件。可以理解的,在一些实施方式中,所述被测车辆的行驶状态还可以包括所述被测车辆输出的提示信息。
示例性的,若所述无人机按照所述测试指令在所述被测车辆的左前方或右前方驶入所述被测车辆当前的车道,所述被测车辆的预期行驶状态包括:负的纵向加速度和/或非零的横向加速度。当无人机在被测车辆左前方或右前方与所述被测车辆较近的位置并线时,如果被测车辆能够减速和/或变道,则可以避免与目标物的模型发生碰撞;但若被测车辆实际没有减速和变道,而是维持车速不变或者加速,则有发生碰撞的风险,则确定所述被测车辆的行驶状态不满足预设的行驶状态条件,可以控制所述无人机执行预设的避让任务,以使所述目标物的模型避让所述被测车辆。可选的,还可以根据所述测试指令确定预期的加速度的范围,如果被测车辆实际加速度超出所述预期的加速度的范围,也可以确定所述被测车辆的行驶状态不满足预设的行驶状态条件。
在一些实施方式中,可以根据所述被测车辆对所述交通场景的观测信息确定所述被测车辆后续的动作趋势,以及根据所述测试指令对应的无人机的飞行状态和所述车辆后续的动作趋势确定是否有发生碰撞的风险,以及确定所述被测车辆的行驶状态是否满足预设的行驶状态条件。
在一些实施方式中,可以根据所述测试指令确定所述被测车辆对所述交通场景的观测信息的预期值,当所述被测车辆实际的观测值与所述预期值相同或接近时,可以避免与所述目标物的模型发生碰撞,如果被测车辆实际的观测值与所述预期值不同,则有发生碰撞的概率,可以确定所述被测车辆的行驶状态不满足预设的行驶状态条件。
示例性的,若所述无人机按照所述测试指令在所述被测车辆所在车道的前方减速,所述被测车辆的预期行驶状态包括以下至少一种:所述被测车辆观测的所述目标物的模型的加速度为负、所述被测车辆观测的所述目标物的模型与 所述被测车辆的相对距离减小。
在一些实施方式中,可以根据所述被测车辆的自动驾驶模块输出的控制信息确定所述被测车辆后续的动作趋势,以及根据所述测试指令对应的目标物的模型的行驶状态和所述车辆后续的动作趋势确定是否有发生碰撞的风险,以及确定所述被测车辆的行驶状态是否满足预设的行驶状态条件。
在一些实施方式中,可以根据所述测试指令确定所述被测车辆的控制信息的预期值,当所述被测车辆实际的控制信息与所述预期值相同或接近时,可以避免与所述目标物的模型发生碰撞,如果被测车辆实际的控制信息与所述预期值不同,则有发生碰撞的概率,可以确定所述被测车辆的行驶状态不满足预设的行驶状态条件。
示例性的,若所述无人机按照所述测试指令在所述被测车辆所在车道的前方减速,所述被测车辆的预期行驶状态包括:用于控制所述被测车辆减速和/或变道的控制信息。
可选的,所述被测车辆的行驶状态包括所述被测车辆的执行系统的状态;所述被测车辆的执行系统包括以下至少一种:制动系统、转向系统、驱动系统。根据所述被测车辆的执行系统的状态可以确定所述被测车辆的行驶轨迹,根据所述行驶轨迹确定是否会与所述目标物的模型碰撞,若确定会与所述目标物的模型碰撞,则可以确定所述被测车辆的行驶状态不满足预设的行驶状态条件。
示例性的,若所述无人机按照所述测试指令在所述被测车辆所在车道的前方减速,所述被测车辆的预期行驶状态包括以下至少一种:所述制动系统制动、所述转向系统转向、所述驱动系统减油门。
示例性的,若所述无人机按照所述测试指令在所述被测车辆的左前方或右前方驶入所述被测车辆当前的车道,所述被测车辆的预期行驶状态包括以下至少一种:所述制动系统制动、所述转向系统转向、所述驱动系统减油门。
举例而言,当无人机在被测车辆正前方减速或者在所述被测车辆的左前方或右前方驶入所述被测车辆当前的车道时,如果被测车辆实际的执行系统的状态能够根据所述测试指令对应的执行系统的状态的预期值相同,使被测车辆减速或者减小加速度或者变道,则可以避免与目标物的模型发生碰撞;但如果被测车辆实际的执行系统的状态与状态的预期值不同,使被测车辆实际没有减速 和变道,而是维持车速不变或者加速,则有发生碰撞的风险,从而可以确定所述被测车辆的行驶状态不满足预设的行驶状态条件,可以控制所述无人机执行预设的避让任务,以使所述目标物的模型避让所述被测车辆。
在其他一些实施方式中,所述被测车辆的行驶状态是否满足所述行驶状态条件,是根据所述被测车辆对所述交通场景的观测信息与所述无人机对所述交通场景的观测信息是否一致确定的。示例性的,若所述被测车辆对所述交通场景的观测信息与所述无人机对所述交通场景的观测信息不一致,则有发生碰撞的风险,从而可以确定所述被测车辆的行驶状态不满足预设的行驶状态条件,可以控制所述无人机执行预设的避让任务,以使所述目标物的模型避让所述被测车辆;若所述被测车辆对所述交通场景的观测信息与所述无人机对所述交通场景的观测信息一致,则可以确定被测车辆的观测信息也是准确的,可以根据观测信息作出准确的决策和控制,发生碰撞的可能性较低。举例而言,若所述无人机按照所述测试指令在所述被测车辆所在车道的前方行驶时,观测到前方不远处有红绿灯显示红灯或者限速标识,而被测车辆没有观测到红灯或者限速标识,则无人机和目标物的模型减速刹车而被测车辆未减速刹车,则有发生碰撞的风险,从而可以确定所述被测车辆的行驶状态不满足预设的行驶状态条件。
本申请实施例提供的自动驾驶测试方法,通过无人机搭载目标物的模型,目标物的模型随所述无人机在所述交通场景中运动模拟交通参与者,给被测车辆创设测试场景,由无人机搭载的目标物的模型机动性、灵敏度更好,响应快,精度高,地面行驶的被测车辆不易与无人机碰撞或者碰撞的损失较轻,因此可以给被测车辆创设更丰富的测试场景,例如可以进行更高速的测试,而且更安全。
本申请实施例提供的自动驾驶测试方法,通过各种逃逸方式设计,可以避免遭受碰撞,或碰撞后几乎没有损失,可以进行多组危险场景连续模拟测试,具有效率高、可连续场景模拟、安全等优点。
可选的,目标物的模型由少量软板或薄膜充气而成,具有低成本、易拼接等优点。
可选的,被测车辆可以与可移动目标物实时相互通信,从而可移动目标物可以根据被测车辆的行驶状态可以更加精确的模拟真实场景。
可选的,对于复杂交通流测试场景,可以不需要结合其他实体车和昂贵辅助驾驶设备来完成,可以安全、效率高的完成多车协同场景。
可选的,可以完成行人、机动车、非机动车都存在的复杂测试场景,行人、机动车、非机动车之间可以实时通信、感知,模拟真实复杂街区路况。
可选的,不仅可以提前设置关键参数,运动路径,还能满足测试中的感知通信功能,实时变化自调整,多车协同变化,互相参考,互相调整。
可选的,能模拟复杂的环境工况,如降雨,浓雾,灰尘、光照等情况,在测试工作中可以验证恶劣环境下的预期功能安全。
可选的,基于无人机的测试系统机动性灵敏度相对高,无人机的反应机动性优于地面移动平台,最大减速度,速度足够(极限危险工况可以模拟)。
请结合上述实施例参阅图11,图11是本申请另一实施例提供的一种基于无人飞行器的车辆测试方法,即自动驾驶测试方法的流程示意图。
所述自动驾驶测试方法用于无人机,所述无人机能够搭载目标物的模型。所述测试方法包括步骤S210至步骤S220。
S210、接收测试指令。
S220、根据所述测试指令在交通场景中运动,以使所述目标物的模型随所述无人机在所述交通场景中运动。
本申请实施例提供的自动驾驶测试方法的具体原理和实现方式均与前述实施例的自动驾驶测试方法类似,此处不再赘述。
请结合上述实施例参阅图12,图12是本申请又一实施例提供的一种基于无人飞行器的车辆测试方法,即自动驾驶测试方法的流程示意图。
所述自动驾驶测试方法用于被测车辆,所述测试方法包括步骤S310至步骤S330。
S310、基于对交通场景中目标物的模型的观测数据和预设的自动驾驶算法在交通场景中自主运动,所述交通场景中包括无人机和所述无人机搭载的目标物的模型,所述目标物的模型随所述无人机在所述交通场景中运动;
S320、获取所述被测车辆的行驶状态;
S330、将所述被测车辆的行驶状态发送给自动驾驶测试装置,以使所述自动驾驶测试装置根据所述被测车辆的行驶状态,生成所述被测车辆的测试结果。
本申请实施例提供的自动驾驶测试方法的具体原理和实现方式均与前述实施例的自动驾驶测试方法类似,此处不再赘述。
请结合上述实施例参阅图13,图13是本申请实施例提供的自动驾驶测试装置500的示意性框图。
其中,自动驾驶测试装置500可以设置在无人机上,当然也可以设置在被测车辆上,或者也可以设置在路侧设备(Road Side Unit,RSU)或者终端设备上。自动驾驶测试装置设置在无人机上时,能够更快更实时更准确的控制无人机调整轨迹。
该自动驾驶测试装置500包括一个或多个处理器501,一个或多个处理器501单独地或共同地工作,用于执行前述的自动驾驶测试方法。
示例性的,自动驾驶测试装置500还包括存储器502。
示例性的,处理器501和存储器502通过总线503连接,该总线503比如为I2C(Inter-integrated Circuit)总线。
具体地,处理器501可以是微控制单元(Micro-controller Unit,MCU)、中央处理单元(Central Processing Unit,CPU)或数字信号处理器(Digital Signal Processor,DSP)等。
具体地,存储器502可以是Flash芯片、只读存储器(ROM,Read-Only Memory)磁盘、光盘、U盘或移动硬盘等。
其中,所述处理器501用于运行存储在存储器502中的计算机程序,并在执行所述计算机程序时实现前述的自动驾驶测试方法。
本申请实施例提供的自动驾驶测试装置500的具体原理和实现方式均与前述实施例的自动驾驶测试方法类似,此处不再赘述。
请结合上述实施例参阅图14,图14是本申请实施例提供的无人飞行器,即无人机600的示意性框图。
无人机600包括飞行平台610,飞行平台610用于飞行。该无人机600还包括一个或多个处理器601,一个或多个处理器601单独地或共同地工作,用于执行前述的自动驾驶测试方法的步骤。
本申请实施例提供的无人机600的具体原理和实现方式均与前述实施例的自动驾驶测试方法类似,此处不再赘述。
请结合上述实施例参阅图15,图15是本申请实施例提供的可移动目标物700的示意性框图。
可移动目标物700包括前述的无人机600,以及目标物的模型710,目标物的模型710能够连接在所述无人机上,随所述无人机在交通场景中运动。
本申请实施例提供的可移动目标物700的具体原理和实现方式均与前述实施例的无人机600类似,此处不再赘述。
请结合上述实施例参阅图16,图16是本申请实施例提供的车辆800的示意性框图。
车辆800包括车辆平台810,以及一个或多个处理器801,一个或多个处理器801单独地或共同地工作,用于执行前述的自动驾驶测试方法的步骤。
本申请实施例提供的车辆800的具体原理和实现方式均与前述实施例的自动驾驶测试方法类似,此处不再赘述。
请结合上述实施例参阅图17,图17是本申请实施例提供的基于无人飞行器的车辆测试系统,或者称为自动驾驶测试系统900的示意性框图。
自动驾驶测试系统900,包括:
无人机910;
目标物的模型920,能够搭载于所述无人机910,随所述无人机910在交通场景中运动;
被测车辆930,所述被测车辆930能够基于对所述目标物的模型的观测数据和预设的自动驾驶算法在所述交通场景中自主运动;
前述的自动驾驶测试装置500。
在一些实施方式中,自动驾驶测试装置500可以设置在无人机910上。
本申请实施例提供的自动驾驶测试系统900的具体原理和实现方式均与前述实施例的自动驾驶测试方法类似,此处不再赘述。
本申请实施例还提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现上述实施例提供的自动驾驶测试方法的步骤。
应当理解,在此本申请中所使用的术语仅仅是出于描述特定实施例的目的而并不意在限制本申请。
还应当理解,在本申请和所附权利要求书中使用的术语“和/或”是指相关联列出的项中的一个或多个的任何组合以及所有可能组合,并且包括这些组合。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到各种等效的修改或替换,这些修改或替换都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。

Claims (76)

  1. 一种自动驾驶测试方法,其特征在于,所述测试方法包括:
    将预设的测试指令发送给无人机,所述测试指令用于指示所述无人机在交通场景中运动,其中,所述无人机搭载目标物的模型,所述目标物的模型随所述无人机在所述交通场景中运动;
    获取所述交通场景中被测车辆的行驶状态,其中,所述被测车辆能够基于对所述目标物的模型的观测数据和预设的自动驾驶算法在所述交通场景中自主运动;
    根据所述被测车辆的行驶状态,生成所述被测车辆的测试结果。
  2. 根据权利要求1所述的自动驾驶测试方法,其特征在于,所述目标物的模型包括以下一种或者多种:柔性板、柔性膜、能够充气的气囊。
  3. 根据权利要求2所述的自动驾驶测试方法,其特征在于,所述气囊搭载在所述无人机上时,能够充气或放气。
  4. 根据权利要求3所述的自动驾驶测试方法,其特征在于,所述气囊由以下一种或者多种方式充气:由所述无人机飞行时桨叶产生的气流充气、由所述气囊上的充气装置充气、由所述气囊连接的外部充气装置充气。
  5. 根据权利要求1所述的自动驾驶测试方法,其特征在于,所述目标物的模型的最大重量与所述无人机的桨盘覆盖的面积正相关。
  6. 根据权利要求1所述的自动驾驶测试方法,其特征在于,所述目标物的模型的外表形状与交通参与者类似,所述交通参与者包括以下一种或者多种:机动车、非机动车、行人、动物。
  7. 根据权利要求1所述的自动驾驶测试方法,其特征在于,所述目标物的模型和/或所述无人机的全部或者部分外表面的材料根据所述被测车辆的雷达类型设置。
  8. 根据权利要求7所述的自动驾驶测试方法,其特征在于,所述外表面被设置为针对所述雷达的探测信号有以下一种或者多种:漫反射特性、折射特性、吸收特性。
  9. 根据权利要求8所述的自动驾驶测试方法,其特征在于,所述雷达的探 测信号为以下任一种:激光、毫米波、超声波。
  10. 根据权利要求1所述的自动驾驶测试方法,其特征在于,所述无人机和所述目标物的模型可拆卸的连接。
  11. 根据权利要求1所述的自动驾驶测试方法,其特征在于,所述无人机和所述目标物的模型可转动连接。
  12. 根据权利要求1所述的自动驾驶测试方法,其特征在于,所述目标物的模型的全部或者部分与所述无人机的连接能够在预设状态下自动断开。
  13. 根据权利要求12所述的自动驾驶测试方法,其特征在于,所述预设状态包括:所述目标物的模型与所述无人机之间的拉力大于预设阈值;和/或,所述目标物的模型与所述无人机之间的拉力方向处于预设方向。
  14. 根据权利要求12所述的自动驾驶测试方法,其特征在于,所述目标物的模型与所述无人机的连接能够在预设状态下自动断开,是基于所述目标物的模型与所述无人机之间的连接件的至少两个部件断开连接实现的。
  15. 根据权利要求12所述的自动驾驶测试方法,其特征在于,所述目标物的模型与所述无人机基于连接件进行连接,所述连接件响应于进入所述预设状态的触发指令,断开所述目标物的模型与所述无人机的所述连接。
  16. 根据权利要求15所述的自动驾驶测试方法,其特征在于,所述触发指令由传感器感测所述目标物的模型与所述无人机之间的拉力生成,所述传感器设置在所述目标物的模型和/或所述无人机上。
  17. 根据权利要求1所述的自动驾驶测试方法,其特征在于,所述目标物的模型包括多个部件,所述多个部件可拆卸的连接。
  18. 根据权利要求17所述的自动驾驶测试方法,其特征在于,所述多个部件中的至少两个部件连接不同的无人机。
  19. 根据权利要求1-18中任一项所述的自动驾驶测试方法,其特征在于,所述方法还包括:
    当所述被测车辆的行驶状态不满足预设的行驶状态条件时,控制所述无人机执行预设的避让任务,以使所述目标物的模型避让所述被测车辆。
  20. 根据权利要求19所述的自动驾驶测试方法,其特征在于,所述控制所述无人机执行预设的避让任务,以使所述目标物的模型避让所述被测车辆,包 括:
    控制所述无人机在水平方向和/或竖直方向上调整运动状态,以使所述目标物的模型避让所述被测车辆。
  21. 根据权利要求20所述的自动驾驶测试方法,其特征在于,所述无人机在竖直方向上调整运动状态时的加速度,根据所述目标物的模型的重量确定。
  22. 根据权利要求19所述的自动驾驶测试方法,其特征在于,所述控制所述无人机执行预设的避让任务,以使所述目标物的模型避让所述被测车辆,包括:
    控制所述无人机调整所述目标物的模型的位姿,以使所述目标物的模型的底端远离地面。
  23. 根据权利要求19所述的自动驾驶测试方法,其特征在于,所述控制所述无人机执行预设的避让任务,以使所述目标物的模型避让所述被测车辆,包括:
    控制所述无人机执行预设的避让任务,以使所述目标物的模型的可拆卸连接的多个部件之间互相分离。
  24. 根据权利要求23所述的自动驾驶测试方法,其特征在于,所述多个部件之间互相分离后,位于所述被测车辆的行驶路径上的部件数量减少。
  25. 根据权利要求23所述的自动驾驶测试方法,其特征在于,所述多个部件,在所述无人机执行所述避让任务时无人机桨叶的下压风场的作用下,和/或在所述无人机执行所述避让任务时机械结构的驱动作用下互相分离。
  26. 根据权利要求23所述的自动驾驶测试方法,其特征在于,所述控制所述无人机执行预设的避让任务,以使所述目标物的模型的可拆卸连接的多个部件之间互相分离,包括:
    控制多个无人机向不同方向远离所述被测车辆,以使所述多个无人机带着所述目标物的模型的多个部件之间互相分离且向不同方向远离所述被测车辆。
  27. 根据权利要求23-26中任一项所述的自动驾驶测试方法,其特征在于,所述多个部件中至少两个相邻的部件通过以下一种或多种方式连接:卡接扣合、过盈配合、磁性吸合、黏性连接。
  28. 根据权利要求1-18中任一项所述的自动驾驶测试方法,其特征在于, 所述方法还包括:
    当所述被测车辆的行驶状态不满足预设的行驶状态条件时,断开所述目标物的模型的全部或者部分与所述无人机的连接。
  29. 根据权利要求1-28中任一项所述的自动驾驶测试方法,其特征在于,所述方法还包括:
    控制所述无人机搭载的环境工况模拟装置运行,以使所述环境工况模拟装置模拟以下一种或多种环境:降雨,浓雾,灰尘、光照。
  30. 根据权利要求1-28中任一项所述的自动驾驶测试方法,其特征在于,所述测试指令用于指示无人机在交通场景中以预设的测试轨迹运动。
  31. 根据权利要求1-28中任一项所述的自动驾驶测试方法,其特征在于,所述根据所述被测车辆的行驶状态,生成所述被测车辆的测试结果,包括:
    根据所述无人机的运动状态和所述被测车辆的行驶状态,生成所述被测车辆的测试结果。
  32. 根据权利要求1-28中任一项所述的自动驾驶测试方法,其特征在于,所述被测车辆的测试结果包括对所述被测车辆的所述行驶状态进行预设处理得到的数据集。
  33. 根据权利要求19-28中任一项所述的自动驾驶测试方法,其特征在于,所述被测车辆的行驶状态是否满足所述行驶状态条件,是根据一个采样周期得到的所述被测车辆的行驶状态确定的,或者是根据多个采样周期得到的所述被测车辆的行驶状态的变化趋势确定的。
  34. 根据权利要求33所述的自动驾驶测试方法,其特征在于,所述被测车辆的行驶状态包括以下至少一种:
    所述被测车辆的运动参数、所述被测车辆对所述交通场景的观测信息、所述被测车辆的控制信息、所述被测车辆与所述交通场景中其余物体的相对运动关系。
  35. 根据权利要求33所述的自动驾驶测试方法,其特征在于,所述被测车辆的行驶状态是否满足所述行驶状态条件,是根据所述被测车辆与所述目标物的模型的相对位置关系确定的。
  36. 根据权利要求33所述的自动驾驶测试方法,其特征在于,所述被测车 辆的行驶状态是否满足所述行驶状态条件,是根据预设类型的评价指标确定的,所述评价指标是根据所述被测车辆的行驶状态确定的。
  37. 根据权利要求33所述的自动驾驶测试方法,其特征在于,所述被测车辆的行驶状态是否满足所述行驶状态条件,是根据获取到的所述被测车辆的行驶状态和所述被测车辆的预期行驶状态确定的,所述预期行驶状态根据所述测试指令确定。
  38. 根据权利要求33所述的自动驾驶测试方法,其特征在于,所述被测车辆的行驶状态是否满足所述行驶状态条件,是根据所述被测车辆对所述交通场景的观测信息与所述目标物的模型对所述交通场景的观测信息是否一致确定的。
  39. 一种自动驾驶测试方法,其特征在于,用于无人机,所述无人机能够搭载目标物的模型,所述测试方法包括:
    接收测试指令;
    根据所述测试指令在交通场景中运动,以使所述目标物的模型随所述无人机在所述交通场景中运动。
  40. 根据权利要求39所述的自动驾驶测试方法,其特征在于,所述无人机和所述目标物的模型可拆卸的连接。
  41. 根据权利要求39所述的自动驾驶测试方法,其特征在于,所述无人机和所述目标物的模型可转动连接。
  42. 根据权利要求39所述的自动驾驶测试方法,其特征在于,所述目标物的模型的全部或者部分与所述无人机的连接能够在预设状态下自动断开。
  43. 根据权利要求39所述的自动驾驶测试方法,其特征在于,所述目标物的模型包括多个部件,所述多个部件可拆卸的连接。
  44. 根据权利要求43所述的自动驾驶测试方法,其特征在于,所述多个部件中的至少两个部件连接不同的无人机。
  45. 根据权利要求39-44中任一项所述的自动驾驶测试方法,其特征在于,所述方法还包括:
    当所述交通场景中的被测车辆的行驶状态不满足预设的行驶状态条件时,执行预设的避让任务,以使所述目标物的模型避让所述被测车辆。
  46. 根据权利要求45所述的自动驾驶测试方法,其特征在于,所述执行预 设的避让任务,以使所述目标物的模型避让所述被测车辆,包括:
    在水平方向和/或竖直方向上调整运动状态,以使所述目标物的模型避让所述被测车辆。
  47. 根据权利要求45所述的自动驾驶测试方法,其特征在于,所述执行预设的避让任务,以使所述目标物的模型避让所述被测车辆,包括:
    调整所述目标物的模型的位姿,以使所述目标物的模型的底端远离地面。
  48. 根据权利要求45所述的自动驾驶测试方法,其特征在于,所述执行预设的避让任务,以使所述目标物的模型避让所述被测车辆,包括:
    执行预设的避让任务,以使所述目标物的模型的可拆卸连接的多个部件之间互相分离。
  49. 根据权利要求39-44中任一项所述的自动驾驶测试方法,其特征在于,所述方法还包括:
    当所述交通场景中的被测车辆的行驶状态不满足预设的行驶状态条件时,断开与所述目标物的模型的全部或者部分的连接。
  50. 一种自动驾驶测试方法,其特征在于,用于被测车辆,所述测试方法包括:
    基于对交通场景中目标物的模型的观测数据和预设的自动驾驶算法在交通场景中自主运动,所述交通场景中包括无人机和所述无人机搭载的目标物的模型,所述目标物的模型随所述无人机在所述交通场景中运动;
    获取所述被测车辆的行驶状态;
    将所述被测车辆的行驶状态发送给自动驾驶测试装置,以使所述自动驾驶测试装置根据所述被测车辆的行驶状态,生成所述被测车辆的测试结果。
  51. 根据权利要求50所述的自动驾驶测试方法,其特征在于,所述被测车辆的行驶状态还用于:所述自动驾驶测试装置确定所述被测车辆的行驶状态不满足预设的行驶状态条件时,控制所述无人机执行预设的避让任务,以使所述目标物的模型避让所述被测车辆。
  52. 根据权利要求50所述的自动驾驶测试方法,其特征在于,所述被测车辆的行驶状态还用于:所述自动驾驶测试装置确定所述被测车辆的行驶状态不满足预设的行驶状态条件时,断开所述目标物的模型的全部或者部分与所述无 人机的连接。
  53. 一种自动驾驶测试装置,其特征在于,包括一个或多个处理器,单独地或共同地工作,用于执行权利要求1-38中任一项所述的自动驾驶测试方法的步骤。
  54. 一种无人机,其特征在于,包括:
    飞行平台,用于飞行;
    一个或多个处理器,单独地或共同地工作,用于执行权利要求39-49中任一项所述的自动驾驶测试方法的步骤。
  55. 一种可移动目标物,其特征在于,所述可移动目标物包括:
    权利要求54所述的无人机;
    目标物的模型,能够连接在所述无人机上,随所述无人机在交通场景中运动。
  56. 根据权利要求55所述的可移动目标物,其特征在于,所述目标物的模型包括以下一种或者多种:柔性板、柔性膜、能够充气的气囊。
  57. 根据权利要求56所述的可移动目标物,其特征在于,所述气囊搭载在所述无人机上时,能够充气或放气。
  58. 根据权利要求57所述的可移动目标物,其特征在于,所述气囊由以下一种或者多种方式充气:由所述无人机飞行时桨叶产生的气流充气、由所述气囊上的充气装置充气、由所述气囊连接的外部充气装置充气。
  59. 根据权利要求55所述的可移动目标物,其特征在于,所述目标物的模型的最大重量与所述无人机的桨盘覆盖的面积正相关。
  60. 根据权利要求55所述的可移动目标物,其特征在于,所述目标物的模型的外表形状与交通参与者类似,所述交通参与者包括以下一种或者多种:机动车、非机动车、行人、动物。
  61. 根据权利要求55所述的可移动目标物,其特征在于,所述目标物的模型和/或所述无人机的全部或者部分外表面的材料根据所述交通场景中被测车辆的雷达类型设置。
  62. 根据权利要求61所述的可移动目标物,其特征在于,所述外表面被设置为针对所述雷达的探测信号有以下一种或者多种:漫反射特性、折射特性、 吸收特性。
  63. 根据权利要求62所述的可移动目标物,其特征在于,所述雷达的探测信号为以下任一种:激光、毫米波、超声波。
  64. 根据权利要求55所述的可移动目标物,其特征在于,所述目标物的模型能够和所述无人机可拆卸的连接。
  65. 根据权利要求55所述的可移动目标物,其特征在于,所述目标物的模型能够和所述无人机可转动连接。
  66. 根据权利要求55所述的可移动目标物,其特征在于,所述目标物的模型的全部或者部分与所述无人机的连接能够在预设状态下自动断开。
  67. 根据权利要求66所述的可移动目标物,其特征在于,所述预设状态包括:所述目标物的模型与所述无人机之间的拉力大于预设阈值;和/或,所述目标物的模型与所述无人机之间的拉力方向处于预设方向。
  68. 根据权利要求66所述的可移动目标物,其特征在于,所述目标物的模型与所述无人机的连接能够在预设状态下自动断开,是基于所述目标物的模型与所述无人机之间的连接件的至少两个部件断开连接实现的。
  69. 根据权利要求66所述的可移动目标物,其特征在于,所述目标物的模型与所述无人机基于连接件进行连接,所述连接件响应于进入所述预设状态的触发指令,断开所述目标物的模型与所述无人机的所述连接。
  70. 根据权利要求69所述的可移动目标物,其特征在于,所述触发指令由传感器感测所述目标物的模型与所述无人机之间的拉力生成,所述传感器设置在所述目标物的模型和/或所述无人机上。
  71. 根据权利要求55所述的可移动目标物,其特征在于,所述目标物的模型包括多个部件,所述多个部件可拆卸的连接。
  72. 根据权利要求71所述的可移动目标物,其特征在于,所述多个部件中的至少两个部件连接不同的无人机。
  73. 一种车辆,其特征在于,包括:
    车辆平台;
    一个或多个处理器,单独地或共同地工作,用于执行权利要求50-52中任一项所述的自动驾驶测试方法的步骤。
  74. 一种自动驾驶测试系统,其特征在于,包括:
    无人机;
    目标物的模型,能够搭载于所述无人机,随所述无人机在交通场景中运动;
    被测车辆,所述被测车辆能够基于对所述目标物的模型的观测数据和预设的自动驾驶算法在所述交通场景中自主运动;
    权利要求53所述的自动驾驶测试装置。
  75. 根据权利要求74所述的自动驾驶测试系统,其特征在于,所述自动驾驶测试装置设置在所述无人机上。
  76. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现:
    如权利要求1-38中任一项所述的自动驾驶测试方法;和/或
    如权利要求39-49中任一项所述的自动驾驶测试方法;和/或
    如权利要求50-52中任一项所述的自动驾驶测试方法。
PCT/CN2021/096991 2021-05-28 2021-05-28 基于无人飞行器的车辆测试方法及系统 WO2022246849A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2021/096991 WO2022246849A1 (zh) 2021-05-28 2021-05-28 基于无人飞行器的车辆测试方法及系统
CN202180087983.5A CN116783461A (zh) 2021-05-28 2021-05-28 基于无人飞行器的车辆测试方法及系统

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/096991 WO2022246849A1 (zh) 2021-05-28 2021-05-28 基于无人飞行器的车辆测试方法及系统

Publications (1)

Publication Number Publication Date
WO2022246849A1 true WO2022246849A1 (zh) 2022-12-01

Family

ID=84229466

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/096991 WO2022246849A1 (zh) 2021-05-28 2021-05-28 基于无人飞行器的车辆测试方法及系统

Country Status (2)

Country Link
CN (1) CN116783461A (zh)
WO (1) WO2022246849A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116183242A (zh) * 2023-02-22 2023-05-30 吉林大学 基于模拟降雨环境的自动驾驶测试方法、设备及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102007035474A1 (de) * 2007-07-26 2009-02-19 Audi Ag System und Verfahren zur Funktionsprüfung eines Fahrerassistenzsystems
EP2781904A1 (de) * 2013-03-22 2014-09-24 Continental Safety Engineering International GmbH Testvorrichtung zur Simulierung von Fahrsituationen
CN107218941A (zh) * 2017-05-25 2017-09-29 百度在线网络技术(北京)有限公司 应用于无人驾驶汽车的测试方法和装置
CN208012871U (zh) * 2018-03-20 2018-10-26 浙江合众新能源汽车有限公司 一种用于无人驾驶汽车试验的辅助道具人设备
CN208443587U (zh) * 2018-08-01 2019-01-29 上海汽车集团股份有限公司 一种碰撞保护机构
CN110197036A (zh) * 2019-06-04 2019-09-03 云动(上海)汽车技术有限公司 智能驾驶评测系统及评测方法
CN210912663U (zh) * 2019-11-22 2020-07-03 上海汽车集团股份有限公司 模拟行人的假人

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102007035474A1 (de) * 2007-07-26 2009-02-19 Audi Ag System und Verfahren zur Funktionsprüfung eines Fahrerassistenzsystems
EP2781904A1 (de) * 2013-03-22 2014-09-24 Continental Safety Engineering International GmbH Testvorrichtung zur Simulierung von Fahrsituationen
CN107218941A (zh) * 2017-05-25 2017-09-29 百度在线网络技术(北京)有限公司 应用于无人驾驶汽车的测试方法和装置
CN208012871U (zh) * 2018-03-20 2018-10-26 浙江合众新能源汽车有限公司 一种用于无人驾驶汽车试验的辅助道具人设备
CN208443587U (zh) * 2018-08-01 2019-01-29 上海汽车集团股份有限公司 一种碰撞保护机构
CN110197036A (zh) * 2019-06-04 2019-09-03 云动(上海)汽车技术有限公司 智能驾驶评测系统及评测方法
CN210912663U (zh) * 2019-11-22 2020-07-03 上海汽车集团股份有限公司 模拟行人的假人

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116183242A (zh) * 2023-02-22 2023-05-30 吉林大学 基于模拟降雨环境的自动驾驶测试方法、设备及存储介质

Also Published As

Publication number Publication date
CN116783461A (zh) 2023-09-19

Similar Documents

Publication Publication Date Title
US11574089B2 (en) Synthetic scenario generator based on attributes
US11568100B2 (en) Synthetic scenario simulator based on events
US11150660B1 (en) Scenario editor and simulator
CN114930303A (zh) 用于自主机器应用的虚拟环境场景和观察器
CN113767389A (zh) 从用于自主机器应用的经变换的真实世界传感器数据模拟逼真的测试数据
CN111919225A (zh) 使用模拟环境对自主机器进行培训、测试和验证
US20170158175A1 (en) Collision mitigated braking for autonomous vehicles
KR20210050925A (ko) 차량 충돌 회피 장치 및 방법
WO2022246852A1 (zh) 基于航测数据的自动驾驶系统测试方法、测试系统及存储介质
US10522041B2 (en) Display device control method and display device
US11496707B1 (en) Fleet dashcam system for event-based scenario generation
CN104992587A (zh) 一种模拟仿真系统
WO2022246860A1 (zh) 一种自动驾驶系统的性能测试方法
CN205050405U (zh) 一种模拟仿真系统
WO2023102911A1 (zh) 数据采集方法、数据展示方法、数据处理方法、飞行器的降落方法、数据展示系统及存储介质
US20220269836A1 (en) Agent conversions in driving simulations
WO2020264276A1 (en) Synthetic scenario generator based on attributes
WO2022246849A1 (zh) 基于无人飞行器的车辆测试方法及系统
WO2023010043A1 (en) Complementary control system for an autonomous vehicle
JP2024513666A (ja) ログデータに基づいたシミュレートされた環境でのオブジェクトのインスタンス化
US20220266859A1 (en) Simulated agents based on driving log data
WO2022246853A1 (zh) 一种车辆系统的安全测试方法和测试用车辆
CN114787894A (zh) 感知误差模型
KR20230003143A (ko) 차량이 붐 배리어를 통과하기 위한 방법 및 장치
US11932242B1 (en) Fleet dashcam system for autonomous vehicle operation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21942420

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202180087983.5

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21942420

Country of ref document: EP

Kind code of ref document: A1