WO2022236754A1 - 一种多驾驶员在环驾驶试验平台 - Google Patents

一种多驾驶员在环驾驶试验平台 Download PDF

Info

Publication number
WO2022236754A1
WO2022236754A1 PCT/CN2021/093471 CN2021093471W WO2022236754A1 WO 2022236754 A1 WO2022236754 A1 WO 2022236754A1 CN 2021093471 W CN2021093471 W CN 2021093471W WO 2022236754 A1 WO2022236754 A1 WO 2022236754A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
driving
traffic
scene
driver
Prior art date
Application number
PCT/CN2021/093471
Other languages
English (en)
French (fr)
Inventor
陈虹
蔡硕
丁海涛
胡云峰
宫洵
林佳眉
陈启军
王祝萍
Original Assignee
吉林大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 吉林大学 filed Critical 吉林大学
Priority to US17/778,897 priority Critical patent/US20240104008A1/en
Priority to PCT/CN2021/093471 priority patent/WO2022236754A1/zh
Publication of WO2022236754A1 publication Critical patent/WO2022236754A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3664Environments for testing or debugging software
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/005Handover processes
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B17/00Systems involving the use of models or simulators of said systems
    • G05B17/02Systems involving the use of models or simulators of said systems electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B23/00Testing or monitoring of control systems or parts thereof
    • G05B23/02Electric testing or monitoring
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0001Details of the control system
    • B60W2050/0019Control system elements or transfer functions
    • B60W2050/0028Mathematical models, e.g. for simulation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/10Accelerator pedal position
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/12Brake pedal position
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/18Steering angle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2710/00Output or target parameters relating to a particular sub-units
    • B60W2710/20Steering systems

Definitions

  • the invention relates to the technical field of development and testing of an automatic driving system, in particular to a multi-driver-in-the-loop driving test platform.
  • the present invention provides the following scheme:
  • a multi-driver-in-the-loop driving test platform at least including a sensing simulation system, a vehicle dynamics simulation system, a driving simulator and a scene simulation system;
  • the sensor simulation system is used to obtain target-level sensing information, and send the target-level sensing information to the vehicle control system;
  • the driving simulator is used to provide a driving environment and a driving scene for a real driver, and output driving instructions according to the driving intention of the real driver, and then send the driving instructions to the vehicle control system;
  • the vehicle dynamics simulation system is used to determine the vehicle pose information according to the control signal output by the vehicle control system;
  • the scene simulation system is used to update the driving scene displayed in the driving simulator in real time according to the vehicle pose information.
  • the sensor simulation system includes a host vehicle sensor simulation subsystem and multiple traffic vehicle sensor simulation subsystems, and the host vehicle sensor simulation subsystem and the traffic vehicle sensor simulation subsystem are both It includes a sensing generation host, Ethernet, Ethernet to CAN module and CAN bus; wherein, the sensing generation host and the Ethernet to CAN module are arranged in the cabinet;
  • the sensing generation host is used to transmit the target-level sensing information to the Ethernet-to-CAN module in the form of Ethernet signals through the Ethernet;
  • the Ethernet-to-CAN module is used to convert the Ethernet signal into a vehicle-mounted CAN signal, and send the vehicle-mounted CAN signal to a vehicle-mounted control system through the CAN bus.
  • the target-level sensing information is traffic target information collected by a virtual sensor;
  • the traffic target information includes traffic vehicle information, pedestrian information, lane line information and traffic light information;
  • the virtual sensors are respectively set in On the main vehicle and the traffic vehicle, the virtual sensors include cameras and millimeter-wave radars.
  • the driving simulator includes a main vehicle driving simulator and multiple traffic vehicle driving simulators, and both the main vehicle driving simulator and the traffic vehicle driving simulator include a seat, a steering wheel, an acceleration Pedals, brake pedals, steering motors and steering controllers;
  • the driving modes loaded by the vehicle control system include unmanned driving mode, man-machine co-driving mode and manual driving mode;
  • the steering controller is used to collect the accelerator pedal opening signal, the brake pedal opening signal and the steering wheel angle signal when the driving mode of the vehicle control system is the manual driving mode, and the accelerator pedal opening signal, the The brake pedal opening signal and the steering wheel angle signal are transmitted to the vehicle control system through the CAN bus;
  • the steering controller is also used to control the steering motor to work in the steering angle control mode when the driving mode of the vehicle control system is the unmanned driving mode;
  • the steering controller is also used to control the steering motor to work in a torque control mode when the driving mode of the vehicle control system is a man-machine co-driving mode.
  • the main vehicle driving simulator is a dual-station driving simulator
  • the traffic vehicle driving simulator is a single-station driving simulator.
  • the on-board control system is used to load different automatic driving algorithms according to different development and testing requirements;
  • the driving algorithm determines the control signals.
  • the vehicle dynamics system includes a host vehicle dynamics subsystem and a traffic vehicle dynamics subsystem, and both the host vehicle dynamics subsystem and the traffic vehicle dynamics subsystem include a vehicle dynamics host;
  • the vehicle dynamics host is used to set different vehicle dynamics models according to different development requirements; the vehicle dynamics host is arranged in the cabinet;
  • the vehicle dynamics host computer is also used to simulate the vehicle motion process and determine vehicle pose information in real time according to the vehicle dynamics model and the control signal.
  • the scene simulation system includes a host vehicle scene simulation subsystem and a traffic vehicle scene simulation subsystem;
  • the vehicle pose information includes the host vehicle pose information and the transit vehicle pose information;
  • the main vehicle scene simulation subsystem includes a U-shaped projection screen, a projector, a main vehicle fusion machine and a main vehicle scene generation host; wherein, the U-shaped projection screen and the projector are arranged in the main vehicle cab; the The main vehicle fusion machine and the main vehicle scene generation host are arranged in the cabinet; the main vehicle scene generation host has a built-in main vehicle driving scene simulation model; the main vehicle scene generation host is used to simulate the main vehicle driving scene model and the vehicle pose information of the main vehicle, determine the driving scene of the main vehicle in the current stage, and send it to the fusion machine of the main vehicle; the fusion machine of the main vehicle is used to fuse the driving scene of the main vehicle in the current stage, and fuse The final main vehicle driving scene is projected on the U-shaped projection screen by the projector;
  • the traffic car scene simulation subsystem includes a triple screen display and a traffic car scene generation host; wherein, the triple screen display is set in the traffic car cab; the traffic car scene generation host is arranged in the cabinet; the traffic car The scenario generation host has a built-in traffic vehicle driving scene simulation model; the traffic vehicle scene generation host is used to determine the traffic vehicle driving scene at the current stage according to the traffic vehicle driving scene simulation model and the traffic vehicle vehicle pose information, and The traffic vehicle driving scene in the current stage is sent to the triple screen display for display.
  • the U-shaped projection screen is a 270° U-shaped projection screen.
  • both the driving scene simulation model of the main vehicle and the driving scene simulation model of the traffic vehicle are constructed according to static environment elements, dynamic traffic elements and meteorological environment elements.
  • the invention provides a multi-driver-in-the-loop driving test platform, which at least includes a sensing simulation system, a vehicle dynamics simulation system, a driving simulator and a scene simulation system; the invention uses virtual simulation technology and a real driver to control the driving simulator
  • the technology overcomes the shortcomings of relying on traditional closed field tests and traditional open road tests to improve the automatic driving algorithm, and ensures the safety of real drivers. Therefore, the present invention has the advantages of saving research and development costs and shortening the research and development cycle.
  • Fig. 1 is a structural block diagram of a multi-driver-in-the-loop driving test platform of the present invention
  • Fig. 2 is a connection diagram of four systems inside a multi-driver-in-the-loop driving test platform of the present invention
  • Fig. 3 is a schematic structural diagram of a device applying a multi-driver-in-the-loop driving test platform according to the present invention.
  • the purpose of the present invention is to provide a multi-driver-in-the-loop driving test platform, using the simulation platform to develop and test the automatic driving algorithm, which can solve the non-reproducibility of the closed field test and the open road test, can ensure the safety of the test personnel, and can Saving research and development costs can also shorten the research and development cycle and accelerate the commercialization of autonomous driving technology.
  • the embodiment of the present invention provides a multi-driver-in-the-loop driving test platform, which is applied in the technical field of automatic driving system development and testing, as shown in Figure 1, at least including a sensing simulation system, a vehicle dynamics simulation system, and a driving simulator and scene simulation system.
  • the sensor simulation system is used to acquire target-level sensing information and send the target-level sensing information to the vehicle control system;
  • the driving simulator is used to provide driving environment and driving scenes for real drivers, and according to the The real driver's driving intention outputs driving instructions, and then sends the driving instructions to the on-board control system;
  • the vehicle dynamics simulation system is used to simulate the vehicle movement process according to the control signal output by the on-board control system, and determine Vehicle pose information;
  • the scene simulation system is used to update the driving scene displayed in the driving simulator in real time according to the vehicle pose information.
  • the sensor simulation system includes a host vehicle sensor simulation subsystem and a plurality of traffic vehicle sensor simulation subsystems, and both the host vehicle sensor simulation subsystem and the traffic vehicle sensor simulation subsystem include sensor generation Host, Ethernet, Ethernet-to-CAN module and CAN bus; wherein, the sensing generating host and the Ethernet-to-CAN module are arranged in a cabinet.
  • the main vehicle sensor simulation subsystem and the traffic vehicle sensor simulation subsystem can share a sensor generation host.
  • the sensing generating host is used to pass the target-level sensing information through the Ethernet and transmit it to the Ethernet-to-CAN module in the form of an Ethernet signal; the Ethernet-to-CAN module is used to transfer the The Ethernet signal is converted into a standard vehicle CAN signal, and the vehicle CAN signal is sent to the vehicle control system through the CAN bus for use.
  • the target-level sensing information refers to traffic target information detected by virtual sensors such as cameras and millimeter-wave radars.
  • the traffic target information includes traffic vehicle information (speed, pose, acceleration, etc.), pedestrian information (speed , pose, acceleration, etc.), lane line information (type, color, curvature, lane line cubic polynomial fitting parameters), traffic light information (position, light color); among them, the virtual sensors are respectively set on the main vehicle and the traffic vehicle superior.
  • the driving simulator includes a main vehicle driving simulator and multiple traffic vehicle driving simulators.
  • the real driver can control the traffic vehicle through the driving simulator to provide complex driving scenarios for the main vehicle, and then the automatic driving algorithm of the main vehicle for development and testing.
  • one main vehicle driving simulator and multiple traffic vehicle driving simulators are set up to realize the technical effect of multiple real drivers driving vehicles in the same driving scene.
  • the main vehicle driving simulator is a dual-station driving simulator
  • the traffic vehicle driving simulator is a single-station driving simulator
  • both the main vehicle driving simulator and the traffic vehicle driving simulator include a seat, a steering wheel , accelerator pedal, brake pedal, steering motor and steering controller.
  • the steering motor includes a rotation angle control mode and a torque control mode.
  • the driving modes that can be loaded by the vehicle-mounted control system include unmanned driving mode, man-machine co-driving mode and manual driving mode.
  • the steering controller is used to collect the accelerator pedal opening signal, the brake pedal opening signal and the steering wheel angle signal when the driving mode of the vehicle control system is the manual driving mode, and the accelerator pedal opening signal, the The brake pedal opening signal and the steering wheel angle signal are sent to the CAN bus, and then transmitted to the vehicle control system through the CAN bus to control the operation of the vehicle.
  • the steering controller is also used to control the steering motor to work in the steering angle control mode when the driving mode of the vehicle-mounted control system is the unmanned driving mode.
  • the steering controller is also used to control the steering motor to work in a torque control mode when the driving mode of the vehicle control system is a man-machine co-driving mode.
  • Angle control mode closed-loop control with the angle as the control target, the command to the steering controller is the angle command, the steering motor always tracks the set angle, this angle control mode can be used in unmanned driving technology, instead of the human driver turning the steering wheel realize the steering of the vehicle.
  • Torque control mode Closed-loop control is performed with the output torque of the motor as the control target.
  • the command to the steering controller is a torque command.
  • the steering motor always tracks the set torque.
  • the steering wheel angle is related to the steering wheel load.
  • This torque control mode can be used in human-machine co-driving technology; during the driving process of a real driver, the controller in the on-board control system can intervene to correct the steering operation of the real driver, so the human-machine co-driving can be developed and tested At the same time, the human-machine co-driving algorithm can also run in the steering controller.
  • human-machine co-driving algorithm here has no specific content, but just to illustrate the function of the multi-driver-in-the-loop driving test platform, which can not only develop and test the automatic driving algorithm, but also support the human-machine co-driving algorithm. Development and testing.
  • the on-board control system is used to load different automatic driving algorithms according to different development and testing requirements; the on-board control system is also used to determine control signal. Since the main car and the traffic car are respectively provided with on-board control systems, the control signals include the main car control signal and the traffic car control signal.
  • the vehicle dynamics system includes the vehicle dynamics subsystem of the main vehicle and the vehicle dynamics subsystem of the traffic vehicle; both the vehicle dynamics subsystem of the main vehicle and the vehicle dynamics subsystem of the traffic vehicle include a vehicle dynamics host, and the vehicle dynamics host is arranged in the cabinet.
  • the vehicle dynamics host has a built-in vehicle dynamics model, and the vehicle dynamics host is used to set different vehicle dynamics models according to different development requirements, that is, the vehicle dynamics model can be customized to meet the requirements of different automatic driving algorithms. Development and testing requirements.
  • the vehicle dynamics host computer is also used for simulating the vehicle movement process according to the vehicle dynamics model and the control signal, and determining the vehicle pose information in real time; the vehicle pose information includes the host vehicle pose information and traffic Vehicle pose information; Among them, the vehicle dynamics subsystem of the main vehicle and the vehicle dynamics subsystem of the traffic vehicle can share a vehicle dynamics host, and when the vehicle dynamics subsystem of the main vehicle and the vehicle dynamics subsystem of the traffic vehicle share a vehicle dynamics When the host is used, the vehicle dynamics host has a built-in vehicle dynamics model of the main vehicle and a vehicle dynamics model of the traffic vehicle.
  • the vehicle dynamics host is an industrial computer; the vehicle dynamics model is used to simulate the dynamic change process of the real vehicle (main vehicle and traffic vehicle) while driving, and the vehicle dynamics model is constructed by commercial software (such as CarSim or CarRealTime) , and the vehicle dynamics model can also be designed by itself, and written by programming software (such as matlab or Visual Studio). Downloading the vehicle dynamics model to the industrial computer can ensure real-time operation of the vehicle dynamics.
  • the scene simulation system provides a realistic driving scene for real drivers; the scene simulation system includes the main vehicle scene simulation subsystem and the traffic vehicle scene simulation subsystem.
  • the main vehicle scene simulation subsystem includes a U-shaped projection screen, a projector, a main vehicle fusion machine and a main vehicle scene generation host; wherein, the U-shaped projection screen and the projector are arranged in the main vehicle cab; the The main vehicle fusion machine and the main vehicle scene generation host are arranged in the cabinet; the main vehicle scene generation host has a built-in main vehicle driving scene simulation model; the main vehicle scene generation host is used to simulate the main vehicle driving scene model and the vehicle pose information of the main vehicle, determine the driving scene of the main vehicle in the current stage, and send it to the fusion machine of the main vehicle; the fusion machine of the main vehicle is used to fuse the driving scene of the main vehicle in the current stage, and fuse The final main car driving scene is projected on the U-shaped projection screen by the projector, so that the real driver of the main car provides a realistic driving scene.
  • the U-shaped projection screen is a 270° U-shaped projection screen, and the 270° U-shaped projection screen provides a wider driving field of vision for the real driver of the main car, making the real driver of the main car more immersive.
  • the real driver of the main car interacts with the main car through the main car driving simulator to generate a driving scene that is closer to the real traffic flow.
  • the traffic car scene simulation subsystem includes a triple screen display and a traffic car scene generation host; wherein, the triple screen display is set in the traffic car cab; the traffic car scene generation host is arranged in the cabinet; the traffic car The scenario generation host has a built-in traffic vehicle driving scene simulation model; the traffic vehicle scene generation host is used to determine the traffic vehicle driving scene at the current stage according to the traffic vehicle driving scene simulation model and the traffic vehicle vehicle pose information, and The traffic car driving scene in the current stage is sent to the triple screen display for display, so as to provide a realistic driving scene for the real driver of the traffic car.
  • the real driver of the traffic car can interact with the main car in real time through the driving view of the traffic car.
  • Both the driving scene simulation model of the main vehicle and the driving scene simulation model of the traffic vehicle are used to simulate the driving scene during the driving of the vehicle.
  • Both the main vehicle driving scene simulation model and the traffic vehicle driving scene simulation model are based on static environment elements (roads, traffic facilities, static obstacles, etc.), dynamic traffic elements (traffic vehicles, pedestrians, animals, etc.), meteorological environment elements (light, rain, snow, fog, etc.).
  • static environment elements roads, traffic facilities, static obstacles, etc.
  • dynamic traffic elements traffic vehicles, pedestrians, animals, etc.
  • meteorological environment elements light, rain, snow, fog, etc.
  • the scene simulation system is used to simulate and generate driving scenes based on static environment elements, dynamic traffic elements and meteorological environment elements, and display the driving scenes in the form of images in real time; Intention, manipulate the steering wheel, accelerator pedal and brake pedal in the driving simulator; the sensor simulation system simulates virtual sensors such as cameras and millimeter-wave radars, and sends information such as traffic vehicles, pedestrians, roads, and traffic signals detected by the virtual sensors to the automatic driving controller (i.e.
  • the automatic driving controller receives the driving signal (steering wheel signal, accelerator pedal signal and brake pedal signal) from the driving simulator and the sensing signal from the sensing simulation system, and then Carry out planning, decision-making and control according to the automatic driving algorithm, and send the control signal to the vehicle dynamics simulation system; the vehicle dynamics simulation system simulates the real vehicle movement process. After receiving the control signal, it calculates through the vehicle dynamics model.
  • the vehicle pose information is transmitted to the scene simulation system in real time; the scene simulation system updates the driving scene in real time according to the change of the vehicle pose, thus forming a driving closed loop.
  • the embodiment of the present invention also provides a device for applying the multi-driver-in-the-loop driving test platform described in Embodiment 1.
  • the device provided in this embodiment includes a main vehicle dual-station driving simulator 1, The first single-station traffic vehicle driving simulator 2, the second single-station traffic vehicle driving simulator 3, the cabinet 4 and the display 5.
  • the main car dual-station driving simulator 1 provides the driving environment and driving scenes for the real driver of the main car.
  • the dual-station driving simulator for the main vehicle includes a 270° U-shaped projection screen, 5 projectors, 2 driver seats, 1 accelerator pedal, 1 brake pedal, 1 steering controller, and 1 steering
  • the first single-station traffic car driving simulator 2 and the second single-station traffic car driving simulator 3 provide the driving environment and driving scenes for the real driver of the traffic car.
  • the first single-station traffic car driving simulator 2 and the second single-station traffic car driving simulator 3 both include 3 monitors, 1 driver's seat, 1 accelerator pedal, 1 brake pedal, and 1 steering control device, a steering motor and a single-station frame built with aluminum profiles.
  • Cabinet 4 is used to fixedly install the equipment required by the multi-driver-in-the-loop driving test platform.
  • Cabinet 4 includes 1 programmable power supply, 1 scene sensing host, 1 switch, 1 fusion machine, 1 Ethernet-to-CAN module, 1 vehicle dynamics host, and 1 controller (or MicroAutobox).
  • the scene sensing host is an integrated structure of the scene generating host and the sensing generating host.
  • the display 5 is used to display the control interface of the scene sensing host.
  • Virtual sensors can be installed in the vehicle, and the virtual sensors can detect traffic targets.
  • Programs can be written using application programming interfaces. Traffic target information can be read and sent to Ethernet.
  • the present invention has the following advantages:
  • the multi-driver-in-the-loop driving test platform can create more complex driving scenarios for the main vehicle.
  • the multi-driver-in-the-loop driving test platform provided by the present invention is controlled by a real driver through a driving simulator, which is closer to the real road test working conditions.
  • the multi-driver-in-the-loop driving test platform provided by the present invention can support human-machine co-driving algorithms or automatic driving algorithms to run in industrial computers, MicroAutobox, and controllers, satisfying the requirements of software-in-the-loop testing and testing in the algorithm development and testing process.
  • Hardware-in-the-loop testing requirements can enable rapid iteration of algorithms and accelerate the development process of autonomous vehicle algorithms.
  • the multi-driver-in-the-loop driving test platform provided by the present invention can save research and development costs, shorten the research and development cycle, and accelerate the commercialization of automatic driving technology.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • General Engineering & Computer Science (AREA)
  • Traffic Control Systems (AREA)

Abstract

本发明公开了一种多驾驶员在环驾驶试验平台,涉及自动驾驶系统开发与测试技术领域,至少包括传感模拟系统、车辆动力学模拟系统、驾驶模拟器和场景模拟系统;所述传感器模拟系统用于获取目标级传感信息,并将所述目标级传感信息发送至车载控制系统;所述驾驶模拟器用于为真人驾驶员提供驾驶环境和驾驶场景,并按照所述真人驾驶员的驾驶意图输出驾驶指令,然后将所述驾驶指令发送至所述车载控制系统;所述车辆动力学模拟系统用于根据所述车载控制系统输出的控制信号,确定车辆位姿信息;所述场景模拟系统用于根据所述车辆位姿信息对显示在所述驾驶模拟器中的驾驶场景进行实时更新。本发明能够达到节约研发成本,缩短研发周期的目的。

Description

一种多驾驶员在环驾驶试验平台 技术领域
本发明涉及自动驾驶系统开发与测试技术领域,特别是涉及一种多驾驶员在环驾驶试验平台。
背景技术
随着自动驾驶技术的发展,自动驾驶汽车的测试评价技术也成为了自动驾驶系统开发与测试的重要环节。根据美国兰德公司的研究,要想达到人类驾驶员的水平,至少需要累计177亿公里数据,用于完善自动驾驶算法。如此大量的测试里程需求,仅仅依靠传统的封闭场地测试和开放道路测试是不可能完成的,其所消耗的经济成本和时间成本也是难以承受的。
搭建自动驾驶汽车仿真测试试验平台成为自动驾驶系统开发与测试的一种解决方案。但是,目前缺少相关研究。
发明内容
基于此,有必要提供一种多驾驶员在环驾驶试验平台。
为实现上述目的,本发明提供了如下方案:
一种多驾驶员在环驾驶试验平台,至少包括传感模拟系统、车辆动力学模拟系统、驾驶模拟器和场景模拟系统;
所述传感器模拟系统用于获取目标级传感信息,并将所述目标级传感信息发送至车载控制系统;
所述驾驶模拟器用于为真人驾驶员提供驾驶环境和驾驶场景,并按照所述真人驾驶员的驾驶意图输出驾驶指令,然后将所述驾驶指令发送至所述车载控制系统;
所述车辆动力学模拟系统用于根据所述车载控制系统输出的控制信号,确定车辆位姿信息;
所述场景模拟系统用于根据所述车辆位姿信息对显示在所述驾驶模拟器中的驾驶场景进行实时更新。
可选的,所述传感器模拟系统包括一个主车传感模拟子系统和多个交通车传感模拟子系统,且所述主车传感模拟子系统和所述交通车传感模拟子系统均包括传感生成主机、以太网、以太网转CAN模块和CAN总线;其中,所述传感生成主机和所述以太网转CAN模块设置在机柜内;
所述传感生成主机用于将所述目标级传感信息通过所述以太网,并以以太网信号形式传输到所述以太网转CAN模块;
所述以太网转CAN模块用于将所述以太网信号转换成车载CAN信号,并通过所述CAN总线将所述车载CAN信号发送至车载控制系统。
可选的,所述目标级传感信息为虚拟传感器采集的交通目标物信息;所述交通目标物信息包括交通车信息、行人信息、车道线信息和交通灯信息;所述虚拟传感器分别设置在主车和交通车上,所述虚拟传感器包括摄像头和毫米波雷达。
可选的,所述驾驶模拟器包括一台主车驾驶模拟器和多台交通车驾驶模拟器,且所述主车驾驶模拟器和所述交通车驾驶模拟器均包括座椅、方 向盘、加速踏板、制动踏板、转向电机和转向控制器;
所述车载控制系统加载的驾驶模式包括无人驾驶模式、人机共驾模式和人工驾驶模式;
所述转向控制器用于当所述车载控制系统的驾驶模式为人工驾驶模式时,采集加速踏板开度信号、制动踏板开度信号和方向盘转角信号,并所述加速踏板开度信号、所述制动踏板开度信号和所述方向盘转角信号通过CAN总线传输到所述车载控制系统;
所述转向控制器还用于当所述车载控制系统的驾驶模式为无人驾驶模式时,控制所述转向电机按照转角控制模式工作;
所述转向控制器还用于当所述车载控制系统的驾驶模式为人机共驾模式时,控制所述转向电机按照转矩控制模式工作。
可选的,所述主车驾驶模拟器为双工位驾驶模拟器,所述交通车驾驶模拟器为单工位驾驶模拟器。
可选的,所述车载控制系统用于根据不同的开发和测试需求加载不同的自动驾驶算法;所述车载控制系统还用于根据所述目标级传感信息、所述驾驶指令和所述自动驾驶算法,确定控制信号。
可选的,所述车辆动力学系统包括主车车辆动力学子系统和交通车车辆动力学子系统,且所述主车车辆动力学子系统和所述交通车车辆动力学子系统均包括车辆动力学主机;所述车辆动力学主机用于根据不同的开发需求设置不同的车辆动力学模型;所述车辆动力学主机设置在机柜内;
所述车辆动力学主机还用于根据所述车辆动力学模型和所述控制信 号,仿真车辆运动过程,并实时确定车辆位姿信息。
可选的,所述场景模拟系统包括主车场景模拟子系统和交通车场景模拟子系统;所述车辆位姿信息包括主车车辆位姿信息和交通车车辆位姿信息;
所述主车场景模拟子系统包括U型投影屏幕、投影机、主车融合机和主车场景生成主机;其中,所述U型投影屏幕和所述投影机设置在主车驾驶室内;所述主车融合机和所述主车场景生成主机设置在机柜内;所述主车场景生成主机内置有主车驾驶场景模拟模型;所述主车场景生成主机用于根据所述主车驾驶场景模拟模型和所述主车车辆位姿信息,确定当前阶段主车驾驶场景,并发送至所述主车融合机;所述主车融合机用于融合所述当前阶段主车驾驶场景,并将融合后的主车驾驶场景通过所述投影机投影在所述U型投影屏幕上;
所述交通车场景模拟子系统包括三联屏显示器和交通车场景生成主机;其中,所述三联屏显示器设置交通车驾驶室内;所述交通车场景生成主机设置在所述机柜内;所述交通车场景生成主机内置有交通车驾驶场景模拟模型;所述交通车场景生成主机用于根据所述交通车驾驶场景模拟模型和所述交通车车辆位姿信息,确定当前阶段交通车驾驶场景,并将所述当前阶段交通车驾驶场景发送至所述三联屏显示器显示。
可选的,所述U型投影屏幕为270°的U型投影幕。
可选的,所述主车驾驶场景模拟模型和所述交通车驾驶场景模拟模型均是根据静态环境要素、动态交通要素和气象环境要素构建的。
与现有技术相比,本发明的有益效果是:
本发明提供了一种多驾驶员在环驾驶试验平台,至少包括传感模拟系统、车辆动力学模拟系统、驾驶模拟器和场景模拟系统;本发明通过虚拟模拟技术和真人驾驶员控制驾驶模拟器技术,克服了依靠传统封闭场地测试和传统开放道路测试完善自动驾驶算法时所存在的缺陷,而且保证真人驾驶员的安全。因此,本发明具有节约研发成本,缩短研发周期的优点。
说明书附图
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本发明一种多驾驶员在环驾驶试验平台的结构框图;
图2为本发明一种多驾驶员在环驾驶试验平台内部四个系统的连接关系图;
图3为本发明一种应用多驾驶员在环驾驶试验平台的装置结构示意图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没 有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
本发明的目的是提供一种多驾驶员在环驾驶试验平台,利用仿真平台开发和测试自动驾驶算法,能够解决封闭场地测试和开放道路测试的不可复现性,能够保证测试人员的安全,能够节约研发成本,还能够缩短研发周期,加快自动驾驶技术商业化进程。
为使本发明的上述目的、特征和优点能够更加明显易懂,下面结合附图和具体实施方式对本发明作进一步详细的说明。
实施例一
本发明实施例提供了一种多驾驶员在环驾驶试验平台,应用于自动驾驶系统开发与测试技术领域,如图1所示,至少包括传感模拟系统、车辆动力学模拟系统、驾驶模拟器和场景模拟系统。
所述传感器模拟系统用于获取目标级传感信息,并将所述目标级传感信息发送至车载控制系统;所述驾驶模拟器用于为真人驾驶员提供驾驶环境和驾驶场景,并按照所述真人驾驶员的驾驶意图输出驾驶指令,然后将所述驾驶指令发送至所述车载控制系统;所述车辆动力学模拟系统用于根据所述车载控制系统输出的控制信号,模拟车辆运动过程,确定车辆位姿信息;所述场景模拟系统用于根据所述车辆位姿信息对显示在所述驾驶模拟器中的驾驶场景进行实时更新。
下面详细介绍下各个系统。
所述传感器模拟系统包括一个主车传感模拟子系统和多个交通车传 感模拟子系统,且所述主车传感模拟子系统和所述交通车传感模拟子系统均包括传感生成主机、以太网、以太网转CAN模块和CAN总线;其中,所述传感生成主机和所述以太网转CAN模块设置在机柜内。同时需要说明下,所述主车传感模拟子系统和所述交通车传感模拟子系统可以共用一台传感生成主机。
所述传感生成主机用于将所述目标级传感信息通过所述以太网,并以以太网信号形式传输到所述以太网转CAN模块;所述以太网转CAN模块用于将所述以太网信号转换成标准的车载CAN信号,并通过所述CAN总线将所述车载CAN信号发送至车载控制系统以供使用。
所述目标级传感信息指的是通过摄像头、毫米波雷达等虚拟传感器检测到的交通目标物信息,该交通目标物信息包括交通车信息(速度、位姿、加速度等)、行人信息(速度、位姿、加速度等)、车道线信息(类型、颜色、曲率、车道线三次多项式拟合参数)、交通灯信息(位置、指示灯颜色);其中,虚拟传感器分别设置在主车和交通车上。
所述驾驶模拟器包括一台主车驾驶模拟器和多台交通车驾驶模拟器,真人驾驶员可通过驾驶模拟器控制交通车为主车提供复杂的驾驶场景,进而对主车的自动驾驶算法进行开发与测试。本实施例通过设置一台主车驾驶模拟器和多台交通车驾驶模拟器实现多个真人驾驶员在同一驾驶场景中驾驶车辆的技术效果。
所述主车驾驶模拟器为双工位驾驶模拟器,所述交通车驾驶模拟器为单工位驾驶模拟器,且所述主车驾驶模拟器和交通车驾驶模拟器均包括座椅、方向盘、加速踏板、制动踏板、转向电机和转向控制器。其中,所述 转向电机包括转角控制模式和转矩控制模式。
所述车载控制系统可加载的驾驶模式包括无人驾驶模式、人机共驾模式和人工驾驶模式。
所述转向控制器用于当所述车载控制系统的驾驶模式为人工驾驶模式时,采集加速踏板开度信号、制动踏板开度信号和方向盘转角信号,并所述加速踏板开度信号、所述制动踏板开度信号和所述方向盘转角信号发送到CAN总线上,然后通过CAN总线传输到车载控制系统,以用来控制车辆运行。
所述转向控制器还用于当所述车载控制系统的驾驶模式为无人驾驶模式时,控制所述转向电机按照转角控制模式工作。
所述转向控制器还用于当所述车载控制系统的驾驶模式为人机共驾模式时,控制所述转向电机按照转矩控制模式工作。
转角控制模式:以转角为控制目标进行闭环控制,给转向控制器的指令为转角指令,转向电机始终跟踪设定转角,这种转角控制模式可用于无人驾驶技术,代替人类驾驶员转动方向盘从而实现车辆的转向。
转矩控制模式:以电机输出转矩为控制目标进行闭环控制,给转向控制器的指令为转矩指令,转向电机始终跟踪设定转矩,方向盘转角大小和方向盘负载大小有关。这种转矩控制模式可用于人机共驾技术;在真人驾驶员驾驶过程中,车载控制系统中的控制器可介入对真人驾驶员的转向操作进行修正,因此可以开发和测试人机共驾算法,同时人机共驾算法也可以在转向控制器中运行。
需要注意的一点是,这里的人机共驾算法没有具体内容,只是想说明多驾驶员在环驾驶试验平台的功能,不仅可进行自动驾驶算法开发与测试,还能支持人机共驾算法的开发与测试。
所述车载控制系统用于根据不同的开发和测试需求加载不同的自动驾驶算法;所述车载控制系统还用于根据所述目标级传感信息、所述驾驶指令和所述自动驾驶算法,确定控制信号。由于在主车和交通车上分别设置有车载控制系统,故控制信号包括主车控制信号和交通车控制信号。
车辆动力学系统包括主车车辆动力学子系统和交通车车辆动力学子系统;主车车辆动力学子系统和交通车车辆动力学子系统均包括车辆动力学主机,所述车辆动力学主机设置在机柜内。
所述车辆动力学主机内置有车辆动力学模型,所述车辆动力学主机用于根据不同的开发需求设置不同的车辆动力学模型,即车辆动力学模型可进行定制,以满足不同自动驾驶算法的开发与测试需求。所述车辆动力学主机还用于根据所述车辆动力学模型和所述控制信号,仿真车辆运动过程,并实时确定车辆位姿信息;所述车辆位姿信息包括主车车辆位姿信息和交通车车辆位姿信息;其中,主车车辆动力学子系统和交通车车辆动力学子系统可以共用一台车辆动力学主机,并当主车车辆动力学子系统和交通车车辆动力学子系统共用一台车辆动力学主机时,该车辆动力学主机内置有主车车辆动力学模型和交通车车辆动力学模型。
车辆动力学主机是一台工控机;车辆动力学模型是用来模拟实车(主车和交通车)在行驶时的动力学变化过程,车辆动力学模型采用商用软件(例如CarSim或CarRealTime)构建,也可自行设计车辆动力学模型, 并通过编程软件(例如matlab或Visual Studio)编写。将车辆动力学模型下载到工控机中,能够保证车辆动力学实时运行。
场景模拟系统为真人驾驶员提供逼真的驾驶视景;场景模拟系统包括主车场景模拟子系统和交通车场景模拟子系统。
所述主车场景模拟子系统包括U型投影屏幕、投影机、主车融合机和主车场景生成主机;其中,所述U型投影屏幕和所述投影机设置在主车驾驶室内;所述主车融合机和所述主车场景生成主机设置在机柜内;所述主车场景生成主机内置有主车驾驶场景模拟模型;所述主车场景生成主机用于根据所述主车驾驶场景模拟模型和所述主车车辆位姿信息,确定当前阶段主车驾驶场景,并发送至所述主车融合机;所述主车融合机用于融合所述当前阶段主车驾驶场景,并将融合后的主车驾驶场景通过所述投影机投影在所述U型投影屏幕上,以便为主车真人驾驶员提供逼真的驾驶视景。
其中,所述U型投影屏幕为270°的U型投影幕,270°的U型投影幕为主车真人驾驶员提供了更宽广的驾驶视野,使主车真人驾驶员的浸入感更强,主车真人驾驶员通过主车驾驶模拟器与主车交互产生更接近真实交通流的驾驶场景。
所述交通车场景模拟子系统包括三联屏显示器和交通车场景生成主机;其中,所述三联屏显示器设置交通车驾驶室内;所述交通车场景生成主机设置在所述机柜内;所述交通车场景生成主机内置有交通车驾驶场景模拟模型;所述交通车场景生成主机用于根据所述交通车驾驶场景模拟模型和所述交通车车辆位姿信息,确定当前阶段交通车驾驶场景,并将所述 当前阶段交通车驾驶场景发送至所述三联屏显示器显示,以便为交通车真人驾驶员提供逼真的驾驶视景。交通车真人驾驶员可通过交通车驾驶视景与主车进行实时互动。
所述主车驾驶场景模拟模型和所述交通车驾驶场景模拟模型均是用来模拟车辆行驶过程中的驾驶场景。
所述主车驾驶场景模拟模型和所述交通车驾驶场景模拟模型均是根据静态环境要素(道路、交通设施、静态障碍物等)、动态交通要素(交通车、行人、动物等)、气象环境要素(光照、雨、雪、雾等)构建的。在车辆实时运行过程中,通过对所述主车驾驶场景模拟模型和所述交通车驾驶场景模拟模型中的要素进行更新以达到更新车辆驾驶场景的目的。
传感模拟系统、车辆动力学模拟系统、驾驶模拟器和场景模拟系统之间的关系如图2所示。
场景模拟系统用于根据静态环境要素、动态交通要素和气象环境要素模拟生成驾驶场景,并将驾驶场景以图象的形式实时显示出来;真人驾驶员观察到实时的驾驶视景,按照自己的驾驶意图,在驾驶模拟器中操纵方向盘、加速踏板和制动踏板;传感模拟系统模拟摄像头、毫米波雷达等虚拟传感器,并将虚拟传感器检测到的交通车、行人、道路、交通信号等信息发送到自动驾驶控制器(即车载控制系统);自动驾驶控制器接收到来自驾驶模拟器的驾驶信号(方向盘信号、加速踏板信号和制动踏板信号)和来自传感模拟系统的传感信号,然后根据自动驾驶算法进行规划、决策和控制,并将控制信号下发给车辆动力学模拟系统;车辆动力学模拟系统对真实车辆运动过程进行仿真,接收到控制信号后,经过车辆动力学模型 计算,车辆的位姿发生变化,实时将车辆位姿信息传递给场景模拟系统;场景模拟系统根据车辆位姿变化对驾驶场景进行实时更新,由此形成驾驶闭环。
实施例二
本发明实施例还提供了一种应用实施例一所述的多驾驶员在环驾驶试验平台的装置,如图3所示,本实施例提供的装置包括主车双工位驾驶模拟器1,第一单工位交通车驾驶模拟器2,第二单工位交通车驾驶模拟器3,机柜4和显示器5。
主车双工位驾驶模拟器1为主车真人驾驶员提供驾驶环境和驾驶场景。主车双工位驾驶模拟器包含1台270°的U型投影幕,5台投影机,2个驾驶座椅,1个加速踏板,1个制动踏板,1个转向控制器,1台转向电机和1个用铝型材搭建的双工位车架。
第一单工位交通车驾驶模拟器2和第二单工位交通车驾驶模拟器3为交通车真人驾驶员提供驾驶环境和驾驶场景。第一单工位交通车驾驶模拟器2和第二单工位交通车驾驶模拟器3均包括3个显示器,1个驾驶座椅,1个加速踏板,1个制动踏板,1个转向控制器,1台转向电机和1个用铝型材搭建的单工位车架。
机柜4用于固定安装多驾驶员在环驾驶试验平台所需的设备。机柜4包括1个可编程电源,1台场景传感主机,1台交换机,1台融合机,1个以太网转CAN模块,1台车辆动力学主机,1个控制器(或者为MicroAutobox)。其中,场景传感主机为场景生成主机和传感生成主机的集成结构。
显示器5用于显示场景传感主机的控制界面。
场景传感主机中安装了商用交通仿真软件,模拟了整个驾驶场景,可将虚拟传感器(摄像头、毫米波雷达)安装到车辆中,虚拟传感器就可以检测到交通目标,使用应用程序编程接口编写程序可将交通目标信息读取到并将其发送到以太网中。
与现有技术相比,本发明具有以下优点:
第一,多驾驶员在环驾驶试验平台可为主车创造更复杂的驾驶场景。
第二,现有自动驾驶仿真平台交通车多为程序控制,本发明提供的多驾驶员在环驾驶试验平台是真人驾驶员通过驾驶模拟器控制的,更接近真实路测工况。
第三,本发明提供的多驾驶员在环驾驶试验平台可支持人机共驾算法或自动驾驶算法在工控机、MicroAutobox、控制器中运行,满足算法开发与测试过程中的软件在环测试与硬件在环测试需求,可使算法进行快速迭代,加速自动驾驶汽车算法开发进程。
第四,本发明提供的多驾驶员在环驾驶试验平台能够节约研发成本,还能够缩短研发周期,加快自动驾驶技术商业化进程。
本说明书中各个实施例采用递进的方式描述,每个实施例重点说明的都是与其他实施例的不同之处,各个实施例之间相同相似部分互相参见即可。
本文中应用了具体个例对本发明的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本发明的方法及其核心思想;同时,对于 本领域的一般技术人员,依据本发明的思想,在具体实施方式及应用范围上均会有改变之处。综上所述,本说明书内容不应理解为对本发明的限制。

Claims (10)

  1. 一种多驾驶员在环驾驶试验平台,其特征在于,至少包括传感模拟系统、车辆动力学模拟系统、驾驶模拟器和场景模拟系统;
    所述传感器模拟系统用于获取目标级传感信息,并将所述目标级传感信息发送至车载控制系统;
    所述驾驶模拟器用于为真人驾驶员提供驾驶环境和驾驶场景,并按照所述真人驾驶员的驾驶意图输出驾驶指令,然后将所述驾驶指令发送至所述车载控制系统;
    所述车辆动力学模拟系统用于根据所述车载控制系统输出的控制信号,确定车辆位姿信息;
    所述场景模拟系统用于根据所述车辆位姿信息对显示在所述驾驶模拟器中的驾驶场景进行实时更新。
  2. 根据权利要求1所述的一种多驾驶员在环驾驶试验平台,其特征在于,所述传感器模拟系统包括一个主车传感模拟子系统和多个交通车传感模拟子系统,且所述主车传感模拟子系统和所述交通车传感模拟子系统均包括传感生成主机、以太网、以太网转CAN模块和CAN总线;其中,所述传感生成主机和所述以太网转CAN模块设置在机柜内;
    所述传感生成主机用于将所述目标级传感信息通过所述以太网,并以以太网信号形式传输到所述以太网转CAN模块;
    所述以太网转CAN模块用于将所述以太网信号转换成车载CAN信号,并通过所述CAN总线将所述车载CAN信号发送至车载控制系统。
  3. 根据权利要求1所述的一种多驾驶员在环驾驶试验平台,其特征 在于,所述目标级传感信息为虚拟传感器采集的交通目标物信息;所述交通目标物信息包括交通车信息、行人信息、车道线信息和交通灯信息;所述虚拟传感器分别设置在主车和交通车上,所述虚拟传感器包括摄像头和毫米波雷达。
  4. 根据权利要求1所述的一种多驾驶员在环驾驶试验平台,其特征在于,所述驾驶模拟器包括一台主车驾驶模拟器和多台交通车驾驶模拟器,且所述主车驾驶模拟器和所述交通车驾驶模拟器均包括座椅、方向盘、加速踏板、制动踏板、转向电机和转向控制器;
    所述车载控制系统加载的驾驶模式包括无人驾驶模式、人机共驾模式和人工驾驶模式;
    所述转向控制器用于当所述车载控制系统的驾驶模式为人工驾驶模式时,采集加速踏板开度信号、制动踏板开度信号和方向盘转角信号,并所述加速踏板开度信号、所述制动踏板开度信号和所述方向盘转角信号通过CAN总线传输到所述车载控制系统;
    所述转向控制器还用于当所述车载控制系统的驾驶模式为无人驾驶模式时,控制所述转向电机按照转角控制模式工作;
    所述转向控制器还用于当所述车载控制系统的驾驶模式为人机共驾模式时,控制所述转向电机按照转矩控制模式工作。
  5. 根据权利要求4所述的一种多驾驶员在环驾驶试验平台,其特征在于,所述主车驾驶模拟器为双工位驾驶模拟器,所述交通车驾驶模拟器为单工位驾驶模拟器。
  6. 根据权利要求1所述的一种多驾驶员在环驾驶试验平台,其特征在于,所述车载控制系统用于根据不同的开发和测试需求加载不同的自动驾驶算法;所述车载控制系统还用于根据所述目标级传感信息、所述驾驶指令和所述自动驾驶算法,确定控制信号。
  7. 根据权利要求1所述的一种多驾驶员在环驾驶试验平台,其特征在于,所述车辆动力学系统包括主车车辆动力学子系统和交通车车辆动力学子系统,且所述主车车辆动力学子系统和所述交通车车辆动力学子系统均包括车辆动力学主机;所述车辆动力学主机用于根据不同的开发需求设置不同的车辆动力学模型;所述车辆动力学主机设置在机柜内;
    所述车辆动力学主机还用于根据所述车辆动力学模型和所述控制信号,仿真车辆运动过程,并实时确定车辆位姿信息。
  8. 根据权利要求1所述的一种多驾驶员在环驾驶试验平台,其特征在于,所述场景模拟系统包括主车场景模拟子系统和交通车场景模拟子系统;所述车辆位姿信息包括主车车辆位姿信息和交通车车辆位姿信息;
    所述主车场景模拟子系统包括U型投影屏幕、投影机、主车融合机和主车场景生成主机;其中,所述U型投影屏幕和所述投影机设置在主车驾驶室内;所述主车融合机和所述主车场景生成主机设置在机柜内;所述主车场景生成主机内置有主车驾驶场景模拟模型;所述主车场景生成主机用于根据所述主车驾驶场景模拟模型和所述主车车辆位姿信息,确定当前阶段主车驾驶场景,并发送至所述主车融合机;所述主车融合机用于融合所述当前阶段主车驾驶场景,并将融合后的主车驾驶场景通过所述投影机投影在所述U型投影屏幕上;
    所述交通车场景模拟子系统包括三联屏显示器和交通车场景生成主机;其中,所述三联屏显示器设置交通车驾驶室内;所述交通车场景生成主机设置在所述机柜内;所述交通车场景生成主机内置有交通车驾驶场景模拟模型;所述交通车场景生成主机用于根据所述交通车驾驶场景模拟模型和所述交通车车辆位姿信息,确定当前阶段交通车驾驶场景,并将所述当前阶段交通车驾驶场景发送至所述三联屏显示器显示。
  9. 根据权利要求8所述的一种多驾驶员在环驾驶试验平台,其特征在于,所述U型投影屏幕为270°的U型投影幕。
  10. 根据权利要求8所述的一种多驾驶员在环驾驶试验平台,其特征在于,所述主车驾驶场景模拟模型和所述交通车驾驶场景模拟模型均是根据静态环境要素、动态交通要素和气象环境要素构建的。
PCT/CN2021/093471 2021-05-13 2021-05-13 一种多驾驶员在环驾驶试验平台 WO2022236754A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US17/778,897 US20240104008A1 (en) 2021-05-13 2021-05-13 A multi-drivers-in-the-loop driving testing platform
PCT/CN2021/093471 WO2022236754A1 (zh) 2021-05-13 2021-05-13 一种多驾驶员在环驾驶试验平台

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/093471 WO2022236754A1 (zh) 2021-05-13 2021-05-13 一种多驾驶员在环驾驶试验平台

Publications (1)

Publication Number Publication Date
WO2022236754A1 true WO2022236754A1 (zh) 2022-11-17

Family

ID=84027934

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/093471 WO2022236754A1 (zh) 2021-05-13 2021-05-13 一种多驾驶员在环驾驶试验平台

Country Status (2)

Country Link
US (1) US20240104008A1 (zh)
WO (1) WO2022236754A1 (zh)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103345548A (zh) * 2013-06-27 2013-10-09 同济大学 微观交通仿真器与多驾驶模拟器整合式一体化网络实验平台
CN105718065A (zh) * 2016-01-27 2016-06-29 北京交通大学 车路协同环境下的交互式视景仿真系统
CN109765803A (zh) * 2019-01-24 2019-05-17 同济大学 一种自动驾驶汽车多icu共时空的硬件仿真测试系统及方法
CN110046833A (zh) * 2019-05-13 2019-07-23 吉林大学 一种交通拥堵辅助系统虚拟测试系统
CN111688704A (zh) * 2020-06-24 2020-09-22 吉林大学 一种基于驾驶状态预测的人机力矩协同转向控制方法
US20200353943A1 (en) * 2019-05-07 2020-11-12 Foresight Ai Inc. Driving scenario machine learning network and driving environment simulation
CN112224211A (zh) * 2020-10-19 2021-01-15 中交第一公路勘察设计研究院有限公司 基于多自主体交通流的驾驶模拟仿真系统

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111353221A (zh) * 2020-02-24 2020-06-30 上海商汤临港智能科技有限公司 自动驾驶仿真方法和装置、电子设备及存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103345548A (zh) * 2013-06-27 2013-10-09 同济大学 微观交通仿真器与多驾驶模拟器整合式一体化网络实验平台
CN105718065A (zh) * 2016-01-27 2016-06-29 北京交通大学 车路协同环境下的交互式视景仿真系统
CN109765803A (zh) * 2019-01-24 2019-05-17 同济大学 一种自动驾驶汽车多icu共时空的硬件仿真测试系统及方法
US20200353943A1 (en) * 2019-05-07 2020-11-12 Foresight Ai Inc. Driving scenario machine learning network and driving environment simulation
CN110046833A (zh) * 2019-05-13 2019-07-23 吉林大学 一种交通拥堵辅助系统虚拟测试系统
CN111688704A (zh) * 2020-06-24 2020-09-22 吉林大学 一种基于驾驶状态预测的人机力矩协同转向控制方法
CN112224211A (zh) * 2020-10-19 2021-01-15 中交第一公路勘察设计研究院有限公司 基于多自主体交通流的驾驶模拟仿真系统

Also Published As

Publication number Publication date
US20240104008A1 (en) 2024-03-28

Similar Documents

Publication Publication Date Title
CN113219955B (zh) 一种多驾驶员在环驾驶试验平台
CN112987703B (zh) 一种实验室内整车在环自动驾驶开发测试系统及方法
CN108803607B (zh) 一种用于自动驾驶的多功能仿真系统
WO2023207016A1 (zh) 一种基于数字孪生云控平台的自动驾驶测试系统和方法
JP7428988B2 (ja) 自律自動車の制御ユニットを修正するための方法およびシステム
CN110160804B (zh) 一种自动驾驶车辆的测试方法、装置及系统
CN110456757B (zh) 一种无人驾驶车辆的整车测试方法及系统
CN112925291B (zh) 一种基于相机暗箱的数字孪生自动驾驶测试方法
CN109901546A (zh) 辅助驾驶车辆硬件在环仿真测试方法和系统
CN107479532A (zh) 一种智能汽车的域控制器测试系统及方法
CN108664013B (zh) 一种汽车车道保持策略验证平台及方法
CN109461342B (zh) 一种用于无人驾驶机动车的教学系统及其教学方法
CN110837697A (zh) 一种智能车的智能交通仿真系统及其仿真方法
CN105843071A (zh) 一种智能车辆运动控制实物仿真系统
CN113419518B (zh) 一种基于vts的vil测试平台
US11604908B2 (en) Hardware in loop testing and generation of latency profiles for use in simulation
CN112114580B (zh) 一种acc仿真测试系统及方法
CN114296424B (zh) 仿真测试系统和方法
CN110930811B (zh) 一种适用于无人驾驶决策学习和训练的系统
CN116755954A (zh) 基于数字孪生虚实结合的自动驾驶测试系统及其测试方法
Pechinger et al. Benefit of smart infrastructure on urban automated driving-using an av testing framework
WO2022236754A1 (zh) 一种多驾驶员在环驾驶试验平台
Zhang et al. Development and verification of traffic confrontation simulation test platform based on PanoSim
CN114740752A (zh) 一种自动驾驶车辆的仿真系统
von Neumann-Cosel et al. Testing of image processing algorithms on synthetic data

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 17778897

Country of ref document: US

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21941327

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21941327

Country of ref document: EP

Kind code of ref document: A1