GB2557252A - Physical environment simulator for vehicle testing - Google Patents
Physical environment simulator for vehicle testing Download PDFInfo
- Publication number
- GB2557252A GB2557252A GB1620461.2A GB201620461A GB2557252A GB 2557252 A GB2557252 A GB 2557252A GB 201620461 A GB201620461 A GB 201620461A GB 2557252 A GB2557252 A GB 2557252A
- Authority
- GB
- United Kingdom
- Prior art keywords
- vehicle
- test
- vehicles
- ground vehicles
- autonomous ground
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000012360 testing method Methods 0.000 title claims abstract description 52
- 238000004891 communication Methods 0.000 claims abstract description 5
- 230000033001 locomotion Effects 0.000 claims description 15
- 238000004088 simulation Methods 0.000 claims description 12
- 238000000034 method Methods 0.000 claims description 5
- 230000000638 stimulation Effects 0.000 abstract description 5
- 238000001514 detection method Methods 0.000 description 8
- 238000005259 measurement Methods 0.000 description 8
- 238000013459 approach Methods 0.000 description 7
- 230000009471 action Effects 0.000 description 6
- 230000001427 coherent effect Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 230000004927 fusion Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000018109 developmental process Effects 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000012552 review Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000003466 anti-cipated effect Effects 0.000 description 1
- 238000013473 artificial intelligence Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 230000000873 masking effect Effects 0.000 description 1
- 238000013178 mathematical model Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000003449 preventive effect Effects 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01M—TESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
- G01M17/00—Testing of vehicles
- G01M17/007—Wheeled or endless-tracked vehicles
- G01M17/06—Steering behaviour; Rolling behaviour
- G01M17/065—Steering behaviour; Rolling behaviour the vehicle wheels co-operating with rotatable rolls
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/04—Monitoring the functioning of the control system
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01M—TESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
- G01M17/00—Testing of vehicles
- G01M17/007—Wheeled or endless-tracked vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01M—TESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
- G01M17/00—Testing of vehicles
- G01M17/007—Wheeled or endless-tracked vehicles
- G01M17/0072—Wheeled or endless-tracked vehicles the wheels of the vehicle co-operating with rotatable rolls
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Automation & Control Theory (AREA)
- Human Computer Interaction (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Traffic Control Systems (AREA)
Abstract
A physical environment simulator for vehicle sensor stimulation for testing advanced driver assistance functions and automated driving vehicles. The physical environment simulator comprises: a test vehicle 4 to be positioned within the environment; one or more remote controlled autonomous ground vehicles 3 which can move in multiple directions relative to the test vehicle 4; a video projection system 1 with a freely moveable viewpoint to display images of a road and street markings on the ground 7; and a test automation 8 and remote control unit 9 to control the ground vehicles 3 and the video projection system 1. A car being tested may be positioned on a roll test stand in fixed position with steering being possible. The ground vehicles 3 may carry pedestrian dummies 6, traffic lights, road signs 2, trees, or represent other road vehicles. The remote control unit 9 may use wireless communications.
Description
(71) Applicant(s):
Christian Schwazl
AnnenstraBe 61/31, Graz, 8020, Austria
Kompetenzzentrum-Das Virtuelle Fahrzeug (Incorporated in Austria)
Forschungsgesellschaft mbH, Inffeldgasse 21/A/1, Graz 8010, Austria (72) Inventor(s):
Christian Schwarzl (74) Agent and/or Address for Service:
Kompetenzzentrum-Das Virtuelle Fahrzeug Forschungsgesellschaft mbH, Inffeldgasse 21/A/1, Graz 8010, Austria (51) INT CL:
G01M 17/06 (2006.01) B60W 50/04 (2006.01) G01M 17/007 (2006.01) (56) Documents Cited:
EP 1998160 A1 WO 2002/001177 A1
JP 2009128395 A JP 2007278951 A (58) Field of Search:
INT CL B60K, B60W, B62D, G01M, G01R, G01S, G05B, G05D, G09B
Other: EPODOC & WPI; Patent Fulltext; Internet (54) Title of the Invention: Physical environment simulator for vehicle testing
Abstract Title: Physical environment simulator for testing vehicle sensor systems (57) A physical environment simulator for vehicle sensor stimulation for testing advanced driver assistance functions and automated driving vehicles. The physical environment simulator comprises: a test vehicle 4 to be positioned within the environment; one or more remote controlled autonomous ground vehicles 3 which can move in multiple directions relative to the test vehicle 4; a video projection system 1 with a freely moveable viewpoint to display images of a road and street markings on the ground 7; and a test automation 8 and remote control unit 9 to control the ground vehicles 3 and the video projection system 1. A car being tested may be positioned on a roll test stand in fixed position with steering being possible. The ground vehicles 3 may carry pedestrian dummies 6, traffic lights, road signs 2, trees, or represent other road vehicles. The remote control unit 9 may use wireless communications.
z measurements
...............................1 | r..................................... | ||
Test Automation | RCU | ||
/ s | ΐ | V 9 |
UGV control vehicle motion
Figure 1
1/3
Drawings
UGV control vehicle motion
Figure 1
2/3
Figure 2
3/3
Figure 3
Physical environment simulator for vehicle testing
Background of the Invention
The constant improvement of comfort and safety in today's vehicles is realized by a high number of advanced driver assistance systems (ADAS), which perceive the environment, analyse the situation and support the driver by overtaking parts of his activities (e.g. maintaining a certain distance to the vehicle in front or emergency brakes). These systems form the basis for new developments achieving automated driving (AD), where the vehicle can perform all driving activities like acceleration, braking and steering without constant supervision of the driver. Such systems require a large number of interdependent functions realized by a complex network of sensors, controllers, and actuators as well as a large number of software modules for sensing the environment, identifying objects and for calculating driving manoeuvres. The dependencies between these functions impose new challenges on their verification and validation, because their correction interactions can only be shown if they are used together. As a consequence all sensors of the vehicle under test (VUT) have to be coherently stimulated during tests, avoiding erroneous results due to unfeasible sensor inputs.
State of the Art
Advanced driver assistance systems and autonomously driving vehicles have in common that they support the driver in performing driving activities, while they differ in the degree of automation. An apparatus for controlling a highly automated driving vehicle is described in ([Schnieders2015]), where the driver may need to take over control of the vehicle if the autonomous driving function cannot operate safely. This can for example happen by line-of-sight obstruction caused by dense fog, causing parts of environment detection to fail. An environment sensing and object detection technique is described in ([Zhu2015]), which considers multiple sensors of different types having varying sample rates. Another environment detection approach which might be affected by dense fog is described in ([Sibiryakov2015]), where kerbs are detected using camera based sensors and image recognition.
In order to achieve a reliable environment sensing, multiple sensors relying on different technologies are used to compensate weaknesses of one sensor by strengths of another. A subset of commonly used sensor types is given in ([Herpel2008]), where the quality and performance of sensors are compared. In particular long- and medium- to short-range sensors like RADAR, LiDAR, and ultrasonic based systems are considered.
The environment is sensed by gathering and merging all sensor measurement data, which is called sensor fusion as described for example by [Duraisamy2015j. The fused data is then processed and analysed to detect objects around the vehicle and to calculate their position as described for example by [Lamon2006] and [Fuentes-Pacheco2015j. The data about the detected objects is then further processed to determine their kind, which can be for example street markings, traffic lights, other vehicles or pedestrians but also trees, shadows and bushes. Given the position and kind of objects around the vehicle and their anticipated future movements at a given point in time, an ADAS can identify the current driving situation and react accordingly [Veres2011j.
Testing advanced driver assistance functions requires checking the correct behaviour of the vehicle in different driving situations. This includes checking the correctness of environment sensing, driving situation recognition and decision making. The checks can be performed by analysing the sensed and processed data, which are gathered by the vehicle and by observing the actuator movements of the vehicle.
In order to be able to observe the vehicle behaviour, its sensors have to be coherently stimulated while the vehicle performs a driving manoeuvre. A driving manoeuvre is the movement of a vehicle along a predefined path and coherent sensor stimulation means, that each sensor produces the expected measurements for each environment object used during a test. Coherent sensor stimulation is achieved during a drive of the vehicle on the road, where all environment objects physically exist and can be recognized by all sensors. Another possibility is the use of an environment simulation in combination with an e.g. Hardware-in-the-Loop (HiL) test bed [Short2005], In this case the environment objects and their positions are managed in the simulation providing corresponding object shape and distance information. This information can basically be fed to the vehicle in two ways. One way is to calculate the expected sensor measurements and send them directly on the vehicle bus system, excluding the actual sensor from the test. The other way is to feed the object shape and distance information into special sensor stimuli devices, which are capable of providing physical stimuli to the sensor leading to desired measurement results. However, the latter approach requires additional devices, which have to be developed and calibrated specifically for each sensor type as for example for radar systems [Heuel2015j. Currently, such devices are not available for all sensor types used in automotive applications.
All of these three approaches have distinct challenges limiting their applicability. The performing of real test drives on public roads includes all used sensors and tests the vehicle as a whole, ensuring that no failures are being masked. However, these tests are expensive and due to unpredictable traffic conditions it is impossible to repeat a test. Repeatability can be achieved by using an environment simulation, abstracting a real environment and using additional devices and mathematical models to provide sensor data. The reliance on an abstracted environment and devices or models not part of the vehicle allows actual vehicle failures to be masked. As a consequence it cannot be guaranteed that the vehicle behaves the same way in the real world as it did in the simulation.
These limitations are partly solved by the VEHIL approach presented in [Helmer2015], which uses robot vehicles called moving bases for simulating traffic surrounding the VUT, which is operated on a dynamometer. Due to the controlled movement of physical objects, reproducible real world sensor stimuli can be generated. In contrast to the presented invention the used dynamometer setup does not allow steering of the VUT, which can for example be achieved by using a pair of rollers on a rotatable frame [Boess2001], This limits the applicability of the VEHIL approach to driving scenarios where the VUT moves only straight ahead. In addition, street markings and the course of the road are neglected. For this reason latest driver assistance functions relying on detection of the street layout like the lane keep assistant cannot be sufficiently tested. Especially for ADAS and automated driving functions, which rely heavily on environment sensing in all driving situations, a more complete environment simulation is needed.
[Schnieders2015] Schnieders, H. & Leimbach, J. Anordnung zur Steuerung eines hochautomatisierten Fahrens eines Fahrzeugs 2015, Patent DE 102013215263 [Zhu2015] Zhu, J. & Agarwal, P. Methods and systems for object detection using multiple sensors 2015, Patent US 201414262269 [Sibiryakov2015] Sibiryakov, A. & BOchner, J. Image recognition system for a vehicle and corresponding method 2015, Patent US 2016010447A1 [Herpel2008] Herpel, T.; Lauer, C.; German, R. & Salzberger, J. Trade-off between coverage and robustness of automotive environment sensor systems Intelligent Sensors, Sensor Networks and Information Processing, 2008. ISSNIP 2008. International Conference on, 2008, 551-556 [Duraisamy2015] Duraisamy, B.; Schwarz, T. & Wohler, C. On track-to-track data association for automotive sensor fusion Information Fusion (Fusion), 2015 18th International Conference on, 2015, 1213-1222 [Lamon2006] Lamon, P.; Stachniss, C.; Triebel, R.; Pfaff, P.; Plagemann, C.; Grisetti, G.; Kolski, S.; Burgard, W. & Siegwart, R. Mapping with an autonomous car IEEE/RSJ IROS Workshop: Safe Navigation in Open and Dynamic Environments, 2006, 26 [Fuentes-Pacheco2015] Fuentes-Pacheco, J.; Ruiz-Ascencio, J. & Rendon-Mancha, J. M. Visual simultaneous localization and mapping: a survey Artificial Intelligence Review, 2015,43, 55-81 [Veres2011] Veres, S. M.; Molnar, L.; Lincoln, N. K. & Morice, C. P. Autonomous vehicle control systems — a review of decision making Proceedings of the Institution of Mechanical Engineers, Part I Journal of Systems and Control Engineering, 2011, 225, 155-195 [Short2005] Short, M. & Pont, M. J. Hardware in the loop simulation of embedded automotive control system Proceedings. 2005 IEEE Intelligent Transportation Systems, 2005., 2005, 426-431 [Heuel2015] Heuel, S. Radar target generation 2015 16th International Radar Symposium (IRS), Institute of Electrical & Electronics Engineers (IEEE), 2015 [Helmer2015] Helmer, T. State of Scientific and Technical Knowledge on Pre-crash Evaluation Development of a Methodology for the Evaluation of Active Safety using the Example of Preventive Pedestrian Protection, Springer International Publishing, 2015, 17-48 [Boess2001] Boess, W. System for measuring a motor vehicle wheel position 2001, Patent US 6195900B1
Description of the Invention
The present invention is a physical environment simulator (PES) apparatus allowing coherent sensor stimulation for testing vehicles equipped with environment sensing systems, which are used for example by ADAS and for automated driving. The physical environment simulation performs a reproducible test drive within a controlled environment, where autonomous ground vehicles (AGVs) and video projection systems are used to simulate the movement of environment objects. This approach avoids the application of dedicated sensor stimuli devices or mathematical sensor models and includes all sensors, processing units and algorithms used in the vehicle, thus preventing error masking.
Although the invention is applicable for any device or system equipped with environment sensing operated on the ground, the invention will be described in combination with vehicles.
The invention will be explained in detail using the following figures:
• Figure 1 shows an example setup of the physical environment simulator, where the vehicle under test is operated on a roll test stand and AGVs are moving around the vehicle.
• Figure 2 shows a street map and the viewport displayed by video projection devices.
• Figure 3 shows an example ground plot of a driving situation within the displayed viewport and positions and possible movement directions of the AGVs.
The setup and components of the physical environment simulator are shown in Figure 1, where 1 is a video projection device, 2 is a street sign, 3 are AGVs, 4 is the vehicle under test, 5 is a roll test stand allowing VUT steering, 6 is a pedestrian dummy, 7 is the ground floor, 8 a test automation system and 9 the remote control unit (RCU). In this setup the VUT is standing on a roll test stand or dynamic test bed and has a fixed position. This allows the vehicle to perform driving activities like accelerate, steer and brake without actually moving. The vehicle sensors are coherently stimulated by moving environment objects like street signs, pedestrians, traffic lights, other vehicles or trees and bushes around the vehicle. The movements are achieved by mounting the environment objects on AGVs, which perform movements relative to the vehicle. AGVs can be moved in any direction (omnidirectional) while performing a rotation at the same time. In contrast, passenger cars can only move forward and backward while changing its direction, but cannot rotate at stand or move sideways.
AGV movements are performed relative to the VUT, such that the VUT sensors would recognize the environment as if it would be running on the street. For example an AGV standing still in front of the VUT while the VUT is driving 50km/h is recognized by the VUT as vehicle moving at the same speed. Correspondingly a decreasing distance between VUT and AGV, realized by moving the AGV towards the VUT, would be recognized as slower vehicle approached by the VUT and an increasing distance as faster vehicle driving away.
Using environment objects moved around the VUT, its installed sensors and processing units as well as algorithms can be used during a test without modification. For this reason the same approach is applicable for different sensor technologies like radio detection and ranging (RADAR), light detection and ranging (LiDAR), video cameras and ultrasonic sensors and allows for a coherent sensor simulation. Video cameras are often used in vehicles for example for detection of street markings, which are distributed over an area and cannot be treated similar to other environment objects as described before. For this reason the PES uses video projection devices, for example installed above the VUT, to display street markings on the ground.
The position, colour and curvature of the street markings are defined in a street map as shown in Figure 2, where 10 is the overall area the street map covers, 11 defines the course and type of the road and 12 is a viewport, which is a sub-area of the map and displayed by the video projection devices of the PES. The viewport can be rotated and freely moved over the map to adapt the displayed street markings.
Since the VUT position is fixed, it is required that the position of environment objects as well as street markings are adapted during a simulated vehicle turn, which is initiated if a steering action in the VUT is performed. This is achieved by moving environment objects and street markings along a curve around the vehicle. The curve length and its curvature depend on the vehicle speed and the angle of the performed steering action. The calculation of the curves is based on a steering geometry suitable for the VUT like the Ackermann steering geometry for cars or common four-wheel vehicles in general.
Figure 3 shows an example ground plot of a driving situation, where the VUT follows the course of the road, including the required AGVs movement directions. In this figure 4 is the VUT, 5 the roll test stand allowing VUT steering, 13 the displayed area within the viewport containing the street markings, 14 and 15 are AGVs simulating other vehicles and 16 is an AGV maintaining the correct position of 17, a street sign.
The video projection system 1 allows testing of camera-based ADAS and safety systems, which need to detect the course of the road or street markings for functioning. The video projection system used to show the displayed area within the viewport is controlled by the remote control unit (RCU). The RCU is the central coordination instance for the VUT and the AGVs. In addition it supports splitting the viewport into multiple display areas allowing their distribution over multiple video projection devices. It is connected with the test automation system, which executes the driving actions of the VUT and performs measurements. A possible embodiment of the invention also comprises the use of unmanned aerial vehicles, capable of keeping well-defined positions, to carry the video projection system 1. With these unmanned aerial vehicles the setup of the video projection system can be accelerated, allowing for example a quick installation also in open-air test grounds.
The VUT driving actions are controlled using e.g. a steering robot or, if available, in-vehicle actuators. The test automation system receives the vehicle motion control values from the RCU, which are calculated from the VUT's global path. A global path defines a list of points on a route like the course of a road, which has to be passed by a vehicle.
The AGVs are remotely controlled by the RCU, which can start and stop their operation and distribute global paths to them using e.g. wireless communication technologies. Starting operation means that each AGV begins to move autonomously along the received global path.
VUT properties like wheel rotation speed, steering angle, acoustic or optical driver warnings and bus communication are captured by the test automation system. The captured values are used to update the viewport orientation, its position on the map and to adjust the movements of AGV. This allows the immediate reaction to autonomously performed actions by the vehicle, which can be initiated by
ADAS or automated driving functions like an emergency brake or speed increase for adaptive cruise control.
Before the environment simulation can be started using the PSE, the global paths of the VUT and of each AGV have to be defined and distributed to the vehicles. When all vehicles have received their global paths the simulation can be executed. The simulation is executed if all vehicles follow their global paths, meaning that the AGVs automatically change their positions and the VUT performs driving actions. After the execution has finished, the gathered measurement values are analysed and used to decide if the VUT behaved correctly with respect to its requirements, resulting in a test verdict which is either pass or fail. The measurements are gathered for example by measuring physical values like voltage or electrical current at predefined times using a real time computer or in reaction to e.g. a reception of a message sent over the vehicle network. After the measured values have been stored for example on a hard drive, signal analysis techniques are used to check if the measured values match predefined patterns. These patterns may include - among others - allowed value ranges, required frequencies or occurrence numbers or minimal and maximal rates of change.
Claims (5)
1. Apparatus to perform a virtual test drive within a physical controlled environment with a vehicle under test 4 is positioned within the physical controlled environment, one or more controlled autonomous ground vehicles 3 with omnidirectional movement capability relative to the vehicle under test 4 carry moving environment objects 2, 6 to stimulate vehicle sensors, a video projection system 1 with rotatable and freely movable viewport to display street markings on the ground 7 and define the course and type of the test road and a test automation 8 and remote control unit 9 for autonomous ground vehicle 3 control and to control the video projection system 1.
2. Apparatus according to claim 1, with the vehicle under test 4 is positioned on a roll test stand or dynamic test bed 5 on fixed position with steering possible within the physically controlled environment.
3. Apparatus according to claim 1 and 2, with autonomous ground vehicles 3 carrying pedestrian dummies 6, traffic lights, road signs 2, trees, other vehicle dummies and other environment objects.
4. Apparatus according to claim 1 to 3, with autonomous ground vehicles 3 controlled by means of wireless communication between remote control unit 9 and autonomous ground vehicles 3.
5. Method to perform virtual test drive within a physical controlled environment where in a first step the global paths of the vehicle under test 4 and of the autonomous ground vehicles 3 are defined and provided to the vehicle under test 4 and the autonomous ground vehicles 3 in a second step the autonomous ground vehicles 3 are placed at their initial position according to the defined paths in a third step the simulation is executed by the remote control unit 9 and the autonomous ground vehicles begin to move autonomously along predefined global paths and in a forth step the behaviour of the vehicle systems under test are recorded and analysed.
Intellectual
Property
Office
Application No: GB1620461.2 Examiner: Ms Becky Lander
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1620461.2A GB2557252B (en) | 2016-12-02 | 2016-12-02 | Physical environment simulator for vehicle testing |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1620461.2A GB2557252B (en) | 2016-12-02 | 2016-12-02 | Physical environment simulator for vehicle testing |
Publications (3)
Publication Number | Publication Date |
---|---|
GB201620461D0 GB201620461D0 (en) | 2017-01-18 |
GB2557252A true GB2557252A (en) | 2018-06-20 |
GB2557252B GB2557252B (en) | 2020-03-25 |
Family
ID=58159787
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1620461.2A Expired - Fee Related GB2557252B (en) | 2016-12-02 | 2016-12-02 | Physical environment simulator for vehicle testing |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2557252B (en) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109084992A (en) * | 2018-07-27 | 2018-12-25 | 长安大学 | Method based on engine bench test unmanned vehicle intelligence |
CN110108505A (en) * | 2019-06-06 | 2019-08-09 | 中国汽车工程研究院股份有限公司 | Automatic Pilot assists lamp stand test fixture vehicle and its working method |
CN110595798A (en) * | 2019-09-19 | 2019-12-20 | 北京百度网讯科技有限公司 | Test method and device |
CN110823595A (en) * | 2019-10-30 | 2020-02-21 | 上海欣巴自动化科技有限公司 | AGV vehicle comprehensive test bed and test method thereof |
CN110867125A (en) * | 2019-12-13 | 2020-03-06 | 清华大学 | Intelligent train traffic system sand table demonstration device and control method thereof |
EP3620770A1 (en) * | 2018-09-10 | 2020-03-11 | EDAG Engineering GmbH | Environmental simulation system for a testbench and testbench |
CN111882944A (en) * | 2020-09-14 | 2020-11-03 | 北京智扬北方国际教育科技有限公司 | Real standard platform of auto steering combination |
CN112444401A (en) * | 2019-08-30 | 2021-03-05 | 大连民族大学 | Unmanned vehicle testing device with controllable road surface moving target |
CN112444403A (en) * | 2019-08-30 | 2021-03-05 | 大连民族大学 | Method for testing unmanned automobile by moving target |
WO2022027304A1 (en) * | 2020-08-05 | 2022-02-10 | 华为技术有限公司 | Testing method and apparatus for autonomous vehicle |
US11415484B2 (en) | 2019-07-11 | 2022-08-16 | Horiba Instruments Incorporated | Apparatus and method for testing automated vehicles via movable target body or electronic target simulator |
DE102022207888A1 (en) | 2022-07-29 | 2024-02-01 | Aip Gmbh & Co. Kg | Wheel drive device, test bench system and method for functional testing of a vehicle |
DE102022005068A1 (en) | 2022-07-29 | 2024-02-01 | Aip Gmbh & Co. Kg | Device and method for driving a vehicle on a vehicle test bench |
US12031883B2 (en) | 2021-01-29 | 2024-07-09 | Horiba Instruments Incorporated | Apparatus and method for testing automated vehicles |
DE102023101889A1 (en) | 2023-01-26 | 2024-08-01 | Bayerische Motoren Werke Aktiengesellschaft | METHOD AND DEVICE FOR OPERATING A MOTOR VEHICLE TEST BENCH |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU2019100368B4 (en) * | 2019-01-25 | 2019-11-28 | Norman BOYLE | A driverless impact attenuating traffic management vehicle |
CN110806742A (en) * | 2019-11-27 | 2020-02-18 | 浙江测迅汽车科技有限公司 | Intelligent automobile ADAS function detection device that rolls off production line |
CN113252365B (en) * | 2021-06-16 | 2024-03-19 | 智己汽车科技有限公司 | Testing device and testing method for lane auxiliary system |
CN116539331B (en) * | 2023-05-31 | 2024-04-12 | 永悦科技股份有限公司 | Be used for unmanned test platform of car |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2002001177A1 (en) * | 2000-06-23 | 2002-01-03 | Nederlandse Organisatie Voor Toegepast-Natuurwetenschappelijk Onderzoek Tno | System for performing tests on intelligent road vehicles |
JP2007278951A (en) * | 2006-04-10 | 2007-10-25 | Alpine Electronics Inc | Car body behavior measuring device |
EP1998160A1 (en) * | 2007-05-31 | 2008-12-03 | Nederlandse Organisatie voor toegepast- natuurwetenschappelijk onderzoek TNO | System and method for testing a vehicle |
JP2009128395A (en) * | 2007-11-20 | 2009-06-11 | Toyota Central R&D Labs Inc | Driving simulation apparatus |
-
2016
- 2016-12-02 GB GB1620461.2A patent/GB2557252B/en not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2002001177A1 (en) * | 2000-06-23 | 2002-01-03 | Nederlandse Organisatie Voor Toegepast-Natuurwetenschappelijk Onderzoek Tno | System for performing tests on intelligent road vehicles |
JP2007278951A (en) * | 2006-04-10 | 2007-10-25 | Alpine Electronics Inc | Car body behavior measuring device |
EP1998160A1 (en) * | 2007-05-31 | 2008-12-03 | Nederlandse Organisatie voor toegepast- natuurwetenschappelijk onderzoek TNO | System and method for testing a vehicle |
JP2009128395A (en) * | 2007-11-20 | 2009-06-11 | Toyota Central R&D Labs Inc | Driving simulation apparatus |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109084992A (en) * | 2018-07-27 | 2018-12-25 | 长安大学 | Method based on engine bench test unmanned vehicle intelligence |
EP3620770A1 (en) * | 2018-09-10 | 2020-03-11 | EDAG Engineering GmbH | Environmental simulation system for a testbench and testbench |
CN110108505A (en) * | 2019-06-06 | 2019-08-09 | 中国汽车工程研究院股份有限公司 | Automatic Pilot assists lamp stand test fixture vehicle and its working method |
US11415484B2 (en) | 2019-07-11 | 2022-08-16 | Horiba Instruments Incorporated | Apparatus and method for testing automated vehicles via movable target body or electronic target simulator |
CN112444403B (en) * | 2019-08-30 | 2022-08-30 | 大连民族大学 | Method for testing unmanned automobile by moving target |
CN112444401A (en) * | 2019-08-30 | 2021-03-05 | 大连民族大学 | Unmanned vehicle testing device with controllable road surface moving target |
CN112444403A (en) * | 2019-08-30 | 2021-03-05 | 大连民族大学 | Method for testing unmanned automobile by moving target |
CN110595798B (en) * | 2019-09-19 | 2022-04-05 | 北京百度网讯科技有限公司 | Test method and device |
CN110595798A (en) * | 2019-09-19 | 2019-12-20 | 北京百度网讯科技有限公司 | Test method and device |
CN110823595A (en) * | 2019-10-30 | 2020-02-21 | 上海欣巴自动化科技有限公司 | AGV vehicle comprehensive test bed and test method thereof |
CN110823595B (en) * | 2019-10-30 | 2021-06-01 | 上海欣巴自动化科技股份有限公司 | AGV vehicle comprehensive test bed and test method thereof |
CN110867125B (en) * | 2019-12-13 | 2021-07-02 | 清华大学 | Intelligent train traffic system sand table demonstration device and control method thereof |
CN110867125A (en) * | 2019-12-13 | 2020-03-06 | 清华大学 | Intelligent train traffic system sand table demonstration device and control method thereof |
WO2022027304A1 (en) * | 2020-08-05 | 2022-02-10 | 华为技术有限公司 | Testing method and apparatus for autonomous vehicle |
CN111882944A (en) * | 2020-09-14 | 2020-11-03 | 北京智扬北方国际教育科技有限公司 | Real standard platform of auto steering combination |
US12031883B2 (en) | 2021-01-29 | 2024-07-09 | Horiba Instruments Incorporated | Apparatus and method for testing automated vehicles |
DE102022207888A1 (en) | 2022-07-29 | 2024-02-01 | Aip Gmbh & Co. Kg | Wheel drive device, test bench system and method for functional testing of a vehicle |
WO2024023243A1 (en) | 2022-07-29 | 2024-02-01 | Aip Gmbh & Co. Kg | Device and method for driving a vehicle on a chassis dynamometer |
DE102022005068A1 (en) | 2022-07-29 | 2024-02-01 | Aip Gmbh & Co. Kg | Device and method for driving a vehicle on a vehicle test bench |
DE102022207888B4 (en) | 2022-07-29 | 2024-03-21 | Aip Gmbh & Co. Kg | Wheel drive device, test stand system and method for functional testing of a vehicle |
DE102023101889A1 (en) | 2023-01-26 | 2024-08-01 | Bayerische Motoren Werke Aktiengesellschaft | METHOD AND DEVICE FOR OPERATING A MOTOR VEHICLE TEST BENCH |
Also Published As
Publication number | Publication date |
---|---|
GB201620461D0 (en) | 2017-01-18 |
GB2557252B (en) | 2020-03-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
GB2557252A (en) | Physical environment simulator for vehicle testing | |
US10902165B2 (en) | Deployable development platform for autonomous vehicle (DDPAV) | |
US12017663B2 (en) | Sensor aggregation framework for autonomous driving vehicles | |
CN108319259B (en) | Test system and test method | |
CN106991041B (en) | Method and apparatus for testing software for autonomous vehicles | |
US10852721B1 (en) | Autonomous vehicle hybrid simulation testing | |
JP2021012709A (en) | Navigating vehicle based on detected barrier | |
US20190278290A1 (en) | Simulation-based method to evaluate perception requirement for autonomous driving vehicles | |
CN104943684B (en) | Pilotless automobile control system and the automobile with it | |
CN111309600A (en) | Virtual scene injection automatic driving test method and electronic equipment | |
JP2021504812A (en) | Object Interaction Prediction Systems and Methods for Autonomous Vehicles | |
CN112526893A (en) | Test system of intelligent automobile | |
US20200272834A1 (en) | Information processing apparatus, information processing method, program, and mobile object | |
Galko et al. | Vehicle-Hardware-In-The-Loop system for ADAS prototyping and validation | |
CN106774287B (en) | A kind of real vehicle of active safety controller is in ring test system and method | |
WO2022086713A1 (en) | In-vehicle operation of simulation scenarios during autonomous vehicle runs | |
Miquet | New test method for reproducible real-time tests of ADAS ECUs:“Vehicle-in-the-Loop” connects real-world vehicles with the virtual world | |
EP4082856A2 (en) | E2e learning-based evaluator for an autonomous driving vehicle | |
Szalay et al. | Proof of concept for Scenario-in-the-Loop (SciL) testing for autonomous vehicle technology | |
CN112740009A (en) | Vehicle inspection system | |
CN115731531A (en) | Object trajectory prediction | |
CN117130298A (en) | Method, device and storage medium for evaluating an autopilot system | |
Schwab et al. | Consistent test method for assistance systems | |
CN112415910A (en) | Method for evaluating software components of a Sil environment | |
CN113533876B (en) | Method and system for performing electromagnetic compatibility testing of autonomous vehicles |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
S117 | Correction of errors in patents and applications (sect. 117/patents act 1977) |
Free format text: REQUEST FOR CORRECTION UNDER SECTION 117 FILED ON 5 MARCH 2021 NOT PROCEEDED WITH ON 22 MARCH 2021 |
|
732E | Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977) |
Free format text: REGISTERED BETWEEN 20210422 AND 20210428 |
|
PCNP | Patent ceased through non-payment of renewal fee |
Effective date: 20231202 |