CN113945224A - Automatic generation method and system for intelligent driving ADAS test scene - Google Patents
Automatic generation method and system for intelligent driving ADAS test scene Download PDFInfo
- Publication number
- CN113945224A CN113945224A CN202111210083.3A CN202111210083A CN113945224A CN 113945224 A CN113945224 A CN 113945224A CN 202111210083 A CN202111210083 A CN 202111210083A CN 113945224 A CN113945224 A CN 113945224A
- Authority
- CN
- China
- Prior art keywords
- test
- model
- target
- vehicle
- adas
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012360 testing method Methods 0.000 title claims abstract description 202
- 102100034112 Alkyldihydroxyacetonephosphate synthase, peroxisomal Human genes 0.000 title claims abstract description 34
- 101000799143 Homo sapiens Alkyldihydroxyacetonephosphate synthase, peroxisomal Proteins 0.000 title claims abstract description 34
- 238000000848 angular dependent Auger electron spectroscopy Methods 0.000 title claims abstract description 34
- 238000000034 method Methods 0.000 title claims abstract description 18
- 238000004458 analytical method Methods 0.000 claims abstract description 7
- 230000001133 acceleration Effects 0.000 claims description 14
- 238000012544 monitoring process Methods 0.000 description 8
- 238000013519 translation Methods 0.000 description 8
- 230000008859 change Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 230000002457 bidirectional effect Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000001149 cognitive effect Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 238000010792 warming Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/3407—Route searching; Route guidance specially adapted for specific applications
- G01C21/3415—Dynamic re-routing, e.g. recalculating the route when the user deviates from calculated route or after detecting real-time traffic data or accidents
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/3453—Special cost functions, i.e. other than distance or default speed limit of road segments
- G01C21/3492—Special cost functions, i.e. other than distance or default speed limit of road segments employing speed data or traffic data, e.g. real-time or historical
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention provides an automatic generation method and system for an intelligent driving ADAS test scene, which comprises the following steps: and (3) an analysis step: analyzing the test requirement; a model selection step: selecting a model according to the test requirement; a target selection step: selecting a target according to the test requirement, and placing the target on the model; parameter selection: selecting and formulating parameters according to the target; a scene generation step: and generating a test scene according to the model, the target and the parameters. The invention improves the understanding of ADAS test standard by the staff, thereby improving the actual test efficiency.
Description
Technical Field
The invention relates to the technical field of test scene generation, in particular to an automatic generation method and system for an intelligent driving ADAS test scene.
Background
The intelligent driving essentially relates to cognitive engineering of attention attraction and distraction, and mainly comprises three links of network navigation, autonomous driving and manual intervention. The intelligent driving has the precondition that the selected vehicle meets the dynamic requirements of driving, and the sensor on the vehicle can obtain relevant visual and auditory signals and information and control the corresponding follow-up system through cognitive calculation. The intelligent driving network navigation solves the problems of where the user is, where the user goes, which lane of which road the user walks and the like; the autonomous driving is to complete driving behaviors such as lane keeping, overtaking and merging, red light stopping and green light driving, light and whistle interaction and the like under the control of an intelligent system; the manual intervention means that the driver reacts to the actual road condition under a series of prompts of the intelligent system. With the development of intelligent driving technology, testing of intelligent driving vehicles becomes an important link for ensuring the safety of the intelligent driving vehicles, and in order to improve the safety and efficiency of testing and reduce the testing cost, the testing is generally performed, and therefore a testing scene for intelligent driving needs to be established.
The chinese patent publication CN111723458A discloses an automated generation method for simulation test scenes of an automatic driving decision planning system, which includes the following steps: step 1: keyword-based scene description: according to the user requirements, semantic description is carried out on the scene to be described by using keywords; step 2: the keywords in the semantic description are digitalized; and step 3: and generating scene files in batch based on the semantic description subjected to the keyword digitization processing.
For the related technologies, the inventor considers that the ADAS specification test is difficult for the staff in the method, and the actual test efficiency is low.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide an automatic generation method and system for an intelligent driving ADAS test scene.
The invention provides an automatic generation method of an intelligent driving ADAS test scene, which comprises the following steps:
and (3) an analysis step: analyzing the test requirement;
a model selection step: selecting a model according to the test requirement;
a target selection step: selecting a target according to the test requirement, and placing the target on the model;
parameter selection: selecting and formulating parameters according to the target;
a scene generation step: and generating a test scene according to the model, the target and the parameters, and performing the ADAS test by using the generated test scene.
Preferably, in the parameter selection step, the parameters include, but are not limited to, a speed of the vehicle under test, a bias rate of the vehicle under test, a speed of the vehicle under test, and an acceleration distance of the vehicle under test.
Preferably, in the model selecting step, the model includes a cross model, a parallel model and a curve model;
the testing of the cross model uses tests that include road crossing correlations for different targets;
the test usage of the parallel model includes following-related emergency braking for different targets, and passing-related tests;
the testing of the curve model uses tests that include lane changes for different targets and other curve scenario correlations.
Preferably, in the model selecting step, the model is a motion trajectory preset for the vehicle under test and the target.
Preferably, in the target selecting step, the target includes a child dummy, an adult dummy, a bicycle, a motorcycle, and a car; and classifying the test parameters according to different targets.
The invention provides an automatic generation system for an intelligent driving ADAS test scene, which comprises the following modules:
an analysis module: analyzing the test requirement;
a model selection module: selecting a model according to the test requirement;
a target selection module: selecting a target according to the test requirement, and placing the target on the model;
a parameter selection module: selecting and formulating parameters according to the target;
a scene generation module: and generating a test scene according to the model, the target and the parameters, and performing the ADAS test by using the generated test scene.
Preferably, in the parameter selection module, the parameters include, but are not limited to, a measured vehicle speed, a measured vehicle bias rate, a test vehicle speed, and a test vehicle acceleration distance.
Preferably, in the model selection module, the model includes a cross model, a parallel model and a curve model;
the testing of the cross model uses tests that include road crossing correlations for different targets;
the test usage of the parallel model includes following-related emergency braking for different targets, and passing-related tests;
the testing of the curve model uses tests that include lane changes for different targets and other curve scenario correlations.
Preferably, in the model selection module, the model is a motion track preset for the vehicle to be tested and the target.
Preferably, in the object selection module, the objects include a child dummy, an adult dummy, a bicycle, a motorcycle, and a car; and classifying the test parameters according to different targets.
Compared with the prior art, the invention has the following beneficial effects:
1. according to the invention, all intelligent ADAS test scenes are more reasonably displayed to the operator through three calculation models, so that the problem that the operator cannot know about the ADAS test specification so as to reduce the actual test efficiency is solved, and through the method, the operator who does not know about the test specification can obtain the effect to be achieved;
2. the invention can realize more accurate and efficient planning generation of the intelligent ADAS test scene, thereby being suitable for different test scenes;
3. according to the different generated scenes, the invention sends different paths to the dummy vehicle test ground disk, and completes the vehicle ADAS scene test work of the appointed scene in a high-coupling mode.
Drawings
Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
FIG. 1 is a flow chart of the present invention;
FIG. 2 is a diagram of a test scenario using a cross model according to an embodiment of the present invention;
FIG. 3 is a diagram of a test scenario using a parallel model according to an embodiment of the present invention;
FIG. 4 is a diagram of a test scenario using a curve model according to an embodiment of the present invention;
FIG. 5 is a diagram illustrating the definition of the blind spot monitoring range according to the present invention.
Reference numerals: STA is called Scooter Target Adult in English, and the Chinese translation is a pedal motorcycle Target object; BTA is called Bicycli st Target Adult in English, and Chinese translation is a bicycle Target; AEB is called Autonomous ignition Braking in English, and Chinese translation is automatic Emergency Braking; FCW is called Forward Collision warming in English, and Chinese translation is used for early Warning of Forward collision; point H represents the origin of the STA side, representing the point of contact for the collision; XX represents the locus of the STA H point; YY represents the centerline of the test vehicle; c1 denotes STA acceleration distance; d1 represents STA uniform distance; z represents the far-end scene 50% collision point; TT represents the trajectory of the tire behind BTA; q represents the BTA acceleration distance during the AEB test; r represents BTA acceleration distance during FCW test; s represents the BTA uniform speed distance; u represents the longitudinal scene 50% collision point; v represents the longitudinal scene 25% collision point; 1 denotes a test vehicle; 2 denotes the center of the ninety-fifth percentile eye ellipse; 3, an area surrounded by FCGB is a vehicle left blind area monitoring range under a straight line working condition; and 4, the area surrounded by the KCLB is the monitoring range of the blind area on the right side of the vehicle under the straight line working condition.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that it would be obvious to those skilled in the art that various changes and modifications can be made without departing from the spirit of the invention. All falling within the scope of the present invention.
The embodiment of the invention discloses an automatic generation method of an intelligent driving ADAS test scene, which comprises the following steps as shown in figure 1: and (3) an analysis step: and analyzing the test requirements. And analyzing what the tested vehicle needs to be tested, such as whether a certain position sensor has a blind area or not.
A model selection step: and selecting the model according to the test requirement. The model is a preset motion track for the tested vehicle and the target. The model includes a cross model, a parallel model and a curve model. The test of the cross model uses a test comprising road crossing correlations for different objects. The test use of the parallel model includes follow-related emergency braking for different targets, and cut-in related tests. The testing of the curve model uses tests that include lane changes for different targets and other curve scenario correlations. This approach may not select multiple models simultaneously. The names of the models are named according to the path characteristics, and the curve models also contain straight-line paths and are named only by taking a curve as a characteristic.
A model is selected. A cross model: the method is mainly used for road crossing related tests of different targets; parallel modeling: the emergency braking system is mainly used for following related emergency braking of different targets and testing related to overtaking; and (3) curve model: lane change for different targets and other curve scenario related tests; the model is a generalised summary of existing ADAS test requirements.
A target selection step: and selecting a target according to the test requirement, and placing the target on the model. Targets include child dummy, adult dummy, bicycle, motorcycle, and automobile; and classifying the test parameters according to different targets. A target is selected. Including child dummy, adult dummy, bicycle, motorcycle, car, etc., a classification of test parameters is performed according to different targets, such as different requirements for test speed of different targets.
Parameter selection: parameters are selected and formulated according to the target. The parameters include, but are not limited to, a measured vehicle speed, a measured vehicle bias rate, a test vehicle speed, and a test vehicle acceleration distance. There is no target detection function. Only the tested vehicle needs to detect the obstacle, and the tested vehicle belongs to the simulated obstacle. Test parameters are selected. For example, the speed parameters include 10km/h, 20km/h, 30km/h and 40 km/h. The scene is generated with a required speed.
A scene generation step: and generating a test scene according to the model, the target and the parameters, and performing the ADAS test by using the generated test scene. And (5) confirming generation. And automatically generating a test scene according to the selection of the steps. After the scene is generated, the tested vehicle and the test target are required to execute.
And calculating a motion path of the characteristic scene according to the parameters, wherein the motion path is a path calculated by speed and acceleration, and the direction is determined according to the real-time navigation direction. The software controls the movement of the object according to the path.
The invention has three models in total, and can be freely selected in software, and the switching is replacement selection. ADAS is called as Advanced Driving Assistance System in English, and Chinese translation is an Advanced Driving Assistance System.
As shown in fig. 2, CSFA is generally called Car-to-Scooter front add in english, and the chinese translation is a vehicle collision remote Scooter type motorcycle. In the case of a CSFA-50 scenario, where no braking action is taken, the vehicle collides with the remotely traversing scooter, and the collision location is at 50% of the vehicle front structure. The STA moves at a speed of 20km/h in a direction perpendicular to the traveling direction of the vehicle. The VUT was tested at speeds of 30km/h, 40km/h, 50km/h and 60km/h, respectively. The impact position is at 50%, corresponding to the "Z" point in the figure. VUT is called the Vehicle Under Test in English and Chinese translation is the Test Vehicle. The test vehicle is equipped with a relevant ADAS functional system and the vehicle is tested. Point H represents the origin of the STA side, representing the point of contact for the collision. Axis: XX represents the locus of the STA H point; YY denotes the center line of the test vehicle. Distance: c1 denotes STA acceleration distance; d1 denotes the STA uniform distance. Point: z represents the far end scene 50% collision point.
As shown in FIG. 3, CBLA is generally called Car-to-Bicyclist Longitudinal Adult, and Chinese translation is used for the Longitudinal running bicycle when the vehicle collides with. CBLA-25 represents a scenario where the vehicle collides with a front longitudinally running bicycle without taking braking measures, and the collision location is at 25% of the front end structure of the vehicle. CBLA-50 represents a scenario where the vehicle collides with a front longitudinally running bicycle without taking braking measures, and the collision location is at 50% of the front end structure of the vehicle. The left side of fig. 3 is taken under CBLA-50 scenario, with the BTA moving at 15km/h in the same direction as the vehicle is traveling. The VUT was tested at speeds of 20km/h, 30km/h, 40km/h, 50km/h and 60km/h, respectively. The impact position is at 50%, corresponding to the "U" point in fig. 3. The right side of fig. 3 is under CBLA-25 scenario, with the BTA moving at 15km/h in the same direction as the vehicle is traveling. The VUT was tested at speeds of 50km/h, 60km/h, 70km/h, 80km/h, respectively. The impact position is at 25%, corresponding to the "V" point in fig. 3. Axis: TT represents the trajectory of the tire behind BTA; YY denotes the center line of the test vehicle. Distance: q represents the BTA acceleration distance during the AEB test; r represents BTA acceleration distance during FCW test; and S represents the BTA uniform velocity distance. Point: u represents the longitudinal scene 50% collision point; v represents the longitudinal scene 25% collision point.
As shown in fig. 4, the test vehicle and the target vehicle both travel straight at a constant speed of 50km/h, and the lateral distance between the outermost edge of the body of the test vehicle (the side close to the target vehicle, excluding the outside rear view mirror) and the outermost edge of the body of the target vehicle (the side close to the test vehicle, excluding the outside rear view mirror) is kept at 6.5m during the travel. When the target vehicle crosses line B in fig. 5 and is completely behind line C, a lane change is made from the side rear of the test vehicle at a lateral speed of (0.5 ± 0.25) m/s until the lateral distance between the two vehicles is 1.5 m. After the lane change is completed, the target vehicle is ensured to still cross the line B and completely behind the line C, the target vehicle keeps running straight for at least 300ms, then the lane change returns to the initial lane, and the test is finished. The test should be repeated from the other side of the test vehicle after completion of the test.
As shown in fig. 5, the blind area monitoring range (detection coverage area): the vehicle blind spot monitoring range is shown in fig. 5. The line drawn in fig. 5 is to illustrate the blind spot monitoring warning requirement. The description of the right side, left side, rear, etc. refers to the traveling direction of the test vehicle. All dimensions given are relative to the test vehicle. Line a is parallel to the rear edge of the test vehicle and is located 30.0m behind the rear edge of the test vehicle. Line B is parallel to the test vehicle rear edge and is located 3.0m behind the test vehicle rear edge. Line C is parallel to the leading edge of the test vehicle and is centered on the ninety-fifth percentile eye ellipse. Line D is a bidirectional extension of the leading edge of the test vehicle. Line E is parallel to the centerline of the test vehicle and is located at the outermost edge of the left side of the test vehicle body (excluding the exterior rear view mirror). Line F is parallel to the centerline of the test vehicle and is located to the left of the left outermost edge of the test vehicle body, 0.5m from the left outermost edge. Line G is parallel to the centerline of the test vehicle and is located to the left of the left outermost edge of the test vehicle body, 3.0m from the left outermost edge. Line H is parallel to the centerline of the test vehicle and is located to the left of the left outermost edge of the test vehicle body, 6.0m from the left outermost edge. Line J is parallel to the centerline of the test vehicle and is located at the outermost edge of the right side of the test vehicle body (excluding the exterior rear view mirror). Line K is parallel to the center line of the test vehicle and is located to the right of the outermost right edge of the test vehicle body, 0.5m from the outermost right edge. Line L is parallel to the center line of the test vehicle and is located to the right of the outermost right edge of the test vehicle body, 3.0m from the outermost right edge. Line M is parallel to the centerline of the test vehicle and is located to the right of the outermost right edge of the test vehicle body, 6.0M from the outermost right edge. Line N is a bidirectional extension of the trailing edge of the test vehicle. Line O is parallel to the test vehicle rear edge and is located 10.0m behind the test vehicle rear edge. Wherein 1 represents a test vehicle; 2, the center of the ninety-fifth percentile eye ellipse is expressed, and the center is required to meet the requirements of GB/T36606 and 2018, and is referred by vehicles of N1 types; 3, an area surrounded by FCGB is a vehicle left blind area monitoring range under a straight line working condition; and 4, the area surrounded by the KCLB is the monitoring range of the blind area on the right side of the vehicle under the straight line working condition.
The invention establishes a cross model, a parallel model and a curve model by analyzing and summarizing required test scenes. And then automatically generating a test scene according to different models and different experimental parameters (including but not limited to the speed of the tested vehicle, the bias rate of the tested vehicle, the speed of the tested vehicle, the acceleration distance of the tested vehicle and the like).
The embodiment of the invention also discloses an automatic generation system for the intelligent driving ADAS test scene, which comprises the following modules: an analysis module: and analyzing the test requirements.
A model selection module: and selecting the model according to the test requirement. The model includes a cross model, a parallel model and a curve model. The test of the cross model uses a test comprising road crossing correlations for different objects. The test use of the parallel model includes follow-related emergency braking for different targets, and cut-in related tests. The testing of the curve model uses tests that include lane changes for different targets and other curve scenario correlations. The model is a preset motion track for the tested vehicle and the target.
A target selection module: and selecting a target according to the test requirement, and placing the target on the model. Targets include child dummy, adult dummy, bicycle, motorcycle, and automobile; and classifying the test parameters according to different targets.
A parameter selection module: parameters are selected and formulated according to the target. The parameters include, but are not limited to, a measured vehicle speed, a measured vehicle bias rate, a test vehicle speed, and a test vehicle acceleration distance.
A scene generation module: and generating a test scene according to the model, the target and the parameters, and performing the ADAS test by using the generated test scene.
Those skilled in the art will appreciate that, in addition to implementing the system and its various devices, modules, units provided by the present invention as pure computer readable program code, the system and its various devices, modules, units provided by the present invention can be fully implemented by logically programming method steps in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Therefore, the system and various devices, modules and units thereof provided by the invention can be regarded as a hardware component, and the devices, modules and units included in the system for realizing various functions can also be regarded as structures in the hardware component; means, modules, units for performing the various functions may also be regarded as structures within both software modules and hardware components for performing the method.
The foregoing description of specific embodiments of the present invention has been presented. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes or modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention. The embodiments and features of the embodiments of the present application may be combined with each other arbitrarily without conflict.
Claims (10)
1. An automatic generation method for an intelligent driving ADAS test scene is characterized by comprising the following steps:
and (3) an analysis step: analyzing the test requirement;
a model selection step: selecting a model according to the test requirement;
a target selection step: selecting a target according to the test requirement, and placing the target on the model;
parameter selection: selecting and formulating parameters according to the target;
a scene generation step: and generating a test scene according to the model, the target and the parameters, and performing the ADAS test by using the generated test scene.
2. The automated intelligent driving ADAS test scenario generation method of claim 1, wherein in the parameter selection step, the parameters include but are not limited to vehicle speed under test, vehicle bias rate under test, vehicle speed under test, and vehicle acceleration distance under test.
3. The automated intelligent driving ADAS test scenario generation method of claim 1, wherein in the model selection step, the models include a cross model, a parallel model and a curve model;
the testing of the cross model uses tests that include road crossing correlations for different targets;
the test usage of the parallel model includes following-related emergency braking for different targets, and passing-related tests;
the testing of the curve model uses tests that include lane changes for different targets and other curve scenario correlations.
4. The automated intelligent driving ADAS test scenario generation method of claim 3, wherein in the model selection step, the model is a preset motion trajectory for the vehicle under test and the target.
5. The intelligent driving ADAS test scenario automated generation method of claim 1, wherein in the target selection step, the targets include child dummy, adult dummy, bicycle, motorcycle, and car; and classifying the test parameters according to different targets.
6. The utility model provides an intelligent driving ADAS test scene automated generation system which characterized in that includes following module:
an analysis module: analyzing the test requirement;
a model selection module: selecting a model according to the test requirement;
a target selection module: selecting a target according to the test requirement, and placing the target on the model;
a parameter selection module: selecting and formulating parameters according to the target;
a scene generation module: and generating a test scene according to the model, the target and the parameters, and performing the ADAS test by using the generated test scene.
7. The automated intelligent driving ADAS test scenario generation system of claim 6, wherein in the parameter selection module, the parameters include but are not limited to vehicle speed under test, vehicle bias rate under test, vehicle speed under test, and vehicle acceleration distance under test.
8. The intelligent driving ADAS test scenario automated generation system of claim 6, wherein in the model selection module, the models include a cross model, a parallel model, and a curve model;
the testing of the cross model uses tests that include road crossing correlations for different targets;
the test usage of the parallel model includes following-related emergency braking for different targets, and passing-related tests;
the testing of the curve model uses tests that include lane changes for different targets and other curve scenario correlations.
9. The automated intelligent driving ADAS test scenario generation system of claim 8, wherein in the model selection module, the model is a preset motion trajectory for the vehicle under test and the target.
10. The intelligent driving ADAS test scenario automated generation system of claim 6, wherein in a goal selection module, the goals include child dummy, adult dummy, bicycle, motorcycle, and car; and classifying the test parameters according to different targets.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111210083.3A CN113945224A (en) | 2021-10-18 | 2021-10-18 | Automatic generation method and system for intelligent driving ADAS test scene |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111210083.3A CN113945224A (en) | 2021-10-18 | 2021-10-18 | Automatic generation method and system for intelligent driving ADAS test scene |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113945224A true CN113945224A (en) | 2022-01-18 |
Family
ID=79331065
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111210083.3A Pending CN113945224A (en) | 2021-10-18 | 2021-10-18 | Automatic generation method and system for intelligent driving ADAS test scene |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113945224A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115906273A (en) * | 2022-09-29 | 2023-04-04 | 苏州魔视智能科技有限公司 | AEB system calibration method and device and electronic equipment |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107436986A (en) * | 2017-08-07 | 2017-12-05 | 北京汽车研究总院有限公司 | The integrating device and Simulation Application method of active safety systems of vehicles |
CN108829087A (en) * | 2018-07-19 | 2018-11-16 | 山东省科学院自动化研究所 | A kind of intelligent test system and test method of autonomous driving vehicle |
CN109324539A (en) * | 2018-08-28 | 2019-02-12 | 山东省科学院自动化研究所 | The intelligent control platform and method of a kind of automatic Pilot closed test field |
CN109416257A (en) * | 2016-06-27 | 2019-03-01 | 御眼视觉技术有限公司 | Based on the main vehicle of the Characteristics Control that parks cars detected |
CN109993849A (en) * | 2019-03-22 | 2019-07-09 | 山东省科学院自动化研究所 | A kind of automatic Pilot test scene render analog method, apparatus and system |
CN111179585A (en) * | 2018-11-09 | 2020-05-19 | 上海汽车集团股份有限公司 | Site testing method and device for automatic driving vehicle |
CN112180892A (en) * | 2020-09-11 | 2021-01-05 | 苏州智行众维智能科技有限公司 | Intelligent driving vehicle testing method based on field-in-loop |
CN112506170A (en) * | 2020-11-20 | 2021-03-16 | 北京赛目科技有限公司 | Driver model based test method and device |
CN112526893A (en) * | 2020-10-30 | 2021-03-19 | 长安大学 | Test system of intelligent automobile |
CN113495005A (en) * | 2020-03-20 | 2021-10-12 | 安东尼百思特动力有限公司 | Target vehicle for ADAS test |
-
2021
- 2021-10-18 CN CN202111210083.3A patent/CN113945224A/en active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109416257A (en) * | 2016-06-27 | 2019-03-01 | 御眼视觉技术有限公司 | Based on the main vehicle of the Characteristics Control that parks cars detected |
CN107436986A (en) * | 2017-08-07 | 2017-12-05 | 北京汽车研究总院有限公司 | The integrating device and Simulation Application method of active safety systems of vehicles |
CN108829087A (en) * | 2018-07-19 | 2018-11-16 | 山东省科学院自动化研究所 | A kind of intelligent test system and test method of autonomous driving vehicle |
CN109324539A (en) * | 2018-08-28 | 2019-02-12 | 山东省科学院自动化研究所 | The intelligent control platform and method of a kind of automatic Pilot closed test field |
CN111179585A (en) * | 2018-11-09 | 2020-05-19 | 上海汽车集团股份有限公司 | Site testing method and device for automatic driving vehicle |
CN109993849A (en) * | 2019-03-22 | 2019-07-09 | 山东省科学院自动化研究所 | A kind of automatic Pilot test scene render analog method, apparatus and system |
CN113495005A (en) * | 2020-03-20 | 2021-10-12 | 安东尼百思特动力有限公司 | Target vehicle for ADAS test |
CN112180892A (en) * | 2020-09-11 | 2021-01-05 | 苏州智行众维智能科技有限公司 | Intelligent driving vehicle testing method based on field-in-loop |
CN112526893A (en) * | 2020-10-30 | 2021-03-19 | 长安大学 | Test system of intelligent automobile |
CN112506170A (en) * | 2020-11-20 | 2021-03-16 | 北京赛目科技有限公司 | Driver model based test method and device |
Non-Patent Citations (1)
Title |
---|
孙涛;丁琴琴;李卫兵;李娟;: "ADAS系统测试平台设计及实现", 中国测试, no. 04, 30 April 2019 (2019-04-30), pages 151 - 156 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115906273A (en) * | 2022-09-29 | 2023-04-04 | 苏州魔视智能科技有限公司 | AEB system calibration method and device and electronic equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108313054B (en) | The autonomous lane-change decision-making technique of automatic Pilot and device and automatic driving vehicle | |
CN110647056B (en) | Intelligent networking automobile environment simulation system based on whole automobile hardware-in-loop | |
Liu et al. | Crash comparison of autonomous and conventional vehicles using pre-crash scenario typology | |
US20190155291A1 (en) | Methods and systems for automated driving system simulation, validation, and implementation | |
CN110304074B (en) | Hybrid driving method based on layered state machine | |
US20190143992A1 (en) | Self-driving learning apparatus and method using driving experience information | |
CN111795832B (en) | Intelligent driving vehicle testing method, device and equipment | |
CN113050455A (en) | Digital twin test system for intelligent networked automobile and control method | |
CN108647437A (en) | A kind of autonomous driving vehicle evaluation method and evaluation system | |
CN112285740A (en) | Vehicle condition detection | |
CN112987711B (en) | Optimization method of automatic driving regulation algorithm and simulation testing device | |
CN111595597B (en) | Method for testing AEB VRU performance in complex environment | |
CN109910880B (en) | Vehicle behavior planning method and device, storage medium and terminal equipment | |
CN113635897A (en) | Safe driving early warning method based on risk field | |
Hyeon et al. | Influence of speed forecasting on the performance of ecological adaptive cruise control | |
CN113945224A (en) | Automatic generation method and system for intelligent driving ADAS test scene | |
Xu et al. | A vehicle model for micro-traffic simulation in dynamic urban scenarios | |
Zhang et al. | Aerial dataset for china congested highway & expressway and its potential applications in automated driving systems development | |
CN111169473B (en) | Vehicle body language interaction data fusion method and system based on GroudTruth | |
CN117008574A (en) | Intelligent network allies oneself with car advanced auxiliary driving system and autopilot system test platform | |
US11654938B1 (en) | Methods and apparatus for disengaging an autonomous mode based on lateral error of an autonomous vehicle | |
CN114261399A (en) | Decision planning method for intelligent driving of automobile under ice and snow road surface | |
Che et al. | A test method for self-driving vehicle based on mixed reality | |
US20240220233A1 (en) | Computer-implemented method for the automated testing and release of vehicle functions | |
Das et al. | Scenario-Based Validation Approach for Commercial Vehicles |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20220401 Address after: 201499 room 503, building 27, No. 6055, Jinhai highway, Fengxian District, Shanghai Applicant after: Shangzhilian Testing Technology (Shanghai) Co.,Ltd. Address before: 201499 building 22, No. 6055, Jinhai highway, Fengxian District, Shanghai Applicant before: Shanghai intelligent network Automobile Technology Center Co.,Ltd. |
|
TA01 | Transfer of patent application right |