CN110954341A - Test method and system for intelligent networking automobile test scene - Google Patents

Test method and system for intelligent networking automobile test scene Download PDF

Info

Publication number
CN110954341A
CN110954341A CN201911282427.4A CN201911282427A CN110954341A CN 110954341 A CN110954341 A CN 110954341A CN 201911282427 A CN201911282427 A CN 201911282427A CN 110954341 A CN110954341 A CN 110954341A
Authority
CN
China
Prior art keywords
scene
image
test
image acquisition
tested
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911282427.4A
Other languages
Chinese (zh)
Other versions
CN110954341B (en
Inventor
罗浩轩
陈涛
张强
杨良义
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Automotive Engineering Research Institute Co Ltd
Original Assignee
China Automotive Engineering Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Automotive Engineering Research Institute Co Ltd filed Critical China Automotive Engineering Research Institute Co Ltd
Priority to CN201911282427.4A priority Critical patent/CN110954341B/en
Publication of CN110954341A publication Critical patent/CN110954341A/en
Application granted granted Critical
Publication of CN110954341B publication Critical patent/CN110954341B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M17/00Testing of vehicles
    • G01M17/007Wheeled or endless-tracked vehicles

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention provides a test method and a test system for an intelligent internet automobile test scene, wherein the system comprises a scene vehicle to be tested, a scene test display screen, M image acquisition devices, a scene vehicle controller, a scene test display screen, a test data signal output end and a test data signal input end, wherein the scene vehicle to be tested is arranged on a scene test platform, the scene test display screen is arranged right in front of the scene vehicle to be tested, the M image acquisition devices are arranged on the scene vehicle to be tested, the scene vehicle controller to be tested controls the corresponding motion of the scene vehicle to be tested according to the test scene displayed on the scene test display screen acquired by the image acquisition devices, the test data signal output end of the scene vehicle controller to be tested is connected with the test data signal. The invention can solve the problems of uneven scene test proportion distribution, incomplete coverage or missing test key scenes and the like aiming at various complex and changeable test scenes, and promotes the development of intelligent networking automobiles.

Description

Test method and system for intelligent networking automobile test scene
Technical Field
The invention relates to the technical field of intelligent networked automobiles, in particular to a testing method and a testing system for an intelligent networked automobile testing scene.
Background
An Intelligent networked automobile, namely an ICV (integrated circuit Vehicle), refers to the organic combination of an internet of vehicles and an Intelligent automobile, is a new-generation automobile which is provided with advanced Vehicle-mounted sensors, controllers, actuators and other devices, integrates modern communication and network technologies, realizes the exchange and sharing of Intelligent information such as automobile, people, automobile, road, background and the like, realizes safe, comfortable, energy-saving and efficient driving, and can finally replace people to operate. The complete implementation of the intelligent networked automobile also requires a lengthy process. The technical development angle is divided into two stages, namely a primary stage, auxiliary driving; second phase — unmanned. In the technical development process, each automobile manufacturer and each part supplier need to repeatedly test the intelligent networked automobile for a certain system and function so as to enable the intelligent networked automobile to meet the expected requirements. However, because each test scene is complex and changeable, in the test process, the problems of uneven scene test proportion distribution, incomplete coverage or missing test of key scenes and the like exist, and the development of intelligent networked automobiles is greatly hindered.
Disclosure of Invention
The invention aims to at least solve the technical problems in the prior art, and particularly provides a test method and a test system for an intelligent networking automobile test scene.
In order to achieve the purpose, the invention provides a test system of an intelligent networking automobile test scene, which comprises a to-be-tested scene vehicle, wherein the to-be-tested scene vehicle is arranged on a scene test bench, a scene test display screen is arranged right in front of the to-be-tested scene vehicle, and a display signal input end of the scene test display screen is connected with a display signal input end of a test scene controller; the scene vehicle to be tested is provided with M image acquisition devices, wherein M is a positive integer larger than or equal to 1 and is respectively an image acquisition 1-th device, an image acquisition 2-th device, an image acquisition 3-th device, … … and an image acquisition M-th device, an image signal output end of the image acquisition I-th device is connected with an image I-th signal input end of the scene vehicle controller to be tested, i is a positive integer smaller than or equal to M, the scene vehicle controller to be tested controls corresponding motion of the scene vehicle to be tested according to a test scene displayed on a scene test display screen acquired by the image acquisition devices, a test data signal output end of the scene vehicle controller to be tested is connected with a test data signal input end of the scene controller to be tested, and the motion state of the scene vehicle to be tested is displayed on the scene test display screen in a simulated mode.
In a preferred embodiment of the invention, the number of the image acquisition devices arranged on the scene vehicle to be tested is four, and the image acquisition devices are an image acquisition 1 st device, an image acquisition 2 nd device, an image acquisition 3 rd device and an image acquisition 4 th device respectively, wherein an image signal output end of the image acquisition 1 st device is connected with an image 1 st signal input end of the scene vehicle controller to be tested, an image signal output end of the image acquisition 2 nd device is connected with an image 2 nd signal input end of the scene vehicle controller to be tested, an image signal output end of the image acquisition 3 rd device is connected with an image 3 rd signal input end of the scene vehicle controller to be tested, and an image signal output end of the image acquisition 4 th device is connected with an image 4 th signal input end of the scene vehicle controller to be tested;
the image acquisition 1 st device is arranged at the left side of the head of the scene vehicle to be tested, and the image acquisition 1 st device can be sequentially deviated from α leftwards, rightwards, upwards and downwards1°、β1°、γ1°、χ1Degree, said α1、β1、γ1、χ1Respectively greater than or equal to 0 and less than or equal to 90, an image acquisition 2 nd device is arranged at the right side of the head of the scene vehicle to be tested, and the image acquisition 2 nd device can be sequentially deflected α leftwards, rightwards, upwards and downwards2°、β2°、γ2°、χ2Degree, said α2、β2、γ2、χ2Respectively greater than or equal to 0 and less than or equal to 90, an image acquisition No. 3 device is arranged at the left side of the tail of the vehicle of the scene to be tested, and the image acquisition No. 3 device can be sequentially deflected α leftwards, rightwards, upwards and downwards3°、β3°、γ3°、χ3Degree, said α3、β3、γ3、χ3Are respectively greater than or equal to 0 and less than or equal to 90; the 4 th image acquisition device is arranged at the right side of the tail of the vehicle in the scene to be tested, and the 1 st image acquisition device can respectively acquire imagesα for left, right, up and down deviation4°、β4°、γ4°、χ4Degree, said α4、β4、γ4、χ4Are respectively greater than or equal to 0 and less than or equal to 90;
the scene vehicle controller to be tested sends control signals to the image obtaining device 1, the image obtaining device 2, the image obtaining device 3 and the image obtaining device 4 respectively, and controls the adjusting positions of the image obtaining device 1, the image obtaining device 2, the image obtaining device 3 and the image obtaining device 4, so that image information displayed on a scene testing display screen collected by the scene vehicle controller to be tested is full-screen image information of the scene testing display screen.
In a preferred embodiment of the present invention, the scene test display screen is an annular display screen;
the image acquisition device is a wide-angle camera or a far-focus camera.
The invention also discloses a test method of the intelligent networking automobile test scene, which comprises the following steps:
s1, the test scene controller loads test scenes for the scene vehicles to be tested in sequence according to the types of the scene vehicles to be tested;
s2, the scene vehicle controller to be tested adjusts the angle acquired by the image acquisition device according to the test scene displayed by the scene test display screen acquired by the image acquisition device;
and S3, counting the failure times of the vehicles in the scene to be tested in the loading test scene, calculating the failure probability to obtain the test scene test weight, and loading the vehicles in the next scene to be tested in the test scene according to the test scene corresponding to the test weight.
In a preferred embodiment of the present invention, step S1 includes:
determining three-dimensional coordinates of a corresponding test scene by taking the driving behavior of the vehicle in the scene to be tested as a first dimension, the behavior of the target object as a second dimension and the environment as a third dimension; the scene vehicle driving behavior to be tested is established by being divided into two sub-dimensions, namely a first sub-dimension of the scene vehicle driving behavior to be tested and a second sub-dimension of the scene vehicle driving behavior to be tested, wherein the first sub-dimension of the scene vehicle driving behavior to be tested comprises one or any combination of acceleration, deceleration and uniform speed, and the second sub-dimension of the scene vehicle driving behavior to be tested comprises one or any combination of straight line driving, curve driving, steering, lane changing, reversing and parking; the target object comprises one or any combination of an automobile, a two-wheel vehicle and a pedestrian, and the target object behavior is established according to the characteristics of the target object;
the environment comprises sub-dimensions established by one or any combination of road type, weather conditions and lighting conditions; if a certain test scene does not contain a target object, the scene is established only from two dimensions of the main vehicle driving behavior and the environment; according to the above, the test scenario is established by the following steps:
s11, determining the driving behavior of the scene vehicle to be tested according to the two sub-dimensions of the driving behavior of the scene vehicle to be tested so as to determine a first-dimension coordinate;
s12, determining second-dimension coordinates: 1) firstly, determining the type of a target object to be one of an automobile, a two-wheel vehicle or a pedestrian or any combination; 2) determining the behavior of the target object according to the type and the characteristics of the target object;
s13, determining a third dimension coordinate according to one or any combination of road type, weather condition and illumination condition;
and S14, determining the proportion of each test scene, sequentially arranging the proportion of the test scenes from large to small, and loading the test scenes corresponding to the proportion of the test scenes sequentially arranged from large to small into the test scene controller.
In a preferred embodiment of the present invention, in step S14, the method for determining the specific gravity of each test scenario includes the following steps:
s141, obtaining the occurrence probability of each coordinate of three dimensions according to the existing traffic accident big data
Figure BDA0002317126630000041
Figure BDA0002317126630000042
Wherein
Figure BDA0002317126630000043
Representing the probability of occurrence at the first dimension coordinate x,
Figure BDA0002317126630000044
representing the probability of occurrence at the second dimension coordinate y,
Figure BDA0002317126630000045
representing the probability of occurrence at the third dimension coordinate z;
s142, obtaining the accident severity b when each three-dimensional coordinate occurs according to the occurrence eventkWherein minor accident b1Score 1, general Accident b2Record 2 points, major accident b3Record as 3 minutes, super major accident b4Recording as 4 points; k is a positive integer less than or equal to 4;
s142, determining the weight score of a certain test scene according to the two indexes of the occurrence probability and the accident severity corresponding to the three-dimensional coordinates of each test scene, wherein the calculation method of the weight score of the test scene comprises the following steps:
Figure BDA0002317126630000046
wherein the content of the first and second substances,
Figure BDA0002317126630000047
representing the probability of occurrence at the first dimension coordinate x,
Figure BDA0002317126630000048
representing the probability of occurrence at the second dimension coordinate y,
Figure BDA0002317126630000049
representing the probability of occurrence at the third dimension coordinate z; bkIndicating the severity of the incident that occurred.
In a preferred embodiment of the present invention, in step S2, the method for adjusting the angle acquired by the image acquisition device comprises the following steps:
s21, at the beginning, the scene vehicle controller to be tested respectively sends the image obtaining 1 st device control first signal, the image obtaining 2 nd device control first signal, the image obtaining 3 rd device control first signal and the image obtaining 4 th device control first signal to the image obtaining 1 st device, the image obtaining 2 nd device control first signal to the image obtaining 3 rd device control first signal and the image obtaining 4 th device control first signal in turn to make them αj=βj=γj=χjJ is 0, 1, 2, 3, 4;
s22, respectively acquiring scene images displayed on a scene test display screen by an image acquisition 1 st device, an image acquisition 2 nd device, an image acquisition 3 rd device and an image acquisition 4 th device, and judging the difference between the scene images acquired by the image acquisition 1 st device, the image acquisition 2 nd device, the image acquisition 3 rd device and the image acquisition 4 th device and the scene images displayed on the scene test display screen;
s23, according to the scene image difference of the step S22, the scene vehicle controller to be tested respectively sends the image acquisition 1 st device control second signal, the image acquisition 2 nd device control second signal, the image acquisition 3 rd device control second signal and the image acquisition 4 th device control second signal to the image acquisition 1 st device, the image acquisition 2 nd device, the image acquisition 3 rd device and the image acquisition 4 th device in sequence, so that the scene images acquired by the image acquisition 1 st device, the image acquisition 2 nd device, the image acquisition 3 rd device and the image acquisition 4 th device are consistent with the scene displayed by the scene test display screen.
In a preferred embodiment of the present invention, in step S23, the method for controlling the image of the scene captured by the image capturing 1 st device, the image capturing 2 nd device, the image capturing 3 rd device and the image capturing 4 th device to be consistent with the scene displayed on the scene testing display screen by the vehicle controller for the scene to be tested is as follows:
image acquisition device adjustment angle 1:
u1=U1∩ F, wherein U1Scene image information acquired by the 1 st device for image acquisition, F scene image information displayed by the scene test display screen, ∩ scene image information having the sameA set of scene images;
if it is
Figure BDA0002317126630000051
α1=β1=γ1=χ1=0;
If it is
Figure BDA0002317126630000052
Then
Figure BDA0002317126630000053
l1Obtaining distance length, l, from device 1 to scene test display screen for image1' focal length of the 1 st device for image acquisition; if u1In the left half of the scene image displayed on the scene test display screen, α1=γ1=χ10; if u1In the right half of the scene image displayed on the scene test display screen β1=γ1=χ10; if u1In the upper half of the scene image displayed on the scene test display screen, α1=β1=γ10; if u1At the lower half of the scene image displayed on the scene test display screen α1=β1=χ1When the value is equal to 0, sigma is a first factor of the angle adjustment proportion;
image acquisition 2 device adjustment angle:
u2=U2∩ F, wherein U2Acquiring scene image information acquired by a device 2 for an image, wherein F is the scene image information displayed by a scene test display screen, and ∩ is a set of scene images with the same part;
if it is
Figure BDA0002317126630000067
α2=β2=γ2=χ2=0;
If it is
Figure BDA0002317126630000068
Then
Figure BDA0002317126630000061
Figure BDA0002317126630000062
For the area of the same partial scene image, SFTesting the area of a scene image displayed by the display screen for the scene; l2Obtaining distance length, l, from device 2 to scene test display screen for image2' focal length of the 2 nd device for image acquisition; if u2In the left half of the scene image displayed on the scene test display screen, α2=γ2=χ20; if u2In the right half of the scene image displayed on the scene test display screen β2=γ2=χ20; if u2In the upper half of the scene image displayed on the scene test display screen, α2=β2=γ20; if u2At the lower half of the scene image displayed on the scene test display screen α2=β2=χ2When the value is equal to 0, delta is a second factor of the angle adjustment proportion;
image acquisition device No. 3 adjustment angle:
u3=U3∩ F, wherein U3Acquiring scene image information acquired by a 3 rd device for an image, wherein F is the scene image information displayed by a scene test display screen, and ∩ is a set of scene images with the same part;
if it is
Figure BDA0002317126630000063
α3=β3=γ3=χ3=0;
If it is
Figure BDA0002317126630000064
Then
Figure BDA0002317126630000065
Figure BDA0002317126630000066
For the area of the same partial scene image, SFTesting the area of a scene image displayed by the display screen for the scene; l3Distance length, l, from device 3 to scene test display screen for image acquisition3' focal length of the 3 rd device for image acquisition; if u3In the left half of the scene image displayed on the scene test display screen, α3=γ3=χ30; if u3In the right half of the scene image displayed on the scene test display screen β3=γ3=χ30; if u3In the upper half of the scene image displayed on the scene test display screen, α3=β3=γ30; if u3At the lower half of the scene image displayed on the scene test display screen α3=β3=χ3P is a third factor of the angle adjustment proportion as 0;
image acquisition 4 th device adjustment angle:
u4=U4∩ F, wherein U4Acquiring scene image information acquired by a 4 th device for an image, wherein F is the scene image information displayed by a scene test display screen, and ∩ is a set of scene images with the same part;
if it is
Figure BDA0002317126630000071
α4=β4=γ4=χ4=0;
If it is
Figure BDA0002317126630000072
Then
Figure BDA0002317126630000073
Figure BDA0002317126630000074
For the area of the same partial scene image, SFTesting the area of a scene image displayed by the display screen for the scene; l4Obtaining distance from device 4 to scene test display for imageLength,. l4' focal length of the 4 th device for image acquisition; if u4In the left half of the scene image displayed on the scene test display screen, α4=γ4=χ40; if u4In the right half of the scene image displayed on the scene test display screen β4=γ4=χ40; if u4In the upper half of the scene image displayed on the scene test display screen, α4=β4=γ40; if u4At the lower half of the scene image displayed on the scene test display screen α4=β4=χ4With 0, ω is the angle adjustment ratio fourth factor. The scene vehicle to be tested is not steady in the test process, so that the test scene images acquired by the 1 st device to the 4 th device are incomplete, and the images displayed on the scene test display screen are completely acquired.
In a preferred embodiment of the present invention, in step S3, the method for calculating the test weight of the test scenario includes:
Figure BDA0002317126630000075
wherein P' is the total number of scene tests to be loaded, PpThe failure times of the scene test with the scene test serial number p are determined; c (p) is the scene test failure probability with the scene test serial number p;
V(p)=C(p)×Sx,y,z
wherein S isx,y,zV (p) is the test scenario test weight.
In conclusion, by adopting the technical scheme, the problems of uneven scene test proportion distribution, incomplete coverage or missing test of key scenes and the like can be solved for each complex and variable test scene, and the development of intelligent networked automobiles is promoted.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic flow diagram of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
The invention provides a test system of an intelligent networking automobile test scene, which comprises a scene vehicle to be tested, wherein the scene vehicle to be tested is arranged on a scene test bench, a scene test display screen is arranged right in front of the scene vehicle to be tested, and a display signal input end of the scene test display screen is connected with a display signal input end of a test scene controller; the scene vehicle to be tested is provided with M image acquisition devices, wherein M is a positive integer larger than or equal to 1 and is respectively an image acquisition 1-th device, an image acquisition 2-th device, an image acquisition 3-th device, … … and an image acquisition M-th device, an image signal output end of the image acquisition I-th device is connected with an image I-th signal input end of the scene vehicle controller to be tested, i is a positive integer smaller than or equal to M, the scene vehicle controller to be tested controls corresponding motion of the scene vehicle to be tested according to a test scene displayed on a scene test display screen acquired by the image acquisition devices, a test data signal output end of the scene vehicle controller to be tested is connected with a test data signal input end of the scene controller to be tested, and the motion state of the scene vehicle to be tested is displayed on the scene test display screen in a simulated mode.
In a preferred embodiment of the invention, the number of the image acquisition devices arranged on the scene vehicle to be tested is four, and the image acquisition devices are an image acquisition 1 st device, an image acquisition 2 nd device, an image acquisition 3 rd device and an image acquisition 4 th device respectively, wherein an image signal output end of the image acquisition 1 st device is connected with an image 1 st signal input end of the scene vehicle controller to be tested, an image signal output end of the image acquisition 2 nd device is connected with an image 2 nd signal input end of the scene vehicle controller to be tested, an image signal output end of the image acquisition 3 rd device is connected with an image 3 rd signal input end of the scene vehicle controller to be tested, and an image signal output end of the image acquisition 4 th device is connected with an image 4 th signal input end of the scene vehicle controller to be tested;
the image acquisition 1 st device is arranged at the left side of the head of the scene vehicle to be tested, and the image acquisition 1 st device can be sequentially deviated from α leftwards, rightwards, upwards and downwards1°、β1°、γ1°、χ1Degree, said α1、β1、γ1、χ1Respectively greater than or equal to 0 and less than or equal to 90, an image acquisition 2 nd device is arranged at the right side of the head of the scene vehicle to be tested, and the image acquisition 2 nd device can be sequentially deflected α leftwards, rightwards, upwards and downwards2°、β2°、γ2°、χ2Degree, said α2、β2、γ2、χ2Respectively greater than or equal to 0 and less than or equal to 90, an image acquisition No. 3 device is arranged at the left side of the tail of the vehicle of the scene to be tested, and the image acquisition No. 3 device can be sequentially deflected α leftwards, rightwards, upwards and downwards3°、β3°、γ3°、χ3Degree, said α3、β3、γ3、χ3Respectively greater than or equal to 0 and less than or equal to 90, an image acquisition 4 th device is arranged at the right side of the tail of the vehicle of the scene to be tested, and the image acquisition 1 st device can be sequentially deflected α leftwards, rightwards, upwards and downwards4°、β4°、γ4°、χ4Degree, said α4、β4、γ4、χ4Are respectively greater than or equal to 0 and less than or equal to 90;
the scene vehicle controller to be tested sends control signals to the image obtaining device 1, the image obtaining device 2, the image obtaining device 3 and the image obtaining device 4 respectively, and controls the adjusting positions of the image obtaining device 1, the image obtaining device 2, the image obtaining device 3 and the image obtaining device 4, so that image information displayed on a scene testing display screen collected by the scene vehicle controller to be tested is full-screen image information of the scene testing display screen.
In a preferred embodiment of the present invention, the scene test display screen is an annular display screen;
the image acquisition device is a wide-angle camera or a far-focus camera.
The invention also discloses a test method of the intelligent networking automobile test scene, which comprises the following steps as shown in figure 1:
s1, the test scene controller loads test scenes for the scene vehicles to be tested in sequence according to the types of the scene vehicles to be tested;
s2, the scene vehicle controller to be tested adjusts the angle acquired by the image acquisition device according to the test scene displayed by the scene test display screen acquired by the image acquisition device;
and S3, counting the failure times of the vehicles in the scene to be tested in the loading test scene, calculating the failure probability to obtain the test scene test weight, and loading the vehicles in the next scene to be tested in the test scene according to the test scene corresponding to the test weight.
In a preferred embodiment of the present invention, step S1 includes:
determining three-dimensional coordinates of a corresponding test scene by taking the driving behavior of the vehicle in the scene to be tested as a first dimension, the behavior of the target object as a second dimension and the environment as a third dimension; the scene vehicle driving behavior to be tested is established by being divided into two sub-dimensions, namely a first sub-dimension of the scene vehicle driving behavior to be tested and a second sub-dimension of the scene vehicle driving behavior to be tested, wherein the first sub-dimension of the scene vehicle driving behavior to be tested comprises one or any combination of acceleration, deceleration and uniform speed, and the second sub-dimension of the scene vehicle driving behavior to be tested comprises one or any combination of straight line driving, curve driving, steering, lane changing, reversing and parking; the target object comprises one or any combination of an automobile, a two-wheel vehicle and a pedestrian, and the target object behavior is established according to the characteristics of the target object;
the environment comprises sub-dimensions established by one or any combination of road type, weather conditions and lighting conditions; if a certain test scene does not contain a target object, the scene is established only from two dimensions of the main vehicle driving behavior and the environment; according to the above, the test scenario is established by the following steps:
s11, determining the driving behavior of the scene vehicle to be tested according to the two sub-dimensions of the driving behavior of the scene vehicle to be tested so as to determine a first-dimension coordinate;
s12, determining second-dimension coordinates: 1) firstly, determining the type of a target object to be one of an automobile, a two-wheel vehicle or a pedestrian or any combination; 2) determining the behavior of the target object according to the type and the characteristics of the target object;
s13, determining a third dimension coordinate according to one or any combination of road type, weather condition and illumination condition;
and S14, determining the proportion of each test scene, sequentially arranging the proportion of the test scenes from large to small, and loading the test scenes corresponding to the proportion of the test scenes sequentially arranged from large to small into the test scene controller.
In a preferred embodiment of the present invention, in step S14, the method for determining the specific gravity of each test scenario includes the following steps:
s141, obtaining the occurrence probability of each coordinate (event) of three dimensions according to the existing traffic accident big data
Figure BDA0002317126630000111
Wherein
Figure BDA0002317126630000112
Representing the probability of occurrence at a first dimension coordinate (event) x,
Figure BDA0002317126630000113
representing the probability of occurrence at the second dimension coordinate (event) y,
Figure BDA0002317126630000114
representing the probability of occurrence at the third dimension coordinate (event) z;
s142, obtaining the accident severity b when each three-dimensional coordinate occurs according to the occurrence eventkWherein minor accident b1Score 1, general Accident b2Record 2 points, major accident b3Record as 3 minutes, super major accident b4Recording as 4 points; k is a positive integer less than or equal to 4;
s142, determining the weight score of a certain test scene according to the two indexes of the occurrence probability and the accident severity corresponding to the three-dimensional coordinates of each test scene, wherein the calculation method of the weight score of the test scene comprises the following steps:
Figure BDA0002317126630000115
wherein the content of the first and second substances,
Figure BDA0002317126630000116
representing the probability of occurrence at a first dimension coordinate (event) x,
Figure BDA0002317126630000117
representing the probability of occurrence at the second dimension coordinate (event) y,
Figure BDA0002317126630000118
representing the probability of occurrence at the third dimension coordinate (event) z; bkIndicating the severity of the incident that occurred. And loading the test scenes into the test scene controller from high to low according to the weight scores.
In a preferred embodiment of the present invention, in step S2, the method for adjusting the angle acquired by the image acquisition device comprises the following steps:
s21, initially, the scene vehicle controller to be tested sends the image acquisition 1 st device control first signal, the image acquisition 2 nd device control first signal, the image acquisition 3 rd device control first signal to the image acquisition 1 st device, the image acquisition 2 nd device control first signal and the image acquisition 4 th device in turn respectivelyTake the 4 th device to control the first signal αj=βj=γj=χjJ is 0, 1, 2, 3, 4;
s22, respectively acquiring scene images displayed on a scene test display screen by an image acquisition 1 st device, an image acquisition 2 nd device, an image acquisition 3 rd device and an image acquisition 4 th device, and judging the difference between the scene images acquired by the image acquisition 1 st device, the image acquisition 2 nd device, the image acquisition 3 rd device and the image acquisition 4 th device and the scene images displayed on the scene test display screen;
s23, according to the scene image difference of the step S22, the scene vehicle controller to be tested respectively sends the image acquisition 1 st device control second signal, the image acquisition 2 nd device control second signal, the image acquisition 3 rd device control second signal and the image acquisition 4 th device control second signal to the image acquisition 1 st device, the image acquisition 2 nd device, the image acquisition 3 rd device and the image acquisition 4 th device in sequence, so that the scene images acquired by the image acquisition 1 st device, the image acquisition 2 nd device, the image acquisition 3 rd device and the image acquisition 4 th device are consistent with the scene displayed by the scene test display screen.
In a preferred embodiment of the present invention, in step S23, the method for controlling the image of the scene captured by the image capturing 1 st device, the image capturing 2 nd device, the image capturing 3 rd device and the image capturing 4 th device to be consistent with the scene displayed on the scene testing display screen by the vehicle controller for the scene to be tested is as follows:
image acquisition device adjustment angle 1:
u1=U1∩ F, wherein U1Acquiring scene image information acquired by a 1 st device for an image, wherein F is the scene image information displayed by a scene test display screen, and ∩ is a set of scene images with the same part;
if it is
Figure BDA0002317126630000121
α1=β1=γ1=χ1=0;
If it is
Figure BDA0002317126630000122
Then
Figure BDA0002317126630000123
Figure BDA0002317126630000124
For the area of the same partial scene image, SFTesting the area of a scene image displayed by the display screen for the scene; l1Obtaining distance length, l, from device 1 to scene test display screen for image1' focal length of the 1 st device for image acquisition; if u1In the left half of the scene image displayed on the scene test display screen, α1=γ1=χ10; if u1In the right half of the scene image displayed on the scene test display screen β1=γ1=χ10; if u1In the upper half of the scene image displayed on the scene test display screen, α1=β1=γ10; if u1At the lower half of the scene image displayed on the scene test display screen α1=β1=χ1When the value is equal to 0, sigma is a first factor of the angle adjustment proportion;
image acquisition 2 device adjustment angle:
u2=U2∩ F, wherein U2Acquiring scene image information acquired by a device 2 for an image, wherein F is the scene image information displayed by a scene test display screen, and ∩ is a set of scene images with the same part;
if it is
Figure BDA0002317126630000131
α2=β2=γ2=χ2=0;
If it is
Figure BDA0002317126630000132
Then
Figure BDA0002317126630000133
Figure BDA0002317126630000134
For the area of the same partial scene image, SFTesting the area of a scene image displayed by the display screen for the scene; l2Obtaining distance length, l, from device 2 to scene test display screen for image2' focal length of the 2 nd device for image acquisition; if u2In the left half of the scene image displayed on the scene test display screen, α2=γ2=χ20; if u2In the right half of the scene image displayed on the scene test display screen β2=γ2=χ20; if u2In the upper half of the scene image displayed on the scene test display screen, α2=β2=γ20; if u2At the lower half of the scene image displayed on the scene test display screen α2=β2=χ2When the value is equal to 0, delta is a second factor of the angle adjustment proportion;
image acquisition device No. 3 adjustment angle:
u3=U3∩ F, wherein U3Acquiring scene image information acquired by a 3 rd device for an image, wherein F is the scene image information displayed by a scene test display screen, and ∩ is a set of scene images with the same part;
if it is
Figure BDA0002317126630000135
α3=β3=γ3=χ3=0;
If it is
Figure BDA0002317126630000136
Then
Figure BDA0002317126630000137
Figure BDA0002317126630000138
For the area of the same partial scene image, SFDisplay for scene testArea of the scene image displayed on the screen; l3Distance length, l, from device 3 to scene test display screen for image acquisition3' focal length of the 3 rd device for image acquisition; if u3In the left half of the scene image displayed on the scene test display screen, α3=γ3=χ30; if u3In the right half of the scene image displayed on the scene test display screen β3=γ3=χ30; if u3In the upper half of the scene image displayed on the scene test display screen, α3=β3=γ30; if u3At the lower half of the scene image displayed on the scene test display screen α3=β3=χ3P is a third factor of the angle adjustment proportion as 0;
image acquisition 4 th device adjustment angle:
u4=U4∩ F, wherein U4Acquiring scene image information acquired by a 4 th device for an image, wherein F is the scene image information displayed by a scene test display screen, and ∩ is a set of scene images with the same part;
if it is
Figure BDA0002317126630000141
α4=β4=γ4=χ4=0;
If it is
Figure BDA0002317126630000142
Then
Figure BDA0002317126630000143
Figure BDA0002317126630000144
For the area of the same partial scene image, SFTesting the area of a scene image displayed by the display screen for the scene; l4Distance length, l, from device 4 to scene test display screen for image acquisition4' focal length of the 4 th device for image acquisition; if u4Testing display screen display in a sceneThe left half of the scene image of (2), α4=γ4=χ40; if u4In the right half of the scene image displayed on the scene test display screen β4=γ4=χ40; if u4In the upper half of the scene image displayed on the scene test display screen, α4=β4=γ40; if u4At the lower half of the scene image displayed on the scene test display screen α4=β4=χ4With 0, ω is the angle adjustment ratio fourth factor.
In a preferred embodiment of the present invention, in step S3, the method for calculating the test weight of the test scenario includes:
Figure BDA0002317126630000145
wherein P' is the total number of scene tests to be loaded, PpThe failure times of the scene test with the scene test serial number p are determined; c (p) is the scene test failure probability with the scene test serial number p;
V(p)=C(p)×Sx,y,z
wherein S isx,y,zV (p) is the test scenario test weight.
While embodiments of the invention have been shown and described, it will be understood by those of ordinary skill in the art that: various changes, modifications, substitutions and alterations can be made to the embodiments without departing from the principles and spirit of the invention, the scope of which is defined by the claims and their equivalents.

Claims (9)

1. The intelligent network connection automobile test scene test system comprises a to-be-tested scene vehicle and is characterized in that the to-be-tested scene vehicle is arranged on a scene test bench, a scene test display screen is arranged right in front of the to-be-tested scene vehicle, and a display signal input end of the scene test display screen is connected with a display signal input end of a test scene controller; the scene vehicle to be tested is provided with M image acquisition devices, wherein M is a positive integer larger than or equal to 1 and is respectively an image acquisition 1-th device, an image acquisition 2-th device, an image acquisition 3-th device, … … and an image acquisition M-th device, an image signal output end of the image acquisition I-th device is connected with an image I-th signal input end of the scene vehicle controller to be tested, i is a positive integer smaller than or equal to M, the scene vehicle controller to be tested controls corresponding motion of the scene vehicle to be tested according to a test scene displayed on a scene test display screen acquired by the image acquisition devices, a test data signal output end of the scene vehicle controller to be tested is connected with a test data signal input end of the scene controller to be tested, and the motion state of the scene vehicle to be tested is displayed on the scene test display screen in a simulated mode.
2. The system of claim 1, wherein the number of the image capturing devices installed on the vehicle in the scene to be tested is four, and the four image capturing devices are the image capturing 1 st device, the image capturing 2 nd device, the image capturing 3 rd device and the image capturing 4 th device, the image signal output end of the image acquisition 1 st device is connected with the image 1 st signal input end of the scene vehicle controller to be tested, the image signal output end of the image acquisition 2 nd device is connected with the image 2 nd signal input end of the scene vehicle controller to be tested, the image signal output end of the image acquisition No. 3 device is connected with the image No. 3 signal input end of the scene vehicle controller to be tested, the image signal output end of the image acquisition 4 th device is connected with the image 4 th signal input end of the scene vehicle controller to be tested;
the image acquisition 1 st device is arranged at the left side of the head of the scene vehicle to be tested, and the image acquisition 1 st device can be sequentially deviated from α leftwards, rightwards, upwards and downwards1°、β1°、γ1°、χ1Degree, said α1、β1、γ1、χ1Are respectively greater than or equal to 0 and less than or equal to 90; the image acquisition 2 nd device is arranged at the right side of the head of the scene vehicle to be tested, and the image acquisition 2 nd device can be used for sequentially carrying out leftward, rightward, upward and downwardDeflection α2°、β2°、γ2°、χ2Degree, said α2、β2、γ2、χ2Respectively greater than or equal to 0 and less than or equal to 90, an image acquisition No. 3 device is arranged at the left side of the tail of the vehicle of the scene to be tested, and the image acquisition No. 3 device can be sequentially deflected α leftwards, rightwards, upwards and downwards3°、β3°、γ3°、χ3Degree, said α3、β3、γ3、χ3Respectively greater than or equal to 0 and less than or equal to 90, an image acquisition 4 th device is arranged at the right side of the tail of the vehicle of the scene to be tested, and the image acquisition 1 st device can be sequentially deflected α leftwards, rightwards, upwards and downwards4°、β4°、γ4°、χ4Degree, said α4、β4、γ4、χ4Are respectively greater than or equal to 0 and less than or equal to 90;
the scene vehicle controller to be tested sends control signals to the image obtaining device 1, the image obtaining device 2, the image obtaining device 3 and the image obtaining device 4 respectively, and controls the adjusting positions of the image obtaining device 1, the image obtaining device 2, the image obtaining device 3 and the image obtaining device 4, so that image information displayed on a scene testing display screen collected by the scene vehicle controller to be tested is full-screen image information of the scene testing display screen.
3. The system for testing the test scene of the intelligent networked automobile according to claim 1, wherein the scene test display screen is an annular display screen;
the image acquisition device is a wide-angle camera or a far-focus camera.
4. A test method for an intelligent networking automobile test scene is characterized by comprising the following steps:
s1, the test scene controller loads test scenes for the scene vehicles to be tested in sequence according to the types of the scene vehicles to be tested;
s2, the scene vehicle controller to be tested adjusts the angle acquired by the image acquisition device according to the test scene displayed by the scene test display screen acquired by the image acquisition device;
and S3, counting the failure times of the vehicles in the scene to be tested in the loading test scene, calculating the failure probability to obtain the test scene test weight, and loading the vehicles in the next scene to be tested in the test scene according to the test scene corresponding to the test weight.
5. The method for testing the intelligent networked automobile test scenario according to claim 4, wherein step S1 includes:
determining three-dimensional coordinates of a corresponding test scene by taking the driving behavior of the vehicle in the scene to be tested as a first dimension, the behavior of the target object as a second dimension and the environment as a third dimension; the scene vehicle driving behavior to be tested is established by being divided into two sub-dimensions, namely a first sub-dimension of the scene vehicle driving behavior to be tested and a second sub-dimension of the scene vehicle driving behavior to be tested, wherein the first sub-dimension of the scene vehicle driving behavior to be tested comprises one or any combination of acceleration, deceleration and uniform speed, and the second sub-dimension of the scene vehicle driving behavior to be tested comprises one or any combination of straight line driving, curve driving, steering, lane changing, reversing and parking; the target object comprises one or any combination of an automobile, a two-wheel vehicle and a pedestrian, and the target object behavior is established according to the characteristics of the target object;
the environment comprises sub-dimensions established by one or any combination of road type, weather conditions and lighting conditions; if a certain test scene does not contain a target object, the scene is established only from two dimensions of the main vehicle driving behavior and the environment; according to the above, the test scenario is established by the following steps:
s11, determining the driving behavior of the scene vehicle to be tested according to the two sub-dimensions of the driving behavior of the scene vehicle to be tested so as to determine a first-dimension coordinate;
s12, determining second-dimension coordinates: 1) firstly, determining the type of a target object to be one of an automobile, a two-wheel vehicle or a pedestrian or any combination; 2) determining the behavior of the target object according to the type and the characteristics of the target object;
s13, determining a third dimension coordinate according to one or any combination of road type, weather condition and illumination condition;
and S14, determining the proportion of each test scene, sequentially arranging the proportion of the test scenes from large to small, and loading the test scenes corresponding to the proportion of the test scenes sequentially arranged from large to small into the test scene controller.
6. The method for testing the test scenes of the intelligent networked automobile as claimed in claim 5, wherein in step S14, the method for determining the specific gravity of each test scene comprises the following steps:
s141, obtaining the occurrence probability of each coordinate of three dimensions according to the existing traffic accident big data
Figure FDA0002317126620000031
Figure FDA0002317126620000032
Wherein
Figure FDA0002317126620000033
Representing the probability of occurrence at the first dimension coordinate x,
Figure FDA0002317126620000034
representing the probability of occurrence at the second dimension coordinate y,
Figure FDA0002317126620000035
representing the probability of occurrence at the third dimension coordinate z;
s142, obtaining the accident severity b when each three-dimensional coordinate occurs according to the occurrence eventkWherein minor accident b1Score 1, general Accident b2Record 2 points, major accident b3Record as 3 minutes, super major accident b4Recording as 4 points; k is a positive integer less than or equal to 4;
s142, determining the weight score of a certain test scene according to the two indexes of the occurrence probability and the accident severity corresponding to the three-dimensional coordinates of each test scene, wherein the calculation method of the weight score of the test scene comprises the following steps:
Figure FDA0002317126620000041
wherein the content of the first and second substances,
Figure FDA0002317126620000042
representing the probability of occurrence at the first dimension coordinate x,
Figure FDA0002317126620000043
representing the probability of occurrence at the second dimension coordinate y,
Figure FDA0002317126620000044
representing the probability of occurrence at the third dimension coordinate z; bkIndicating the severity of the incident that occurred.
7. The method for testing the intelligent networked automobile test scene in the step S2, wherein the method for adjusting the angle collected by the image capturing device comprises the following steps:
s21, at the beginning, the scene vehicle controller to be tested respectively sends the image obtaining 1 st device control first signal, the image obtaining 2 nd device control first signal, the image obtaining 3 rd device control first signal and the image obtaining 4 th device control first signal to the image obtaining 1 st device, the image obtaining 2 nd device control first signal to the image obtaining 3 rd device control first signal and the image obtaining 4 th device control first signal in turn to make them αj=βj=γj=χjJ is 0, 1, 2, 3, 4;
s22, respectively acquiring scene images displayed on a scene test display screen by an image acquisition 1 st device, an image acquisition 2 nd device, an image acquisition 3 rd device and an image acquisition 4 th device, and judging the difference between the scene images acquired by the image acquisition 1 st device, the image acquisition 2 nd device, the image acquisition 3 rd device and the image acquisition 4 th device and the scene images displayed on the scene test display screen;
s23, according to the scene image difference of the step S22, the scene vehicle controller to be tested respectively sends the image acquisition 1 st device control second signal, the image acquisition 2 nd device control second signal, the image acquisition 3 rd device control second signal and the image acquisition 4 th device control second signal to the image acquisition 1 st device, the image acquisition 2 nd device, the image acquisition 3 rd device and the image acquisition 4 th device in sequence, so that the scene images acquired by the image acquisition 1 st device, the image acquisition 2 nd device, the image acquisition 3 rd device and the image acquisition 4 th device are consistent with the scene displayed by the scene test display screen.
8. The method for testing the testing scene of the intelligent networked automobile as claimed in claim 7, wherein in step S23, the method for controlling the scene vehicle controller to control the image capturing 1 st device, the image capturing 2 nd device, the image capturing 3 rd device and the image capturing 4 th device to capture the scene image consistent with the scene displayed by the scene testing display screen comprises:
image acquisition device adjustment angle 1:
u1=U1∩ F, wherein U1Acquiring scene image information acquired by a 1 st device for an image, wherein F is the scene image information displayed by a scene test display screen, and ∩ is a set of scene images with the same part;
if it is
Figure FDA0002317126620000051
α1=β1=γ1=χ1=0;
If it is
Figure FDA0002317126620000052
Then
Figure FDA0002317126620000053
l1Obtaining distance length, l, from device 1 to scene test display screen for image1' focal length of the 1 st device for image acquisition; if u1In the left half of the scene image displayed on the scene test display screen, α1=γ1=χ10; if u1In the right half of the scene image displayed on the scene test display screen β1=γ1=χ10; if u1In the upper half of the scene image displayed on the scene test display screen, α1=β1=γ10; if u1At the lower half of the scene image displayed on the scene test display screen α1=β1=χ1When the value is equal to 0, sigma is a first factor of the angle adjustment proportion;
image acquisition 2 device adjustment angle:
u2=U2∩ F, wherein U2Acquiring scene image information acquired by a device 2 for an image, wherein F is the scene image information displayed by a scene test display screen, and ∩ is a set of scene images with the same part;
if it is
Figure FDA0002317126620000054
α2=β2=γ2=χ2=0;
If it is
Figure FDA0002317126620000055
Then
Figure FDA0002317126620000056
Figure FDA0002317126620000057
For the area of the same partial scene image, SFTesting the area of a scene image displayed by the display screen for the scene; l2Obtaining distance length, l, from device 2 to scene test display screen for image2' focal length of the 2 nd device for image acquisition; if u2In the left half of the scene image displayed on the scene test display screen, α2=γ2=χ20; if u2Testing a display in a sceneThe right half of the scene image is shown, β2=γ2=χ20; if u2In the upper half of the scene image displayed on the scene test display screen, α2=β2=γ20; if u2At the lower half of the scene image displayed on the scene test display screen α2=β2=χ2When the value is equal to 0, delta is a second factor of the angle adjustment proportion;
image acquisition device No. 3 adjustment angle:
u3=U3∩ F, wherein U3Acquiring scene image information acquired by a 3 rd device for an image, wherein F is the scene image information displayed by a scene test display screen, and ∩ is a set of scene images with the same part;
if it is
Figure FDA0002317126620000061
α3=β3=γ3=χ3=0;
If it is
Figure FDA0002317126620000062
Then
Figure FDA0002317126620000063
Figure FDA0002317126620000064
For the area of the same partial scene image, SFTesting the area of a scene image displayed by the display screen for the scene; l3Distance length, l, from device 3 to scene test display screen for image acquisition3' focal length of the 3 rd device for image acquisition; if u3In the left half of the scene image displayed on the scene test display screen, α3=γ3=χ30; if u3In the right half of the scene image displayed on the scene test display screen β3=γ3=χ30; if u3In the upper half of the scene image displayed on the scene test display screen,α3=β3=γ30; if u3At the lower half of the scene image displayed on the scene test display screen α3=β3=χ3P is a third factor of the angle adjustment proportion as 0;
image acquisition 4 th device adjustment angle:
u4=U4∩ F, wherein U4Acquiring scene image information acquired by a 4 th device for an image, wherein F is the scene image information displayed by a scene test display screen, and ∩ is a set of scene images with the same part;
if it is
Figure FDA0002317126620000065
α4=β4=γ4=χ4=0;
If it is
Figure FDA0002317126620000066
Then
Figure FDA0002317126620000067
Figure FDA0002317126620000068
For the area of the same partial scene image, SFTesting the area of a scene image displayed by the display screen for the scene; l4Distance length, l, from device 4 to scene test display screen for image acquisition4' focal length of the 4 th device for image acquisition; if u4In the left half of the scene image displayed on the scene test display screen, α4=γ4=χ40; if u4In the right half of the scene image displayed on the scene test display screen β4=γ4=χ40; if u4In the upper half of the scene image displayed on the scene test display screen, α4=β4=γ40; if u4At the lower half of the scene image displayed on the scene test display screen α4=β4=χ4With 0, ω is the angle adjustment ratio fourth factor.
9. The method for testing the test scenario of the intelligent networked automobile according to claim 4, wherein in step S3, the test scenario test weight is calculated by:
Figure FDA0002317126620000071
wherein P' is the total number of scene tests to be loaded, PpThe failure times of the scene test with the scene test serial number p are determined; c (p) is the scene test failure probability with the scene test serial number p;
V(p)=C(p)×Sx,y,z
wherein S isx,y,zV (p) is the test scenario test weight.
CN201911282427.4A 2019-12-13 2019-12-13 Test method and system for intelligent networking automobile test scene Active CN110954341B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911282427.4A CN110954341B (en) 2019-12-13 2019-12-13 Test method and system for intelligent networking automobile test scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911282427.4A CN110954341B (en) 2019-12-13 2019-12-13 Test method and system for intelligent networking automobile test scene

Publications (2)

Publication Number Publication Date
CN110954341A true CN110954341A (en) 2020-04-03
CN110954341B CN110954341B (en) 2021-10-26

Family

ID=69981449

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911282427.4A Active CN110954341B (en) 2019-12-13 2019-12-13 Test method and system for intelligent networking automobile test scene

Country Status (1)

Country Link
CN (1) CN110954341B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112781887A (en) * 2020-12-21 2021-05-11 苏州挚途科技有限公司 Method, device and system for testing vehicle performance

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102006060553A1 (en) * 2006-12-21 2008-06-26 Bayerische Motoren Werke Ag Test of a motor vehicle imaging system, to show the driver the conditions around the vehicle, uses a reference camera on a test drive to give stored images as reference images for projection and comparison
CN202033574U (en) * 2011-03-18 2011-11-09 浙江吉利汽车研究院有限公司 Camera installation frame
CN202281013U (en) * 2011-07-22 2012-06-20 浙江吉利汽车研究院有限公司 Roof camera head installation rack
CN105954040A (en) * 2016-04-22 2016-09-21 百度在线网络技术(北京)有限公司 Testing method and device for driverless automobiles
JP2018083601A (en) * 2016-11-25 2018-05-31 トヨタ自動車株式会社 Vehicular display apparatus
CN207624060U (en) * 2017-08-08 2018-07-17 中国汽车工程研究院股份有限公司 A kind of automated driving system scene floor data acquisition system
CN108332977A (en) * 2018-01-23 2018-07-27 常熟昆仑智能科技有限公司 A kind of classifying and analyzing method joining automotive test scene to intelligent network
CN109141929A (en) * 2018-10-19 2019-01-04 重庆西部汽车试验场管理有限公司 Intelligent network joins automobile emulation test system and method
CN109975035A (en) * 2019-04-22 2019-07-05 中国汽车工程研究院股份有限公司 A kind of L3 grades of autonomous driving vehicle vehicle grade is in ring test platform system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102006060553A1 (en) * 2006-12-21 2008-06-26 Bayerische Motoren Werke Ag Test of a motor vehicle imaging system, to show the driver the conditions around the vehicle, uses a reference camera on a test drive to give stored images as reference images for projection and comparison
CN202033574U (en) * 2011-03-18 2011-11-09 浙江吉利汽车研究院有限公司 Camera installation frame
CN202281013U (en) * 2011-07-22 2012-06-20 浙江吉利汽车研究院有限公司 Roof camera head installation rack
CN105954040A (en) * 2016-04-22 2016-09-21 百度在线网络技术(北京)有限公司 Testing method and device for driverless automobiles
JP2018083601A (en) * 2016-11-25 2018-05-31 トヨタ自動車株式会社 Vehicular display apparatus
CN207624060U (en) * 2017-08-08 2018-07-17 中国汽车工程研究院股份有限公司 A kind of automated driving system scene floor data acquisition system
CN108332977A (en) * 2018-01-23 2018-07-27 常熟昆仑智能科技有限公司 A kind of classifying and analyzing method joining automotive test scene to intelligent network
CN109141929A (en) * 2018-10-19 2019-01-04 重庆西部汽车试验场管理有限公司 Intelligent network joins automobile emulation test system and method
CN109975035A (en) * 2019-04-22 2019-07-05 中国汽车工程研究院股份有限公司 A kind of L3 grades of autonomous driving vehicle vehicle grade is in ring test platform system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
张强等: "基于自然驾驶数据的APS系统测试评价场景研究", 《2018中国汽车工程学会年会论文》 *
李卫兵等: "基于CarMaker的BSD系统加速测试研究", 《汽车工程学报》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112781887A (en) * 2020-12-21 2021-05-11 苏州挚途科技有限公司 Method, device and system for testing vehicle performance
CN112781887B (en) * 2020-12-21 2024-02-20 苏州挚途科技有限公司 Method, device and system for testing vehicle performance

Also Published As

Publication number Publication date
CN110954341B (en) 2021-10-26

Similar Documents

Publication Publication Date Title
CN112055666B (en) Method, system and medium for determining characteristics of a road
Che et al. D $^ 2$-city: a large-scale dashcam video dataset of diverse traffic scenarios
CN110606093A (en) Vehicle performance evaluation method, device, equipment and storage medium
CN104756172B (en) Instruction device is moved rearwards by for vehicle
CN111324120A (en) Cut-in and cut-out scene extraction method for automatic driving front vehicle
Reway et al. Test methodology for vision-based adas algorithms with an automotive camera-in-the-loop
KR20210080459A (en) Lane detection method, apparatus, electronic device and readable storage medium
CN107543725A (en) A kind of unmanned vehicle manipulation method of testing and device
WO2015090300A1 (en) Method and device for monitoring an external dimension of a vehicle
CN202624199U (en) Control device for controlling speed of automobile by using traffic light
US11524695B2 (en) Interface for harmonizing performance of different autonomous vehicles in a fleet
CN102013170A (en) Vehicle counting-based traffic light control system and control method thereof
JP2015102898A (en) Road sign determination device
CN109326168B (en) Predictive driving assistance simulation system
DE102016218277A1 (en) Method for functional testing of a driver assistance system and control unit and reference device for a driver assistance system
CN107316463A (en) A kind of method and apparatus of vehicle monitoring
CN110954341B (en) Test method and system for intelligent networking automobile test scene
CN109949573A (en) A kind of vehicle violation monitoring method, apparatus and system
CN115777196A (en) Camera testing and verification under electromagnetic interference
WO2020250525A1 (en) On-vehicle information display device
CN104008518B (en) Body detection device
DE102020122137A1 (en) AUTOMATED DRIVING VEHICLE
CN111666825B (en) Vehicle load state identification method and device based on person-in-loop
JP7316620B2 (en) Systems and methods for image normalization
CN114882393B (en) Road reverse running and traffic accident event detection method based on target detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant