WO2023210132A1 - Dispositif de création de données d'évaluation, procédé de création de données d'évaluation et programme de création de données d'évaluation - Google Patents

Dispositif de création de données d'évaluation, procédé de création de données d'évaluation et programme de création de données d'évaluation Download PDF

Info

Publication number
WO2023210132A1
WO2023210132A1 PCT/JP2023/006272 JP2023006272W WO2023210132A1 WO 2023210132 A1 WO2023210132 A1 WO 2023210132A1 JP 2023006272 W JP2023006272 W JP 2023006272W WO 2023210132 A1 WO2023210132 A1 WO 2023210132A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
sensor
model
vehicle
scenario
Prior art date
Application number
PCT/JP2023/006272
Other languages
English (en)
Japanese (ja)
Inventor
健太 中尾
義直 高桑
裕量 清水
稔晃 西森
Original Assignee
三菱重工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱重工業株式会社 filed Critical 三菱重工業株式会社
Publication of WO2023210132A1 publication Critical patent/WO2023210132A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/06Improving the dynamic response of the control system, e.g. improving the speed of regulation or avoiding hunting or overshoot
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M17/00Testing of vehicles
    • G01M17/007Wheeled or endless-tracked vehicles

Definitions

  • the present disclosure relates to an evaluation data creation device, an evaluation data creation method, and an evaluation data creation program.
  • Driving support functions such as automatic driving and collision avoidance are being developed to support drivers who drive vehicles.
  • the driving support function determines which support to perform based on data obtained from various sensors installed in the vehicle.
  • the driving support function needs to be compatible with various environments in which the vehicle runs.
  • Patent Document 1 describes that the driving environment of a plurality of vehicles is reproduced through simulation, the reproduced information is input to each vehicle, and a driving support function is evaluated based on the executed behavior.
  • Patent Document 1 can perform analysis in various situations by performing analysis through simulation. However, since it is a simulation, there is a deviation from the data that can be obtained when the vehicle is actually driving, so there is a limit to the improvement in reliability when used to evaluate driving support functions.
  • the present disclosure provides an evaluation data creation device, an evaluation data creation method, and an evaluation data creation method that can efficiently create evaluation data with reliability suitable for evaluation of driving support performance.
  • the task is to provide a program.
  • the present disclosure provides an evaluation data creation device that creates evaluation data for evaluating at least one of automatic driving of a vehicle and a driving support function of the vehicle, the device comprising: a scenario setting unit that sets a scenario for reproducing the driving of the vehicle; , a sensor data creation unit that creates sensor data obtained when the vehicle runs a scenario set by the scenario setting unit, using a sensor model that models the sensor of the vehicle; and the sensor data creation unit
  • the sensor data created in 2. is associated with actual data that is data acquired by the sensor of the vehicle, and the sensor data is converted based on the associated actual data to create converted actual data that is evaluation data.
  • An evaluation data creation device including an actual data conversion section is provided.
  • the present disclosure also provides an evaluation data creation method for creating evaluation data for evaluating at least one of automatic driving of a vehicle and a driving support function of a vehicle, the method comprising: a step of creating sensor data that is acquired when the vehicle runs the scenario set in the scenario setting section using a sensor model that models the sensor of the vehicle; the step of associating actual data that is data acquired by a sensor of the vehicle, converting the sensor data based on the associated actual data, and creating converted actual data that is evaluation data.
  • a method for creating evaluation data for creating evaluation data.
  • the present disclosure also provides an evaluation data creation program that creates evaluation data for evaluating at least one of automatic driving of a vehicle and a driving support function of a vehicle, the program setting a scenario that reproduces the driving of the vehicle. a step of creating sensor data that will be acquired when the vehicle runs the scenario set in the scenario setting section using a sensor model that models the sensors of the vehicle; , a step of associating actual data that is data acquired by a sensor of the vehicle, converting the sensor data based on the associated actual data, and creating converted actual data that is evaluation data.
  • an evaluation data creation program that executes the evaluation data.
  • FIG. 1 is a block diagram showing an example of an evaluation data creation device.
  • FIG. 2 is a flowchart showing an example of processing of the evaluation data creation device.
  • FIG. 3 is a flowchart showing an example of the operation of the machine learning section.
  • FIG. 4 is a flowchart showing an example of processing of the evaluation data creation device.
  • FIG. 5 is a flowchart showing an example of processing of the evaluation data creation device.
  • FIG. 6 is a flowchart showing an example of processing of the evaluation data creation device.
  • FIG. 7 is a flowchart showing an example of processing of the evaluation data creation device.
  • FIG. 8 is a block diagram showing an example of an evaluation data creation device.
  • FIG. 9 is a flowchart showing an example of the operation of the machine learning section.
  • FIG. 10 is a flowchart showing an example of processing of the evaluation data creation device.
  • FIG. 1 is a block diagram showing an example of an evaluation data creation device.
  • the evaluation data creation device 10 according to the present embodiment creates evaluation data used for evaluation of the driving support function of the ECU (Electronic Control Unit) 6.
  • ECU Electronic Control Unit
  • the ECU 6 is mounted on the vehicle and executes a driving support function based on information acquired by sensors mounted on the vehicle.
  • the driving support executed by the ECU 6 is not particularly limited. Examples of driving support include traveling speed control, brake control, steering control, automatic driving control, warning control to the driver, and warning control to other vehicles and surrounding passersby. In other words, at least one of the vehicle's automatic driving function and driving support function is targeted.
  • Sensors installed in the vehicle also contain various information, including information acquired by distance sensors such as LiDAR, temperature sensors, cameras, etc., information acquired from surrounding vehicles, roadside devices, and the cloud via communication, and information input into the vehicle. This includes information on the accelerator, brake, steering, etc., and information on sensors installed in the drive mechanism.
  • the evaluation data created by the evaluation data creation device 10 is input to the ECU 6.
  • the ECU 6 processes the input evaluation data and outputs the processing result to the evaluation device 8 as driving support instruction information.
  • the evaluation device 8 evaluates the driving support function of the ECU 6.
  • the evaluation device 8 is a device that has an arithmetic processing function such as a CPU and a storage function such as a ROM and RAM, and executes an evaluation function using a program.
  • the evaluation device 8 acquires the evaluation data inputted by the evaluation data creation device 10 and the conditions of the evaluation data, processes the evaluation data by the ECU 6, and acquires driving support instruction information outputted.
  • the evaluation device 8 determines whether the timing and content of the driving support executed by the ECU 6 are appropriate for the situation of the scenario of the evaluation data.
  • the driving support performance of the ECU 6 can be evaluated.
  • the performance of driving support can be improved by evaluating the ECU 6 and adjusting the functions of the ECU 6 using the evaluation data.
  • the evaluation data can also be used as learning data for the ECU 6.
  • the evaluation data creation device 10 reproduces the running state of the vehicle to be evaluated through simulation, and creates evaluation data that simulates data input to the ECU 6 when the vehicle to be evaluated is running.
  • the evaluation data creation device 10 includes an input section 12, an output section 14, a communication section 15, a calculation section 16, and a storage section 18.
  • the input unit 12 includes an input device such as a keyboard and mouse, a touch panel, or a microphone that collects speech from an operator, and outputs a signal corresponding to an operation performed by the operator on the input device to the calculation unit 16.
  • the output unit 14 includes a display device such as a display, and displays a screen containing various information such as processing results and images to be processed based on the display signal output from the calculation unit 16. Further, the output unit 14 may include a recording device that outputs data on a recording medium.
  • the communication unit 15 uses a communication interface to transmit data.
  • the communication unit 15 communicates with the ECU 6 and the evaluation device 8, and sends and receives data.
  • the communication unit 15 transmits various data and programs acquired through communication with external devices to the storage unit 16 and stores them therein.
  • the communication unit may be connected to an external device via a wired communication line or may be connected to an
  • the calculation unit 16 includes an integrated circuit (processor) such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit), and a memory serving as a work area, and can execute various programs using these hardware resources. Execute various processing by. Specifically, the arithmetic unit 16 reads a program stored in the storage unit 18, loads it in the memory, and causes the processor to execute instructions included in the program loaded in the memory, thereby executing various processes.
  • the calculation unit 16 includes a scenario setting unit 22, a sensor data creation unit 24, an actual data conversion unit 26, a correlation processing unit 28, and a machine learning unit 30. Before explaining each part of the calculation section 16, the storage section 18 will be explained.
  • the storage unit 18 is composed of a nonvolatile storage device such as a magnetic storage device or a semiconductor storage device, and stores various programs and data.
  • the storage unit 18 stores a data creation program 32, a learning program 34, a scenario model 36, an environmental object model 37, a weather model 38, a light source model 40, a vehicle model 42, a sensor model 44, and a correspondence table. 46, actual data 48, and a trained model 50.
  • the data stored in the storage unit 18 includes a scenario model 36, an environmental object model 37, a weather model 38, a light source model 40, a vehicle model 42, a sensor model 44, a correspondence table 46, and actual data. 48 and a trained model 50.
  • the scenario model 36, the environmental object model 37, the weather model 38, the light source model 40, the vehicle model 42, the sensor model 44, the correspondence table 46, the actual data 48, and the learned model 50 are each There are multiple models available, allowing you to select the model, data, and table you want to use.
  • the scenario model 36 is a model that sets conditions for simulating vehicle travel.
  • the scenario model 36 is a model that is a combination of models selected from the environmental object model 37, the weather model 38, the light source model 40, and the vehicle model 42.
  • the scenario model 36 can be a model for a certain instant in which no time elapses in order to evaluate the processing of the ECU 6 at a certain moment, or a model in which there is a lapse of time in order to evaluate the processing of the ECU 6 while driving in a predetermined section. It can also be used as a model.
  • the environmental object model 37 is a model that includes information on the shape of the surroundings of the vehicle.
  • the environmental object model 37 includes information on the shape of the road on which the vehicle travels (straight line, curve, intersection, presence or absence of guardrails, presence or absence of traffic lights), information on surrounding vehicles, information on surrounding passersby, and the like.
  • the environmental object model 37 includes a model of the three-dimensional shape of the surroundings of the vehicle.
  • the weather model 38 is a model that reproduces sunny, cloudy, rain, storm, snow, sleet, and vehicle weather.
  • the weather model 38 is a model in which the reproduced situation changes depending on the amount of clouds, rainfall, snowfall, temperature, wind speed, and the like.
  • the light source model 40 is a model of an object that outputs light around the vehicle, such as the sun, a lighting tower, a building, or an oncoming vehicle.
  • the vehicle model 42 is a model that reproduces the vehicle to be analyzed and surrounding vehicles.
  • the vehicle model 42 includes information such as the shape of the vehicle, driving performance, and installed sensors.
  • the sensor model 44 is a model of a sensor installed in the vehicle to be analyzed.
  • the sensor model 44 is a model in which what kind of information is output is set according to surrounding information.
  • the association table 46 is a model that includes information on the association between sensor data, which is data created through simulation, and actual data obtained by actually driving the vehicle in advance.
  • the correspondence table 46 has information on conditions for performing correspondence.
  • the association table 46 includes a table of conditions for associating actual data when the actual data is associated based on distribution conditions preset based on scenario conditions and acquired sensor data. When using the learned model 50 for association, processing conditions for inputting sensor data to the learned model 50, etc. are associated.
  • the actual data 48 is data obtained in advance by actually driving the vehicle.
  • Actual data 48 is data detected by a sensor mounted on the vehicle.
  • the actual data 48 also includes information on actual driving of the vehicle, information on environmental objects, information on weather, information on light sources, and information on the vehicle.
  • the trained model 50 is a model created by the machine learning unit 30 through machine learning. When sensor data is input, the learned model 50 outputs actual data corresponding to the sensor data.
  • the programs stored in the storage unit 18 include a data creation program 32 and a learning program 34.
  • the data creation program 32 is a program for creating evaluation data (evaluation data creation program).
  • the data creation program 32 is a program that implements the functions of the scenario setting section 22, the sensor data creation section 24, the actual data conversion section 26, and the association processing section 28.
  • the learning program 34 is a program that uses machine learning to create a learned model used for part of the processing of the data creation program 32.
  • the learning program 34 is a program that implements the functions of the machine learning section 30.
  • the learning program 34 performs deep learning processing to create a trained model 50 by using the sensor data created by simulation based on the scenario as input and the teacher data with actual data corresponding to the sensor data as output.
  • GAN Geneative Adversarial Network
  • the learning model and machine learning method are not particularly limited, as long as they can convert sensor data into actual data.
  • the storage unit 18 may install the data creation program 32 and the learning program 34 by reading the data creation program 32 and the learning program 34 recorded on a recording medium, or may install the data creation program 32 and the learning program 34 by providing them on a network.
  • the data creation program 32 and the learning program 34 may be installed by reading the data creation program 32 and the learning program 34.
  • Each part of the calculation unit 16 can be executed by executing a program stored in the storage unit 18.
  • the scenario setting unit 22 acquires the simulation conditions for executing the analysis, and selects models based on the conditions from the environmental object model 37, the weather model 38, the light source model 40, and the vehicle model 42. , create a scenario model that is a simulation model.
  • the simulation conditions for executing the analysis are set based on the user's input results. If a scenario model 36 corresponding to the simulation conditions for performing the analysis is stored in the storage unit 18, the scenario setting unit 22 reads the stored scenario model 36.
  • the sensor data creation unit 24 creates sensor data when the vehicle runs through the scenario set by the scenario setting unit 22, using a sensor model that models the vehicle's sensors.
  • the sensor data creation unit 24 executes a simulation based on the scenario model created by the scenario setting unit 22, and creates sensor data that is data detected by a sensor when the vehicle travels based on the scenario model.
  • the actual data conversion unit 26 associates the sensor data created by the sensor data creation unit 24 with the association processing unit 28, and uses the results of the association to convert the sensor data into actual data.
  • the association processing unit 28 associates actual data with the sensor data input from the actual data conversion unit 26.
  • the association processing unit 28 uses the association table 46 to identify actual data that corresponds to the sensor data. Further, when using the trained model 50, the association processing unit 28 inputs sensor data to the machine learning unit 30 and acquires the output actual data.
  • the machine learning unit 30 performs deep learning processing to create a trained model 50 by using the sensor data created by simulation based on the scenario as input and the teacher data with real data corresponding to the sensor data as output. do.
  • the machine learning unit 30 also executes processing using the learned model 50. Specifically, the machine learning unit 30 inputs the input sensor data to the learned model 50 and outputs the corresponding actual data.
  • FIG. 2 is a flowchart showing an example of processing of the evaluation data creation device.
  • the process shown in FIG. 2 is a process that does not use machine learning in the association process.
  • the calculation unit 16 uses the scenario setting unit 22 to determine the scenario of the simulation to be executed (step S12).
  • the scenario setting unit 22 determines a scenario to be executed based on conditions input by the user, conditions set in advance, and the like.
  • the scenario setting unit 22 determines the environmental object model, weather model, light source model, and vehicle model to be used based on the scenario, and creates a scenario model (step S14).
  • the calculation unit 16 inputs the determined scenario model into the sensor model and creates sensor data (step S16).
  • the calculation unit 16 uses the sensor data creation unit 24 to create a sensor model based on the vehicle model 42 .
  • the sensor data creation unit 24 inputs the scenario model into the sensor model, executes a simulation, and creates sensor data detected by the sensor model, that is, data detected by the sensor when the scenario model is used.
  • the calculation unit 16 determines actual data corresponding to the sensor data (step S18).
  • the actual data converting unit 26 and the association processing unit 28 process the sensor data using the association table 46 to determine actual data to be associated with the sensor data.
  • actual data with the closest conditions are selected from the actual data associated with the association table 46.
  • the calculation unit 16 performs a conversion process to actual data (step S20).
  • the actual data conversion unit 26 converts the sensor data into actual data based on the association determined by the association processing unit 28.
  • the calculation unit 16 saves the created actual data (step S22). Actual data becomes evaluation data.
  • the evaluation data can be made into real data by identifying real data for sensor data created in simulation and converting the sensor data into real data.
  • the shape of the object is formed by polygons, the characteristics of the object (reflectance, transmittance, texture, etc.) are set as materials, and these are combined to form a three-dimensional model.
  • this embodiment converts the sensor data into actual data and supplies it to the ECU 6.
  • the evaluation data can be actual data.
  • the association processing unit 28 determines one piece of real data 48 that corresponds to the sensor data, and uses the determined real data as the evaluation table data, but the present invention is not limited to this.
  • the evaluation data creation device 10 may create a learned model in a machine learning section, and convert sensor data to actual data using the learned model.
  • FIG. 3 is a flowchart showing an example of the operation of the machine learning section.
  • the processing shown in FIG. 3 is executed by the machine learning section.
  • the machine learning unit 30 acquires actual data (step S32).
  • the machine learning unit 30 may acquire actual data stored in the storage unit 18 or may acquire actual data via the communication unit 15.
  • the machine learning unit 30 creates sensor data (step S34).
  • the machine learning unit 30 executes processing in the scenario setting unit 22 and sensor data creation unit 24 to create sensor data corresponding to actual data.
  • the machine learning unit 30, for example, acquires actual driving conditions of the data and creates sensor data using the created conditions.
  • the machine learning unit 30 associates the sensor data with the actual data (step S36). That is, the machine learning unit 30 associates the actual data acquired in step S32 with the sensor data created in step S34.
  • the machine learning unit 30 determines whether the creation of learning data is completed (step S38). For example, the machine learning unit 30 uses as a determination criterion whether there are a set number or more of combinations of sensor data and actual data.
  • step S38 If the machine learning unit 30 determines that the creation of learning data has not been completed (No in step S38), the process returns to step S32 and further creates a combination of sensor data and actual data.
  • the machine learning unit 30 determines that the creation of learning data is completed (Yes in step S38), the machine learning unit 30 executes learning of the learning model by using the sensor data as input and the corresponding actual data as output (step S40). ). For example, the machine learning unit 30 uses a part of the combination of sensor data and actual data, which is learning data, as learning data and the rest as verification data, performs learning with the learning data, and verifies with the verification data. .
  • the machine learning unit 30 creates a learned model that outputs the corresponding actual data.
  • the method for creating a trained model is one example and is not limited to this.
  • the above embodiment has been described as supervised learning, it is also possible to prepare a learning model and real data and perform unsupervised learning to identify the real data with respect to the sensor data.
  • FIG. 4 is a flowchart showing an example of processing of the evaluation data creation device. Among the processes shown in FIG. 4, processes that are the same as those shown in FIG. 2 are given the same reference numerals, and detailed explanations are omitted.
  • the calculation unit 16 uses the scenario setting unit 22 to determine the scenario of the simulation to be executed (step S12).
  • the scenario setting unit 22 determines the environmental object model, weather model, light source model, and vehicle model to be used based on the scenario, and creates a scenario model (step S14).
  • the calculation unit 16 inputs the determined scenario model into the sensor model and creates sensor data (step S16).
  • the calculation unit 16 inputs the sensor data to the trained model and converts it into actual data (step S52).
  • the calculation unit 16 saves the created actual data (step S22). Actual data becomes evaluation data.
  • the evaluation data creation device 10 converts sensor data into actual data using a trained model created by machine learning (deep learning), without setting various conditions. , data conversion can be performed.
  • the evaluation data creation device 10 may use a plurality of trained models.
  • the classification conditions are set and learning data is created in which sensor data that satisfies each classification condition is associated with actual data.
  • the machine learning unit 30 creates a learned model for each condition to be classified by executing machine learning for each learning data.
  • FIG. 5 is a flowchart showing an example of the processing of the evaluation data creation device. Among the processes shown in FIG. 5, the same processes as those shown in FIG. 4 are given the same reference numerals, and detailed description thereof will be omitted. FIG. 5 shows a case where trained models are created for each weather and light source condition.
  • the calculation unit 16 uses the scenario setting unit 22 to determine the scenario of the simulation to be executed (step S12).
  • the scenario setting unit 22 determines the environmental object model, weather model, light source model, and vehicle model to be used based on the scenario, and creates a scenario model (step S14).
  • the calculation unit 16 inputs the determined scenario model into the sensor model and creates sensor data (step S16).
  • the calculation unit 16 uses the actual data conversion unit 26 to determine a trained model to be used based on the weather model and the light source model (step S54).
  • the actual data conversion unit 26 acquires the weather and light source conditions of the sensor data based on the information of the scenario model, and determines a trained model corresponding to the acquired conditions.
  • the calculation unit 16 inputs the sensor data to the trained model and converts it into actual data (step S56).
  • the calculation unit 16 saves the created actual data (step S22). Actual data becomes evaluation data.
  • the evaluation data creation device may determine trained data to be used based on user input.
  • FIG. 6 is a flowchart showing an example of processing of the evaluation data creation device. Among the processes shown in FIG. 6, the same processes as those shown in FIG.
  • the calculation unit 16 uses the scenario setting unit 22 to determine the scenario of the simulation to be executed (step S12).
  • the scenario setting unit 22 determines the environmental object model, weather model, light source model, and vehicle model to be used based on the scenario, and creates a scenario model (step S14).
  • the calculation unit 16 inputs the determined scenario model into the sensor model and creates sensor data (step S16).
  • the calculation unit 16 uses the actual data conversion unit 26 to determine a trained model to be used based on the input (step S62).
  • the actual data conversion unit 26 determines a corresponding trained model based on the weather and light source information input to the input unit 12.
  • the calculation unit 16 inputs the sensor data to the trained model and converts it into actual data (step S56).
  • the calculation unit 16 saves the created actual data (step S22). Actual data becomes evaluation data.
  • the evaluation data creation device 10 can thereby convert the sensor data into actual data of the weather and light source conditions required by the user.
  • the information is input by the user, but the information may be input from another database.
  • the evaluation data creation device 10 may set the weather and light source of the scenario determined in step S12 as one standard setting. This makes it possible to reduce the burden of model creation. In addition, since the process selects a trained model based on the user's arbitrary settings, the actual weather and light source data used as evaluation data are converted into actual data of the weather and light source conditions required by the user. be able to.
  • the evaluation data creation device may classify the learned models for each condition of the environmental object model.
  • FIG. 7 is a flowchart showing an example of processing of the evaluation data creation device. Among the processes shown in FIG. 7, processes that are the same as those shown in FIG.
  • the calculation unit 16 uses the scenario setting unit 22 to determine the scenario of the simulation to be executed (step S12).
  • the scenario setting unit 22 determines the environmental object model, weather model, light source model, and vehicle model to be used based on the scenario, and creates a scenario model (step S14).
  • the calculation unit 16 inputs the determined scenario model into the sensor model and creates sensor data (step S16).
  • the calculation unit 16 uses the actual data conversion unit 26 to determine a trained model to be used based on the environmental object model (step S64).
  • the actual data conversion unit 26 acquires the weather and light source conditions of the sensor data based on the information of the scenario model, and determines a trained model corresponding to the acquired conditions.
  • the calculation unit 16 inputs the sensor data to the trained model and converts it into actual data (step S56).
  • the calculation unit 16 saves the created actual data (step S22). Actual data becomes evaluation data.
  • the evaluation data creation device 10 can improve the accuracy of the learned model by classifying the environmental object models by condition and performing learning under similar conditions for people, buildings, trees, etc. Further, it is possible to suppress calculation divergence during learning, and reduce model creation load.
  • the evaluation data creation device 10 also input an environmental object model as input information during learning.
  • an environmental object model as input information during learning.
  • FIG. 8 is a block diagram showing an example of an evaluation data creation device.
  • the evaluation data creation device 10a shown in FIG. 8 includes a weather light source conversion unit 29 and a second learned model 52. Further, the trained model of the evaluation data creation device 10 becomes the first trained model 50. The evaluation data creation device 10 executes the data conversion process twice.
  • the evaluation data creation device 10a has two trained models: a first trained model 50 and a second trained model 52.
  • the first trained model 50 is a trained model used in the process of converting sensor data into actual data as described above.
  • the first trained model 50 is created by performing learning using a sensor model and actual data using weather and light source conditions as standard conditions.
  • the second trained model 52 is a trained model used in the process of converting real data into real data with different weather and light source conditions. A method for creating the second trained model 52 will be described later.
  • the weather light source conversion unit 29 uses the second trained model 52 to create second actual data by changing the weather and light source conditions of the actual data (first actual data) created by the actual data conversion unit 26.
  • the weather light source conversion section 29 changes the weather and light source conditions based on the conditions input through the input section 12 .
  • FIG. 9 is a flowchart showing an example of the operation of the machine learning section. This is executed by the machine learning section using FIG.
  • the machine learning unit 30 acquires actual data with standard light source and weather conditions (step S72).
  • the machine learning unit 30 may acquire actual data stored in the storage unit 18 or may acquire actual data via the communication unit 15.
  • the machine learning unit 30 acquires actual data in which the light source and weather are different from standard conditions (step S74).
  • the machine learning unit 30 may acquire actual data stored in the storage unit 18 or may acquire actual data via the communication unit 15.
  • the machine learning unit 30 associates real data with the same conditions except for the light source and weather (step S76). That is, the machine learning unit 30 associates the actual data acquired in step S72 with the sensor data created in step S74.
  • the machine learning unit 30 determines whether the creation of learning data is completed (step S78). For example, the machine learning unit 30 uses whether there are more than a set number of combinations of actual data as a criterion.
  • step S78 If the machine learning unit 30 determines that the creation of learning data has not been completed (No in step S78), the process returns to step S72 and further creates a combination of sensor data and actual data.
  • step S78 the machine learning unit 30 uses the actual data under the standard conditions as input and the corresponding actual data as output, and starts learning the second learning model. Execute (step S80).
  • the machine learning unit 30 creates a second learned model that converts actual data under standard conditions to actual data under different weather and light source conditions.
  • FIG. 10 is a flowchart showing an example of the processing of the evaluation data creation device.
  • the calculation unit 16 uses the scenario setting unit 22 to determine the scenario of the simulation to be executed (step S12).
  • the scenario setting unit 22 determines an environmental object model and a vehicle model to be used based on the scenario, and creates a scenario model (step S90).
  • the weather model and light source model are based on standard conditions.
  • the calculation unit 16 inputs the determined scenario model into the sensor model and creates sensor data (step S16).
  • the calculation unit 16 determines the first learned model (step S92).
  • the actual data conversion unit 26 determines a learned model corresponding to the acquired conditions based on the information and input of the scenario model.
  • the first trained model may be one trained model.
  • the calculation unit 16 inputs the sensor data to the first trained model and converts it into first actual data (step S94).
  • the calculation unit 16 uses the weather light source conversion unit 29 to determine the second trained model to be used based on the input (step S96).
  • the weather light source conversion unit 29 determines a corresponding second learned model based on the weather and light source information input to the input unit 12.
  • the calculation unit 16 inputs the first real data to the second learned model and converts it into second real data (step S98).
  • the calculation unit 16 stores the created second actual data (step S99).
  • the second actual data becomes evaluation data.
  • the evaluation data creation device 10a can use weather and light source data as reference conditions among the sensor data scenarios created by the sensor data creation unit 24. Thereby, the load of creating sensor data can be reduced. Moreover, the load of creating the first trained model can also be reduced.
  • the evaluation data creation device 10a changes the weather and light source conditions in the weather light source converter 29, thereby converting the first actual data converted from the sensor data into second actual data with desired weather and light source conditions. can be converted.
  • the evaluation data creation device 10a may perform machine learning that selects corresponding real data as the second trained model, but based on the teacher data, the first real data is replaced with the second real data. It is preferable to use machine learning to learn the process of converting into .
  • the evaluation data creation device 10a can convert and correct the influence of the weather and light source with higher accuracy by using the second learned model 52 that can convert from the reference state to the desired weather and light source. As a result, various types of second actual data can be created by changing the weather and light source, and a larger amount of evaluation data can be created.
  • the actual data is subjected to the process of changing the weather and the light source, but the process of changing is not limited to the weather and the light source.
  • the actual data may be changed, that is, the actual data may be corrected based on various conditions set in the scenario. This allows the actual data to be closer to the scenario. Furthermore, by being able to perform correction processing, it is possible to create more evaluation data with different conditions.
  • An evaluation data creation device that creates evaluation data for evaluating at least one of automatic driving of a vehicle and a driving support function of the vehicle, the scenario setting unit that sets a scenario that reproduces the driving of the vehicle; a sensor data creation unit that creates sensor data obtained when the vehicle runs a scenario set by the scenario setting unit, using a sensor model that models the sensor of the vehicle; An implementation for associating the created sensor data with actual data that is data acquired by a sensor of the vehicle, converting the sensor data based on the associated actual data, and creating converted actual data that is evaluation data.
  • An evaluation data creation device including a data conversion section.
  • the evaluation data creation device including a correlation processing section. By correlating the actual data, it is possible to prevent the evaluation data from deviating from the actual sensor detection.
  • the evaluation data creation device further comprising an association processing unit that determines actual data to be associated with the sensor data.
  • the association processing unit has a plurality of trained models classified according to scenario conditions, and determines the trained model to be used based on the scenario set by the scenario setting unit. Evaluation data creation device. This makes it possible to increase the accuracy of the trained model. Further, it is possible to suppress calculation divergence during learning, and reduce model creation load.
  • the evaluation data creation device wherein the association processing unit determines a trained model to be used based on at least one of the weather of the scenario and the surrounding light source.
  • weather and light sources as classification conditions, it is possible to identify conditions that are difficult to judge using sensor data created from a scenario model. For example, in the case of images, it is possible to prevent the learning results from being mixed depending on whether it is raining or snowing, whether it is cloudy and dark, whether it is dark in the evening, whether it is dark indoors, etc. This makes it possible to further improve the accuracy of conversion of actual data based on sensor data.
  • a weather/light source conversion unit that performs a process of converting at least one of weather and surrounding light sources from the scenario to the converted actual data created by the actual data conversion unit, and creates second converted actual data.
  • the evaluation data creation device according to any one of (1) to (7). Thereby, the load of creating sensor data can be reduced, and more evaluation data can be created.
  • the weather/light source conversion unit is associated with actual data in which at least one of the weather and the surrounding light sources is different, receives the actual data of the weather and the surrounding light sources of the scenario created by the scenario creation unit, and receives the weather and the surrounding light sources as input.
  • the evaluation data creation device which converts the converted actual data into the second converted actual data using a learned model that is trained by outputting actual data that is associated with a different at least one of the surrounding light sources. This allows the actual data to be closer to the scenario. Furthermore, by being able to perform correction processing, it is possible to create more evaluation data with different conditions.
  • An evaluation data creation method for creating evaluation data for evaluating at least one of a vehicle's automatic driving and a vehicle's driving support function comprising: setting a scenario for reproducing the driving of the vehicle; , using a sensor model that models the sensor of the vehicle, creating sensor data that will be acquired when the vehicle runs the scenario set by the scenario setting section; Creating evaluation data, including the step of associating actual data that is data acquired by a sensor, and converting the sensor data based on the associated actual data to create converted actual data that is evaluation data.
  • An evaluation data creation program for creating evaluation data for evaluating at least one of a vehicle's automatic driving and a vehicle's driving support function, the step of setting a scenario for reproducing the driving of the vehicle; , using a sensor model that models the sensors of the vehicle to create sensor data that will be acquired when the vehicle travels through a set scenario;
  • An evaluation data creation program that executes a process including the step of associating actual data that is data, converting the sensor data based on the associated actual data, and creating converted actual data that is evaluation data. .
  • Evaluation device 10
  • Input section 14
  • Output section 15
  • Communication section 16
  • Arithmetic section 18
  • Storage section 22
  • Scenario setting section 24
  • Sensor data creation section 26
  • Actual data conversion section 28
  • Correspondence processing section 30
  • Machine learning section 32
  • Data creation section Program 34
  • Learning program 36
  • Scenario model 37
  • Environmental object model 38
  • Weather model 40
  • Vehicle model 42
  • Sensor model 46
  • Correspondence table 48 Actual data 50 Learned model

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne un dispositif de création de données d'évaluation et similaire grâce auquel une plus grande quantité de données d'évaluation destinées à être utilisées dans l'évaluation des performances d'aide à la conduite peut être créée avec un faible débit. Ledit dispositif de création de données d'évaluation, qui crée des données d'évaluation permettant d'évaluer des fonctions d'aide à la conduite d'un véhicule, comprend : une unité de définition de scénario permettant de définir un scénario qui reproduit le déplacement d'un véhicule ; une unité de création de données de capteur permettant d'utiliser un modèle de capteur, qui modélise des capteurs du véhicule, afin de traiter le scénario défini par l'unité de définition de scénario, et de créer des données de capteur, constituant des informations détectées par le modèle de capteur ; et une unité de conversion de données réelles permettant d'associer des données réelles, constituant des données obtenues à l'aide des capteurs de véhicule, aux données de capteur créées par l'unité de création de données de capteur, et de convertir les données de capteur en fonction des données réelles associées afin de créer des données réelles converties, constituant les données d'évaluation.
PCT/JP2023/006272 2022-04-28 2023-02-21 Dispositif de création de données d'évaluation, procédé de création de données d'évaluation et programme de création de données d'évaluation WO2023210132A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022075352A JP2023164047A (ja) 2022-04-28 2022-04-28 評価用データ作成装置、評価用データ作成方法及び評価用データ作成プログラム
JP2022-075352 2022-04-28

Publications (1)

Publication Number Publication Date
WO2023210132A1 true WO2023210132A1 (fr) 2023-11-02

Family

ID=88518561

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/006272 WO2023210132A1 (fr) 2022-04-28 2023-02-21 Dispositif de création de données d'évaluation, procédé de création de données d'évaluation et programme de création de données d'évaluation

Country Status (2)

Country Link
JP (1) JP2023164047A (fr)
WO (1) WO2023210132A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016001172A (ja) * 2014-05-19 2016-01-07 株式会社堀場製作所 車両試験システム、試験管理装置、試験管理プログラム及び車両試験方法
JP2017105453A (ja) * 2015-12-08 2017-06-15 ローベルト ボツシユ ゲゼルシヤフト ミツト ベシユレンクテル ハフツングRobert Bosch Gmbh 自動車の運転支援機能を検証する方法
JP2019043157A (ja) * 2017-08-29 2019-03-22 トヨタ自動車株式会社 自動運転評価装置及び自動運転評価方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016001172A (ja) * 2014-05-19 2016-01-07 株式会社堀場製作所 車両試験システム、試験管理装置、試験管理プログラム及び車両試験方法
JP2017105453A (ja) * 2015-12-08 2017-06-15 ローベルト ボツシユ ゲゼルシヤフト ミツト ベシユレンクテル ハフツングRobert Bosch Gmbh 自動車の運転支援機能を検証する方法
JP2019043157A (ja) * 2017-08-29 2019-03-22 トヨタ自動車株式会社 自動運転評価装置及び自動運転評価方法

Also Published As

Publication number Publication date
JP2023164047A (ja) 2023-11-10

Similar Documents

Publication Publication Date Title
CN111062413B (zh) 一种道路目标检测方法、装置、电子设备及存储介质
JP2021534484A (ja) 手続き的な世界の生成
KR102404791B1 (ko) 입력 영상에 포함된 객체를 인식하는 디바이스 및 방법
KR102068473B1 (ko) 차량 시뮬레이션 방법 및 장치
JP2019043495A (ja) 自動運転調整装置、自動運転調整システム、及び自動運転調整方法
JP2019093896A (ja) 情報処理装置、分類方法およびコンピュータ・プログラム
US11919530B2 (en) Method and system for validating an autonomous vehicle stack
JPWO2018173933A1 (ja) 情報処理装置、走行データ処理方法、車両およびプログラム
CN114625637A (zh) 一种基于动态虚拟场景的测试方法及评价方法
CN114077785A (zh) 车辆的仿真测试场景的构建方法和装置
Wang et al. Simulation and application of cooperative driving sense systems using prescan software
JP2023168244A (ja) 運転者支援システム(adas)及び/又は自動運転システム(ads)を較正及び検証するための方法、システム、及びコンピュータプログラム製品
CN109683491B (zh) 车载摄像头仿真系统
EP3786854A1 (fr) Procédés et systèmes pour déterminer le comportement de conduite
WO2023210132A1 (fr) Dispositif de création de données d'évaluation, procédé de création de données d'évaluation et programme de création de données d'évaluation
KR20210038792A (ko) 드라이빙 시뮬레이션 평가 시스템 및 그 방법
CN207965887U (zh) 一种新型的区分工况的驾驶风格辨识装置
Adam et al. Robustness and deployability of deep object detectors in autonomous driving
US11565711B2 (en) System and method for generating vehicle speed alerts
WO2023123130A1 (fr) Procédé et appareil pour système de conduite autonome, dispositif électronique et support
WO2023058178A1 (fr) Dispositif de simulation, procédé de simulation et programme de simulation
US20230177809A1 (en) Training method for a generator for generating realistic images
CN116449807B (zh) 一种物联网汽车操控系统仿真测试方法及系统
JPWO2020194589A1 (ja) 車両制御用演算装置、車両制御装置、及び、車両制御用演算方法
Sural et al. CoSim: A Co-Simulation Framework for Testing Autonomous Vehicles in Adverse Operating Conditions

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23795880

Country of ref document: EP

Kind code of ref document: A1