WO2023210132A1 - Evaluation data creation device, evaluation data creation method, and evaluation data creation program - Google Patents

Evaluation data creation device, evaluation data creation method, and evaluation data creation program Download PDF

Info

Publication number
WO2023210132A1
WO2023210132A1 PCT/JP2023/006272 JP2023006272W WO2023210132A1 WO 2023210132 A1 WO2023210132 A1 WO 2023210132A1 JP 2023006272 W JP2023006272 W JP 2023006272W WO 2023210132 A1 WO2023210132 A1 WO 2023210132A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
sensor
model
vehicle
scenario
Prior art date
Application number
PCT/JP2023/006272
Other languages
French (fr)
Japanese (ja)
Inventor
健太 中尾
義直 高桑
裕量 清水
稔晃 西森
Original Assignee
三菱重工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱重工業株式会社 filed Critical 三菱重工業株式会社
Publication of WO2023210132A1 publication Critical patent/WO2023210132A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/06Improving the dynamic response of the control system, e.g. improving the speed of regulation or avoiding hunting or overshoot
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01MTESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
    • G01M17/00Testing of vehicles
    • G01M17/007Wheeled or endless-tracked vehicles

Definitions

  • the present disclosure relates to an evaluation data creation device, an evaluation data creation method, and an evaluation data creation program.
  • Driving support functions such as automatic driving and collision avoidance are being developed to support drivers who drive vehicles.
  • the driving support function determines which support to perform based on data obtained from various sensors installed in the vehicle.
  • the driving support function needs to be compatible with various environments in which the vehicle runs.
  • Patent Document 1 describes that the driving environment of a plurality of vehicles is reproduced through simulation, the reproduced information is input to each vehicle, and a driving support function is evaluated based on the executed behavior.
  • Patent Document 1 can perform analysis in various situations by performing analysis through simulation. However, since it is a simulation, there is a deviation from the data that can be obtained when the vehicle is actually driving, so there is a limit to the improvement in reliability when used to evaluate driving support functions.
  • the present disclosure provides an evaluation data creation device, an evaluation data creation method, and an evaluation data creation method that can efficiently create evaluation data with reliability suitable for evaluation of driving support performance.
  • the task is to provide a program.
  • the present disclosure provides an evaluation data creation device that creates evaluation data for evaluating at least one of automatic driving of a vehicle and a driving support function of the vehicle, the device comprising: a scenario setting unit that sets a scenario for reproducing the driving of the vehicle; , a sensor data creation unit that creates sensor data obtained when the vehicle runs a scenario set by the scenario setting unit, using a sensor model that models the sensor of the vehicle; and the sensor data creation unit
  • the sensor data created in 2. is associated with actual data that is data acquired by the sensor of the vehicle, and the sensor data is converted based on the associated actual data to create converted actual data that is evaluation data.
  • An evaluation data creation device including an actual data conversion section is provided.
  • the present disclosure also provides an evaluation data creation method for creating evaluation data for evaluating at least one of automatic driving of a vehicle and a driving support function of a vehicle, the method comprising: a step of creating sensor data that is acquired when the vehicle runs the scenario set in the scenario setting section using a sensor model that models the sensor of the vehicle; the step of associating actual data that is data acquired by a sensor of the vehicle, converting the sensor data based on the associated actual data, and creating converted actual data that is evaluation data.
  • a method for creating evaluation data for creating evaluation data.
  • the present disclosure also provides an evaluation data creation program that creates evaluation data for evaluating at least one of automatic driving of a vehicle and a driving support function of a vehicle, the program setting a scenario that reproduces the driving of the vehicle. a step of creating sensor data that will be acquired when the vehicle runs the scenario set in the scenario setting section using a sensor model that models the sensors of the vehicle; , a step of associating actual data that is data acquired by a sensor of the vehicle, converting the sensor data based on the associated actual data, and creating converted actual data that is evaluation data.
  • an evaluation data creation program that executes the evaluation data.
  • FIG. 1 is a block diagram showing an example of an evaluation data creation device.
  • FIG. 2 is a flowchart showing an example of processing of the evaluation data creation device.
  • FIG. 3 is a flowchart showing an example of the operation of the machine learning section.
  • FIG. 4 is a flowchart showing an example of processing of the evaluation data creation device.
  • FIG. 5 is a flowchart showing an example of processing of the evaluation data creation device.
  • FIG. 6 is a flowchart showing an example of processing of the evaluation data creation device.
  • FIG. 7 is a flowchart showing an example of processing of the evaluation data creation device.
  • FIG. 8 is a block diagram showing an example of an evaluation data creation device.
  • FIG. 9 is a flowchart showing an example of the operation of the machine learning section.
  • FIG. 10 is a flowchart showing an example of processing of the evaluation data creation device.
  • FIG. 1 is a block diagram showing an example of an evaluation data creation device.
  • the evaluation data creation device 10 according to the present embodiment creates evaluation data used for evaluation of the driving support function of the ECU (Electronic Control Unit) 6.
  • ECU Electronic Control Unit
  • the ECU 6 is mounted on the vehicle and executes a driving support function based on information acquired by sensors mounted on the vehicle.
  • the driving support executed by the ECU 6 is not particularly limited. Examples of driving support include traveling speed control, brake control, steering control, automatic driving control, warning control to the driver, and warning control to other vehicles and surrounding passersby. In other words, at least one of the vehicle's automatic driving function and driving support function is targeted.
  • Sensors installed in the vehicle also contain various information, including information acquired by distance sensors such as LiDAR, temperature sensors, cameras, etc., information acquired from surrounding vehicles, roadside devices, and the cloud via communication, and information input into the vehicle. This includes information on the accelerator, brake, steering, etc., and information on sensors installed in the drive mechanism.
  • the evaluation data created by the evaluation data creation device 10 is input to the ECU 6.
  • the ECU 6 processes the input evaluation data and outputs the processing result to the evaluation device 8 as driving support instruction information.
  • the evaluation device 8 evaluates the driving support function of the ECU 6.
  • the evaluation device 8 is a device that has an arithmetic processing function such as a CPU and a storage function such as a ROM and RAM, and executes an evaluation function using a program.
  • the evaluation device 8 acquires the evaluation data inputted by the evaluation data creation device 10 and the conditions of the evaluation data, processes the evaluation data by the ECU 6, and acquires driving support instruction information outputted.
  • the evaluation device 8 determines whether the timing and content of the driving support executed by the ECU 6 are appropriate for the situation of the scenario of the evaluation data.
  • the driving support performance of the ECU 6 can be evaluated.
  • the performance of driving support can be improved by evaluating the ECU 6 and adjusting the functions of the ECU 6 using the evaluation data.
  • the evaluation data can also be used as learning data for the ECU 6.
  • the evaluation data creation device 10 reproduces the running state of the vehicle to be evaluated through simulation, and creates evaluation data that simulates data input to the ECU 6 when the vehicle to be evaluated is running.
  • the evaluation data creation device 10 includes an input section 12, an output section 14, a communication section 15, a calculation section 16, and a storage section 18.
  • the input unit 12 includes an input device such as a keyboard and mouse, a touch panel, or a microphone that collects speech from an operator, and outputs a signal corresponding to an operation performed by the operator on the input device to the calculation unit 16.
  • the output unit 14 includes a display device such as a display, and displays a screen containing various information such as processing results and images to be processed based on the display signal output from the calculation unit 16. Further, the output unit 14 may include a recording device that outputs data on a recording medium.
  • the communication unit 15 uses a communication interface to transmit data.
  • the communication unit 15 communicates with the ECU 6 and the evaluation device 8, and sends and receives data.
  • the communication unit 15 transmits various data and programs acquired through communication with external devices to the storage unit 16 and stores them therein.
  • the communication unit may be connected to an external device via a wired communication line or may be connected to an
  • the calculation unit 16 includes an integrated circuit (processor) such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit), and a memory serving as a work area, and can execute various programs using these hardware resources. Execute various processing by. Specifically, the arithmetic unit 16 reads a program stored in the storage unit 18, loads it in the memory, and causes the processor to execute instructions included in the program loaded in the memory, thereby executing various processes.
  • the calculation unit 16 includes a scenario setting unit 22, a sensor data creation unit 24, an actual data conversion unit 26, a correlation processing unit 28, and a machine learning unit 30. Before explaining each part of the calculation section 16, the storage section 18 will be explained.
  • the storage unit 18 is composed of a nonvolatile storage device such as a magnetic storage device or a semiconductor storage device, and stores various programs and data.
  • the storage unit 18 stores a data creation program 32, a learning program 34, a scenario model 36, an environmental object model 37, a weather model 38, a light source model 40, a vehicle model 42, a sensor model 44, and a correspondence table. 46, actual data 48, and a trained model 50.
  • the data stored in the storage unit 18 includes a scenario model 36, an environmental object model 37, a weather model 38, a light source model 40, a vehicle model 42, a sensor model 44, a correspondence table 46, and actual data. 48 and a trained model 50.
  • the scenario model 36, the environmental object model 37, the weather model 38, the light source model 40, the vehicle model 42, the sensor model 44, the correspondence table 46, the actual data 48, and the learned model 50 are each There are multiple models available, allowing you to select the model, data, and table you want to use.
  • the scenario model 36 is a model that sets conditions for simulating vehicle travel.
  • the scenario model 36 is a model that is a combination of models selected from the environmental object model 37, the weather model 38, the light source model 40, and the vehicle model 42.
  • the scenario model 36 can be a model for a certain instant in which no time elapses in order to evaluate the processing of the ECU 6 at a certain moment, or a model in which there is a lapse of time in order to evaluate the processing of the ECU 6 while driving in a predetermined section. It can also be used as a model.
  • the environmental object model 37 is a model that includes information on the shape of the surroundings of the vehicle.
  • the environmental object model 37 includes information on the shape of the road on which the vehicle travels (straight line, curve, intersection, presence or absence of guardrails, presence or absence of traffic lights), information on surrounding vehicles, information on surrounding passersby, and the like.
  • the environmental object model 37 includes a model of the three-dimensional shape of the surroundings of the vehicle.
  • the weather model 38 is a model that reproduces sunny, cloudy, rain, storm, snow, sleet, and vehicle weather.
  • the weather model 38 is a model in which the reproduced situation changes depending on the amount of clouds, rainfall, snowfall, temperature, wind speed, and the like.
  • the light source model 40 is a model of an object that outputs light around the vehicle, such as the sun, a lighting tower, a building, or an oncoming vehicle.
  • the vehicle model 42 is a model that reproduces the vehicle to be analyzed and surrounding vehicles.
  • the vehicle model 42 includes information such as the shape of the vehicle, driving performance, and installed sensors.
  • the sensor model 44 is a model of a sensor installed in the vehicle to be analyzed.
  • the sensor model 44 is a model in which what kind of information is output is set according to surrounding information.
  • the association table 46 is a model that includes information on the association between sensor data, which is data created through simulation, and actual data obtained by actually driving the vehicle in advance.
  • the correspondence table 46 has information on conditions for performing correspondence.
  • the association table 46 includes a table of conditions for associating actual data when the actual data is associated based on distribution conditions preset based on scenario conditions and acquired sensor data. When using the learned model 50 for association, processing conditions for inputting sensor data to the learned model 50, etc. are associated.
  • the actual data 48 is data obtained in advance by actually driving the vehicle.
  • Actual data 48 is data detected by a sensor mounted on the vehicle.
  • the actual data 48 also includes information on actual driving of the vehicle, information on environmental objects, information on weather, information on light sources, and information on the vehicle.
  • the trained model 50 is a model created by the machine learning unit 30 through machine learning. When sensor data is input, the learned model 50 outputs actual data corresponding to the sensor data.
  • the programs stored in the storage unit 18 include a data creation program 32 and a learning program 34.
  • the data creation program 32 is a program for creating evaluation data (evaluation data creation program).
  • the data creation program 32 is a program that implements the functions of the scenario setting section 22, the sensor data creation section 24, the actual data conversion section 26, and the association processing section 28.
  • the learning program 34 is a program that uses machine learning to create a learned model used for part of the processing of the data creation program 32.
  • the learning program 34 is a program that implements the functions of the machine learning section 30.
  • the learning program 34 performs deep learning processing to create a trained model 50 by using the sensor data created by simulation based on the scenario as input and the teacher data with actual data corresponding to the sensor data as output.
  • GAN Geneative Adversarial Network
  • the learning model and machine learning method are not particularly limited, as long as they can convert sensor data into actual data.
  • the storage unit 18 may install the data creation program 32 and the learning program 34 by reading the data creation program 32 and the learning program 34 recorded on a recording medium, or may install the data creation program 32 and the learning program 34 by providing them on a network.
  • the data creation program 32 and the learning program 34 may be installed by reading the data creation program 32 and the learning program 34.
  • Each part of the calculation unit 16 can be executed by executing a program stored in the storage unit 18.
  • the scenario setting unit 22 acquires the simulation conditions for executing the analysis, and selects models based on the conditions from the environmental object model 37, the weather model 38, the light source model 40, and the vehicle model 42. , create a scenario model that is a simulation model.
  • the simulation conditions for executing the analysis are set based on the user's input results. If a scenario model 36 corresponding to the simulation conditions for performing the analysis is stored in the storage unit 18, the scenario setting unit 22 reads the stored scenario model 36.
  • the sensor data creation unit 24 creates sensor data when the vehicle runs through the scenario set by the scenario setting unit 22, using a sensor model that models the vehicle's sensors.
  • the sensor data creation unit 24 executes a simulation based on the scenario model created by the scenario setting unit 22, and creates sensor data that is data detected by a sensor when the vehicle travels based on the scenario model.
  • the actual data conversion unit 26 associates the sensor data created by the sensor data creation unit 24 with the association processing unit 28, and uses the results of the association to convert the sensor data into actual data.
  • the association processing unit 28 associates actual data with the sensor data input from the actual data conversion unit 26.
  • the association processing unit 28 uses the association table 46 to identify actual data that corresponds to the sensor data. Further, when using the trained model 50, the association processing unit 28 inputs sensor data to the machine learning unit 30 and acquires the output actual data.
  • the machine learning unit 30 performs deep learning processing to create a trained model 50 by using the sensor data created by simulation based on the scenario as input and the teacher data with real data corresponding to the sensor data as output. do.
  • the machine learning unit 30 also executes processing using the learned model 50. Specifically, the machine learning unit 30 inputs the input sensor data to the learned model 50 and outputs the corresponding actual data.
  • FIG. 2 is a flowchart showing an example of processing of the evaluation data creation device.
  • the process shown in FIG. 2 is a process that does not use machine learning in the association process.
  • the calculation unit 16 uses the scenario setting unit 22 to determine the scenario of the simulation to be executed (step S12).
  • the scenario setting unit 22 determines a scenario to be executed based on conditions input by the user, conditions set in advance, and the like.
  • the scenario setting unit 22 determines the environmental object model, weather model, light source model, and vehicle model to be used based on the scenario, and creates a scenario model (step S14).
  • the calculation unit 16 inputs the determined scenario model into the sensor model and creates sensor data (step S16).
  • the calculation unit 16 uses the sensor data creation unit 24 to create a sensor model based on the vehicle model 42 .
  • the sensor data creation unit 24 inputs the scenario model into the sensor model, executes a simulation, and creates sensor data detected by the sensor model, that is, data detected by the sensor when the scenario model is used.
  • the calculation unit 16 determines actual data corresponding to the sensor data (step S18).
  • the actual data converting unit 26 and the association processing unit 28 process the sensor data using the association table 46 to determine actual data to be associated with the sensor data.
  • actual data with the closest conditions are selected from the actual data associated with the association table 46.
  • the calculation unit 16 performs a conversion process to actual data (step S20).
  • the actual data conversion unit 26 converts the sensor data into actual data based on the association determined by the association processing unit 28.
  • the calculation unit 16 saves the created actual data (step S22). Actual data becomes evaluation data.
  • the evaluation data can be made into real data by identifying real data for sensor data created in simulation and converting the sensor data into real data.
  • the shape of the object is formed by polygons, the characteristics of the object (reflectance, transmittance, texture, etc.) are set as materials, and these are combined to form a three-dimensional model.
  • this embodiment converts the sensor data into actual data and supplies it to the ECU 6.
  • the evaluation data can be actual data.
  • the association processing unit 28 determines one piece of real data 48 that corresponds to the sensor data, and uses the determined real data as the evaluation table data, but the present invention is not limited to this.
  • the evaluation data creation device 10 may create a learned model in a machine learning section, and convert sensor data to actual data using the learned model.
  • FIG. 3 is a flowchart showing an example of the operation of the machine learning section.
  • the processing shown in FIG. 3 is executed by the machine learning section.
  • the machine learning unit 30 acquires actual data (step S32).
  • the machine learning unit 30 may acquire actual data stored in the storage unit 18 or may acquire actual data via the communication unit 15.
  • the machine learning unit 30 creates sensor data (step S34).
  • the machine learning unit 30 executes processing in the scenario setting unit 22 and sensor data creation unit 24 to create sensor data corresponding to actual data.
  • the machine learning unit 30, for example, acquires actual driving conditions of the data and creates sensor data using the created conditions.
  • the machine learning unit 30 associates the sensor data with the actual data (step S36). That is, the machine learning unit 30 associates the actual data acquired in step S32 with the sensor data created in step S34.
  • the machine learning unit 30 determines whether the creation of learning data is completed (step S38). For example, the machine learning unit 30 uses as a determination criterion whether there are a set number or more of combinations of sensor data and actual data.
  • step S38 If the machine learning unit 30 determines that the creation of learning data has not been completed (No in step S38), the process returns to step S32 and further creates a combination of sensor data and actual data.
  • the machine learning unit 30 determines that the creation of learning data is completed (Yes in step S38), the machine learning unit 30 executes learning of the learning model by using the sensor data as input and the corresponding actual data as output (step S40). ). For example, the machine learning unit 30 uses a part of the combination of sensor data and actual data, which is learning data, as learning data and the rest as verification data, performs learning with the learning data, and verifies with the verification data. .
  • the machine learning unit 30 creates a learned model that outputs the corresponding actual data.
  • the method for creating a trained model is one example and is not limited to this.
  • the above embodiment has been described as supervised learning, it is also possible to prepare a learning model and real data and perform unsupervised learning to identify the real data with respect to the sensor data.
  • FIG. 4 is a flowchart showing an example of processing of the evaluation data creation device. Among the processes shown in FIG. 4, processes that are the same as those shown in FIG. 2 are given the same reference numerals, and detailed explanations are omitted.
  • the calculation unit 16 uses the scenario setting unit 22 to determine the scenario of the simulation to be executed (step S12).
  • the scenario setting unit 22 determines the environmental object model, weather model, light source model, and vehicle model to be used based on the scenario, and creates a scenario model (step S14).
  • the calculation unit 16 inputs the determined scenario model into the sensor model and creates sensor data (step S16).
  • the calculation unit 16 inputs the sensor data to the trained model and converts it into actual data (step S52).
  • the calculation unit 16 saves the created actual data (step S22). Actual data becomes evaluation data.
  • the evaluation data creation device 10 converts sensor data into actual data using a trained model created by machine learning (deep learning), without setting various conditions. , data conversion can be performed.
  • the evaluation data creation device 10 may use a plurality of trained models.
  • the classification conditions are set and learning data is created in which sensor data that satisfies each classification condition is associated with actual data.
  • the machine learning unit 30 creates a learned model for each condition to be classified by executing machine learning for each learning data.
  • FIG. 5 is a flowchart showing an example of the processing of the evaluation data creation device. Among the processes shown in FIG. 5, the same processes as those shown in FIG. 4 are given the same reference numerals, and detailed description thereof will be omitted. FIG. 5 shows a case where trained models are created for each weather and light source condition.
  • the calculation unit 16 uses the scenario setting unit 22 to determine the scenario of the simulation to be executed (step S12).
  • the scenario setting unit 22 determines the environmental object model, weather model, light source model, and vehicle model to be used based on the scenario, and creates a scenario model (step S14).
  • the calculation unit 16 inputs the determined scenario model into the sensor model and creates sensor data (step S16).
  • the calculation unit 16 uses the actual data conversion unit 26 to determine a trained model to be used based on the weather model and the light source model (step S54).
  • the actual data conversion unit 26 acquires the weather and light source conditions of the sensor data based on the information of the scenario model, and determines a trained model corresponding to the acquired conditions.
  • the calculation unit 16 inputs the sensor data to the trained model and converts it into actual data (step S56).
  • the calculation unit 16 saves the created actual data (step S22). Actual data becomes evaluation data.
  • the evaluation data creation device may determine trained data to be used based on user input.
  • FIG. 6 is a flowchart showing an example of processing of the evaluation data creation device. Among the processes shown in FIG. 6, the same processes as those shown in FIG.
  • the calculation unit 16 uses the scenario setting unit 22 to determine the scenario of the simulation to be executed (step S12).
  • the scenario setting unit 22 determines the environmental object model, weather model, light source model, and vehicle model to be used based on the scenario, and creates a scenario model (step S14).
  • the calculation unit 16 inputs the determined scenario model into the sensor model and creates sensor data (step S16).
  • the calculation unit 16 uses the actual data conversion unit 26 to determine a trained model to be used based on the input (step S62).
  • the actual data conversion unit 26 determines a corresponding trained model based on the weather and light source information input to the input unit 12.
  • the calculation unit 16 inputs the sensor data to the trained model and converts it into actual data (step S56).
  • the calculation unit 16 saves the created actual data (step S22). Actual data becomes evaluation data.
  • the evaluation data creation device 10 can thereby convert the sensor data into actual data of the weather and light source conditions required by the user.
  • the information is input by the user, but the information may be input from another database.
  • the evaluation data creation device 10 may set the weather and light source of the scenario determined in step S12 as one standard setting. This makes it possible to reduce the burden of model creation. In addition, since the process selects a trained model based on the user's arbitrary settings, the actual weather and light source data used as evaluation data are converted into actual data of the weather and light source conditions required by the user. be able to.
  • the evaluation data creation device may classify the learned models for each condition of the environmental object model.
  • FIG. 7 is a flowchart showing an example of processing of the evaluation data creation device. Among the processes shown in FIG. 7, processes that are the same as those shown in FIG.
  • the calculation unit 16 uses the scenario setting unit 22 to determine the scenario of the simulation to be executed (step S12).
  • the scenario setting unit 22 determines the environmental object model, weather model, light source model, and vehicle model to be used based on the scenario, and creates a scenario model (step S14).
  • the calculation unit 16 inputs the determined scenario model into the sensor model and creates sensor data (step S16).
  • the calculation unit 16 uses the actual data conversion unit 26 to determine a trained model to be used based on the environmental object model (step S64).
  • the actual data conversion unit 26 acquires the weather and light source conditions of the sensor data based on the information of the scenario model, and determines a trained model corresponding to the acquired conditions.
  • the calculation unit 16 inputs the sensor data to the trained model and converts it into actual data (step S56).
  • the calculation unit 16 saves the created actual data (step S22). Actual data becomes evaluation data.
  • the evaluation data creation device 10 can improve the accuracy of the learned model by classifying the environmental object models by condition and performing learning under similar conditions for people, buildings, trees, etc. Further, it is possible to suppress calculation divergence during learning, and reduce model creation load.
  • the evaluation data creation device 10 also input an environmental object model as input information during learning.
  • an environmental object model as input information during learning.
  • FIG. 8 is a block diagram showing an example of an evaluation data creation device.
  • the evaluation data creation device 10a shown in FIG. 8 includes a weather light source conversion unit 29 and a second learned model 52. Further, the trained model of the evaluation data creation device 10 becomes the first trained model 50. The evaluation data creation device 10 executes the data conversion process twice.
  • the evaluation data creation device 10a has two trained models: a first trained model 50 and a second trained model 52.
  • the first trained model 50 is a trained model used in the process of converting sensor data into actual data as described above.
  • the first trained model 50 is created by performing learning using a sensor model and actual data using weather and light source conditions as standard conditions.
  • the second trained model 52 is a trained model used in the process of converting real data into real data with different weather and light source conditions. A method for creating the second trained model 52 will be described later.
  • the weather light source conversion unit 29 uses the second trained model 52 to create second actual data by changing the weather and light source conditions of the actual data (first actual data) created by the actual data conversion unit 26.
  • the weather light source conversion section 29 changes the weather and light source conditions based on the conditions input through the input section 12 .
  • FIG. 9 is a flowchart showing an example of the operation of the machine learning section. This is executed by the machine learning section using FIG.
  • the machine learning unit 30 acquires actual data with standard light source and weather conditions (step S72).
  • the machine learning unit 30 may acquire actual data stored in the storage unit 18 or may acquire actual data via the communication unit 15.
  • the machine learning unit 30 acquires actual data in which the light source and weather are different from standard conditions (step S74).
  • the machine learning unit 30 may acquire actual data stored in the storage unit 18 or may acquire actual data via the communication unit 15.
  • the machine learning unit 30 associates real data with the same conditions except for the light source and weather (step S76). That is, the machine learning unit 30 associates the actual data acquired in step S72 with the sensor data created in step S74.
  • the machine learning unit 30 determines whether the creation of learning data is completed (step S78). For example, the machine learning unit 30 uses whether there are more than a set number of combinations of actual data as a criterion.
  • step S78 If the machine learning unit 30 determines that the creation of learning data has not been completed (No in step S78), the process returns to step S72 and further creates a combination of sensor data and actual data.
  • step S78 the machine learning unit 30 uses the actual data under the standard conditions as input and the corresponding actual data as output, and starts learning the second learning model. Execute (step S80).
  • the machine learning unit 30 creates a second learned model that converts actual data under standard conditions to actual data under different weather and light source conditions.
  • FIG. 10 is a flowchart showing an example of the processing of the evaluation data creation device.
  • the calculation unit 16 uses the scenario setting unit 22 to determine the scenario of the simulation to be executed (step S12).
  • the scenario setting unit 22 determines an environmental object model and a vehicle model to be used based on the scenario, and creates a scenario model (step S90).
  • the weather model and light source model are based on standard conditions.
  • the calculation unit 16 inputs the determined scenario model into the sensor model and creates sensor data (step S16).
  • the calculation unit 16 determines the first learned model (step S92).
  • the actual data conversion unit 26 determines a learned model corresponding to the acquired conditions based on the information and input of the scenario model.
  • the first trained model may be one trained model.
  • the calculation unit 16 inputs the sensor data to the first trained model and converts it into first actual data (step S94).
  • the calculation unit 16 uses the weather light source conversion unit 29 to determine the second trained model to be used based on the input (step S96).
  • the weather light source conversion unit 29 determines a corresponding second learned model based on the weather and light source information input to the input unit 12.
  • the calculation unit 16 inputs the first real data to the second learned model and converts it into second real data (step S98).
  • the calculation unit 16 stores the created second actual data (step S99).
  • the second actual data becomes evaluation data.
  • the evaluation data creation device 10a can use weather and light source data as reference conditions among the sensor data scenarios created by the sensor data creation unit 24. Thereby, the load of creating sensor data can be reduced. Moreover, the load of creating the first trained model can also be reduced.
  • the evaluation data creation device 10a changes the weather and light source conditions in the weather light source converter 29, thereby converting the first actual data converted from the sensor data into second actual data with desired weather and light source conditions. can be converted.
  • the evaluation data creation device 10a may perform machine learning that selects corresponding real data as the second trained model, but based on the teacher data, the first real data is replaced with the second real data. It is preferable to use machine learning to learn the process of converting into .
  • the evaluation data creation device 10a can convert and correct the influence of the weather and light source with higher accuracy by using the second learned model 52 that can convert from the reference state to the desired weather and light source. As a result, various types of second actual data can be created by changing the weather and light source, and a larger amount of evaluation data can be created.
  • the actual data is subjected to the process of changing the weather and the light source, but the process of changing is not limited to the weather and the light source.
  • the actual data may be changed, that is, the actual data may be corrected based on various conditions set in the scenario. This allows the actual data to be closer to the scenario. Furthermore, by being able to perform correction processing, it is possible to create more evaluation data with different conditions.
  • An evaluation data creation device that creates evaluation data for evaluating at least one of automatic driving of a vehicle and a driving support function of the vehicle, the scenario setting unit that sets a scenario that reproduces the driving of the vehicle; a sensor data creation unit that creates sensor data obtained when the vehicle runs a scenario set by the scenario setting unit, using a sensor model that models the sensor of the vehicle; An implementation for associating the created sensor data with actual data that is data acquired by a sensor of the vehicle, converting the sensor data based on the associated actual data, and creating converted actual data that is evaluation data.
  • An evaluation data creation device including a data conversion section.
  • the evaluation data creation device including a correlation processing section. By correlating the actual data, it is possible to prevent the evaluation data from deviating from the actual sensor detection.
  • the evaluation data creation device further comprising an association processing unit that determines actual data to be associated with the sensor data.
  • the association processing unit has a plurality of trained models classified according to scenario conditions, and determines the trained model to be used based on the scenario set by the scenario setting unit. Evaluation data creation device. This makes it possible to increase the accuracy of the trained model. Further, it is possible to suppress calculation divergence during learning, and reduce model creation load.
  • the evaluation data creation device wherein the association processing unit determines a trained model to be used based on at least one of the weather of the scenario and the surrounding light source.
  • weather and light sources as classification conditions, it is possible to identify conditions that are difficult to judge using sensor data created from a scenario model. For example, in the case of images, it is possible to prevent the learning results from being mixed depending on whether it is raining or snowing, whether it is cloudy and dark, whether it is dark in the evening, whether it is dark indoors, etc. This makes it possible to further improve the accuracy of conversion of actual data based on sensor data.
  • a weather/light source conversion unit that performs a process of converting at least one of weather and surrounding light sources from the scenario to the converted actual data created by the actual data conversion unit, and creates second converted actual data.
  • the evaluation data creation device according to any one of (1) to (7). Thereby, the load of creating sensor data can be reduced, and more evaluation data can be created.
  • the weather/light source conversion unit is associated with actual data in which at least one of the weather and the surrounding light sources is different, receives the actual data of the weather and the surrounding light sources of the scenario created by the scenario creation unit, and receives the weather and the surrounding light sources as input.
  • the evaluation data creation device which converts the converted actual data into the second converted actual data using a learned model that is trained by outputting actual data that is associated with a different at least one of the surrounding light sources. This allows the actual data to be closer to the scenario. Furthermore, by being able to perform correction processing, it is possible to create more evaluation data with different conditions.
  • An evaluation data creation method for creating evaluation data for evaluating at least one of a vehicle's automatic driving and a vehicle's driving support function comprising: setting a scenario for reproducing the driving of the vehicle; , using a sensor model that models the sensor of the vehicle, creating sensor data that will be acquired when the vehicle runs the scenario set by the scenario setting section; Creating evaluation data, including the step of associating actual data that is data acquired by a sensor, and converting the sensor data based on the associated actual data to create converted actual data that is evaluation data.
  • An evaluation data creation program for creating evaluation data for evaluating at least one of a vehicle's automatic driving and a vehicle's driving support function, the step of setting a scenario for reproducing the driving of the vehicle; , using a sensor model that models the sensors of the vehicle to create sensor data that will be acquired when the vehicle travels through a set scenario;
  • An evaluation data creation program that executes a process including the step of associating actual data that is data, converting the sensor data based on the associated actual data, and creating converted actual data that is evaluation data. .
  • Evaluation device 10
  • Input section 14
  • Output section 15
  • Communication section 16
  • Arithmetic section 18
  • Storage section 22
  • Scenario setting section 24
  • Sensor data creation section 26
  • Actual data conversion section 28
  • Correspondence processing section 30
  • Machine learning section 32
  • Data creation section Program 34
  • Learning program 36
  • Scenario model 37
  • Environmental object model 38
  • Weather model 40
  • Vehicle model 42
  • Sensor model 46
  • Correspondence table 48 Actual data 50 Learned model

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)

Abstract

Provided are an evaluation data creation device and the like with which a larger amount of evaluation data for use in evaluating driving assistance performance can be created with low throughput. This evaluation data creation device, which creates evaluation data for evaluating driving assistance functions of a vehicle, includes: a scenario setting unit for setting a scenario that reproduces travel of a vehicle; a sensor data creation unit for using a sensor model, which models sensors of the vehicle, to process the scenario set by the scenario setting unit, and creating sensor data, which is information detected by the sensor model; and a real data conversion unit for associating real data, which is data obtained with the vehicle sensors, with the sensor data created by the sensor data creation unit, and converting the sensor data on the basis of the associated real data to create converted real data, which is the evaluation data.

Description

評価用データ作成装置、評価用データ作成方法及び評価用データ作成プログラムEvaluation data creation device, evaluation data creation method, and evaluation data creation program
 本開示は、評価用データ作成装置、評価用データ作成方法及び評価データ用作成プログラムに関するものである。 The present disclosure relates to an evaluation data creation device, an evaluation data creation method, and an evaluation data creation program.
 車両を運転する運転者を支援するために、自動運転、衝突回避等の運転支援機能が開発されている。運転支援機能は、車両に搭載された各種センサの取得データに基づいて、実行する支援の判定を行う。運転支援機能は、走行する車両の種々の環境に対応する必要がある。 Driving support functions such as automatic driving and collision avoidance are being developed to support drivers who drive vehicles. The driving support function determines which support to perform based on data obtained from various sensors installed in the vehicle. The driving support function needs to be compatible with various environments in which the vehicle runs.
 運転支援機能の性能を評価するための評価手法も提案されている。特許文献1には、シミュレーションで複数の車両の走行環境を再現し、再現した情報をそれぞれの車両に入力して実行される挙動に基づいて運転支援機能を評価することが記載されている。 Evaluation methods for evaluating the performance of driving support functions have also been proposed. Patent Document 1 describes that the driving environment of a plurality of vehicles is reproduced through simulation, the reproduced information is input to each vehicle, and a driving support function is evaluated based on the executed behavior.
特開2017-105453号公報JP 2017-105453 Publication
 特許文献1の装置は、シミュレーションで解析を行うことで、種々の状況での解析を行うことができる。しかしながら、シミュレーションであるため、実際に車両が走行して得ることができるデータとは乖離が生じるため、運転支援機能の評価に用いた場合に、信頼性の向上に限界がある。 The device of Patent Document 1 can perform analysis in various situations by performing analysis through simulation. However, since it is a simulation, there is a deviation from the data that can be obtained when the vehicle is actually driving, so there is a limit to the improvement in reliability when used to evaluate driving support functions.
 特許文献1の装置では、シミュレーションにより種々の条件・状況を模擬した解析ができる。しかしながら、シミュレーションで得られる評価用データについては、実際に車両が走行して得られるデータとは乖離があるため、運転支援性能の評価に適したレベルまでデータの信頼性を高めることが難しい。 With the device of Patent Document 1, analysis can be performed by simulating various conditions and situations through simulation. However, since the evaluation data obtained through the simulation differs from the data obtained when the vehicle is actually driving, it is difficult to increase the reliability of the data to a level suitable for evaluating driving support performance.
 一方、実際の車両を走行させることにより、模擬データではなく実際の走行データを得ることができる。この場合に、運転支援性能の評価に適した信頼性を得るには、種々の条件、状況にて試験を行う必要がある。しかしながら、実際の車両を使って種々の条件、状況で走行試験を行うことには限界がある。そのため、実際の走行車両試験から、運転支援性能の評価に適したデータを得ることは、現実には難しい。 On the other hand, by driving an actual vehicle, it is possible to obtain actual driving data instead of simulated data. In this case, in order to obtain reliability suitable for evaluating driving support performance, it is necessary to conduct tests under various conditions and situations. However, there are limits to conducting driving tests using actual vehicles under various conditions and situations. Therefore, it is difficult in reality to obtain data suitable for evaluating driving support performance from actual driving vehicle tests.
 本開示は、上記課題を解決するために、運転支援性能の評価に適した信頼性を有する評価用データを、効率的に作成できる評価用データ作成装置、評価用データ作成方法及び評価用データ作成プログラムを提供することを課題とする。 In order to solve the above problems, the present disclosure provides an evaluation data creation device, an evaluation data creation method, and an evaluation data creation method that can efficiently create evaluation data with reliability suitable for evaluation of driving support performance. The task is to provide a program.
 本開示は、車両の自動運転及び車両の運転支援機能の少なくとも一方を評価する評価用データを作成する評価用データ作成装置であって、前記車両の走行を再現するシナリオを設定するシナリオ設定部と、前記車両のセンサをモデル化したセンサモデルを用いて、前記シナリオ設定部で設定したシナリオを前記車両が走行した場合に取得されるセンサデータを作成するセンサデータ作成部と、前記センサデータ作成部で作成したセンサデータに、前記車両のセンサで取得したデータである実データを対応付け、対応付けた実データに基づいて前記センサデータを変換して、評価用データである変換実データを作成する実データ変換部と、を含む評価用データ作成装置を提供する。 The present disclosure provides an evaluation data creation device that creates evaluation data for evaluating at least one of automatic driving of a vehicle and a driving support function of the vehicle, the device comprising: a scenario setting unit that sets a scenario for reproducing the driving of the vehicle; , a sensor data creation unit that creates sensor data obtained when the vehicle runs a scenario set by the scenario setting unit, using a sensor model that models the sensor of the vehicle; and the sensor data creation unit The sensor data created in 2. is associated with actual data that is data acquired by the sensor of the vehicle, and the sensor data is converted based on the associated actual data to create converted actual data that is evaluation data. An evaluation data creation device including an actual data conversion section is provided.
 また、本開示は、車両の自動運転及び車両の運転支援機能の少なくとも一方の運転支援機能を評価する評価用データを作成する評価用データ作成方法であって、前記車両の走行を再現するシナリオを設定するステップと、前記車両のセンサをモデル化したセンサモデルを用いて、前記シナリオ設定部で設定したシナリオを前記車両が走行した場合に取得されるセンサデータを作成するステップと、作成したセンサデータに、前記車両のセンサで取得したデータである実データを対応付け、対応付けた実データに基づいて前記センサデータを変換して、評価用データである変換実データを作成するステップと、を含む評価用データ作成方法を提供する。 The present disclosure also provides an evaluation data creation method for creating evaluation data for evaluating at least one of automatic driving of a vehicle and a driving support function of a vehicle, the method comprising: a step of creating sensor data that is acquired when the vehicle runs the scenario set in the scenario setting section using a sensor model that models the sensor of the vehicle; the step of associating actual data that is data acquired by a sensor of the vehicle, converting the sensor data based on the associated actual data, and creating converted actual data that is evaluation data. Provides a method for creating evaluation data.
 また、本開示は、車両の自動運転及び車両の運転支援機能の少なくとも一方運転支援機能を評価する評価用データを作成する評価用データ作成プログラムであって、前記車両の走行を再現するシナリオを設定するステップと、前記車両のセンサをモデル化したセンサモデルを用いて、前記シナリオ設定部で設定したシナリオを前記車両が走行した場合に取得されるセンサデータを作成するステップと、作成したセンサデータに、前記車両のセンサで取得したデータである実データを対応付け、対応付けた実データに基づいて前記センサデータを変換して、評価用データである変換実データを作成するステップと、を含む処理を実行させる評価用データ作成プログラムを提供する。 The present disclosure also provides an evaluation data creation program that creates evaluation data for evaluating at least one of automatic driving of a vehicle and a driving support function of a vehicle, the program setting a scenario that reproduces the driving of the vehicle. a step of creating sensor data that will be acquired when the vehicle runs the scenario set in the scenario setting section using a sensor model that models the sensors of the vehicle; , a step of associating actual data that is data acquired by a sensor of the vehicle, converting the sensor data based on the associated actual data, and creating converted actual data that is evaluation data. Provides an evaluation data creation program that executes the evaluation data.
 上記構成とすることで、運転支援の性能の評価に用いるより多くの評価用データを少ない処理量で作成できるという効果を奏する。 With the above configuration, it is possible to create more evaluation data for use in evaluating driving support performance with a smaller amount of processing.
図1は、評価用データ作成装置の一例を示すブロック図である。FIG. 1 is a block diagram showing an example of an evaluation data creation device. 図2は、評価用データ作成装置の処理の一例を示すフローチャートである。FIG. 2 is a flowchart showing an example of processing of the evaluation data creation device. 図3は、機械学習部の動作の一例を示すフローチャートである。FIG. 3 is a flowchart showing an example of the operation of the machine learning section. 図4は、評価用データ作成装置の処理の一例を示すフローチャートである。FIG. 4 is a flowchart showing an example of processing of the evaluation data creation device. 図5は、評価用データ作成装置の処理の一例を示すフローチャートである。FIG. 5 is a flowchart showing an example of processing of the evaluation data creation device. 図6は、評価用データ作成装置の処理の一例を示すフローチャートである。FIG. 6 is a flowchart showing an example of processing of the evaluation data creation device. 図7は、評価用データ作成装置の処理の一例を示すフローチャートである。FIG. 7 is a flowchart showing an example of processing of the evaluation data creation device. 図8は、評価用データ作成装置の一例を示すブロック図である。FIG. 8 is a block diagram showing an example of an evaluation data creation device. 図9は、機械学習部の動作の一例を示すフローチャートである。FIG. 9 is a flowchart showing an example of the operation of the machine learning section. 図10は、評価用データ作成装置の処理の一例を示すフローチャートである。FIG. 10 is a flowchart showing an example of processing of the evaluation data creation device.
 以下に、本開示に係る実施形態を図面に基づいて詳細に説明する。なお、この実施形態によりこの発明が限定されるものではない。また、下記実施形態における構成要素には、当業者が置換可能かつ容易なもの、あるいは実質的に同一のものが含まれる。さらに、以下に記載した構成要素は適宜組み合わせることが可能であり、また、実施形態が複数ある場合には、各実施形態を組み合わせることも可能である。 Embodiments according to the present disclosure will be described in detail below based on the drawings. Note that the present invention is not limited to this embodiment. Furthermore, the constituent elements in the embodiments described below include those that can be easily replaced by those skilled in the art, or those that are substantially the same. Furthermore, the constituent elements described below can be combined as appropriate, and if there are multiple embodiments, it is also possible to combine each embodiment.
<評価用データ作成装置>
 図1は、評価用データ作成装置の一例を示すブロック図である。本実施形態に係る評価用データ作成装置10は、ECU(Electronic Control Unit)6の運転支援機能の評価に用いる評価用データを作成する。
<Evaluation data creation device>
FIG. 1 is a block diagram showing an example of an evaluation data creation device. The evaluation data creation device 10 according to the present embodiment creates evaluation data used for evaluation of the driving support function of the ECU (Electronic Control Unit) 6.
 ECU6は、車両に搭載され、車両に搭載されたセンサで取得した情報に基づいて、運転支援機能を実行する。ECU6が実行する運転支援は、特に限定されない。運転支援としては、走行速度制御、ブレーキ制御、操舵制御、自動運転制御、運転者への警告制御、他の車両、周囲の通行人に対する警告制御等が例示される。つまり、車両の自動運転機能、運転支援機能の少なくとも一方が対象となる。車両に搭載されたセンサも種々の情報が含まれ、LiDAR等の距離センサ、温度センサ、カメラ等で取得した情報、通信を介して周囲の車両、路側機、クラウドから取得した情報、車両に入力されたアクセル、ブレーキ、操舵等の操作情報、駆動機構に配置されたセンサの情報等が含まれる。 The ECU 6 is mounted on the vehicle and executes a driving support function based on information acquired by sensors mounted on the vehicle. The driving support executed by the ECU 6 is not particularly limited. Examples of driving support include traveling speed control, brake control, steering control, automatic driving control, warning control to the driver, and warning control to other vehicles and surrounding passersby. In other words, at least one of the vehicle's automatic driving function and driving support function is targeted. Sensors installed in the vehicle also contain various information, including information acquired by distance sensors such as LiDAR, temperature sensors, cameras, etc., information acquired from surrounding vehicles, roadside devices, and the cloud via communication, and information input into the vehicle. This includes information on the accelerator, brake, steering, etc., and information on sensors installed in the drive mechanism.
 ECU6の評価を行う場合、評価用データ作成装置10で作成された評価用データをECU6に入力する。ECU6は、センサのデータとして入力されると、入力された評価用データを処理して、処理結果を運転支援の指示情報として評価装置8に出力する。 When evaluating the ECU 6, the evaluation data created by the evaluation data creation device 10 is input to the ECU 6. When input as sensor data, the ECU 6 processes the input evaluation data and outputs the processing result to the evaluation device 8 as driving support instruction information.
 評価装置8は、ECU6の運転支援機能を評価する。評価装置8は、CPU等の演算処理機能と、ROM、RAM等の記憶機能を有し、プログラムで、評価機能を実行する装置である。評価装置8は、評価用データ作成装置10で入力した評価用データとその評価用データの条件を取得し、ECU6で評価用データを処理して出力される運転支援の指示情報を取得する。評価装置8は、ECU6が実行した運転支援のタイミング、内容が、評価用データのシナリオの状況に対して適切であるかを判定する。評価用データを用いて、ECU6を評価することで、ECU6の運転支援の性能を評価することができる。評価用データを用いて、ECU6を評価し、ECU6の機能を調整することで、運転支援の性能を向上させることができる。また、評価用データは、ECU6の学習データとして用いることもできる。 The evaluation device 8 evaluates the driving support function of the ECU 6. The evaluation device 8 is a device that has an arithmetic processing function such as a CPU and a storage function such as a ROM and RAM, and executes an evaluation function using a program. The evaluation device 8 acquires the evaluation data inputted by the evaluation data creation device 10 and the conditions of the evaluation data, processes the evaluation data by the ECU 6, and acquires driving support instruction information outputted. The evaluation device 8 determines whether the timing and content of the driving support executed by the ECU 6 are appropriate for the situation of the scenario of the evaluation data. By evaluating the ECU 6 using the evaluation data, the driving support performance of the ECU 6 can be evaluated. The performance of driving support can be improved by evaluating the ECU 6 and adjusting the functions of the ECU 6 using the evaluation data. Moreover, the evaluation data can also be used as learning data for the ECU 6.
 次に、評価用データ作成装置10について説明する。評価用データ作成装置10は、評価対象の車両の走行状態をシミュレーションで再現し、評価対象の車両の走行時にECU6に入力されるデータを模擬した評価用データを作成する。 Next, the evaluation data creation device 10 will be explained. The evaluation data creation device 10 reproduces the running state of the vehicle to be evaluated through simulation, and creates evaluation data that simulates data input to the ECU 6 when the vehicle to be evaluated is running.
 評価用データ作成装置10は、入力部12と、出力部14と、通信部15と、演算部16と、記憶部18と、を含む。入力部12は、キーボード及びマウス、タッチパネル、またはオペレータからの発話を集音するマイク等の入力装置を含み、オペレータが入力装置に対して行う操作に対応する信号を演算部16へ出力する。出力部14は、ディスプレイ等の表示装置を含み、演算部16から出力される表示信号に基づいて、処理結果や処理対象の画像等、各種情報を含む画面を表示する。また、出力部14は、データを記録媒体で出力する記録装置を含んでもよい。通信部15は、通信インターフェースを用いて、データの送信を行う。通信部15は、ECU6、評価装置8と通信を行い、データの送受信を行う。通信部15は、外部機器と通信を行い取得した各種データ、プログラムを記憶部16に送り、保存する。通信部は、有線の通信回線で外部機器と接続しても、無線の通信回線で外部機器と接続してもよい。 The evaluation data creation device 10 includes an input section 12, an output section 14, a communication section 15, a calculation section 16, and a storage section 18. The input unit 12 includes an input device such as a keyboard and mouse, a touch panel, or a microphone that collects speech from an operator, and outputs a signal corresponding to an operation performed by the operator on the input device to the calculation unit 16. The output unit 14 includes a display device such as a display, and displays a screen containing various information such as processing results and images to be processed based on the display signal output from the calculation unit 16. Further, the output unit 14 may include a recording device that outputs data on a recording medium. The communication unit 15 uses a communication interface to transmit data. The communication unit 15 communicates with the ECU 6 and the evaluation device 8, and sends and receives data. The communication unit 15 transmits various data and programs acquired through communication with external devices to the storage unit 16 and stores them therein. The communication unit may be connected to an external device via a wired communication line or may be connected to an external device via a wireless communication line.
 演算部16は、CPU(Central Processing Unit)やGPU(Graphics Processing Unit)等の集積回路(プロセッサ)と、作業領域となるメモリとを含み、これらのハードウェア資源を用いて各種プログラムを実行することによって各種処理を実行する。具体的に、演算部16は、記憶部18に記憶されているプログラムを読み出してメモリに展開し、メモリに展開されたプログラムに含まれる命令をプロセッサに実行させることで、各種処理を実行する。演算部16は、シナリオ設定部22と、センサデータ作成部24と、実データ変換部26と、対応付け処理部28と、機械学習部30と、を含む。演算部16の各部の説明の前に記憶部18について説明する。 The calculation unit 16 includes an integrated circuit (processor) such as a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit), and a memory serving as a work area, and can execute various programs using these hardware resources. Execute various processing by. Specifically, the arithmetic unit 16 reads a program stored in the storage unit 18, loads it in the memory, and causes the processor to execute instructions included in the program loaded in the memory, thereby executing various processes. The calculation unit 16 includes a scenario setting unit 22, a sensor data creation unit 24, an actual data conversion unit 26, a correlation processing unit 28, and a machine learning unit 30. Before explaining each part of the calculation section 16, the storage section 18 will be explained.
 記憶部18は、磁気記憶装置や半導体記憶装置等の不揮発性を有する記憶装置からなり、各種のプログラムおよびデータを記憶する。記憶部18は、データ作成プログラム32と、学習プログラム34と、シナリオモデル36と、環境物体モデル37と、天候モデル38と、光源モデル40と、車両モデル42と、センサモデル44と、対応付けテーブル46と、実データ48と、学習済みモデル50と、を含む。 The storage unit 18 is composed of a nonvolatile storage device such as a magnetic storage device or a semiconductor storage device, and stores various programs and data. The storage unit 18 stores a data creation program 32, a learning program 34, a scenario model 36, an environmental object model 37, a weather model 38, a light source model 40, a vehicle model 42, a sensor model 44, and a correspondence table. 46, actual data 48, and a trained model 50.
 記憶部18に記憶されるデータとしては、シナリオモデル36と、環境物体モデル37と、天候モデル38と、光源モデル40と、車両モデル42と、センサモデル44と、対応付けテーブル46と、実データ48と、学習済みモデル50と、が含まれる。シナリオモデル36と、環境物体モデル37と、天候モデル38と、光源モデル40と、車両モデル42と、センサモデル44と、対応付けテーブル46と、実データ48と、学習済みモデル50とは、それぞれ複数備え、使用するモデル、データ、テーブルを選択することができる。 The data stored in the storage unit 18 includes a scenario model 36, an environmental object model 37, a weather model 38, a light source model 40, a vehicle model 42, a sensor model 44, a correspondence table 46, and actual data. 48 and a trained model 50. The scenario model 36, the environmental object model 37, the weather model 38, the light source model 40, the vehicle model 42, the sensor model 44, the correspondence table 46, the actual data 48, and the learned model 50 are each There are multiple models available, allowing you to select the model, data, and table you want to use.
 シナリオモデル36は、車両の走行のシミュレーションする条件を設定したモデルである。シナリオモデル36は、環境物体モデル37、天候モデル38、光源モデル40、車両モデル42のそれぞれから選択したモデルを組み合わせたモデルある。シナリオモデル36は、ある瞬間のECU6の処理の評価をするために、時間経過がないある瞬間のモデルとしても、所定の区間を走行中のECU6の処理の評価をするために、時間経過があるモデルとしてもよい。 The scenario model 36 is a model that sets conditions for simulating vehicle travel. The scenario model 36 is a model that is a combination of models selected from the environmental object model 37, the weather model 38, the light source model 40, and the vehicle model 42. The scenario model 36 can be a model for a certain instant in which no time elapses in order to evaluate the processing of the ECU 6 at a certain moment, or a model in which there is a lapse of time in order to evaluate the processing of the ECU 6 while driving in a predetermined section. It can also be used as a model.
 環境物体モデル37は、車両の周囲の形状の情報を含むモデルである。環境物体モデル37は、車両が走行する道路の形状情報(直線、カーブ、交差点、ガードレールの有無、信号機の有無)や、周囲の車両の情報、周囲の通行人の情報等を含む。つまり、環境物体モデル37は、車両の周囲の三次元形状のモデルを含む。 The environmental object model 37 is a model that includes information on the shape of the surroundings of the vehicle. The environmental object model 37 includes information on the shape of the road on which the vehicle travels (straight line, curve, intersection, presence or absence of guardrails, presence or absence of traffic lights), information on surrounding vehicles, information on surrounding passersby, and the like. In other words, the environmental object model 37 includes a model of the three-dimensional shape of the surroundings of the vehicle.
 天候モデル38は、晴れ、曇り、雨、暴風雨、雪、みぞれ、車両の天候を再現するモデルである。天候モデル38は、雲量、降雨量、降雪量、温度、風速等に応じて、再現される状況が変化するモデルである。 The weather model 38 is a model that reproduces sunny, cloudy, rain, storm, snow, sleet, and vehicle weather. The weather model 38 is a model in which the reproduced situation changes depending on the amount of clouds, rainfall, snowfall, temperature, wind speed, and the like.
 光源モデル40は、太陽、照明塔、建物、対向車など、車両の周囲にある光を出力する物体のモデルである。 The light source model 40 is a model of an object that outputs light around the vehicle, such as the sun, a lighting tower, a building, or an oncoming vehicle.
 車両モデル42は、解析対象の車両、周囲の車両を再現するモデルである。車両モデル42は、車両の形状、走行性能、搭載しているセンサの情報等を含む。 The vehicle model 42 is a model that reproduces the vehicle to be analyzed and surrounding vehicles. The vehicle model 42 includes information such as the shape of the vehicle, driving performance, and installed sensors.
 センサモデル44は、解析対象の車両に搭載されるセンサのモデルである。センサモデル44は、周囲の情報に応じて、どのような情報を出力するかが設定されるモデルである。 The sensor model 44 is a model of a sensor installed in the vehicle to be analyzed. The sensor model 44 is a model in which what kind of information is output is set according to surrounding information.
 対応付けテーブル46は、シミュレーションで作成したデータであるセンサデータと、事前に実際に車両を走行して取得した実データとの対応付けの情報を含むモデルである。対応付けテーブル46は、対応付けを行う条件の情報を有する。対応付けテーブル46は、シナリオの条件や取得したセンサデータに基づいて予め設定された振り分け条件に基づいて実データを対応付ける場合、実データを対応付ける条件のテーブルが含まれる。対応付けに学習済みモデル50を用いる場合、センサデータを学習済みモデル50に入力する処理条件等が対応付けられる。 The association table 46 is a model that includes information on the association between sensor data, which is data created through simulation, and actual data obtained by actually driving the vehicle in advance. The correspondence table 46 has information on conditions for performing correspondence. The association table 46 includes a table of conditions for associating actual data when the actual data is associated based on distribution conditions preset based on scenario conditions and acquired sensor data. When using the learned model 50 for association, processing conditions for inputting sensor data to the learned model 50, etc. are associated.
 実データ48は、事前に実際に車両を走行して取得したデータである。実データ48は、車両に搭載したセンサが検出したデータである。また、実データ48は実際に車両を走行した際の情報、環境物体の情報、天候の情報、光源の情報、車両の情報も含まれる。 The actual data 48 is data obtained in advance by actually driving the vehicle. Actual data 48 is data detected by a sensor mounted on the vehicle. The actual data 48 also includes information on actual driving of the vehicle, information on environmental objects, information on weather, information on light sources, and information on the vehicle.
 学習済みモデル50は、機械学習部30が機械学習で作成したモデルである。学習済みモデル50は、センサデータが入力されると、当該センサデータに対応する実データを出力する。 The trained model 50 is a model created by the machine learning unit 30 through machine learning. When sensor data is input, the learned model 50 outputs actual data corresponding to the sensor data.
 記憶部18に記憶されるプログラムとしては、データ作成プログラム32と、学習プログラム34と、がある。 The programs stored in the storage unit 18 include a data creation program 32 and a learning program 34.
 データ作成プログラム32は、評価用データを作成するプログラム(評価用データ作成プログラム)である。データ作成プログラム32は、シナリオ設定部22、センサデータ作成部24、実データ変換部26、対応付け処理部28の機能を実現するプログラムである。 The data creation program 32 is a program for creating evaluation data (evaluation data creation program). The data creation program 32 is a program that implements the functions of the scenario setting section 22, the sensor data creation section 24, the actual data conversion section 26, and the association processing section 28.
 学習プログラム34は、データ作成プログラム32の処理の一部に用いる学習済みモデルを機械学習で作成するプログラムである。学習プログラム34は、機械学習部30の機能を実現するプログラムである。学習プログラム34は、シナリオに基づいてシミュレーションで作成したセンサデータを入力とし、当該センサデータに対応する実データを出力とした教師データを用いて、深層学習処理を行い、学習済みモデル50を作成する。深層学習モデルとしては、GAN(Generative Adversarial Network)等を用いることができる。なお、学習モデル、機械学習の方法は特に限定されず、センサデータを実データに変換できればよい。 The learning program 34 is a program that uses machine learning to create a learned model used for part of the processing of the data creation program 32. The learning program 34 is a program that implements the functions of the machine learning section 30. The learning program 34 performs deep learning processing to create a trained model 50 by using the sensor data created by simulation based on the scenario as input and the teacher data with actual data corresponding to the sensor data as output. . As the deep learning model, GAN (Generative Adversarial Network) or the like can be used. Note that the learning model and machine learning method are not particularly limited, as long as they can convert sensor data into actual data.
 記憶部18は、記録媒体に記録されたデータ作成プログラム32と、学習プログラム34と、を読み込むことで、データ作成プログラム32と、学習プログラム34と、がインストールされてもよいし、ネットワーク上で提供されるデータ作成プログラム32と、学習プログラム34と、を読み込むことで、データ作成プログラム32と、学習プログラム34と、がインストールされてもよい。 The storage unit 18 may install the data creation program 32 and the learning program 34 by reading the data creation program 32 and the learning program 34 recorded on a recording medium, or may install the data creation program 32 and the learning program 34 by providing them on a network. The data creation program 32 and the learning program 34 may be installed by reading the data creation program 32 and the learning program 34.
 演算部16の各部の機能について説明する。演算部16の各部は、記憶部18に記憶されるプログラムを実行することで、実行することができる。シナリオ設定部22は、解析を実行するシミュレーションの条件を取得し、条件に基づいたモデルを、環境物体モデル37と、天候モデル38と、光源モデル40と、車両モデル42とから、それぞれ選択して、シミュレーションモデルであるシナリオモデルを作成する。解析を実行するシミュレーションの条件は、ユーザの入力結果に基づいて設定する。シナリオ設定部22は、解析を実行するシミュレーションの条件に対応するシナリオモデル36が記憶部18に記憶されている場合、記憶されているシナリオモデル36を読み出す。 The functions of each part of the calculation unit 16 will be explained. Each part of the calculation unit 16 can be executed by executing a program stored in the storage unit 18. The scenario setting unit 22 acquires the simulation conditions for executing the analysis, and selects models based on the conditions from the environmental object model 37, the weather model 38, the light source model 40, and the vehicle model 42. , create a scenario model that is a simulation model. The simulation conditions for executing the analysis are set based on the user's input results. If a scenario model 36 corresponding to the simulation conditions for performing the analysis is stored in the storage unit 18, the scenario setting unit 22 reads the stored scenario model 36.
 センサデータ作成部24は、車両のセンサをモデル化したセンサモデルを用いて、シナリオ設定部22で設定したシナリオを車両が走行した場合のセンサデータを作成する。センサデータ作成部24は、シナリオ設定部22で作成したシナリオモデルに基づいてシミュレーションを実行し、シナリオモデルに基づいて車両が走行した場合にセンサが検出するデータであるセンサデータを作成する。 The sensor data creation unit 24 creates sensor data when the vehicle runs through the scenario set by the scenario setting unit 22, using a sensor model that models the vehicle's sensors. The sensor data creation unit 24 executes a simulation based on the scenario model created by the scenario setting unit 22, and creates sensor data that is data detected by a sensor when the vehicle travels based on the scenario model.
 実データ変換部26は、センサデータ作成部24で作成したセンサデータに対して対応付け処理部28で対応付けを行い、対応付けた結果を用いて、実データに変換する。 The actual data conversion unit 26 associates the sensor data created by the sensor data creation unit 24 with the association processing unit 28, and uses the results of the association to convert the sensor data into actual data.
 対応付け処理部28は、実データ変換部26から入力されるセンサデータに対して実データの対応付けを行う。対応付け処理部28は、対応付けテーブル46を用いて、センサデータに対応する実データを特定する。また、対応付け処理部28は、学習済みモデル50を使用する場合、機械学習部30にセンサデータを入力し、出力された実データを取得する。 The association processing unit 28 associates actual data with the sensor data input from the actual data conversion unit 26. The association processing unit 28 uses the association table 46 to identify actual data that corresponds to the sensor data. Further, when using the trained model 50, the association processing unit 28 inputs sensor data to the machine learning unit 30 and acquires the output actual data.
 機械学習部30は、シナリオに基づいてシミュレーションで作成したセンサデータを入力とし、当該センサデータに対応する実データを出力とした教師データを用いて、深層学習処理を行い、学習済みモデル50を作成する。また、機械学習部30は、学習済みモデル50を用いた処理も実行する。具体的には、機械学習部30は、入力されたセンサデータを学習済みモデル50に入力し、対応する実データを出力する。 The machine learning unit 30 performs deep learning processing to create a trained model 50 by using the sensor data created by simulation based on the scenario as input and the teacher data with real data corresponding to the sensor data as output. do. The machine learning unit 30 also executes processing using the learned model 50. Specifically, the machine learning unit 30 inputs the input sensor data to the learned model 50 and outputs the corresponding actual data.
<方法>
 次に、図2を用いて、評価用データ作成方法について説明する。図2は、評価用データ作成装置の処理の一例を示すフローチャートである。図2に示す処理は、対応付け処理で機械学習を用いない処理である。
<Method>
Next, a method for creating evaluation data will be described using FIG. 2. FIG. 2 is a flowchart showing an example of processing of the evaluation data creation device. The process shown in FIG. 2 is a process that does not use machine learning in the association process.
 演算部16は、シナリオ設定部22で、実行するシミュレーションのシナリオを決定する(ステップS12)。シナリオ設定部22は、ユーザにより入力された条件、予め設定される条件等に元浮いて実行するシナリオを決定する。 The calculation unit 16 uses the scenario setting unit 22 to determine the scenario of the simulation to be executed (step S12). The scenario setting unit 22 determines a scenario to be executed based on conditions input by the user, conditions set in advance, and the like.
 演算部16は、シナリオ設定部22で、シナリオに基づいて、使用する環境物体モデル、天候モデル、光源モデル、車両モデルを決定し、シナリオモデルを作成する(ステップS14)。 In the calculation unit 16, the scenario setting unit 22 determines the environmental object model, weather model, light source model, and vehicle model to be used based on the scenario, and creates a scenario model (step S14).
 演算部16は、決定したシナリオモデルをセンサモデルに入力し、センサデータを作成する(ステップS16)。演算部16は、センサデータ作成部24で、車両モデル42に基づいてセンサモデルを作成する。センサデータ作成部24は、シナリオモデルをセンサモデルに入力してシミュレーションを実行し、センサモデルで検出したセンサデータ、つまりシナリオモデルで走行した場合に、センサが検出するデータを作成する。 The calculation unit 16 inputs the determined scenario model into the sensor model and creates sensor data (step S16). The calculation unit 16 uses the sensor data creation unit 24 to create a sensor model based on the vehicle model 42 . The sensor data creation unit 24 inputs the scenario model into the sensor model, executes a simulation, and creates sensor data detected by the sensor model, that is, data detected by the sensor when the scenario model is used.
 演算部16は、センサデータに対応する実データを決定する(ステップS18)。演算部16は、実データ変換部26及び対応付け処理部28で、対応付けテーブル46を用いてセンサデータを処理して、センサデータに対応付けする実データを決定する。本実施形態では、センサデータ及びシナリオモデルに基づいて、対応付けテーブル46に対応付けられている実データから、最も条件が近い実データを選定する。 The calculation unit 16 determines actual data corresponding to the sensor data (step S18). In the calculation unit 16, the actual data converting unit 26 and the association processing unit 28 process the sensor data using the association table 46 to determine actual data to be associated with the sensor data. In this embodiment, based on the sensor data and the scenario model, actual data with the closest conditions are selected from the actual data associated with the association table 46.
 演算部16は、実データへの変換処理を行う(ステップS20)。演算部16は、実データ変換部26で、対応付け処理部28で決定した対応付けに基づいてセンサデータを実データに変換する。 The calculation unit 16 performs a conversion process to actual data (step S20). In the calculation unit 16, the actual data conversion unit 26 converts the sensor data into actual data based on the association determined by the association processing unit 28.
 演算部16は、作成した実データを保存する(ステップS22)。実データが評価用データとなる。 The calculation unit 16 saves the created actual data (step S22). Actual data becomes evaluation data.
 本実施形態は、以上のように、シミュレーションで作成したセンサデータに対する実データを特定し、センサデータを実データに変換することで、評価用データを実データとすることができる。 In this embodiment, as described above, the evaluation data can be made into real data by identifying real data for sensor data created in simulation and converting the sensor data into real data.
 ここで、シナリオモデルは、例えば、物体の形状をポリゴンで形成し、物体の特性(反射率、透過率、テクスチャ等)をマテリアルとして設定し、これらを組み合わせて三次元形状のモデルとする。シナリオモデルの設定を簡素なモデル、例えば、モデルの作成ソフトに初期設定で設定されている情報のみで作成した場合、本実施形態は、センサデータから実データに変換することで、ECU6に供給する評価用データは実データとすることができる。 Here, in the scenario model, for example, the shape of the object is formed by polygons, the characteristics of the object (reflectance, transmittance, texture, etc.) are set as materials, and these are combined to form a three-dimensional model. When setting the scenario model as a simple model, for example, when creating only the information set in the initial settings in the model creation software, this embodiment converts the sensor data into actual data and supplies it to the ECU 6. The evaluation data can be actual data.
 これにより、センサモデルから実データに近い値を出力するための精緻なモデルを作成せずに、実際の走行した場合に近い、評価用データを作成することができる。また、シナリオモデルとして簡素なモデルを用いつつ、高い精度の評価ができる評価用データを作成できるため、評価用データの作成にかかる時間を低減することができ、処理を簡単にすることができる。また、実データを対応付けることで、評価用データが実際のセンサの検出と乖離することを抑制できる。 As a result, it is possible to create evaluation data that is close to actual driving without creating a sophisticated model for outputting values close to actual data from the sensor model. Further, since it is possible to create evaluation data that allows highly accurate evaluation while using a simple model as a scenario model, the time required to create evaluation data can be reduced and processing can be simplified. Furthermore, by associating the actual data, it is possible to suppress the deviation of the evaluation data from the actual sensor detection.
 上記実施形態では、対応付け処理部28で、センサデータに対応する1つの実データ48を決定し、決定した実データを評価表データとしたがこれに限定されない。評価用データ作成装置10は、機械学習部で学習済みモデルを作成し、センサデータから実データへの変換を、学習済みモデルを用いて行ってもよい。 In the above embodiment, the association processing unit 28 determines one piece of real data 48 that corresponds to the sensor data, and uses the determined real data as the evaluation table data, but the present invention is not limited to this. The evaluation data creation device 10 may create a learned model in a machine learning section, and convert sensor data to actual data using the learned model.
<学習済みモデル作成方法>
 図3は、機械学習部の動作の一例を示すフローチャートである。図3に示す処理は、機械学習部で実行する。機械学習部30は、実データを取得する(ステップS32)。機械学習部30は、記憶部18に記憶された実データを取得しても、通信部15を介して実データ取得してもよい。
<How to create a trained model>
FIG. 3 is a flowchart showing an example of the operation of the machine learning section. The processing shown in FIG. 3 is executed by the machine learning section. The machine learning unit 30 acquires actual data (step S32). The machine learning unit 30 may acquire actual data stored in the storage unit 18 or may acquire actual data via the communication unit 15.
 機械学習部30は、センサデータを作成する(ステップS34)。機械学習部30は、シナリオ設定部22、センサデータ作成部24で処理を実行し、実データに対応するセンサデータを作成する。機械学習部30は、例えば、実データの走行時の条件を取得し、作成した条件を用いてセンサデータを作成する。 The machine learning unit 30 creates sensor data (step S34). The machine learning unit 30 executes processing in the scenario setting unit 22 and sensor data creation unit 24 to create sensor data corresponding to actual data. The machine learning unit 30, for example, acquires actual driving conditions of the data and creates sensor data using the created conditions.
 機械学習部30は、センサデータと実データの対応付けを実行する(ステップS36)。つまり、機械学習部30は、ステップS32で取得した実データとステップS34で作成したセンサデータとを対応付ける。 The machine learning unit 30 associates the sensor data with the actual data (step S36). That is, the machine learning unit 30 associates the actual data acquired in step S32 with the sensor data created in step S34.
 機械学習部30は、学習用のデータの作成が完了したかを判定する(ステップS38)。例えば、機械学習部30は、センサデータと実データとの組み合わせが、設定した数以上あるかを判定基準とする。 The machine learning unit 30 determines whether the creation of learning data is completed (step S38). For example, the machine learning unit 30 uses as a determination criterion whether there are a set number or more of combinations of sensor data and actual data.
 機械学習部30は、学習用のデータの作成が完了していない(ステップS38でNo)と判定した場合、ステップS32に戻り、センサデータと実データとの組み合わせをさらに作成する。 If the machine learning unit 30 determines that the creation of learning data has not been completed (No in step S38), the process returns to step S32 and further creates a combination of sensor data and actual data.
 機械学習部30は、学習用のデータの作成が完了した(ステップS38でYes)と判定した場合、センサデータを入力とし、対応する実データを出力として、学習モデルの学習を実行する(ステップS40)。機械学習部30は、例えば、学習データとなる、センサデータと実データとの組み合わせの一部を学習用データ、残りを検証用データとし、学習用データで学習を行い、検証用データで検証する。 If the machine learning unit 30 determines that the creation of learning data is completed (Yes in step S38), the machine learning unit 30 executes learning of the learning model by using the sensor data as input and the corresponding actual data as output (step S40). ). For example, the machine learning unit 30 uses a part of the combination of sensor data and actual data, which is learning data, as learning data and the rest as verification data, performs learning with the learning data, and verifies with the verification data. .
 機械学習部30は、以上の処理で、センサデータが入力されると、対応する実データを出力する学習済みモデルを作成する。なお、学習済みモデルの作成方法は、一例でありこれに限定されない。上記実施形態では、教師あり学習として説明したが、学習モデルと実データとを準備し、教師無し学習で、センサデータに対する実データの特定を行うようにしてもよい。 Through the above process, when sensor data is input, the machine learning unit 30 creates a learned model that outputs the corresponding actual data. Note that the method for creating a trained model is one example and is not limited to this. Although the above embodiment has been described as supervised learning, it is also possible to prepare a learning model and real data and perform unsupervised learning to identify the real data with respect to the sensor data.
<評価用データ作成方法の他の例>
 図4は、評価用データ作成装置の処理の一例を示すフローチャートである。図4に示す処理のうち、図2に示す処理を同じ処理は同じ符号を付して詳細な説明を省略する。
<Other examples of how to create evaluation data>
FIG. 4 is a flowchart showing an example of processing of the evaluation data creation device. Among the processes shown in FIG. 4, processes that are the same as those shown in FIG. 2 are given the same reference numerals, and detailed explanations are omitted.
 演算部16は、シナリオ設定部22で、実行するシミュレーションのシナリオを決定する(ステップS12)。演算部16は、シナリオ設定部22で、シナリオに基づいて、使用する環境物体モデル、天候モデル、光源モデル、車両モデルを決定し、シナリオモデルを作成する(ステップS14)。演算部16は、決定したシナリオモデルをセンサモデルに入力し、センサデータを作成する(ステップS16)。 The calculation unit 16 uses the scenario setting unit 22 to determine the scenario of the simulation to be executed (step S12). In the calculation unit 16, the scenario setting unit 22 determines the environmental object model, weather model, light source model, and vehicle model to be used based on the scenario, and creates a scenario model (step S14). The calculation unit 16 inputs the determined scenario model into the sensor model and creates sensor data (step S16).
 演算部16は、センサデータを学習済みモデルに入力し、実データに変換する(ステップS52)。 The calculation unit 16 inputs the sensor data to the trained model and converts it into actual data (step S52).
 演算部16は、作成した実データを保存する(ステップS22)。実データが評価用データとなる。 The calculation unit 16 saves the created actual data (step S22). Actual data becomes evaluation data.
 評価用データ作成装置10は、図4に示すように、機械学習(深層学習)で作成した学習済みモデルを用いて、センサデータを実データに変換することで、各種条件の設定等を行わずに、データの変換を行うことができる。 As shown in FIG. 4, the evaluation data creation device 10 converts sensor data into actual data using a trained model created by machine learning (deep learning), without setting various conditions. , data conversion can be performed.
 ここで、評価用データ作成装置10は、複数の学習済みモデルを用いてもよい。学習済みモデルを複数用いる場合は、分類する条件を設定し、それぞれの分類条件を満たすセンサデータと実データとの対応付けを行った学習データを作成する。機械学習部30は、学習データ毎に、機械学習を実行することで、分類する条件毎の学習済みモデルを作成する。 Here, the evaluation data creation device 10 may use a plurality of trained models. When using multiple trained models, the classification conditions are set and learning data is created in which sensor data that satisfies each classification condition is associated with actual data. The machine learning unit 30 creates a learned model for each condition to be classified by executing machine learning for each learning data.
 図5は、評価用データ作成装置の処理の一例を示すフローチャートである。図5に示す処理のうち、図4に示す処理を同じ処理は同じ符号を付して詳細な説明を省略する。図5は、学習済モデルを、天候、光源の条件毎に学習済みモデルを作成している場合である。 FIG. 5 is a flowchart showing an example of the processing of the evaluation data creation device. Among the processes shown in FIG. 5, the same processes as those shown in FIG. 4 are given the same reference numerals, and detailed description thereof will be omitted. FIG. 5 shows a case where trained models are created for each weather and light source condition.
 演算部16は、シナリオ設定部22で、実行するシミュレーションのシナリオを決定する(ステップS12)。演算部16は、シナリオ設定部22で、シナリオに基づいて、使用する環境物体モデル、天候モデル、光源モデル、車両モデルを決定し、シナリオモデルを作成する(ステップS14)。演算部16は、決定したシナリオモデルをセンサモデルに入力し、センサデータを作成する(ステップS16)。 The calculation unit 16 uses the scenario setting unit 22 to determine the scenario of the simulation to be executed (step S12). In the calculation unit 16, the scenario setting unit 22 determines the environmental object model, weather model, light source model, and vehicle model to be used based on the scenario, and creates a scenario model (step S14). The calculation unit 16 inputs the determined scenario model into the sensor model and creates sensor data (step S16).
 演算部16は、実データ変換部26を用いて、天候モデル、光源モデルに基づいて使用する学習済みモデルを決定する(ステップS54)。実データ変換部26は、シナリオモデルの情報に基づいて、センサデータの天候、光源の条件を取得し、取得した条件に対応する学習済みモデルを決定する。演算部16は、センサデータを学習済みモデルに入力し、実データに変換する(ステップS56)。 The calculation unit 16 uses the actual data conversion unit 26 to determine a trained model to be used based on the weather model and the light source model (step S54). The actual data conversion unit 26 acquires the weather and light source conditions of the sensor data based on the information of the scenario model, and determines a trained model corresponding to the acquired conditions. The calculation unit 16 inputs the sensor data to the trained model and converts it into actual data (step S56).
 演算部16は、作成した実データを保存する(ステップS22)。実データが評価用データとなる。 The calculation unit 16 saves the created actual data (step S22). Actual data becomes evaluation data.
 このように、学習済みデータを条件毎に設けることで、実データを特定する精度を高くすることができる。シナリオモデルの条件で実際に走行した場合に得られる実データに近い実データを対応付けることができる。 In this way, by providing learned data for each condition, it is possible to increase the accuracy of identifying actual data. It is possible to associate actual data that is close to the actual data obtained when actually driving under the conditions of the scenario model.
 本実施形態のように、天候、光源を分類の条件とすることで、シナリオモデルから作成したセンサデータでの判定が難しい条件を識別することができる。例えば、画像の場合、雨なのか雪なのか、曇っていて暗いのか夕方で暗いのか室内で暗いのか等で、学習結果が混在することを抑制できる。これにより、センサデータに基づいた実データの変換の精度をより高くすることができる。 As in this embodiment, by using weather and light sources as classification conditions, it is possible to identify conditions that are difficult to determine using sensor data created from a scenario model. For example, in the case of images, it is possible to prevent the learning results from being mixed depending on whether it is raining or snowing, whether it is cloudy and dark, whether it is dark in the evening, whether it is dark indoors, etc. This makes it possible to further improve the accuracy of conversion of actual data based on sensor data.
 評価用データ作成装置は、ユーザの入力に基づいて使用する学習済みデータを決定してもよい。図6は、評価用データ作成装置の処理の一例を示すフローチャートである。図6に示す処理のうち、図5に示す処理を同じ処理は同じ符号を付して詳細な説明を省略する。 The evaluation data creation device may determine trained data to be used based on user input. FIG. 6 is a flowchart showing an example of processing of the evaluation data creation device. Among the processes shown in FIG. 6, the same processes as those shown in FIG.
 演算部16は、シナリオ設定部22で、実行するシミュレーションのシナリオを決定する(ステップS12)。演算部16は、シナリオ設定部22で、シナリオに基づいて、使用する環境物体モデル、天候モデル、光源モデル、車両モデルを決定し、シナリオモデルを作成する(ステップS14)。演算部16は、決定したシナリオモデルをセンサモデルに入力し、センサデータを作成する(ステップS16)。 The calculation unit 16 uses the scenario setting unit 22 to determine the scenario of the simulation to be executed (step S12). In the calculation unit 16, the scenario setting unit 22 determines the environmental object model, weather model, light source model, and vehicle model to be used based on the scenario, and creates a scenario model (step S14). The calculation unit 16 inputs the determined scenario model into the sensor model and creates sensor data (step S16).
 演算部16は、実データ変換部26を用いて、入力に基づいて使用する学習済みモデルを決定する(ステップS62)。実データ変換部26は、入力部12に入力された天候、光源の情報に基づいて、対応する学習済みモデルを決定する。演算部16は、センサデータを学習済みモデルに入力し、実データに変換する(ステップS56)。 The calculation unit 16 uses the actual data conversion unit 26 to determine a trained model to be used based on the input (step S62). The actual data conversion unit 26 determines a corresponding trained model based on the weather and light source information input to the input unit 12. The calculation unit 16 inputs the sensor data to the trained model and converts it into actual data (step S56).
 演算部16は、作成した実データを保存する(ステップS22)。実データが評価用データとなる。 The calculation unit 16 saves the created actual data (step S22). Actual data becomes evaluation data.
 評価用データ作成装置10は、これにより、センサデータをユーザが必要とする天候、光源の条件の実データに変換することができる。なお、本実施形態ではユーザの入力としたが、他のデータベースから入力された情報としてもよい。 The evaluation data creation device 10 can thereby convert the sensor data into actual data of the weather and light source conditions required by the user. In this embodiment, the information is input by the user, but the information may be input from another database.
 評価用データ作成装置10は、ステップS12で決定するシナリオの、天候、光源を1つの標準の設定としてもよい。これにより、モデルの作成の負荷を低減することができる。また、ユーザの任意の設定に基づいて学習済みモデルを選定する処理であるため、評価用データとなる実データの天候、光源として、ユーザが必要とする天候、光源の条件の実データに変換することができる。 The evaluation data creation device 10 may set the weather and light source of the scenario determined in step S12 as one standard setting. This makes it possible to reduce the burden of model creation. In addition, since the process selects a trained model based on the user's arbitrary settings, the actual weather and light source data used as evaluation data are converted into actual data of the weather and light source conditions required by the user. be able to.
 評価用データ作成装置は、学習済みモデルを、環境物体モデルの条件毎に分類してもよい。図7は、評価用データ作成装置の処理の一例を示すフローチャートである。図7に示す処理のうち、図5に示す処理を同じ処理は同じ符号を付して詳細な説明を省略する。 The evaluation data creation device may classify the learned models for each condition of the environmental object model. FIG. 7 is a flowchart showing an example of processing of the evaluation data creation device. Among the processes shown in FIG. 7, processes that are the same as those shown in FIG.
 演算部16は、シナリオ設定部22で、実行するシミュレーションのシナリオを決定する(ステップS12)。演算部16は、シナリオ設定部22で、シナリオに基づいて、使用する環境物体モデル、天候モデル、光源モデル、車両モデルを決定し、シナリオモデルを作成する(ステップS14)。演算部16は、決定したシナリオモデルをセンサモデルに入力し、センサデータを作成する(ステップS16)。 The calculation unit 16 uses the scenario setting unit 22 to determine the scenario of the simulation to be executed (step S12). In the calculation unit 16, the scenario setting unit 22 determines the environmental object model, weather model, light source model, and vehicle model to be used based on the scenario, and creates a scenario model (step S14). The calculation unit 16 inputs the determined scenario model into the sensor model and creates sensor data (step S16).
 演算部16は、実データ変換部26を用いて、環境物体モデルに基づいて使用する学習済みモデルを決定する(ステップS64)。実データ変換部26は、シナリオモデルの情報に基づいて、センサデータの天候、光源の条件を取得し、取得した条件に対応する学習済みモデルを決定する。演算部16は、センサデータを学習済みモデルに入力し、実データに変換する(ステップS56)。 The calculation unit 16 uses the actual data conversion unit 26 to determine a trained model to be used based on the environmental object model (step S64). The actual data conversion unit 26 acquires the weather and light source conditions of the sensor data based on the information of the scenario model, and determines a trained model corresponding to the acquired conditions. The calculation unit 16 inputs the sensor data to the trained model and converts it into actual data (step S56).
 演算部16は、作成した実データを保存する(ステップS22)。実データが評価用データとなる。 The calculation unit 16 saves the created actual data (step S22). Actual data becomes evaluation data.
 評価用データ作成装置10は、環境物体モデルを条件毎に分類し、人、ビル、樹木等が類似する条件で学習を行うことで、学習済みモデルの精度を高くすることができる。また、学習時に計算が発散することを抑制でき、モデルの作成負荷も低減できる。 The evaluation data creation device 10 can improve the accuracy of the learned model by classifying the environmental object models by condition and performing learning under similar conditions for people, buildings, trees, etc. Further, it is possible to suppress calculation divergence during learning, and reduce model creation load.
 また、評価用データ作成装置10は、学習時に環境物体モデルも入力情報として入力することが好ましい。これにより、天候や光源(昼夜)で検出状態が変わる場合も、環境物理モデルの条件を処理しつつ対応付けることができる。これにより、検出精度をより高くすることができる。 Furthermore, it is preferable that the evaluation data creation device 10 also input an environmental object model as input information during learning. As a result, even if the detection state changes depending on the weather or the light source (day or night), the conditions of the environmental physical model can be processed and associated. Thereby, detection accuracy can be made higher.
<評価用データ作成装置の他の例>
 図8は、評価用データ作成装置の一例を示すブロック図である。図8に示す評価用データ作成装置10aは、評価用データ作成装置10の構成に加え、天候光源変換部29と、第2学習済みモデル52とを含む。また、評価用データ作成装置10の学習済みモデルが、第1学習済みモデル50となる。評価用データ作成装置10は、データの変換処理を2回実行する。
<Other examples of evaluation data creation device>
FIG. 8 is a block diagram showing an example of an evaluation data creation device. In addition to the configuration of the evaluation data creation device 10, the evaluation data creation device 10a shown in FIG. 8 includes a weather light source conversion unit 29 and a second learned model 52. Further, the trained model of the evaluation data creation device 10 becomes the first trained model 50. The evaluation data creation device 10 executes the data conversion process twice.
 評価用データ作成装置10aは、第1学習済みモデル50と第2学習済みモデル52の2つの学習済みモデルを有する。第1学習済みモデル50は、上述したようにセンサデータを実データに変換する処理に用いる学習済みモデルである。第1学習済みモデル50は、天候、光源の条件を標準条件としたセンサモデルと実データとで学習を行い、作成される。 The evaluation data creation device 10a has two trained models: a first trained model 50 and a second trained model 52. The first trained model 50 is a trained model used in the process of converting sensor data into actual data as described above. The first trained model 50 is created by performing learning using a sensor model and actual data using weather and light source conditions as standard conditions.
 第2学習済みモデル52は、実データを天候、光源の条件が異なる実データに変換する処理に用いる学習済みモデルである。第2学習済みモデル52の作成方法は後述する。 The second trained model 52 is a trained model used in the process of converting real data into real data with different weather and light source conditions. A method for creating the second trained model 52 will be described later.
 天候光源変換部29は、第2学習済みモデル52を用いて、実データ変換部26で作成した実データ(第一実データ)の天候と光源の条件を変更した第二実データを作成する。天候光源変換部29は、入力部12で入力された条件に基づいて天候と光源の条件を変更する。 The weather light source conversion unit 29 uses the second trained model 52 to create second actual data by changing the weather and light source conditions of the actual data (first actual data) created by the actual data conversion unit 26. The weather light source conversion section 29 changes the weather and light source conditions based on the conditions input through the input section 12 .
 図9は、機械学習部の動作の一例を示すフローチャートである。図9を用いて、機械学習部で実行する。機械学習部30は、光源、天候が標準条件の実データを取得する(ステップS72)。機械学習部30は、記憶部18に記憶された実データを取得しても、通信部15を介して実データ取得してもよい。 FIG. 9 is a flowchart showing an example of the operation of the machine learning section. This is executed by the machine learning section using FIG. The machine learning unit 30 acquires actual data with standard light source and weather conditions (step S72). The machine learning unit 30 may acquire actual data stored in the storage unit 18 or may acquire actual data via the communication unit 15.
 機械学習部30は、光源、天候が標準条件とは異なる実データを取得する(ステップS74)。機械学習部30は、記憶部18に記憶された実データを取得しても、通信部15を介して実データ取得してもよい。 The machine learning unit 30 acquires actual data in which the light source and weather are different from standard conditions (step S74). The machine learning unit 30 may acquire actual data stored in the storage unit 18 or may acquire actual data via the communication unit 15.
 機械学習部30は、光源、天候以外が同じ条件の実データ同士の対応付けを実行する(ステップS76)。つまり、機械学習部30は、ステップS72で取得した実データとステップS74で作成したセンサデータとを対応付ける。 The machine learning unit 30 associates real data with the same conditions except for the light source and weather (step S76). That is, the machine learning unit 30 associates the actual data acquired in step S72 with the sensor data created in step S74.
 機械学習部30は、学習用のデータの作成が完了したかを判定する(ステップS78)。例えば、機械学習部30は、実データの組み合わせが、設定した数以上あるかを判定基準とする。 The machine learning unit 30 determines whether the creation of learning data is completed (step S78). For example, the machine learning unit 30 uses whether there are more than a set number of combinations of actual data as a criterion.
 機械学習部30は、学習用のデータの作成が完了していない(ステップS78でNo)と判定した場合、ステップS72に戻り、センサデータと実データとの組み合わせをさらに作成する。 If the machine learning unit 30 determines that the creation of learning data has not been completed (No in step S78), the process returns to step S72 and further creates a combination of sensor data and actual data.
 機械学習部30は、学習用のデータの作成が完了した(ステップS78でYes)と判定した場合、標準条件の実データを入力とし、対応する実データを出力として、第2学習モデルの学習を実行する(ステップS80)。 When the machine learning unit 30 determines that the creation of the learning data is completed (Yes in step S78), the machine learning unit 30 uses the actual data under the standard conditions as input and the corresponding actual data as output, and starts learning the second learning model. Execute (step S80).
 機械学習部30は、以上の処理で、標準条件の実データを異なる天候、光源の条件の実データに変換する第2学習済みモデルを作成する。 Through the above process, the machine learning unit 30 creates a second learned model that converts actual data under standard conditions to actual data under different weather and light source conditions.
 図10は、評価用データ作成装置の処理の一例を示すフローチャートである。演算部16は、シナリオ設定部22で、実行するシミュレーションのシナリオを決定する(ステップS12)。演算部16は、シナリオ設定部22で、シナリオに基づいて、使用する環境物体モデル、車両モデルを決定し、シナリオモデルを作成する(ステップS90)。天候モデル、光源モデルは、標準条件のモデルとする。演算部16は、決定したシナリオモデルをセンサモデルに入力し、センサデータを作成する(ステップS16)。 FIG. 10 is a flowchart showing an example of the processing of the evaluation data creation device. The calculation unit 16 uses the scenario setting unit 22 to determine the scenario of the simulation to be executed (step S12). In the calculation unit 16, the scenario setting unit 22 determines an environmental object model and a vehicle model to be used based on the scenario, and creates a scenario model (step S90). The weather model and light source model are based on standard conditions. The calculation unit 16 inputs the determined scenario model into the sensor model and creates sensor data (step S16).
 演算部16は、第1学習済みモデルを決定する(ステップS92)。実データ変換部26は、シナリオモデルの情報や入力に基づいて、取得した条件に対応する学習済みモデルを決定する。なお、第1学習済みモデルは、1つの学習済みモデルとしてもよい。演算部16は、センサデータを第1学習済みモデルに入力し、第一実データに変換する(ステップS94)。 The calculation unit 16 determines the first learned model (step S92). The actual data conversion unit 26 determines a learned model corresponding to the acquired conditions based on the information and input of the scenario model. Note that the first trained model may be one trained model. The calculation unit 16 inputs the sensor data to the first trained model and converts it into first actual data (step S94).
 演算部16は、天候光源変換部29を用いて、入力に基づいて使用する第2学習済みモデルを決定する(ステップS96)。天候光源変換部29は、入力部12に入力された天候、光源の情報に基づいて、対応する第2学習済みモデルを決定する。演算部16は、第一実データを第二学習済みモデルに入力し、第二実データに変換する(ステップS98)。 The calculation unit 16 uses the weather light source conversion unit 29 to determine the second trained model to be used based on the input (step S96). The weather light source conversion unit 29 determines a corresponding second learned model based on the weather and light source information input to the input unit 12. The calculation unit 16 inputs the first real data to the second learned model and converts it into second real data (step S98).
 演算部16は、作成した第二実データを保存する(ステップS99)。第二実データが評価用データとなる。 The calculation unit 16 stores the created second actual data (step S99). The second actual data becomes evaluation data.
 評価用データ作成装置10aは、変換処理を2回行うことで、センサデータ作成部24作成するセンサデータのシナリオのうち、天候、光源のデータを基準条件とすることができる。これにより、センサデータの作成の負荷を小さくすることができる。また、第1学習済みモデルの作成の負荷も小さくすることができる。 By performing the conversion process twice, the evaluation data creation device 10a can use weather and light source data as reference conditions among the sensor data scenarios created by the sensor data creation unit 24. Thereby, the load of creating sensor data can be reduced. Moreover, the load of creating the first trained model can also be reduced.
 評価用データ作成装置10aは、天候光源変換部29で、天候、光源の条件を変更することで、センサデータから変換した第一実データを、所望の天候、光源の条件の第二実データに変換することができる。 The evaluation data creation device 10a changes the weather and light source conditions in the weather light source converter 29, thereby converting the first actual data converted from the sensor data into second actual data with desired weather and light source conditions. can be converted.
 また、評価用データ作成装置10aは、第2学習済みモデルとして、対応する実データを選択する処理を実行する機械学習としてもよいが、教師データに基づいて、第一実データを第二実データに変換する処理を学習する機械学習とすることが好ましい。評価用データ作成装置10aは、基準状態から所望の天候、光源に変換できる第2学習済みモデル52を用いることで、天候、光源の影響をより高い精度で変換、補正することができる。これにより、天候、光源を変化させた種々の第二実データを作成することができ、評価用データをより多く作成することができる。 Furthermore, the evaluation data creation device 10a may perform machine learning that selects corresponding real data as the second trained model, but based on the teacher data, the first real data is replaced with the second real data. It is preferable to use machine learning to learn the process of converting into . The evaluation data creation device 10a can convert and correct the influence of the weather and light source with higher accuracy by using the second learned model 52 that can convert from the reference state to the desired weather and light source. As a result, various types of second actual data can be created by changing the weather and light source, and a larger amount of evaluation data can be created.
 また、上記実施形態では、実データに対して天候、光源を変更する処理を行ったが、変更する処理は、天候、光源に限定されない。シナリオで設定する種々の条件に基づいて、実データの変更、つまり、実データの補正処理を行ってもよい。これにより、シナリオにより近い実データとすることができる。また、補正処理ができることで、条件が異なる評価用データをより多く作成することができる。 Furthermore, in the above embodiment, the actual data is subjected to the process of changing the weather and the light source, but the process of changing is not limited to the weather and the light source. The actual data may be changed, that is, the actual data may be corrected based on various conditions set in the scenario. This allows the actual data to be closer to the scenario. Furthermore, by being able to perform correction processing, it is possible to create more evaluation data with different conditions.
 本開示は、以下の発明を開示している。なお、下記に限定されない。
(1)車両の自動運転及び車両の運転支援機能の少なくとも一方を評価する評価用データを作成する評価用データ作成装置であって、前記車両の走行を再現するシナリオを設定するシナリオ設定部と、前記車両のセンサをモデル化したセンサモデルを用いて、前記シナリオ設定部で設定したシナリオを前記車両が走行した場合に取得されるセンサデータを作成するセンサデータ作成部と、前記センサデータ作成部で作成したセンサデータに、前記車両のセンサで取得したデータである実データを対応付け、対応付けた実データに基づいて前記センサデータを変換して、評価用データである変換実データを作成する実データ変換部と、を含む評価用データ作成装置。
This disclosure discloses the following inventions. Note that the examples are not limited to the following.
(1) An evaluation data creation device that creates evaluation data for evaluating at least one of automatic driving of a vehicle and a driving support function of the vehicle, the scenario setting unit that sets a scenario that reproduces the driving of the vehicle; a sensor data creation unit that creates sensor data obtained when the vehicle runs a scenario set by the scenario setting unit, using a sensor model that models the sensor of the vehicle; An implementation for associating the created sensor data with actual data that is data acquired by a sensor of the vehicle, converting the sensor data based on the associated actual data, and creating converted actual data that is evaluation data. An evaluation data creation device including a data conversion section.
 これにより、センサモデルから実データに近い値を出力するための精緻なモデルを作成せずに、実際の走行した場合に近い、評価用データを作成することができる。また、シナリオモデルとして簡素なモデルを用いつつ、高い精度の評価ができる評価用データを作成できるため、評価用データの作成にかかる時間を低減することができ、処理を簡単にすることができる。 As a result, it is possible to create evaluation data that is close to actual driving without creating a sophisticated model for outputting values close to actual data from the sensor model. Further, since it is possible to create evaluation data that allows highly accurate evaluation while using a simple model as a scenario model, the time required to create evaluation data can be reduced and processing can be simplified.
(2)前記実データを記憶する記憶部を有し、前記実データ変換部が取得したセンサデータと、前記記憶部に記憶した実データとを比較し、前記センサデータに対応付ける実データを決定する対応付け処理部を有する(1)に記載の評価用データ作成装置。実データを対応付けることで、評価用データが実際のセンサの検出と乖離することを抑制できる。 (2) having a storage unit for storing the actual data, comparing the sensor data acquired by the actual data conversion unit with the actual data stored in the storage unit, and determining actual data to be associated with the sensor data; The evaluation data creation device according to (1), including a correlation processing section. By correlating the actual data, it is possible to prevent the evaluation data from deviating from the actual sensor detection.
(3)前記センサデータと前記実データとが対応付けられた学習データを用いて、前記センサデータを入力とし、対応する実データを出力として学習した学習済モデルで、前記センサデータを処理し、前記センサデータに対応付ける実データを決定する対応付け処理部を有する(1)に記載の評価用データ作成装置。学習済みモデルを用いることで、対応付け処理の負荷を低減することができる。 (3) using learning data in which the sensor data and the actual data are associated, processing the sensor data with a trained model that has been trained with the sensor data as input and the corresponding actual data as output; The evaluation data creation device according to (1), further comprising an association processing unit that determines actual data to be associated with the sensor data. By using a trained model, the load of the mapping process can be reduced.
(4)前記対応付け処理部は、シナリオの条件で分類した複数の学習済モデルを有し、前記シナリオ設定部で設定したシナリオに基づいて使用する学習済みモデルを決定する(3)に記載の評価用データ作成装置。これにより、学習済みモデルの精度を高くすることができる。また、学習時に計算が発散することを抑制でき、モデルの作成負荷も低減できる。 (4) The association processing unit has a plurality of trained models classified according to scenario conditions, and determines the trained model to be used based on the scenario set by the scenario setting unit. Evaluation data creation device. This makes it possible to increase the accuracy of the trained model. Further, it is possible to suppress calculation divergence during learning, and reduce model creation load.
(5)前記対応付け処理部は、シナリオの天候及び周囲の光源の少なくとも一方に基づいて、使用する学習済みモデルを決定する(4)に記載の評価用データ作成装置。天候、光源を分類の条件とすることで、シナリオモデルから作成したセンサデータでの判定が難しい条件を識別することができる。例えば、画像の場合、雨なのか雪なのか、曇っていて暗いのか夕方で暗いのか室内で暗いのか等で、学習結果が混在することを抑制できる。これにより、センサデータに基づいた実データの変換の精度をより高くすることができる。 (5) The evaluation data creation device according to (4), wherein the association processing unit determines a trained model to be used based on at least one of the weather of the scenario and the surrounding light source. By using weather and light sources as classification conditions, it is possible to identify conditions that are difficult to judge using sensor data created from a scenario model. For example, in the case of images, it is possible to prevent the learning results from being mixed depending on whether it is raining or snowing, whether it is cloudy and dark, whether it is dark in the evening, whether it is dark indoors, etc. This makes it possible to further improve the accuracy of conversion of actual data based on sensor data.
(6)入力を検出する入力部を有し、前記対応付け処理部は、前記入力部で検出した入力に基づいて、使用する学習済みモデルを決定する(4)または(5)に記載の評価用データ作成装置。これにより、ユーザが必要とする条件に対応する実データに変換することができる。 (6) The evaluation according to (4) or (5), further comprising an input unit that detects an input, wherein the association processing unit determines a trained model to be used based on the input detected by the input unit. data creation device. Thereby, it is possible to convert the data into actual data that corresponds to the conditions required by the user.
(7)前記対応付け処理部は、車両の周囲環境に基づいて、使用する学習済みモデルを決定する(4)から(6)のいずれかに記載の評価用データ作成装置。これにより、学習済みモデルの精度を高くすることができる。また、学習時に計算が発散することを抑制でき、モデルの作成負荷も低減できる。 (7) The evaluation data creation device according to any one of (4) to (6), wherein the association processing unit determines a trained model to be used based on the surrounding environment of the vehicle. This makes it possible to increase the accuracy of the trained model. Further, it is possible to suppress calculation divergence during learning, and reduce model creation load.
(8)前記実データ変換部で作成した変換実データに対して、前記シナリオから天候及び周囲の光源の少なくとも一方を変換する処理を行い、第二変換実データを作成する天候・光源変換部を有する(1)から(7)のいずれか一つに記載の評価用データ作成装置。これにより、センサデータの作成の負荷を小さくすることができ、評価用データをより多く作成することができる。 (8) A weather/light source conversion unit that performs a process of converting at least one of weather and surrounding light sources from the scenario to the converted actual data created by the actual data conversion unit, and creates second converted actual data. The evaluation data creation device according to any one of (1) to (7). Thereby, the load of creating sensor data can be reduced, and more evaluation data can be created.
(9)天候・光源変換部は、天候及び周囲の光源の少なくとも一方が異なる実データが対応付けられ、前記シナリオ作成部で作成するシナリオの天候及び周囲の光源の実データを入力とし、天候及び周囲の光源の少なくとも一方が異なる対応付けられた実データを出力として学習した学習済みモデルで、変換実データを前記第二変換実データに変換する(8)に記載の評価用データ作成装置。これにより、シナリオにより近い実データとすることができる。また、補正処理ができることで、条件が異なる評価用データをより多く作成することができる。 (9) The weather/light source conversion unit is associated with actual data in which at least one of the weather and the surrounding light sources is different, receives the actual data of the weather and the surrounding light sources of the scenario created by the scenario creation unit, and receives the weather and the surrounding light sources as input. The evaluation data creation device according to (8), which converts the converted actual data into the second converted actual data using a learned model that is trained by outputting actual data that is associated with a different at least one of the surrounding light sources. This allows the actual data to be closer to the scenario. Furthermore, by being able to perform correction processing, it is possible to create more evaluation data with different conditions.
(10)車両の自動運転及び車両の運転支援機能の少なくとも一方運転支援機能を評価する評価用データを作成する評価用データ作成方法であって、前記車両の走行を再現するシナリオを設定するステップと、記車両のセンサをモデル化したセンサモデルを用いて、前記シナリオ設定部で設定したシナリオを前記車両が走行した場合に取得されるセンサデータを作成するステップと、作成したセンサデータに、前記車両のセンサで取得したデータである実データを対応付け、対応付けた実データに基づいて前記センサデータを変換して、評価用データである変換実データを作成するステップと、を含む評価用データ作成方法。 (10) An evaluation data creation method for creating evaluation data for evaluating at least one of a vehicle's automatic driving and a vehicle's driving support function, the step comprising: setting a scenario for reproducing the driving of the vehicle; , using a sensor model that models the sensor of the vehicle, creating sensor data that will be acquired when the vehicle runs the scenario set by the scenario setting section; Creating evaluation data, including the step of associating actual data that is data acquired by a sensor, and converting the sensor data based on the associated actual data to create converted actual data that is evaluation data. Method.
 これにより、センサモデルから実データに近い値を出力するための精緻なモデルを作成せずに、実際の走行した場合に近い、評価用データを作成することができる。また、シナリオモデルとして簡素なモデルを用いつつ、高い精度の評価ができる評価用データを作成できるため、評価用データの作成にかかる時間を低減することができ、処理を簡単にすることができる。 As a result, it is possible to create evaluation data that is close to actual driving without creating a sophisticated model for outputting values close to actual data from the sensor model. Further, since it is possible to create evaluation data that allows highly accurate evaluation while using a simple model as a scenario model, the time required to create evaluation data can be reduced and processing can be simplified.
(11)車両の自動運転及び車両の運転支援機能の少なくとも一方運転支援機能を評価する評価用データを作成する評価用データ作成プログラムであって、前記車両の走行を再現するシナリオを設定するステップと、前記車両のセンサをモデル化したセンサモデルを用いて、設定したシナリオを前記車両が走行した場合に取得されるセンサデータを作成するステップと、作成したセンサデータに、前記車両のセンサで取得したデータである実データを対応付け、対応付けた実データに基づいて前記センサデータを変換して、評価用データである変換実データを作成するステップと、を含む処理を実行させる評価用データ作成プログラム。 (11) An evaluation data creation program for creating evaluation data for evaluating at least one of a vehicle's automatic driving and a vehicle's driving support function, the step of setting a scenario for reproducing the driving of the vehicle; , using a sensor model that models the sensors of the vehicle to create sensor data that will be acquired when the vehicle travels through a set scenario; An evaluation data creation program that executes a process including the step of associating actual data that is data, converting the sensor data based on the associated actual data, and creating converted actual data that is evaluation data. .
 これにより、センサモデルから実データに近い値を出力するための精緻なモデルを作成せずに、実際の走行した場合に近い、評価用データを作成することができる。また、シナリオモデルとして簡素なモデルを用いつつ、高い精度の評価ができる評価用データを作成できるため、評価用データの作成にかかる時間を低減することができ、処理を簡単にすることができる。 As a result, it is possible to create evaluation data that is close to actual driving without creating a sophisticated model for outputting values close to actual data from the sensor model. Further, since it is possible to create evaluation data that allows highly accurate evaluation while using a simple model as a scenario model, the time required to create evaluation data can be reduced and processing can be simplified.
  6 ECU
  8 評価装置
 10 評価用データ作成装置
 12 入力部
 14 出力部
 15 通信部
 16 演算部
 18 記憶部
 22 シナリオ設定部
 24 センサデータ作成部
 26 実データ変換部
 28 対応付け処理部
 30 機械学習部
 32 データ作成プログラム
 34 学習プログラム
 36 シナリオモデル
 37 環境物体モデル
 38 天候モデル
 40 光源モデル
 42 車両モデル
 44 センサモデル
 46 対応付けテーブル
 48 実データ
 50 学習済みモデル
6 ECU
8 Evaluation device 10 Evaluation data creation device 12 Input section 14 Output section 15 Communication section 16 Arithmetic section 18 Storage section 22 Scenario setting section 24 Sensor data creation section 26 Actual data conversion section 28 Correspondence processing section 30 Machine learning section 32 Data creation section Program 34 Learning program 36 Scenario model 37 Environmental object model 38 Weather model 40 Light source model 42 Vehicle model 44 Sensor model 46 Correspondence table 48 Actual data 50 Learned model

Claims (11)

  1.  車両の自動運転及び車両の運転支援機能の少なくとも一方を評価する評価用データを作成する評価用データ作成装置であって、
     前記車両の走行を再現するシナリオを設定するシナリオ設定部と、
     前記車両のセンサをモデル化したセンサモデルを用いて、前記シナリオ設定部で設定したシナリオを前記車両が走行した場合に取得されるセンサデータを作成するセンサデータ作成部と、
     前記センサデータ作成部で作成したセンサデータに、前記車両のセンサで取得したデータである実データを対応付け、対応付けた実データに基づいて前記センサデータを変換して、評価用データである変換実データを作成する実データ変換部と、を含む評価用データ作成装置。
    An evaluation data creation device that creates evaluation data for evaluating at least one of automatic driving of a vehicle and a driving support function of a vehicle, the device comprising:
    a scenario setting unit that sets a scenario for reproducing the driving of the vehicle;
    a sensor data creation unit that creates sensor data that is acquired when the vehicle runs a scenario set by the scenario setting unit, using a sensor model that models sensors of the vehicle;
    The sensor data created by the sensor data creation unit is associated with actual data, which is data acquired by the sensor of the vehicle, and the sensor data is converted based on the associated actual data, thereby converting the sensor data to be evaluation data. An evaluation data creation device including an actual data conversion unit that creates actual data.
  2.  前記実データを記憶する記憶部を有し、
     前記実データ変換部が取得したセンサデータと、前記記憶部に記憶した実データとを比較し、前記センサデータに対応付ける実データを決定する対応付け処理部を有する請求項1に記載の評価用データ作成装置。
    comprising a storage unit that stores the actual data;
    The evaluation data according to claim 1, further comprising a correlation processing unit that compares the sensor data acquired by the actual data conversion unit and the actual data stored in the storage unit, and determines actual data to be associated with the sensor data. Creation device.
  3.  前記センサデータと前記実データとが対応付けられた学習データを用いて、前記センサデータを入力とし、対応する実データを出力として学習した学習済モデルで、前記センサデータを処理し、前記センサデータに対応付ける実データを決定する対応付け処理部を有する請求項1に記載の評価用データ作成装置。 Using learning data in which the sensor data and the actual data are associated, the sensor data is processed by a trained model that has been trained with the sensor data as input and the corresponding actual data as output, and the sensor data is The evaluation data creation device according to claim 1, further comprising a mapping processing unit that determines actual data to be mapped to.
  4.  前記対応付け処理部は、シナリオの条件で分類した複数の学習済モデルを有し、前記シナリオ設定部で設定したシナリオに基づいて使用する学習済みモデルを決定する請求項3に記載の評価用データ作成装置。 The evaluation data according to claim 3, wherein the association processing unit has a plurality of trained models classified according to scenario conditions, and determines the trained model to be used based on the scenario set by the scenario setting unit. Creation device.
  5.  前記対応付け処理部は、シナリオの天候及び周囲の光源の少なくとも一方に基づいて、使用する学習済みモデルを決定する請求項4に記載の評価用データ作成装置。 The evaluation data creation device according to claim 4, wherein the association processing unit determines the trained model to be used based on at least one of the weather of the scenario and the surrounding light source.
  6.  入力を検出する入力部を有し、
     前記対応付け処理部は、前記入力部で検出した入力に基づいて、使用する学習済みモデルを決定する請求項4に記載の評価用データ作成装置。
    has an input section that detects input;
    The evaluation data creation device according to claim 4, wherein the association processing unit determines a trained model to be used based on the input detected by the input unit.
  7.  前記対応付け処理部は、車両の周囲環境に基づいて、使用する学習済みモデルを決定する請求項4に記載の評価用データ作成装置。 The evaluation data creation device according to claim 4, wherein the association processing unit determines the trained model to be used based on the surrounding environment of the vehicle.
  8.  前記実データ変換部で作成した変換実データに対して、前記シナリオから天候及び周囲の光源の少なくとも一方を変換する処理を行い、第二変換実データを作成する天候・光源変換部を有する請求項1から7のいずれか一項に記載の評価用データ作成装置。 A weather/light source conversion unit that performs a process of converting at least one of weather and surrounding light sources from the scenario to the converted actual data created by the actual data conversion unit to create second converted actual data. 8. The evaluation data creation device according to any one of 1 to 7.
  9.  天候・光源変換部は、天候及び周囲の光源の少なくとも一方が異なる実データが対応付けられ、前記シナリオ作成部で作成するシナリオの天候及び周囲の光源の実データを入力とし、天候及び周囲の光源の少なくとも一方が異なる対応付けられた実データを出力として学習した学習済みモデルで、変換実データを前記第二変換実データに変換する請求項8に記載の評価用データ作成装置。 The weather/light source conversion unit is associated with actual data in which at least one of the weather and the surrounding light sources is different, and inputs the actual data of the weather and the surrounding light sources of the scenario created by the scenario creation unit, and converts the weather and the surrounding light sources. 9. The evaluation data creation device according to claim 8, wherein the converted actual data is converted to the second converted actual data using a learned model that is trained by outputting associated actual data in which at least one of the following is different.
  10.  車両の自動運転及び車両の運転支援機能の少なくとも一方運転支援機能を評価する評価用データを作成する評価用データ作成方法であって、
     前記車両の走行を再現するシナリオを設定するステップと、
    前記車両のセンサをモデル化したセンサモデルを用いて、前記シナリオ設定部で設定したシナリオを前記車両が走行した場合に取得されるセンサデータを作成するステップと、
     作成したセンサデータに、前記車両のセンサで取得したデータである実データを対応付け、対応付けた実データに基づいて前記センサデータを変換して、評価用データである変換実データを作成するステップと、を含む評価用データ作成方法。
    An evaluation data creation method for creating evaluation data for evaluating at least one of a vehicle's automatic driving and a vehicle's driving support function, the method comprising:
    setting a scenario that reproduces the driving of the vehicle;
    Using a sensor model that models the sensors of the vehicle, creating sensor data that will be acquired when the vehicle runs the scenario set by the scenario setting unit;
    A step of associating the created sensor data with actual data that is data acquired by a sensor of the vehicle, converting the sensor data based on the associated actual data, and creating converted actual data that is evaluation data. and an evaluation data creation method including.
  11.  車両の自動運転及び車両の運転支援機能の少なくとも一方運転支援機能を評価する評価用データを作成する評価用データ作成プログラムであって、
     前記車両の走行を再現するシナリオを設定するステップと、
     前記車両のセンサをモデル化したセンサモデルを用いて、設定したシナリオを前記車両が走行した場合に取得されるセンサデータを作成するステップと、
     作成したセンサデータに、前記車両のセンサで取得したデータである実データを対応付け、対応付けた実データに基づいて前記センサデータを変換して、評価用データである変換実データを作成するステップと、を含む処理を実行させる評価用データ作成プログラム。
    An evaluation data creation program that creates evaluation data for evaluating at least one of a vehicle's automatic driving and a vehicle's driving support function, the program comprising:
    setting a scenario that reproduces the driving of the vehicle;
    creating sensor data to be obtained when the vehicle travels through a set scenario using a sensor model that models the sensors of the vehicle;
    A step of associating the created sensor data with actual data that is data acquired by a sensor of the vehicle, converting the sensor data based on the associated actual data, and creating converted actual data that is evaluation data. An evaluation data creation program that executes processing including.
PCT/JP2023/006272 2022-04-28 2023-02-21 Evaluation data creation device, evaluation data creation method, and evaluation data creation program WO2023210132A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022075352A JP2023164047A (en) 2022-04-28 2022-04-28 Evaluation data preparation device, evaluation data preparation method, and evaluation data preparation program
JP2022-075352 2022-04-28

Publications (1)

Publication Number Publication Date
WO2023210132A1 true WO2023210132A1 (en) 2023-11-02

Family

ID=88518561

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/006272 WO2023210132A1 (en) 2022-04-28 2023-02-21 Evaluation data creation device, evaluation data creation method, and evaluation data creation program

Country Status (2)

Country Link
JP (1) JP2023164047A (en)
WO (1) WO2023210132A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016001172A (en) * 2014-05-19 2016-01-07 株式会社堀場製作所 Vehicle test system, test management device, test management program, and vehicle test method
JP2017105453A (en) * 2015-12-08 2017-06-15 ローベルト ボツシユ ゲゼルシヤフト ミツト ベシユレンクテル ハフツングRobert Bosch Gmbh Method for validating drive assistance function of motor vehicle
JP2019043157A (en) * 2017-08-29 2019-03-22 トヨタ自動車株式会社 Autonomous driving evaluation device and autonomous driving evaluation method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016001172A (en) * 2014-05-19 2016-01-07 株式会社堀場製作所 Vehicle test system, test management device, test management program, and vehicle test method
JP2017105453A (en) * 2015-12-08 2017-06-15 ローベルト ボツシユ ゲゼルシヤフト ミツト ベシユレンクテル ハフツングRobert Bosch Gmbh Method for validating drive assistance function of motor vehicle
JP2019043157A (en) * 2017-08-29 2019-03-22 トヨタ自動車株式会社 Autonomous driving evaluation device and autonomous driving evaluation method

Also Published As

Publication number Publication date
JP2023164047A (en) 2023-11-10

Similar Documents

Publication Publication Date Title
CN111062413B (en) Road target detection method and device, electronic equipment and storage medium
KR102068473B1 (en) Simulation method and apparatus vehicle
KR102404791B1 (en) Device and method for recognizing objects included in input image
US12106575B2 (en) Method for operating a driver assistance system of a vehicle and driver assistance system for a vehicle
JP2019043495A (en) Device, system and method for adjusting automatic operation
US20210056863A1 (en) Hybrid models for dynamic agents in a simulation environment
US20190072961A1 (en) Autonomous driving adjustment method, apparatus, and system
JP2019093896A (en) Information processing device, classification method and computer program
US11919530B2 (en) Method and system for validating an autonomous vehicle stack
JPWO2018173933A1 (en) Information processing apparatus, traveling data processing method, vehicle, and program
JP2023168244A (en) Method and system for calibration and validation of advanced driver assistance system (adas) and/or automated driving system (ads), and computer program product
US20230205951A1 (en) Simulation obstacle vehicles with driving styles
CN114625637A (en) Testing method and evaluation method based on dynamic virtual scene
US20230278582A1 (en) Trajectory value learning for autonomous systems
CN114077785A (en) Method and device for constructing simulation test scene of vehicle
WO2023123130A1 (en) Method and apparatus for autonomous driving system, electronic device and medium
CN109683491B (en) Vehicle-mounted camera simulation system
WO2023210132A1 (en) Evaluation data creation device, evaluation data creation method, and evaluation data creation program
CN112462759B (en) Evaluation method, system and computer storage medium of rule control algorithm
CN113421298A (en) Vehicle distance measuring method, vehicle control device, vehicle and readable storage medium
Adam et al. Robustness and deployability of deep object detectors in autonomous driving
WO2020194589A1 (en) Vehicle control calculation device, vehicle control apparatus, and vehicle control calculation method
KR20210038792A (en) Evaluation system for driving simulation and method thereof
US11565711B2 (en) System and method for generating vehicle speed alerts
KR102692318B1 (en) Apparutus and method for controlling transmission of vehicle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23795880

Country of ref document: EP

Kind code of ref document: A1