EP4315069A1 - Verfahren zum bewerten einer software für ein steuergerät eines fahrzeugs - Google Patents

Verfahren zum bewerten einer software für ein steuergerät eines fahrzeugs

Info

Publication number
EP4315069A1
EP4315069A1 EP22706756.8A EP22706756A EP4315069A1 EP 4315069 A1 EP4315069 A1 EP 4315069A1 EP 22706756 A EP22706756 A EP 22706756A EP 4315069 A1 EP4315069 A1 EP 4315069A1
Authority
EP
European Patent Office
Prior art keywords
version
objects
data
vehicle
relevant
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP22706756.8A
Other languages
German (de)
English (en)
French (fr)
Inventor
Dora WILMES
Marina Ullmann
Matthias Birk
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of EP4315069A1 publication Critical patent/EP4315069A1/de
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3696Methods or tools to render software testable
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0098Details of control systems ensuring comfort, safety or stability not otherwise provided for
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/07Responding to the occurrence of a fault, e.g. fault tolerance
    • G06F11/16Error detection or correction of the data by redundancy in hardware
    • G06F11/1629Error detection by comparing the output of redundant processing systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3688Test management for test execution, e.g. scheduling of test suites
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F11/00Error detection; Error correction; Monitoring
    • G06F11/36Preventing errors by testing or debugging software
    • G06F11/3668Software testing
    • G06F11/3672Test management
    • G06F11/3692Test management for test results analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/217Validation; Performance evaluation; Active pattern learning techniques

Definitions

  • the invention relates to a method for evaluating software for a control unit of a vehicle. Furthermore, the invention relates to a control device, a computer program and a computer-readable medium for executing the aforementioned method.
  • a vehicle such as a passenger car or truck can be equipped with a driver assistance system that enables partially or fully automated control of the vehicle.
  • the driver assistance system can recognize positions, orientations and/or object types of objects in the vicinity of the vehicle, for example by means of a suitable sensor system, and steer, brake and/or accelerate the vehicle taking these objects into account.
  • driver assistance system is generally subject to strict safety requirements. Updating the driver assistance system, for example to improve or expand the driver assistance system, can be very time-consuming, since each time individual components are updated, the entire system must be released.
  • Embodiments of the present invention make it possible to run a not yet released version of object recognition software for a control unit of a vehicle in parallel with an already released version of the object recognition software on the control unit, to evaluate the not yet released version with regard to its recognition quality by comparing recognition results of the two versions and to send a corresponding evaluation result to a central data processing device for further evaluation.
  • evaluation results i. H. real data
  • a correspondingly large number of production vehicles can thus be guaranteed a very fast and cost-effective validation of the not yet released version, for example a new version of an object recognition or sensor data fusion module.
  • a first aspect of the invention relates to a computer-implemented method for evaluating software for a control unit of a vehicle.
  • the control unit includes a memory in which a first version and a second version of the software are stored, and a processor for executing the first version and the second version.
  • the method comprises at least the following steps: receiving in the control unit sensor data which were generated by a sensor system for detecting an environment of the vehicle; entering the sensor data into the first version and the second version; Generating first object data from the sensor data by the first version, the first object data comprising positions and/or orientations of first objects in the area surrounding the vehicle detected by the first version; generating second object data from the sensor data by the second version, the second object data comprising positions and/or orientations of second objects in the area surrounding the vehicle detected by the second version; Assessing the second version with regard to a recognition quality by comparing the first object data with the second object data, wherein an evaluation result is generated; and sending the evaluation result from the control unit to a central data processing device.
  • the method can be executed automatically by the processor of the control device, for example.
  • the method can be executed when a command generated by the central data processing device for executing the method is received in the control unit.
  • the vehicle may be an automobile, such as a car, truck, bus, or motorcycle.
  • a vehicle can also be understood as an autonomous, mobile robot.
  • the sensor system can include at least one environment sensor such as a camera, a radar, lidar or ultrasonic sensor.
  • the sensors can include a location sensor for determining the geographic coordinates of the vehicle using a global navigation satellite system such as GPS, GLONASS or similar.
  • the sensor system for detecting a driving condition of the vehicle can have at least one driving dynamics sensor such as an acceleration, wheel speed, steering wheel angle,
  • positions and/or orientations of the objects relative to the vehicle can be determined in several consecutive magazines and stored in the form of an object list in an environment model. It is possible that in each current journal future positions and/or orientations of the objects are estimated from their positions and/or orientations in one or more previous journals.
  • the sensor data can be received in several consecutive journals, whereby the sensor data in each journal can be entered in both the first version and the second version.
  • the controller software can be configured to steer, accelerate and/or decelerate the vehicle based on the sensor data.
  • the vehicle can include a corresponding actuator, for example in the form of a steering actuator, a brake actuator, an engine control unit, an electric drive motor or a combination of at least two of the examples mentioned.
  • the control unit software can include one or more components of a driver assistance system.
  • the central data processing device can be a server, a PC, a laptop, a tablet or a smartphone, for example.
  • control device and the central data processing device can be connected via a wireless data communication connection such as WLAN,
  • Bluetooth and/or cellular be connected to each other.
  • a wired data communication connection between the control device and the central data processing device is also possible.
  • the method can additionally include the following steps: receiving the second version in the control unit; storing the second version in the controller's memory.
  • the second version can be generated by the central data processing device and/or sent from the central data processing device to the control device.
  • the second version can be received in the control unit and stored there, for example in the form of a file that can be executed by the processor.
  • the first version can be an older version of the software that has already been released.
  • the second version may be a newer, unreleased version of the software.
  • the second version may include an updated version of one or more software modules of the first version, such as a detection or sensor data fusion module for detecting the objects in the sensor data or an interpretation module for interpreting the objects in terms of their relevance to the vehicle.
  • a detection or sensor data fusion module for detecting the objects in the sensor data
  • an interpretation module for interpreting the objects in terms of their relevance to the vehicle.
  • the first version includes a first detection module for converting the sensor data into the first object data and/or a first interpretation module for determining objects relevant to the vehicle based on the first object data and/or the sensor data.
  • a first detection module for converting the sensor data into the first object data
  • a first interpretation module for determining objects relevant to the vehicle based on the first object data and/or the sensor data.
  • the first recognition module or the first interpretation module can be an already released software module of the software.
  • the second version may include a second detection module for converting the sensor data into the second object data and/or a second interpretation module for determining objects relevant to the vehicle based on the second object data and/or the sensor data.
  • the second recognition module or the second interpretation module can be a software module of the software that has not yet been released.
  • the first version and the second version can, for example, be executed in parallel processes by the processor of the control device.
  • the second version can be executed in an isolated area within which the second version can be executed without safety-relevant effects on hardware and software components of the vehicle located outside this area, for example on the first version or an actuator of the vehicle.
  • Software running in such an isolated area may also be referred to as shadow mode software.
  • the first version and the second version can be executed in different operating environments.
  • the second version can be executed in an operating environment that is restricted compared to the operating environment of the first version.
  • the first object data can include object classes of the first objects in addition to the positions and/or orientations of the first objects.
  • the second object data can include object classes of the second objects in addition to the positions and/or orientations of the second objects.
  • an object class can be an object type, such as “oncoming vehicle”, “vehicle in front”, “lane marking”, “pedestrian” or similar.
  • the object classes can be assigned to the first or second objects by evaluating the sensor data using one or more classifiers. More than one object class can also be assigned to one and the same object.
  • the first object data and the second object data can be compared with one another, for example, by comparing the positions and/or orientations of the first objects with the positions and/or orientations of the second objects and/or by recognition times, at which the first objects were detected, are compared with detection times at which the second objects were detected.
  • other comparison methods are also possible.
  • links can be determined from a first object and a second object.
  • the positions and/or orientations and/or the detection times of the objects linked to one another can be compared with one another.
  • the objects that are linked to one another can be object models that describe one and the same actual object in the area surrounding the vehicle.
  • the second version By evaluating the second version, it can be determined whether its recognition quality, i. H. the recognition accuracy and reliability with which objects are recognized by the second version is worse than or at least as good as the recognition quality of the first version.
  • the evaluation result can include, for example, statistical estimates for the recognition accuracy and reliability of the second version, seen in absolute terms and/or in relation to the first version. Additionally or alternatively, the evaluation result for the evaluation of the second version can include relevant data sequences from the first and/or second object data and/or the sensor data with regard to the recognition quality.
  • the central data processing device for example via a wireless data communication link, the data sequences can be evaluated at a central location regardless of where the vehicle is located.
  • Such a data sequence can include, for example, an object list with objects relevant to the assessment of the second version with regard to the recognition quality and/or the (raw) sensor data on which these objects are based.
  • the second version can be rated worse than the first version in terms of recognition quality if it is determined that an object that was recognized by the first version was not recognized by the second version.
  • the second version is rated better than the first version in terms of recognition quality if it is determined that an object that was recognized by the second version was not recognized by the first version.
  • an object can be recognized as relevant or not relevant for the vehicle depending on its distance and/or its relative speed relative to the vehicle and/or depending on its object class.
  • the approach described here and below is based on the fact that software can be executed in a so-called shadow mode in production vehicles to support the development of driver assistance systems.
  • the software or individual software modules run passively in an isolated area without affecting active components in the vehicle.
  • newly developed software versions can be uploaded and evaluated in quick iterations.
  • an evaluation framework can be used for this purpose, which triggers the recording of data sequences based on defined trigger logic and ensures wireless data transmission to a cloud.
  • newly developed software versions can be compared and evaluated very quickly with a large amount of data that corresponds to what is happening in the field.
  • such a shadow mode can now be used to determine whether an updated version of software for a control unit of a vehicle achieves certain target parameters just as well or better than an already released version of this software is active in the vehicle.
  • the results of the updated version running in shadow mode in the vehicle can be compared directly with the results of the version that has already been released.
  • the first version is executed by a first controller and the second version is executed by a second controller.
  • the second control device can run the (released) first version or another released version of the software.
  • the first control unit can be the control unit of a first vehicle, for example.
  • the second control unit can be the control unit of a second vehicle, for example.
  • the first control device and the second control device can be connected to one another for data communication, for example via a wireless data communication connection.
  • the first object data can be generated by the first control unit from first sensor data, for example, it being possible for the first sensor data to have been generated by a first sensor system for detecting an environment of the first vehicle.
  • the second object data can be generated by the second control unit from second sensor data, for example, it being possible for the second sensor data to have been generated by a second sensor system for detecting surroundings of the second vehicle.
  • the first control device prefferably receives the second object data from the second control device.
  • the evaluation result by comparing the first object data with that of the second control unit received second object data are generated by the first control device.
  • the evaluation result can be sent from the first control unit to the central data processing device and, in addition, to the second control unit.
  • a second aspect of the invention relates to a control device, comprising a processor which is configured to carry out the method according to an embodiment of the first aspect of the invention.
  • a control device comprising a processor which is configured to carry out the method according to an embodiment of the first aspect of the invention.
  • Features of the method according to an embodiment of the first aspect of the invention can also be features of the control unit and vice versa.
  • the control unit can include hardware and/or software modules.
  • control unit can include a memory and data communication interfaces for data communication with peripheral devices.
  • a third aspect of the invention relates to a computer program.
  • the computer program comprises instructions which, when the computer program is executed by the processor, cause a processor to carry out the method according to an embodiment of the first aspect of the invention.
  • a fourth aspect of the invention relates to a computer-readable medium on which the computer program according to an embodiment of the third aspect of the invention is stored.
  • the computer-readable medium can be volatile or non-volatile data storage.
  • the computer-readable medium can be a hard drive, USB storage device, RAM, ROM, EPROM, or flash memory.
  • the computer-readable medium can also be a downloadable program code
  • data communication network such as the Internet or a data cloud (cloud).
  • At least one first evaluation parameter is determined for each first object, which indicates how well the first object was recognized by the first version.
  • At least one second evaluation parameter is determined for every second object, which indicates how well the second object was recognized by the second version.
  • the second version is then scored by comparing the first scoring parameters to the second scoring parameters.
  • a first or second evaluation parameter can be, for example, a time of recognition or a statistical value, for example a confidence with regard to the recognized positions and/or orientations.
  • the second version can be evaluated by comparing the first evaluation parameters and second evaluation parameters of identical objects.
  • a first object can be evaluated with a second object that matches the first object by comparing the first evaluation parameter(s) assigned to the first object with the second evaluation parameter(s) assigned to the second object. For example, to compare the first version with the second version, a difference can be formed from the first evaluation parameter(s) and the corresponding second evaluation parameter(s), wherein the second version can be evaluated based on the difference. This enables the software to be evaluated based on individual detected objects.
  • the first evaluation parameter is a detection time at which the first object was detected by the first version.
  • the second evaluation parameter can be a detection time at which the second object was detected by the second version.
  • the second version can be rated better than the first version in terms of recognition quality if the recognition time at which an object was recognized by the second version is an earlier time than the recognition time at which the same object was recognized by the first version became, and vice versa. Such a comparison of the detection times is easy to carry out and provides a sufficiently accurate assessment result.
  • the first evaluation parameter is a probability regarding the position and/or orientation of the first object.
  • the second evaluation parameter can Probability regarding the position and/or orientation of the second object.
  • the precision of the positions and/or orientations more precisely the precision of a position parameter with regard to a probability distribution of the positions and/or orientations, can be specified with the probability.
  • the first or second evaluation parameter can be a position and/or scatter parameter of a probability distribution, for example. It is also possible for the first or second evaluation parameter to indicate a confidence interval.
  • the accuracy and reliability of the method can thus be increased.
  • first objects relevant to the vehicle are selected from the first objects by the first version.
  • Objects that match one another are determined by comparing the relevant first objects with the second objects.
  • the evaluation parameters of the objects that match one another are then compared with one another.
  • a relevant first object can be selected from the first objects, for example depending on its distance and/or its relative speed relative to the vehicle and/or depending on its object class. This can be done using the first interpretation module of the first version, which can be configured to determine the relevance of the first objects depending on the situation and/or function, for example by dividing the first objects into different relevance categories, in the simplest case into the relevance categories "relevant ' and 'not relevant'.
  • second objects relevant to the vehicle are selected by the second version from the second objects that do not match any relevant first object.
  • an individual evaluation is generated for each relevant second object, which indicates whether the recognition of the relevant second object by the second version corresponds to an improvement or deterioration in the recognition quality of the second version compared to the first version.
  • the second version will then be further based on the individual ratings. For this purpose, it can first be determined whether the second object data contain second objects that do not match any of the relevant first objects, ie that were not recognized or at least not recognized as relevant by the first version. It can also be determined whether the second objects that do not match any of the relevant first objects are relevant to the vehicle or not.
  • this can be done using the second interpretation module of the second version, which - analogous to the first interpretation module - can be configured to determine the relevance of the second objects depending on the situation and/or function, for example by dividing the second objects into different relevance categories, in the simplest case the relevance categories “relevant” and “not relevant”.
  • the individual assessments can, for example, be sent to the central data processing device as part of the assessment result.
  • the assessment result can include the object data and/or sensor data on which the respective individual assessments are based. It is possible for the object data and/or sensor data to be sent as part of the assessment result only if the individual assessments on which they are based indicate a deterioration in the recognition quality of the second version in relation to the first version.
  • the second version can be evaluated depending on whether the second version recognizes relevant objects that were not already recognized by the first version. For example, the rating of the second version can be recalculated with each individual rating.
  • changes in a driving state of the vehicle are determined which correlate in time with detection times at which the relevant second objects were detected by the second version .
  • each individual evaluation is generated by evaluating the change in the driving state that correlates in time with the respective relevant second object.
  • the sensor data and / or the driving dynamics data can be evaluated, for example, to determine a reaction of a driver of the vehicle at the time of detection of the object in question and to interpret this. If, for example, no reaction or at least no relevant reaction by the driver can be determined, this can be interpreted as a strong indication that the recognition quality was not appreciably improved with the recognition of the object in question, and vice versa.
  • the second object data is generated in a number of consecutive magazines.
  • the second objects are checked for plausibility by comparing the second object data from different magazines.
  • the second version is then also evaluated as a function of the plausibility of the second objects. For example, the positions and/or orientations of one and the same object from different magazines can be compared with one another in order to identify inconsistencies, i. H. implausible changes in the position and/or orientation of the object. This makes it possible to determine whether the second version supplies chronologically consistent and plausible object data. It is possible for object data relating to individual, implausible objects, such as their positions and/or orientations over a number of consecutive time steps, to be sent to the central data processing device as part of the assessment result.
  • the assessment result includes data sequences from the sensor data, the first object data and/or the second object data.
  • the evaluation of the second version can be based on the data sequences.
  • the data sequences can indicate an improvement or deterioration in the recognition quality of the second version compared to the first version.
  • the data sequences are only sent if the second version was rated worse than the first version in terms of recognition quality.
  • the efficiency of the data communication between the control device and the central data processing device can be improved.
  • FIG. 1 shows a vehicle with a control device according to an exemplary embodiment of the invention.
  • FIG. 2 shows various modules of software running on the control device from FIG.
  • FIG. 1 shows a vehicle 100 that is equipped with a control unit 102 , a sensor system 104 for detecting surroundings of the vehicle 100 , and an actuator system 106 .
  • the sensor system 104 can include a camera, a radar, lidar and/or ultrasonic sensor, for example.
  • the sensor system 104 can include at least one vehicle dynamics sensor, such as an acceleration or yaw rate sensor.
  • the sensor system 104 generates sensor data 108 in several consecutive magazines, which are received in the control unit 102 and evaluated there as part of an object recognition.
  • Control unit 102 includes a memory 110 on which appropriate software 111 is stored, and a processor 112 for executing software 111.
  • control unit 102 It is possible for control unit 102 to generate a control signal 114 for automatically activating actuator system 106 based on sensor data 108 , ie based on the results of the object recognition carried out therewith.
  • the Actuator system 106 can include, for example, one or more steering and/or braking actuators for steering or braking vehicle 100 .
  • First objects 116 in the area surrounding vehicle 100 are detected by a first version 111a of software 111 evaluating sensor data 108
  • second objects 118 in the area surrounding vehicle 100 are detected by evaluating sensor data 108 using a second version 111b of software 111 will.
  • the first objects 116 and the second objects 118 here for example vehicles driving ahead, can be identical or different objects.
  • the control device 102 can evaluate a recognition quality of the second version 111b compared to the first version lila. This is described in more detail below with reference to FIG.
  • the second version 111b can be an update of the first version lila, for example an improved and/or expanded version of the software 111 compared to the first version lila.
  • the second version 111b can also be referred to as shadow software.
  • the second version 111b can also be allocated to a separate control device in the same way.
  • the sensor data 108 enter both a first recognition module 202 of the first version IIIa and a second recognition module 204 of the second version III1b.
  • the first detection module 202 uses the sensor data 108 to generate first object data 206, the positions and/or orientations of the first objects 116 relative to vehicle 100 .
  • second detection module 204 From sensor data 108 , second detection module 204 generates second object data 208 , which includes positions and/or orientations of second objects 118 relative to vehicle 100 .
  • Detected objects 116, 118 can be stored together with their respective positions and/or orientations, for example in the form of object lists, in an environment model of the environment of vehicle 100 and continuously updated there based on sensor data 108.
  • the first objects 116 and the second objects 118 can be understood as object models stored in the environment model of objects actually present in the environment of the vehicle 100 .
  • the first object data 206 and/or the second object data 208 can specify one or more object types for each detected object 116 or 118, such as “vehicle”, “pedestrian” or “lane marking”.
  • the first object data 206 and the second object data 208 are entered into an evaluation module 210 of the software 111, which in this example, like the second version 111b, runs in the restricted operating environment 200 for security reasons.
  • the evaluation module 210 evaluates the recognition quality of the second version 111b in relation to the first version 111a by suitably comparing the first object data 206 with the second object data 208 .
  • the evaluation module 210 generates a corresponding evaluation result 212 and sends this, for example via a WLAN, Bluetooth and/or cellular connection, to a central data processing device 214 outside of the vehicle 100 for further evaluation.
  • the first objects 116 can be compared with the corresponding second objects 118 using one or more suitable assessment parameters, for example using the respective detection times or using one or more estimated values with regard to the accuracy and/or reliability of the respective positions and/or orientations .
  • the first object data 206 can be interpreted by a first interpretation module 216 of the first version IIIa, ie evaluated with regard to their relevance for the vehicle 100 .
  • the first interpretation module 216 can select one or more relevant first objects 116′ from the first objects 116, in FIG. 1 for example a vehicle driving ahead to the left of the vehicle 100, and correspondingly filtered first object data 206′. send to the scoring module 210.
  • the evaluation module 210 can then link the relevant first objects 116' to the corresponding second objects 118, for example based on the respective positions and/or orientations and/or the respective detection times. Subsequently, the evaluation parameters of the linked objects can be compared with one another in order to evaluate the second version 111b.
  • the objects that are linked to one another can be objects that correspond to one another in that they are object models of one and the same object that actually exists in the environment of the vehicle 100 .
  • the second object data 208 is interpreted by a second interpretation module 218 of the second version 111b, i. H. be evaluated with regard to their relevance for the vehicle 100 .
  • second interpretation module 218 can select one or more relevant second objects 118' from second objects 118, in FIG. 1 for example a vehicle driving ahead to the right of vehicle 100, and correspondingly filtered second object data 208'. send to the scoring module 210.
  • the evaluation module 210 can determine whether there are objects that clearly match one another among the relevant first objects 206′ and the relevant second objects 208′ or not. If one of the relevant second objects 208′ does not match one of the relevant first objects 206′, the rating module 210 can generate an individual rating for this second object, which indicates whether the recognition of this second object improves or worsens the recognition quality of the second version 111b is purple compared to the first version.
  • the evaluation result 212 can then be generated based on the individual evaluation.
  • the individual evaluation can be generated based on the sensor data 108 .
  • Sensor data 108 can include driving dynamics data generated by one or more driving dynamics sensors of vehicle 100 in addition to environmental data generated by one or more environmental sensors of vehicle 100 .
  • temporally correlating changes in the driving dynamics state of the vehicle 100 for example triggered by a corresponding reaction of its driver, can be determined with the detection of the relevant second objects 118′. Based on this change, it can finally be determined whether the recognition of the relevant object is equivalent to an improvement or deterioration in the recognition quality of the second version 111b.
  • the second interpretation module 218 may be configured to first determine those of the second objects 118 that do not uniquely match any of the relevant first objects 116' and then select the relevant second objects 118' therefrom.
  • the evaluation module 210 can be configured to check the second objects 118 for plausibility. For this purpose, the evaluation module 210 can evaluate the second objects 118 based on the second object data 208 of a number of consecutive time steps. In this case, the assessment result 212 can also be determined taking into account the plausibility of the second objects 118 .
  • An example of an implausible or inconsistent second object 118 is indicated in FIG. 1 with a dashed frame.
  • the assessment of the second version 111b in terms of a validation of a recognition task that is to be solved by means of the second recognition module 204 and/or the second interpretation module 218 can include the following steps, for example.
  • first objects 116' which were recognized by the first version 111a, were also recognized in the same or improved manner by the second version 111b running in shadow mode.
  • the relevance of the first objects 116 is not determined by the first recognition module 202 itself, but by the first interpretation module 216, ie determined by one or more subsequent interpreting software elements in a kind of situation analysis.
  • a link is carried out in the evaluation module 210 .
  • the recognition quality of the two versions 111a, 111b is then compared with one another on the basis of defined metrics, which can include, for example, the respective recognition times or a confidence with regard to the respective positions and/or orientations.
  • a corresponding data sequence can be sent directly to the central data processing device 214.
  • the data sequence can be generated from the corresponding sensor data 108 and/or the corresponding object data 116 or 118 .
  • an improvement in the recognition quality can be confirmed by the non-arrival of such data sequences at the central data processing device 214 .
  • the improvement or deterioration in the recognition quality can be recorded by the control unit 102 regularly sending bundled statistics to the central data processing device 214 .
  • the second interpretation module 218 determines a second relevant object 118' in a type of situation analysis that cannot be linked to any of the relevant first objects 116', then it is first determined using defined logic, for example based on the driver's reaction to this object, whether the recognition of this object represents an improvement or deterioration in the recognition quality. Depending on the logic, the process can be recorded in a statistic. Additionally or alternatively, the direct transmission of a corresponding data sequence for external evaluation in the central data processing device 214 can be triggered.
  • the evaluation module 210 detects using time curves from the sensor data 108 and/or the second object data 208 or 208' Inconsistencies, such as second objects 118 suddenly appearing or suddenly disappearing in the immediate vicinity of vehicle 100.
  • Information about these objects can be sent from control unit 102 to data processing device 214 either directly in the form of a corresponding data sequence or bundled in the form of statistics.
EP22706756.8A 2021-03-24 2022-02-03 Verfahren zum bewerten einer software für ein steuergerät eines fahrzeugs Pending EP4315069A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102021202903.5A DE102021202903A1 (de) 2021-03-24 2021-03-24 Verfahren zum Bewerten einer Software für ein Steuergerät eines Fahrzeugs
PCT/EP2022/052522 WO2022199916A1 (de) 2021-03-24 2022-02-03 Verfahren zum bewerten einer software für ein steuergerät eines fahrzeugs

Publications (1)

Publication Number Publication Date
EP4315069A1 true EP4315069A1 (de) 2024-02-07

Family

ID=80595546

Family Applications (1)

Application Number Title Priority Date Filing Date
EP22706756.8A Pending EP4315069A1 (de) 2021-03-24 2022-02-03 Verfahren zum bewerten einer software für ein steuergerät eines fahrzeugs

Country Status (6)

Country Link
EP (1) EP4315069A1 (ko)
JP (1) JP2024512563A (ko)
KR (1) KR20230157514A (ko)
CN (1) CN117099085A (ko)
DE (1) DE102021202903A1 (ko)
WO (1) WO2022199916A1 (ko)

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10884902B2 (en) * 2017-05-23 2021-01-05 Uatc, Llc Software version verification for autonomous vehicles

Also Published As

Publication number Publication date
JP2024512563A (ja) 2024-03-19
WO2022199916A1 (de) 2022-09-29
CN117099085A (zh) 2023-11-21
DE102021202903A1 (de) 2022-09-29
KR20230157514A (ko) 2023-11-16

Similar Documents

Publication Publication Date Title
DE102017215552A1 (de) Plausibilisierung der Objekterkennung für Fahrassistenzsysteme
DE102016212326A1 (de) Verfahren zur Verarbeitung von Sensordaten für eine Position und/oder Orientierung eines Fahrzeugs
DE102016212195A1 (de) Verfahren zum Durchführen eines automatischen Eingriffs in die Fahrzeugführung eines Fahrzeugs
EP3530537B1 (de) Kraftfahrzeug-steuervorrichtung und verfahren zum betreiben der steuervorrichtung zum autonomen führen eines kraftfahrzeugs
DE102021128041A1 (de) Verbesserung eines neuronalen fahrzeugnetzwerks
AT523834B1 (de) Verfahren und System zum Testen eines Fahrerassistenzsystems
DE102020211970A1 (de) Verfahren zum Steuern eines Fahrzeugs
DE102020209680B3 (de) Signalverarbeitungspfad, Vorrichtung zur Umfelderkennung und Verfahren zur Validierung eines automatisiert betreibbaren Fahrsystems
EP3857437A1 (de) Verfahren und vorrichtung zur analyse eines sensordatenstroms sowie verfahren zum führen eines fahrzeugs
DE102017206344A1 (de) Fahrerassistenzsystem für ein Fahrzeug
WO2021130066A1 (de) Training von neuronalen netzen durch ein neuronales netz
DE102017201796A1 (de) Steuervorrichtung zum Ermitteln einer Eigenbewegung eines Kraftfahrzeugs sowie Kraftfahrzeug und Verfahren zum Bereitstellen der Steuervorrichtung
DE102017223621A1 (de) Verfahren und Steuereinheit zur Steuerung einer Funktion eines zumindest teilweise automatisiert fahrenden Fahrzeugs
EP4315069A1 (de) Verfahren zum bewerten einer software für ein steuergerät eines fahrzeugs
DE102018211726A1 (de) Verfahren zum automatischen maschinellen Trainieren eines elektronischen Fahrzeugführungssystems, sowie Kraftfahrzeug
DE102020001309A1 (de) Verfahren zum Betreiben einer elektronischen Recheneinrichtung für ein Kraftfahrzeug, sowie elektronische Recheneinrichtung
DE102020206610A1 (de) Sicherheitsarchitektur zur steuerung eines autonomen fahrzeugs
DE102022200139A1 (de) Verfahren zur Optimierung der Umfeldwahrnehmung für ein Fahrunterstützungssystem mittels zusätzlicher Referenzsensorik
DE102021209623A1 (de) Verfahren zum infrastrukturgestützten Assistieren eines Kraftfahrzeugs
WO2022223085A1 (de) Verfahren zum testen eines assistenzsystems
DE112021007341T5 (de) Steuervorrichtung und Steuerverfahren
DE202021004237U1 (de) Computerlesbares Speichermedium und Rechenvorrichtung zum Evaluieren eines Softwarestands eines Fahrerassistenzsystems
DE102020213496A1 (de) Validierung von Modellen für Fahrbahn-Spuren basierend auf Schwarmdaten
DE102021112169A1 (de) Erkennen eines manipulierten Eingangssignals für einen Klassifikator eines Kraftfahrzeugs
WO2023222357A1 (de) Verfahren und vorrichtung zum erkennen einer fehlfunktion eines umfeldmodells einer automatisierten fahrfunktion

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20231024

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR