WO2014183948A2 - Capteur, simulateur et procédé de simulation de mesures de capteur, de fusion de mesures de capteur, de validation d'un modèle pour capteur et de conception d'un système d'aide à la conduite - Google Patents

Capteur, simulateur et procédé de simulation de mesures de capteur, de fusion de mesures de capteur, de validation d'un modèle pour capteur et de conception d'un système d'aide à la conduite Download PDF

Info

Publication number
WO2014183948A2
WO2014183948A2 PCT/EP2014/057611 EP2014057611W WO2014183948A2 WO 2014183948 A2 WO2014183948 A2 WO 2014183948A2 EP 2014057611 W EP2014057611 W EP 2014057611W WO 2014183948 A2 WO2014183948 A2 WO 2014183948A2
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
model
environment
real
virtual
Prior art date
Application number
PCT/EP2014/057611
Other languages
German (de)
English (en)
Inventor
Michael Fiegert
Wendelin Feiten
Original Assignee
Siemens Aktiengesellschaft
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Aktiengesellschaft filed Critical Siemens Aktiengesellschaft
Publication of WO2014183948A2 publication Critical patent/WO2014183948A2/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • B60W50/0205Diagnosing or detecting failures; Failure detection models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/15Vehicle, aircraft or watercraft design
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • B60W50/0205Diagnosing or detecting failures; Failure detection models
    • B60W2050/0215Sensor drifts or sensor failures

Definitions

  • model and “representation” must be defined differently.
  • data structures are referred to as models which largely largely accurately model aspects of a real environment and can therefore be used as reliable input and / or specification for algorithms for sensor fusion as well as for their training.
  • models are usually created by experts with great care and a great deal of time.
  • an environment model denotes a largely exact model of a real environment, which an expert has created, for example, by exact manual measurement of a test environment and subsequent construction of a finely resolved polygon model.
  • an environment representation denotes an internal representation of an environment, which is generated, for example, by a sensor fusion algorithm as a grid map and must necessarily remain in both its level of detail and its reliability relative to the environment model.
  • the terms environment representation and internal environment representation are synonymous with this. stand.
  • the terms measured values, sensor data and sensor signals are also used synonymously.
  • a vehicle model refers to a largely exact model of a vehicle, for example in the form of a high-resolution 3D model or blueprint.
  • a vehicle representation denotes estimated states of the vehicle, for example, whether or not there is a skid.
  • a driver model designates a largely exact model of a driver, for example in the form of an animated humanoid 3D model, which shows realistic head, eye and eyelid movements.
  • a driver's representation denotes estimated states of the driver, for example, whether he is tired or not.
  • the environment will be an environment of a vehicle.
  • the environment can also be the vehicle itself or its driver.
  • the term environment model should therefore also include the special cases vehicle model and driver model below.
  • the term environment representation if not specified more precisely, should also include the special cases of vehicle representation and driver representation in the following.
  • the algorithms described below can also use an environment model, a vehicle model and / or a driver model and an environment representation, a vehicle representation and / or a driver representation side by side, which then exist in separate modules and work together in a suitable manner.
  • the applications come from independent manufacturers and independent Developed from each other, the sensor signals used by a particular application of another application can not be used.
  • the main reason for this is the vertical integration of the applications, ie the applications including the necessary hardware and software are developed independently of each other and installed in the vehicle.
  • the respective application has specially provided sensors, its own environment representation and its own output, namely the application of the respective driver assistance application.
  • this block comprises three levels, namely the detection of the environment by the sensor, the interpretation of the sensor values based on an environmental representation and the reaction by the driver assistance application.
  • the conversion rule according to which the sensor data is converted into the environment representation is included conventionally integrated in a fusion algorithm. If a new type of sensor is to be integrated into the system, the entire fusion algorithm must be reopened. Then the fusion algorithm has to be re-formulated taking into account the new sensor type. Again, the developer must know exactly the peculiarities of this new sensor type.
  • Document DE 10 2004 052 242 A1 shows a sensor fusion system and vehicle control system with a sensor fusion system.
  • Each of a plurality of probability distribution output units calculates a probability distribution of a data value detected by a corresponding sensor or algorithm in image recognition processing.
  • the respective probability distributions of the plurality of probability distribution output units are output to a synthesis determination processing unit.
  • Data formats of the outputs of the synthesis determination processing unit can thereby be standardized. Therefore, the synthesis determination processing unit is excluded from considering on which type of sensor or algorithm each of the outputs is based. Even if a sensor or algorithm is added or changed, the same data-fusing algorithm can be used in the synthesis-determining processing unit.
  • probability distribution output units and a synthesis determination processing unit are necessary.
  • the document WO 2007/017308 A1 shows a method for creating environmental hypotheses for driver assistance functions.
  • an essential step in the design of driver assistance systems is the derivation of internal representations of the environment, of the vehicle and of the driver from the measured values of one or more sensors. So far, flat-rate models are formed, in which both in-depth knowledge of the physics of the sensor as well as the characteristics of the environment of use.
  • the object is to specify a sensor product and a simulator and methods for simulating sensor measurements, for merging sensor measurements, for validating a sensor model and for designing a driver assistance system, which increase the reusability of the corresponding products and methods and / or use simplify the corresponding products and processes.
  • a sensor product comprising a sensor configured to generate real measurements which generate real sensor data which at least partially represent a real environment
  • the real environment is in particular an environment of a vehicle, a vehicle with states and / or a driver of a vehicle,
  • the senor in particular a 2D or 3D camera, an ultrasonic sensor, a 2D or 3D laser scanner, a
  • 2D or 3D radar a lidar, a wheel rotation sensor, an inertial sensor, an acceleration sensor, a rotation rate sensor, a temperature sensor, an air humidity sensor, a position sensor for determining at least one parameter of the driving dynamics of a vehicle, a seat occupancy sensor or a distance sensor,
  • a program code or XML document which, in particular as a program code or XML document, is stored on a data medium, in particular on a memory of the sensor, on a mobile data carrier or on a mass memory of a server or in a cloud,
  • the simulator includes a computing unit which is programmed to simulate virtual measurements of at least one sensor of the sensor product in at least one environment model based on the sensor model of the sensor product.
  • the object is achieved according to the invention by a method for simulating sensor measurements
  • a computing unit reads in a sensor model from a data carrier
  • the sensor model describes and / or virtually models a hardware and / or physical properties of a sensor, wherein the sensor is set up to image a real environment by means of real measurements which generate real sensor data,
  • the real environment is in particular an environment of a vehicle, a vehicle with states or a driver of a vehicle,
  • the arithmetic unit simulates virtual measurements of the sensor in an environment model on the basis of the sensor model
  • the object is achieved according to the invention by a method for the fusion of sensor measurements
  • real measurements can be carried out in a real environment with at least one sensor product, whereby real sensor data are generated, or
  • a computing unit uses probability density functions contained in the sensor model of the sensor product to store the real sensor data or virtual sensor data as random variable and / or probability density functions in an environmental representation of a predetermined type
  • the real sensor data or virtual sensor data are generated at different times and time-fused in the environment representation
  • Sensors are generated with different positions and locally fused in the environmental representation, and / or
  • the real sensor data or virtual sensor data are generated for sensors of different types and fused in the environment representation, and
  • the predetermined type of environment representation is, in particular, a 2D or 3D grid map, an object list or a list of states.
  • the object is achieved according to the invention by a method for validating a sensor model
  • the sensor measurement simulation method simulates virtual sensor measurements in an environment model that mimics the real environment, thereby generating virtual sensor data
  • the object is achieved according to the invention by a method for designing a driver assistance system,
  • a suitable type for the environment representation 6 for the driver assistance system is selected, and in which the sensor product is integrated into the mechanical and electrical design, information relating to this in the Sensor model can be used.
  • a computer program is stored, which executes one of the methods, when it is processed in a microprocessor.
  • the computer program is executed in a microprocessor and executes one of the methods.
  • the sensor product makes it possible to model the different areas of influence for the sensor model separately from one another.
  • An overall model results from simulation of the sensor measurements based on the separately set up models - in particular the sensor model and separately from an environmental model - and a suitable simulation system.
  • the environment model contains those physical properties of the environment that are used to simulate or derive the "ground truth” of an environmental representation are needed.
  • ground truth is meant the actual truth, which underlies a measurement.
  • an object is first measured with a very high-quality sensor in order to obtain an exact measurement (“ground truth”) in advance. Thereafter, a large number of measurements are made with the final sensor, resulting in, for example, a Gaussian curve of measured values in relation to the previously determined exact measurement (“ground truth").
  • ground truth results directly from the environment model.
  • the environment model takes into account the physical properties of the environment. For a video camera, for example, these are the geometry and textures of the objects. For an ultrasonic sensor, an approximation of the geometry together with the acoustic reflection properties may suffice.
  • the sensor model can be developed independently of its specific use. This makes it possible, in particular, to use a formalized description for the respective sensor signal.
  • the division between sensor model and environment model is not common in this way, however, leads to the fact that the special knowledge of the developer of the sensor hardware and the knowledge of the application domain need not be united in one person or in a team, but recorded independently and can be used or reused. In addition to increasing reusability, this also simplifies the use of the respective products and processes.
  • the sensor models enable a modular and hybrid fusion of sensor signals from different sensors. In particular, a developer does not have to know or require the specific properties of the respective sensor since he receives the sensor model either directly from the sensor or in another way, for example via a USB stick or as a downloadable file via the Internet.
  • the hardware required for the driver assistance system can be reduced since individual sensors are used multiple times. Also, the complexity of the overall system driver assistance system is easier to control.
  • the sensor models allow separate groups of developers - for example for the sensors, the sensor simulation, the sensor fusion and the design - to work together using defined interfaces such as the sensor models.
  • Another advantage is the ability to progressively develop driver assistance systems through the increased modularity created. Instead of equipping each application with its own sensors as usual, the existing sensors can be reused. The additional cost of a new driver assistance application is then oriented only to the optionally additionally installed sensors and the costs for new software.
  • the suitable type for the environment representation for the driver assistance system is selected, for example, from the following selected: 2D grid map, 3D grid or cube map, polygon model, object lists (list of center points of objects or rectangle envelopes with size specification) or simple lists of states.
  • the respective sensor can output the sensor model directly in order to fulfill a corresponding certification.
  • Alternatives to this are in those embodiments in which the sensor model is provided in a different way. Examples of such other ways include the use of a USB stick or a file downloadable from the Internet.
  • the USB stick can be coupled, for example, with an engineering system.
  • the sensor could provide a secure hash sum even in this case, through which the authenticity of the sensor model can be verified.
  • the arithmetic unit can be implemented in hardware and / or software technology.
  • the arithmetic unit can be designed as a device or as part of a device, for example as a computer or as a microprocessor or as a control computer of a vehicle.
  • the respective unit may be part of a computer program product, a function, a routine, part of a
  • Program codes or be designed as an executable object.
  • the computer-readable data carrier is, for example, a memory card, a USB stick, a CD-ROM, a DVD or even a data carrier of a server, from which a file with the computer program in a network is provided or delivered. This can be done, for example, in a wireless communication network by transmitting the appropriate file.
  • the sensor model has a predetermined format and / or an interface of a predetermined format.
  • the predetermined format of the sensor model is, for example, an XML schema.
  • the description and / or virtual modeling of the hardware and / or the physical properties of the sensor is at least partially a probabilistic description with probability density functions.
  • Probabilistic descriptions are particularly suitable for fusing sensor data from different sensors with different modalities, for example radar sensor and laser sensor, to an environmental representation or a fused sensor signal.
  • the sensor model models random deviations in the real sensor data whose causes lie in a hardware of the sensor as a probability density function.
  • the sensor model models random deviations in the real sensor data whose causes lie in fluctuations in a transmission medium or different imaged materials, modeled as a probability density function.
  • the sensor is an ultrasonic distance sensor
  • a sensor product results, which additionally comprises at least one environment model which is stored on a data carrier, in particular on a memory of the sensor, on a mobile data carrier or on a mass memory of a server or in a
  • the environment model includes a description of a virtual environment that allows a simulator to simulate virtual measurements of the sensor in the virtual environment based on the sensor model
  • the virtual environment is in particular a virtual world, a driver in a vehicle cockpit or a vehicle with internal states,
  • environment model is modularly separate and independent of the sensor model
  • the environment model has a standardized format that allows the use of the environment model with different sensor models and / or simulators.
  • a sensor product results, in which the sensor is a camera and the environment model allows a simulator to simulate a camera image in which the environment model contains, in particular:
  • Turbidity of the transmission medium due to fog, rain or snow, and / or
  • the senor is an ultrasonic sensor and the environmental model allows a simulator to simulate an ultrasound image
  • the environment model contains a 3D model that approximates and simplifies a real-world model, and where the 3D model contains polygons and information about Contains reflection properties and / or surface normals of the polygons.
  • a sensor product results, which additionally comprises a modular simulator software which is stored on a data medium, in particular on a memory of the sensor, on a mobile data carrier or on a mass memory of a server or in a cloud, and which is adapted to simulate virtual measurements of the sensor in the environment model after reading the sensor model.
  • a modular simulator software which is stored on a data medium, in particular on a memory of the sensor, on a mobile data carrier or on a mass memory of a server or in a cloud, and which is adapted to simulate virtual measurements of the sensor in the environment model after reading the sensor model.
  • a sensor simulation module which is set up to read in the sensor model from the data carrier, whereby program code for the sensor simulation module is provided, or
  • the senor is simulated by the sensor simulation module
  • the sensor simulation module comprises its own hardware and / or is executed virtually in a computing unit as program code.
  • the sensor simulator module reads in a sensor model encoded in XML and then behaves like the real sensor.
  • a method results for Si simulation of sensor measurements
  • a vehicle model indicates a geometric position of the sensor relative to a vehicle and in particular a supply voltage with which the sensor is powered by the vehicle
  • the vehicle model in which the vehicle model is taken into account in the simulation.
  • the vehicle model describes where the sensors are relative to the vehicle. In the first place geometric relationships are relevant.
  • electrical conditions can also be used if, for example, according to the sensor model, fluctuations in the supply voltage can lead to increased noise in the measured values.
  • the supply voltage possibly as a random variable, can be used as part of the vehicle model.
  • Beam intersections calculated with polygons in the environment model, in particular normal vectors of the polygons are considered at the intersections in the determination of an ultrasonic echo.
  • Figure 2A-2D typical ingredients of a modeling of
  • FIG. 2A shows a probability density function p hit of FIG
  • FIG. 2B actually determined distance measurements due to non-static objects in the environment
  • FIG. 2C shows a case in which no echo is measured at all
  • FIG. 3 shows a distribution resulting from FIGS. 2A-2D for the modeling of ultrasound measurements
  • FIG. 4 shows a flow diagram of a simulation of virtual
  • Figure 5 flowchart for validation of a sensor model.
  • the appropriate interpretation of the measured values of the sensors in different operating environments is a crucial prerequisite for the function of the driver assistance system. It is therefore important to understand as precisely as possible which phenomena lead to which measured values, so that conclusions can then be drawn from the measured values on the phenomena, i. so that an internal representation of the environment, the vehicle or the driver can be derived from the measured values in the vehicle or in the driver assistance system.
  • the properties of the real sensor data 4-beyond the actual measured values, ie on a meta-level-relate in particular to the scattering of the data, the occurrence of random values apart from a physical explanation that can be followed in detail, limitations of the measured values, etc.
  • a sensor data fusion 5 is basically always required. Because from the second measurement, a temporal merger must already be made. Several sensors at different locations require a local fusion of the real sensor data 4. Furthermore, different sensor types, such as ultrasound and camera, under special
  • a first embodiment relates to a sensor model for ultrasonic sensors in a driver assistance function for automatic parking.
  • This example is based on the representations in Thrun, Burgard and Fox: “Probabilistic Robotics", MIT Press 2005, chap. 6.
  • the automatic parking application needs a map of the surroundings as an internal environment representation in which a parking space can be identified and then a path can be planned into this parking space.
  • This card will usually be an occupancy grid card.
  • the plane in which the vehicle is located is usually divided into square cells of the same size, for which it is recorded whether this cell is occupied or not.
  • the edge length of a cell is usually between 0.05 m and 1.0 m, but can also be smaller or larger.
  • Such a grid map is an example of an environmental representation m of the environment.
  • the environmental representation may be static over a period of time, ie, it should model the parts of the real environment e that are Do not change over a period of time.
  • a dynamic environment representation is available, in which objects are entered with their direction of movement and speed. The measurement depends on the real environment e.
  • this is often taken synonymously with the information in the environmental representation m, ie the map of the environment.
  • FIGS. 2A to 2D show four typical ingredients (based on Thrun, Burgard and Fox: “Probabilistic Robotics", MIT Press 2005, Chapter 6).
  • the vertical axis denotes p (z ⁇
  • the horizontal axis indicates the measured distance and reaches halfway the actual distance ,
  • FIG. 2A shows the probability density function p hit of the measured value with undisturbed measurement on an object.
  • the measured value is not always equal to the actual distance He scatters depending on non-modeled and thus uncontrollable influencing variables (wind, humidity, temperature, reflection properties of the object, ).
  • This scattering is modeled by a normal distribution. It mainly reflects the physical properties of the sensor.
  • Fig. 2B actually detected distance measurements due to non-static objects in the environment (leaves passing by, pedestrians, etc.) are detected. Because multiple objects can occur simultaneously and independently of each other, and because of several objects, the next is always seen and the others are not, this probability accumulates at shorter distance values. This probability p S hort depends on the environment.
  • FIG. 2C detects the fact that many ultrasound sensor systems return a fixed maximum value in the event that no echo is measured at all. This contribution p max depends both on the hardware of the sensor and on the environment, so in reality there are two components: eg which material is not perceived by the sensor, and how often does this material occur in the environment.
  • FIG. 2D it is modeled that some sensors occasionally produce purely random measured values.
  • the cause could be anything: EMC problems in the vehicle, electromagnetic or acoustic interference from the environment or similar.
  • the expert does not necessarily investigate the lack of data and, in the absence of sufficiently stringent examination conditions, why p ran d a ls background noise remains in the model.
  • the aggregated forward sensor model that is the estimation of the measurements given the environment and the state of the vehicle in the environment, can not be made by the sensor hardware expert alone, nor by the domain expert alone. Rather, this requires expertise from both areas.
  • the corresponding competencies are brought together in the development team for the respective module, and then the models are adapted by engineering until they correspond to the sensor behavior in the targeted environment.
  • a separate and explicit modeling of the sensor properties (sensor model), the environmental properties (environment model) and the type and content of the internal environment representation is now carried out. In the design process, the required expertise on the sensor hardware is decoupled from expertise to application areas.
  • FIG. 4 shows a sensor model 10, a vehicle model 15 and an environment model 20, which enable the simulation of a virtual measurement 30, resulting in virtual sensor data 40.
  • the above-mentioned decoupling occurs in that the sensor hardware is modeled physically so extensively in the sensor model 10 that the virtual measurements 30 for the sensor can be simulated by means of a suitable simulation and due to the correspondingly detailed environment model 20.
  • the virtual sensor data 40 are subject to the same random fluctuations as the real sensor data determined in experiments (see FIG.
  • the sensor model 10 contains the physical properties of the sensor. Which these are depends of course on the respective different sensor types. In the following, the requirements for the sensor model 10 are illustrated using the example of an ultrasonic distance sensor, in which only the first incoming echo is evaluated or returned to the corresponding distance value.
  • the first type of causes of random deviations in the measured values lies in the sensor hardware. Different physical dependencies of the measured values of either unmodeled or undetectable phenomena within the sensor can underlie this. This can e.g. Fluctuations in the supply voltage, random variations in the behavior of analog circuit components such as amplifiers or filters, discretization noise due to accidental shifts of sampling instants, or similar. Such phenomena can produce pseudo-measurements that do not correspond to any external physical facts.
  • Another type of causes of random deviations may be in the transmission medium. In ultrasonic measurements, the speed of sound and thus the distance measurement depend on the air temperature, air pressure and humidity (and on the wavelength of the signal). Likewise, the wind conditions have a significant impact. These influences are often unknown. Often it is also not possible to estimate these quantities from other measurements since the phenomenon itself is random, e.g. in strongly changing wind conditions. These causes can lead to seemingly random deviations, but also to complete failure of measurements.
  • these include the transmission and reception frequency ranges or frequency responses, as well as the directional characteristic, the filter characteristics, the transmission energy, the reception sensitivity.
  • the sensor model must go far beyond the usual specifications.
  • the sensor model 10 which influence the different states of the transmission medium can have, whereby this influence is most suitably modeled as a probability density distribution over the measured value given the distance to a normalized object. Since many different influences are superimposed here, the measured value, given an object at a certain distance, is often approximately normally distributed.
  • the dimensions, the weight, the electrical connection values should be listed in the sensor model 10.
  • the environment model 20 contains the physical properties of the environment needed to simulate or derive the ground truth of the internal environment representation. For each sensor model 10 to be used, the environmental model 20 must include the corresponding physical properties of the environment be.
  • these properties include the geometry and textures (optical reflection properties) of the objects.
  • environment models 20 can be found in the areas of computer animation for movies, computer games, architectural simulation, etc. In these environment models 20, there are some examples. also simulated properties of the transmission medium (fog, rain, snow, ...) as well as properties of lighting.
  • the environmental model 20 For the simulation of an ultrasonic sensor, an approximation of the geometry together with the acoustic reflection properties is sufficient as the environmental model 20.
  • the objects in the environment model 20 are modeled so that sections with rays can be calculated or that the reflection of waves on the surfaces can be calculated.
  • the normals on the surfaces are needed.
  • the formats in which, on the one hand, the sensor model 10 and, on the other hand, the environment model 20 are set up, must match the corresponding simulators.
  • a standardization is recommended, but should leave a certain amount of space. Most promising candidate for this is also a probabilistic formulation, ie the corresponding parameters are considered as random variables.
  • the environmental model 20 basically represents the phenomena that lead to measurements.
  • the environment model 20 also represents the phenomena which lead to measurements, but which should not contribute to entries in the internal environment representation. For example, for planning a train to automatic
  • Parking may require an internal environmental representation of the static components of the environment of the vehicle.
  • dynamic objects would be leaves that pass by or pedestrians who pass by (see Figure 2B). These phenomena are best selected and described by the domain expert.
  • the resulting readings are noise related to the static components of the environment. If the internal environment representation is also to capture the dynamic objects, then the resulting measured values are no longer noise, but part of the signal.
  • the vehicle model 15 describes where the sensors are relative to the vehicle. First and foremost, the geometric relationships are relevant here. However, electrical conditions may also become relevant if e.g. According to sensor model 10 fluctuations in the supply voltage can lead to increased noise in the measured values. In this case, the supply voltage (possibly as a random variable) is part of the vehicle model 15th
  • the sensor simulation From the sensor model 10 and the environment model 20, the sensor simulation generates the same virtual sensor data 40 as the real sensors in a real use environment. would generate.
  • the simulation can process random variables as input values, and also the virtual sensor data 40 are random variables, and their probability density distribution is also simulated.
  • the simulation can be carried out at very different levels of accuracy, which usually results in a higher storage space and computing time required, the more accurate the simulation.
  • a very simple simulation is based on the fact that for a discrete, often very small number of rays emanating from the sensor, the intersections of these rays are formed with objects in the environment. For this purpose, the intersection points of rays with objects must be able to be formed in the environment model 20. This simulation neglects the reflec- tion of acoustic waves on surfaces and thus provides more measured values than the physical sensor. In another simulation method, in addition to the intersection of the rays with the surfaces, the normal vectors on the surfaces at the intersections are also taken into account. This makes it possible to judge whether a sound wave is reflected towards a receiver (this can also be the transmitter) and thus an echo is measured, or whether no echo is measured.
  • Another simulation method is much more influenced by a type of internal environment representation.
  • the volume of the internal environment representation is decomposed into grid cells and the course of the pressure distribution over time is simulated.
  • This method (depending on the resolution, ie the size of the grid cells and time intervals) simulates the echoes of very jagged obstacles, and also simulates echoes that reach several different reflectors to a receiver, so-called multi-path echoes.
  • the ultrasonic sensor emits a cone-shaped signal. An echo can be described by a circle segment within this cone whose radius corresponds to the measured distance. Accordingly, a likelihood of an obstacle is increased in all cells cut by the circle segment and lowered in all cells passed by the beam on the way there.
  • the sensor model 10 By agreeing formats, the sensor model 10, the environment model 20 and the vehicle model 15 can come from different suppliers and still fit together. Similar de facto standards already exist in the area of CAD (for example the STEP format). In the context of the embodiments, standards are needed in which the probability density functions can also be described.
  • the sensor model 10 can come from the sensor manufacturer.
  • the environment model 20 may come from a company that specializes in 3D modeling.
  • the simulator can come from a company specializing in simulation. In the future, the vehicle model will be contributed by the system integrator.
  • the models and the simulator are normally used by the system integrator, ie a car manufacturer - 1er or designer.
  • FIG. 5 shows a validation method for comparing the virtual sensor data 40 with experimentally determined real sensor data 4, with each of the involved parties sharing with their own models which other models or simulation tools underlie the comparison.
  • the sensor manufacturer manufactures a number of representative test environments, ie it actually builds them. Such a test environment is shown in FIG. 5 as a real environment 2.
  • These test environments are made according to the specifications the environment model manufacturer (and ideally by an employee of an environment model manufacturer) with respect to the physical principles of the sensor 1 and the formal requirements of the simulation tools modeled.
  • these formats should be standardized.
  • the sensor 1 measures the real sensor data 4. Irrespective of this, in a simulator on the basis of a sensor model 10 and an environment model 20, virtual measurements 30 are made which yield virtual sensor data 40.
  • the real sensor data 4 are compared with the virtual sensor data 40.
  • the result of this comparison is provided by the sensor manufacturer together with its sensor model 10. This allows the system integrator to assess the quality of the approximation of the real conditions by the sensor model 10 (with the aid of environment model 20 and simulation). This quality of the approximation is relevant in the further design process of a driver assistance system.
  • the environment model 20 need not be limited to geometry, but may also include domain knowledge regarding a likelihood of events and / or relevance of objects, such as how often people appear on a lane, or that children are more relevant than parked cars ,
  • the environment model can be modularly separated into the two aforementioned components geometry and domain knowledge and provided by different manufacturers.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Geometry (AREA)
  • Automation & Control Theory (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Computer Hardware Design (AREA)
  • Computational Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Traffic Control Systems (AREA)
  • Stored Programmes (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
PCT/EP2014/057611 2013-05-16 2014-04-15 Capteur, simulateur et procédé de simulation de mesures de capteur, de fusion de mesures de capteur, de validation d'un modèle pour capteur et de conception d'un système d'aide à la conduite WO2014183948A2 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
DE102013209148 2013-05-16
DE102013209148.6 2013-05-16
DE201310212710 DE102013212710A1 (de) 2013-05-16 2013-06-28 Sensorprodukt, Simulator und Verfahren zur Simulation von Sensormessungen, zur Fusion von Sensormessungen, zur Validierung eines Sensormodells und zum Entwurf eines Fahrerassistenzsystems
DE102013212710.3 2013-06-28

Publications (1)

Publication Number Publication Date
WO2014183948A2 true WO2014183948A2 (fr) 2014-11-20

Family

ID=51831426

Family Applications (5)

Application Number Title Priority Date Filing Date
PCT/EP2014/057600 WO2014183945A1 (fr) 2013-05-16 2014-04-15 Système de conception et procédé de conception d'un système d'assistance à la conduite
PCT/EP2014/057585 WO2014183944A1 (fr) 2013-05-16 2014-04-15 Système de conception et procédé de conception d'un système d'assistance à la conduite
PCT/EP2014/057611 WO2014183948A2 (fr) 2013-05-16 2014-04-15 Capteur, simulateur et procédé de simulation de mesures de capteur, de fusion de mesures de capteur, de validation d'un modèle pour capteur et de conception d'un système d'aide à la conduite
PCT/EP2014/057619 WO2014183949A1 (fr) 2013-05-16 2014-04-15 Dispositif et procédé afférents à un système d'assistance à la conduite d'un véhicule
PCT/EP2014/057867 WO2014183953A1 (fr) 2013-05-16 2014-04-17 Ensemble et procédé de fusion de capteurs ainsi que procédé de production pour établir un modèle de fusion

Family Applications Before (2)

Application Number Title Priority Date Filing Date
PCT/EP2014/057600 WO2014183945A1 (fr) 2013-05-16 2014-04-15 Système de conception et procédé de conception d'un système d'assistance à la conduite
PCT/EP2014/057585 WO2014183944A1 (fr) 2013-05-16 2014-04-15 Système de conception et procédé de conception d'un système d'assistance à la conduite

Family Applications After (2)

Application Number Title Priority Date Filing Date
PCT/EP2014/057619 WO2014183949A1 (fr) 2013-05-16 2014-04-15 Dispositif et procédé afférents à un système d'assistance à la conduite d'un véhicule
PCT/EP2014/057867 WO2014183953A1 (fr) 2013-05-16 2014-04-17 Ensemble et procédé de fusion de capteurs ainsi que procédé de production pour établir un modèle de fusion

Country Status (2)

Country Link
DE (5) DE102013212710A1 (fr)
WO (5) WO2014183945A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017167528A1 (fr) * 2016-03-31 2017-10-05 Siemens Aktiengesellschaft Procédé et système de validation d'un système de détection d'obstacles
US10229363B2 (en) 2015-10-19 2019-03-12 Ford Global Technologies, Llc Probabilistic inference using weighted-integrals-and-sums-by-hashing for object tracking
DE102022209080A1 (de) 2022-09-01 2024-03-07 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zum Kalibrieren eines Sensors, Recheneinheit und Sensorsystem

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014209340A1 (de) * 2014-05-16 2015-11-19 Siemens Aktiengesellschaft Anordnung und Verfahren zur Sensorfusion
DE102015010270B4 (de) 2015-08-08 2021-10-28 Audi Ag Verfahren zum Betrieb von Fahrerassistenzsystemen in einem Kraftfahrzeug und Kraftfahrzeug
DE102016202317A1 (de) * 2016-02-16 2017-08-17 Continental Teves Ag & Co. Ohg Verfahren zum steuern von fahrzeugfunktionen durch ein fahrerassistenzsystem, fahrerassistenzsystem und fahrzeug
CN107310550B (zh) * 2016-04-27 2019-09-17 腾讯科技(深圳)有限公司 道路交通工具行驶控制方法和装置
EP3260999B1 (fr) * 2016-06-24 2021-08-04 Sick Ag Systeme de simulation de capteurs
US10599150B2 (en) 2016-09-29 2020-03-24 The Charles Stark Kraper Laboratory, Inc. Autonomous vehicle: object-level fusion
US10377375B2 (en) 2016-09-29 2019-08-13 The Charles Stark Draper Laboratory, Inc. Autonomous vehicle: modular architecture
EP3515780A1 (fr) * 2016-09-29 2019-07-31 The Charles Stark Draper Laboratory, Inc. Véhicule autonome à architecture modulaire
US10101745B1 (en) 2017-04-26 2018-10-16 The Charles Stark Draper Laboratory, Inc. Enhancing autonomous vehicle perception with off-vehicle collected data
DE102017208692B4 (de) 2017-05-23 2023-02-02 Audi Ag Verfahren zum Bereitstellen von Trainingsdaten für eine Funktionsprüfung einer Erkennungseinrichtung sowie Datenbanksystem
DE102017116017A1 (de) * 2017-07-17 2019-01-17 Valeo Schalter Und Sensoren Gmbh Kraftfahrzeug-Sensorvorrichtung mit mehreren Sensoreinheiten und mehreren neuronalen Netzen zum Erzeugen einer kombinierten Repräsentation einer Umgebung
DE102017116016A1 (de) * 2017-07-17 2019-01-17 Valeo Schalter Und Sensoren Gmbh Kraftfahrzeug-Sensorvorrichtung mit mehreren Sensoreinheiten und einem neuronalen Netz zum Erzeugen einer integrierten Repräsentation einer Umgebung
DE102018205804A1 (de) * 2018-04-17 2019-10-17 Conti Temic Microelectronic Gmbh Steuergerätetesteinrichtung zum Testen, Absichern und Entwickeln von Funktionen
DE102018206326B4 (de) * 2018-04-24 2020-01-09 Zf Friedrichshafen Ag Verfahren zum Erweitern einer Datenbasis eines Bayesschen Netzes
DE102018123779A1 (de) 2018-09-26 2020-03-26 HELLA GmbH & Co. KGaA Verfahren und Vorrichtung zum Verbessern einer Objekterkennung eines Radargeräts
DE102018123735A1 (de) * 2018-09-26 2020-03-26 HELLA GmbH & Co. KGaA Verfahren und Vorrichtung zum Verbessern einer Objekterkennung eines Radargeräts
FR3088041B1 (fr) * 2018-11-02 2020-10-16 Renault Sas Procede d’elaboration d’une consigne de pilotage d’un vehicule automobile
CN109634426B (zh) * 2018-12-20 2020-08-14 南京钟山虚拟现实技术研究院有限公司 基于Unity3D的高自由度实验类三维虚拟仿真方法和系统
US11249184B2 (en) 2019-05-07 2022-02-15 The Charles Stark Draper Laboratory, Inc. Autonomous collision avoidance through physical layer tracking
DE102019210448A1 (de) * 2019-07-16 2021-01-21 Audi Ag Verfahren zur Ermittlung einer Verbauposition eines umfeldüberwachenden Umfeldsensors eines Kraftfahrzeugs, Recheneinrichtung, Computerprogramm und elektronisch lesbarer Datenträger
DE102019124504A1 (de) * 2019-09-12 2021-04-01 Bayerische Motoren Werke Aktiengesellschaft Verfahren und Vorrichtung zur Simulation und Bewertung eines Sensorsystems für ein Fahrzeug sowie Verfahren und Vorrichtung zum Entwurf eines Sensorsystems zur Umfelddetektion für ein Fahrzeug
DE102020130748A1 (de) 2020-11-20 2022-05-25 Bayerische Motoren Werke Aktiengesellschaft Verfahren, System sowie ein Computerprogramm zum Erzeugen einer virtuellen Umgebung eines Fahrzeugs
DE102020215657A1 (de) 2020-12-10 2022-06-15 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren und System zum Testen eines Steuergeräts eines Fahrzeugs

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3931879B2 (ja) 2003-11-28 2007-06-20 株式会社デンソー センサフュージョンシステム及びそれを用いた車両制御装置
DE102005008714A1 (de) 2005-02-25 2006-09-07 Robert Bosch Gmbh Verfahren und System zur Bereitstellung von Sensor-Daten
DE102005036953A1 (de) * 2005-08-05 2007-02-08 Robert Bosch Gmbh Verfahren zum Erzeugen von Umwelthypothesen für Fahrerassistenzfunktionen
EP2107503A1 (fr) * 2008-03-31 2009-10-07 Harman Becker Automotive Systems GmbH Procédé et dispositif pour générer un modèle en temps réel pour véhicules
US8739049B2 (en) * 2010-05-24 2014-05-27 GM Global Technology Operations LLC Vehicle system modeling systems and methods
DE102011086342A1 (de) * 2011-11-15 2013-05-16 Robert Bosch Gmbh Vorrichtung und verfahren zum betreiben eines fahrzeugs

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10229363B2 (en) 2015-10-19 2019-03-12 Ford Global Technologies, Llc Probabilistic inference using weighted-integrals-and-sums-by-hashing for object tracking
WO2017167528A1 (fr) * 2016-03-31 2017-10-05 Siemens Aktiengesellschaft Procédé et système de validation d'un système de détection d'obstacles
CN109311497A (zh) * 2016-03-31 2019-02-05 西门子移动有限公司 用于验证障碍物识别系统的方法和系统
US11427239B2 (en) 2016-03-31 2022-08-30 Siemens Mobility GmbH Method and system for validating an obstacle identification system
DE102022209080A1 (de) 2022-09-01 2024-03-07 Robert Bosch Gesellschaft mit beschränkter Haftung Verfahren zum Kalibrieren eines Sensors, Recheneinheit und Sensorsystem

Also Published As

Publication number Publication date
WO2014183953A1 (fr) 2014-11-20
WO2014183949A1 (fr) 2014-11-20
WO2014183945A1 (fr) 2014-11-20
DE102013218678A1 (de) 2014-11-20
DE102013212710A1 (de) 2014-11-20
DE102013213807A1 (de) 2014-11-20
DE102013215032A1 (de) 2014-11-20
DE102013215115A1 (de) 2014-11-20
WO2014183944A1 (fr) 2014-11-20

Similar Documents

Publication Publication Date Title
WO2014183948A2 (fr) Capteur, simulateur et procédé de simulation de mesures de capteur, de fusion de mesures de capteur, de validation d'un modèle pour capteur et de conception d'un système d'aide à la conduite
EP3438901A1 (fr) Système de base de données de scénario de conduite d'essai pour scénarios de conduite d'essais virtuels proches de la réalité
EP3200428B1 (fr) Procede mis en oeuvre par un ordinateur destine a executer une application v2x
DE102018100469A1 (de) Generierten von simulierten sensordaten zum trainieren und überprüfen von erkennungsmodellen
DE102019102205A1 (de) System und verfahren zur end-to-end-validierung von autonomen fahrzeugen
DE102018121018A1 (de) Erweitern von realen sensoraufnahmen mit simuliertem sensordatenhintergrund
DE102016220670A1 (de) Verfahren und System zum Testen von Software für autonome Fahrzeuge
EP3438856A1 (fr) Procédé de modélisation d'un capteur de véhicule automobile dans un environnement d'essai virtuel
DE102019215903A1 (de) Verfahren und Vorrichtung zum Erzeugen von Trainingsdaten für ein Erkennungsmodell zum Erkennen von Objekten in Sensordaten eines Sensors insbesondere eines Fahrzeugs, Verfahren zum Trainieren und Verfahren zum Ansteuern
DE102014118622A1 (de) Verfahren zum simulativen Bestimmen einer Interaktion zwischen einem Sensor eines Kraftfahrzeugs und einem virtuellen Objekt in einem virtuellen Umgebungsbereich des Kraftfahrzeugs sowie Recheneinrichtung
DE102019105337A1 (de) Ultraschallmesssystem im Fahrzeug zur Erkennung und Klassifizierung von Objekten im Umfeld des Fahrzeugs mit Hilfe eines Deep-Learning Verfahrens
DE102019213546A1 (de) Erzeugung synthetischer Lidarsignale
DE102011015094B4 (de) Verfahren zum simulativen Ermitteln von Messeigenschaften eines Sensors eines Kraftfahrzeugs und Rechensystem
DE102020128978A1 (de) Trainieren von tiefen neuronalen netzwerken mit synthetischen bildern
WO2019162317A1 (fr) Procédé de génération de données de capteur pour des appareils de commande d'automobile critiques pour la sécurité
DE102019130096A1 (de) Verfahren zur Ermittlung einer Radarinformation, Simulationsverfahren, Klassifizierungsverfahren, Computerprogramm und Berechnungssystem
DE102020214596A1 (de) Verfahren zum Erzeugen von Trainingsdaten für ein Erkennungsmodell zum Erkennen von Objekten in Sensordaten einer Umfeldsensorik eines Fahrzeugs, Verfahren zum Erzeugen eines solchen Erkennungsmodells und Verfahren zum Ansteuern einer Aktorik eines Fahrzeugs
DE102020215657A1 (de) Verfahren und System zum Testen eines Steuergeräts eines Fahrzeugs
DE102018207566A1 (de) System zum Durchführen von simulierten Kollisionsszenarios von einem Kraftfahrzeug mit einem nicht-motorisierten Verkehrsteilnehmer
DE102020101060B4 (de) Selbstlernendes Ultraschallmesssystem im Fahrzeug zur Erkennung und Klassifizierung von Objekten im Umfeld des Fahrzeugs mit einem Multiplanar-Reformatierer
DE102017201796A1 (de) Steuervorrichtung zum Ermitteln einer Eigenbewegung eines Kraftfahrzeugs sowie Kraftfahrzeug und Verfahren zum Bereitstellen der Steuervorrichtung
DE102014118624A1 (de) Verfahren zum simulativen Bestimmen einer Interaktion zwischen einem Sensor eines Kraftfahrzeugs und einem virtuellen Objekt in einem virtuellen Umgebungsbereich des Kraftfahrzeugs sowie Recheneinrichtung
DE102008055932A1 (de) Verfahren zur modellbasierten Simulation eines Verhaltens eines Sensors
DE102020101036B4 (de) Selbstlernendes Ultraschallmesssystem im Fahrzeug zur Erkennung und Klassifizierung von Objekten im Umfeld des Fahrzeugs mit einem Volumenrenderer
DE102021213538A1 (de) Simulation zur Validierung einer automatisierenden Fahrfunktion für ein Fahrzeug

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14718050

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14718050

Country of ref document: EP

Kind code of ref document: A2