EP3850536A1 - Analysis of dynamic spatial scenarios - Google Patents
Analysis of dynamic spatial scenariosInfo
- Publication number
- EP3850536A1 EP3850536A1 EP19773727.3A EP19773727A EP3850536A1 EP 3850536 A1 EP3850536 A1 EP 3850536A1 EP 19773727 A EP19773727 A EP 19773727A EP 3850536 A1 EP3850536 A1 EP 3850536A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- ego
- vehicle
- sensor data
- representation
- scenario
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000013528 artificial neural network Methods 0.000 claims abstract description 50
- 238000000034 method Methods 0.000 claims abstract description 41
- 238000012549 training Methods 0.000 claims abstract description 15
- 238000012545 processing Methods 0.000 claims description 27
- 230000002123 temporal effect Effects 0.000 claims description 23
- 238000004040 coloring Methods 0.000 claims description 3
- 230000008859 change Effects 0.000 description 13
- 238000011161 development Methods 0.000 description 9
- 230000018109 developmental process Effects 0.000 description 9
- 230000006870 function Effects 0.000 description 9
- 230000007613 environmental effect Effects 0.000 description 7
- 230000008901 benefit Effects 0.000 description 4
- 230000007423 decrease Effects 0.000 description 3
- 238000002360 preparation method Methods 0.000 description 3
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 2
- 230000003044 adaptive effect Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 238000011524 similarity measure Methods 0.000 description 2
- 238000002604 ultrasonography Methods 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000033228 biological regulation Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000011157 data evaluation Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 230000014616 translation Effects 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0088—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W10/00—Conjoint control of vehicle sub-units of different type or different function
- B60W10/20—Conjoint control of vehicle sub-units of different type or different function including control of steering systems
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/20—Drawing from basic elements, e.g. lines or circles
- G06T11/203—Drawing of straight lines or curves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/42—Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation
- G06V10/422—Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation for representing the structure of the pattern or shape of an object therefor
- G06V10/426—Graphical representations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/62—Extraction of image or video features relating to a temporal dimension, e.g. time-based feature extraction; Pattern tracking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/08—Detecting or categorising vehicles
Definitions
- the present invention relates to a method and a system for data preparation of dynamic spatial scenarios for processing by an artificial neural network, a computer-aided method and a system for training an artificial neural network as well as a computer-aided method and a system. system for analyzing sensor data.
- ADAS advanced driver assistance systems
- Support ranges from the mere display of possibly relevant information (e.g. issuing a warning by a lane change assistant) to semi-autonomous interventions (e.g. regulation of the torque applied to the wheel axles by an anti-lock braking system) to fully autonomous interventions in the control of the vehicle (e.g. adaptive cruise control using adaptive cruise control (ACC).
- ADAS advanced driver assistance systems
- driver assistance systems The basis for such driver assistance systems is usually formed by sensor data, for example provided signals from ultrasound sensors, radar sensors or cameras, which can be used to determine the current driving situation and in response to this the function of the respective driver assistance system can be carried out.
- sensor data for example provided signals from ultrasound sensors, radar sensors or cameras, which can be used to determine the current driving situation and in response to this the function of the respective driver assistance system can be carried out.
- the current driving situation must be able to be classified very reliably on the basis of the sensor data.
- a traffic scenario in which a neighboring vehicle in front of the ego vehicle and equipped with the driver assistance system can enter the same lane. shear, can be recognized by the fact that a sensor-detected transverse distance decreases perpendicular to the direction of travel to the neighboring vehicle and finally, at least essentially, assumes the value 0 when the neighboring vehicle is located directly in front of the ego vehicle.
- the programming of algorithms that are suitable for (reliably) recognizing such driving situations or scenarios is generally extremely complex and, in particular, possibly even impossible in the case of a large number of driving situations to be taken into account. Therefore, machine learning is increasingly being used to automatically find characteristics in sensor data that indicate a driving situation.
- the sensor data used is usually, e.g. Manually, classified sensor data with respect to different driving situations (English labeled sensor data), based on which, for example, an artificial neural network can extract the relevant criteria for the respective driving situation.
- the classified sensor data must usually be in a specified form, i.e. be prepared accordingly so that it can be meaningfully processed by the artificial neural network.
- grid maps show the corresponding driving situations from a bird's eye view.
- the dynamics of an underlying driving situation are reproduced by, for example, all the positions of the respective positions taken over time Traffic participants are shown in the corresponding grid map.
- Such grid cards are for example in Grüner et. al. , “Spatiotemporal Representation of Driving Scenarios and Classification using Neural Networks”, 2017 IEEE Intelligent Vehicle Symposium, pp. 1782-1788.
- This object is achieved by a method and a system for data processing, a method and a system for training an artificial neural network and a method and a system for analyzing sensor data in accordance with the independent claims.
- a first aspect of the invention relates to a method for data processing of dynamic spatial scenarios, in particular traffic scenarios, for processing by an artificial neural network.
- a representation of a temporal course of an angle sector, which is covered by another object, in particular another vehicle, from the perspective of an ego object, in particular an ego vehicle, is generated.
- the course over time is determined from sensor data, the sensor data being suitable for characterizing a dynamic spatial scenario in relation to the ego object and at least one other object.
- a spatial scenario in the sense of the invention is in particular formed from a chronological sequence of spatial, in particular static, scenes.
- the spatial scenes give, for example, the spatial arrangement of the at least one other object relative to the ego object, e.g. the constellations of road users.
- a spatial scenario can in particular contain a driving situation in which a driver assistance system at least partially controls the vehicle called the ego vehicle and equipped with the driver assistance system, e.g. carries out at least one vehicle function of the ego vehicle autonomously.
- Dynamic in the sense of the invention means, in particular, temporal sequence or temporal course. Traffic scenarios are dynamic, for example, because the individual traffic turn participants move relative to each other over time, ie change their positions relative to each other.
- a representation in the sense of the invention is in particular a graphic representation, e.g. a graph or chart.
- the representation preferably contains a, in particular two- or three-dimensional, image.
- the representation can also contain a mathematical representation, e.g. an assignment rule or a function.
- the representation can, for example, show the course over time of an angle sector, which is covered by the other object from the perspective of the ego object, in a map-like illustration.
- Sensor data in the sense of the invention are, in particular, data generated by real or simulated sensors, in particular environmental sensors, in particular in the form of signals, i.e. real or simulated sensor data.
- the sensors are preferably set up to record an environment of the ego vehicle and to generate corresponding sensor data so that the sensor data characterize the environment of the ego vehicle.
- the sensor data are preferably sensor fusion data which are obtained from a combination of signals provided by the sensors and, if appropriate, have already been prepared, at least to a certain extent.
- An ego object in the sense of the invention is in particular the object from the perspective of which the dynamic spatial scenario is viewed.
- An ego object can, for example, be a (ego) vehicle, from the point of view of which a traffic scenario is recorded and possibly analyzed and / or evaluated, for example in order to be able to adequately control a driver assistance system or to be able to react to the recorded traffic scenario.
- sensor data that are processed, for example, for further processing are preferably generated by environmental sensors of the ego vehicle, the environmental sensors, such as ultrasound sensors, radar sensors, cameras and / or the like, being set up to detect the environment of the ego vehicle.
- An angular sector in the sense of the invention is in particular that area which is taken up by another object from the perspective of the ego object. From the perspective of an ego vehicle, the angle sector corresponds, for example, to the area that is covered by another vehicle.
- the The angle sector is defined by the cross-sectional area of the object or its contour.
- the angle sector can be defined by a maximum extension of the cross-sectional area or contour along a predetermined direction, in particular in the horizontal direction.
- the invention is based in particular on the approach of encoding information relating to a dynamic spatial scenario, for example a traffic situation, determined from sensor data in a representation.
- the representation can therefore also be understood as an abstraction of the scenario and serves in particular to summarize different types of information, for example positions, translations, number of objects and / or the like.
- the representation is preferably generated on the basis of the temporal development of an angular sector in which the field of vision of an ego object is covered by another object. With the help of the angle sector, in particular constellations of objects in spatial scenes can be described relative to each other.
- the representation is preferably a graphic, in particular two- or three-dimensional, image, which - in an abstract manner - shows both the position and the movement of the objects relative to one another.
- the movement of an object in the illustration can be represented by a shape, in particular a geometric shape.
- a characteristic form can preferably be found on the basis of its analysis of such an image.
- the shape is particularly characteristic of the dynamic spatial scenario in relation to the ego object and at least one other object.
- the characteristic shape is a pattern that e.g. can be found by means of an artificial neural network by analyzing such a representation, possibly also several such representations that were generated on the basis of the sensor data.
- the corresponding dynamic spatial scenario can be determined on the basis of an evaluation of a representation.
- the representation of a dynamic spatial scenario based on the temporal development of the angle sector has the advantage that its information content does not depend, or at least only slightly, on variations in the scenario. Different variants of the scenario, for example a particularly fast or particularly slow driving maneuver, produce different images in the display, but with essentially the same or at least a similar shape.
- plotting the angular sector which is covered by another vehicle with respect to the field of vision of a ego vehicle and thus represents the position of the other vehicle relative to the ego vehicle, generates a pattern over time, the shape of which is for that of driving maneuvers performed in another vehicle is characteristic. If the other vehicle moves away from the ego vehicle, the covered angular sector becomes somewhat smaller, while it increases correspondingly when approached. If the other vehicle is moving e.g. with respect to the direction of travel to the side of the ego vehicle, the covered angle sector shifts.
- the resulting pattern which, for example, has a characteristic curvature, can then be reliably assigned to a driving maneuver, even if the image generally varies, for example aggressively or defensively, depending on the execution of the driving maneuver, e.g. is compressed or stretched.
- the representation of the temporal development of the angular sector thus enables various types of information to be summarized in a compact manner. Since a time lapse is shown here, overlaps, e.g. of graphic elements, and an associated loss of information avoided or at least reduced. At the same time, for example, storage space can be saved and the evaluation of sensor data can be accelerated. At the same time, the display enables reliable identification of dynamic spatial scenarios.
- the representation of the temporal development of the angle sector has the advantage, for example, that the same information can be represented by a much smaller amount of data.
- individual pixels that are not occupied by another vehicle ie “empty” pixels
- do not contribute to the information content that can be used by an artificial neural network for example.
- the amount of data required to create an artificial neural network train can be reduced by a factor of 27 by using the representation of the temporal development of the angular sector.
- the present invention makes it possible to further improve the processing of sensor data.
- the processing of sensor data for processing by an artificial neural network can be improved, for example by preventing or at least reducing information loss during processing.
- the representation is output to an artificial neural network or made available for processing by the artificial neural network.
- the representation can be stored as a data packet, in particular as a digital image, on a storage medium or transmitted directly to the artificial neural network via an interface.
- large amounts of sensor data can also be used to train the artificial neural network, in particular to determine patterns in the representation that are characteristic of a given scenario, such as the shape of a figure in the representation.
- a second aspect of the invention relates to a computer-based method for training an artificial neural network on the basis of sensor data which are suitable for characterizing a known dynamic spatial scenario in relation to an ego object and at least one other object.
- a representation of a time profile of an angle sector, which is covered from the perspective of an ego object, in particular an ego vehicle, by another object, in particular another vehicle, is generated on the basis of the sensor data.
- the generated representation together with information about the spatial scenario is fed to the artificial neural network.
- the artificial neural network can be trained particularly quickly and reliably to recognize patterns that are characteristic of different scenarios.
- the information about the spatial scenario preferably contains an indication of a classification of the scenario, with the aid of which the scenario can be identified, preferably uniquely.
- the information can already be contained in the sensor data, for example if it is simulated sensor data that was generated by a simulator when simulating a specific scenario.
- the information can also be supplied to the artificial neural network as a separate data stream, for example if the sensor data have already been analyzed and classified in relation to at least one scenario. This classification can in particular have been carried out manually, for example by analyzing an image stream corresponding to the sensor data.
- a third aspect of the invention relates to a computer-aided method for analyzing sensor data which are suitable for characterizing a dynamic spatial scenario in relation to an ego object and at least one other object.
- a representation of a time profile of an angle sector, which is covered by another object, in particular another vehicle, from the perspective of an ego object, in particular an ego vehicle, is generated on the basis of the sensor data.
- the generated representation is compared with at least one predefined template of a known dynamic spatial scenario. This means that a known scenario can be detected particularly quickly and reliably on the basis of a sensor data stream, such as is provided by environmental sensors of a vehicle.
- the predefined template is preferably a generic representation of the temporal development of the angle sector, which e.g. contains all essential characteristic patterns of a scenario.
- the predefined template can be a representation that contains at least one figure, the shape of which is characteristic of a driving maneuver of a vehicle that is carried out in the context of a traffic scenario.
- the predefined template is preferably determined on the basis of a number of representations which are characteristic of the known dynamic spatial scenario.
- the multiple representations that are characteristic of the known scenario can be averaged.
- These representations can be generated beforehand, for example, from sensor data that have been classified in relation to the scenario have been. As a result, a high degree of reliability can be achieved when comparing the generated representation with the predefined template.
- the dynamic spatial scenario is classified on the basis of the comparison, for example by assigning the determined representation to the at least one predefined template. This enables the scenario to be reliably identified and, if necessary, reactions of a driver assistance system to be triggered.
- a measure of similarity is preferably determined, on the basis of which the scenario can be classified, for example, and the generated representation can be assigned to a scenario. It is e.g. it is conceivable to assign the generated representation to the scenario on the predefined templates of which the generated representation can best be represented by means of an elastic matching or nonlinear template matching method, i.e. for which a similarity measure obtained by the method becomes maximum. This enables the scenario to be identified particularly reliably.
- a template is defined for a new dynamic spatial scenario if the generated representation cannot be assigned to the at least one predefined template of a known dynamic spatial scenario.
- a template can be defined for a new dynamic spatial scenario if a similarity measure determined when comparing the generated representation does not reach a predetermined threshold value.
- a catalog of predefined templates of known dynamic spatial scenarios can be generated in a simple manner, in particular in real time.
- the generated representation of the further gives a, in particular transverse, distance of the ego object to the other object and / or a, in particular transverse, speed of the ego object.
- the distance and / or the speed is also or are determined from sensor data.
- the corresponding distance information or speed information is preferably encoded in the representation in such a way that it can be read independently of at least the spatial arrangement of the other object relative to the ego object.
- the speed of the ego object can in particular be a transverse speed, i.e. a velocity component essentially perpendicular to the dominant (longitudinal) direction of movement, e.g. a direction of travel, the ego object.
- the speed of the ego object reproduced in the representation produced is preferably formed exclusively from the transverse speed.
- the transverse speed of the ego vehicle is zero, while when changing lanes it increases to a maximum value and then drops back to zero.
- the distance of the ego object to the other object can in particular have a transversal distance, ie a distance component essentially perpendicular to the dominant (longitudinal) direction of movement, for example a direction of travel, of the ego object.
- the distance of the ego object to the other object reproduced in the representation produced is preferably formed exclusively from the transverse distance. If the ego vehicle is overtaken by another vehicle in an adjacent lane, for example, the absolute distance between the ego vehicle and the other vehicle changes. However, the transverse distance remains constant as long as neither the ego vehicle nor any other vehicle leaves its respective lane. The change in the transverse distance therefore allows (further) conclusions to be drawn about the maneuver carried out, for example in the context of a scenario.
- the time profile of an angle sector is represented by a line, the width of which indicates a value of the respective angle sector.
- a distance of the ego object to the other object and / or a speed of the ego object is or become by a stored value or a coloring at the respective position of the line, which corresponds to the time of the presence of the distance and / or Corresponds to the speed shown.
- the stored value or the coloring of the line can alternatively or additionally be determined by a, in particular generic, function, into which, for example, input variables the, in particular transverse, distance between the ego object and the other object and / or the, in particular transverse, speed of the ego object.
- the function can be used, for example, to add or multiply the distance and the speed, possibly taking a weighting into account.
- the speed can be weighted with the distance or the distance with the speed.
- the function is preferably used to manipulate the perception of an artificial neural network.
- the function can be selected so that the resulting representations influence the pattern recognized by the artificial neural network.
- different dynamic spatial scenarios can be distinguished particularly reliably.
- parameters other than the distance between the ego object and the other object and / or the speed of the ego object can also be considered Input variables of the function can be selected.
- the parameters can be selected, for example, depending on the, in particular known, dynamic spatial scenario. Parameters are preferably selected as input variables that at least partially characterize the dynamic spatial scenario.
- a fourth aspect of the invention relates to a system for data processing of dynamic spatial scenarios, in particular traffic scenarios, for processing by an artificial neural network.
- the system has a determination module which is set up to use sensor data to plot a time profile of an angle sector, which is covered by another object, in particular another vehicle, from the perspective of an ego object, in particular an ego vehicle to investigate.
- the sensor data are suitable for characterizing a dynamic spatial scenario in relation to the ego object and at least one other object.
- the system also has a generation module which is set up to generate a representation of the determined time profile.
- a sixth aspect of the invention relates to a system for analyzing sensor data which are suitable for characterizing a dynamic spatial scenario in relation to an ego object and at least one other object.
- the system has a generation module which is set up to display a time profile of an angular sector which is covered by another object, in particular another vehicle, from the perspective of an ego object, in particular an ego vehicle. to generate based on the sensor data.
- the system also has a comparison module which is set up to display the generated data. Compare position with at least one predefined template of a known dynamic spatial scenario.
- FIG. 1 each shows a preferred exemplary embodiment of a system according to the invention for data processing and of a system according to the invention for training an artificial neural network;
- FIG. 2 shows a preferred exemplary embodiment of a system according to the invention for analyzing sensor data
- FIG. 5 shows a second example to explain the relationship between a representation according to the invention and a corresponding dynamic spatial scenario
- FIG. 6 each shows a preferred exemplary embodiment of a method according to the invention for data processing and of a method according to the invention for training an artificial neural network; and 7 shows a preferred exemplary embodiment of a method for data evaluation according to the invention.
- the system 100 for data processing has a determination module 2 and a generation module 3 , wherein the determination module 2 is set up to determine a temporal profile of an angle sector, which is covered by another object from the perspective of an ego object, from sensor data S, and the generation module 3 is set up to generate a representation of the determined course over time.
- the system 200 for training the artificial neural network 1 has the generation module 3 and an interface 4, the interface 4 being set up to supply the generated representation together with information about a dynamic spatial scenario to the artificial neural network 1.
- the sensor data S are generated, for example, by environmental sensors of a ego vehicle when recording a traffic scenario and characterize, for example, the number of neighboring vehicles, the relative arrangement, in particular the relative distances, of the other vehicles from the ego vehicle, the speed of the ego vehicle, and / or similar. From these sensor data S, the determination module 2 can preferably determine how wide at least one angle sector hidden by the other vehicles is in the field of vision of the ego vehicle and in what position this angle sector, for example relative to the direction of travel of the ego vehicle, is arranged.
- the resulting sensor data S can also be used to determine the time profile of the angle sector, in particular a change in its width and / or its position from the perspective of the ego vehicle.
- the time course can be used by the generation module 3 to generate a graphical representation that depicts the traffic scenario in an abstract manner.
- the generation module 3 is set up to encode the information contained in the sensor data S regarding the traffic scenario in the representation, in particular in compressed form.
- the sensor data S are preferably sensor data classified in relation to dynamic spatial scenarios, ie the sensor data S are assigned, for example, to one of several classes of traffic scenarios, such as overtaking maneuvers or lane change maneuvers. This classification can be done manually, for example, by viewing an image data stream.
- the representation generated on the basis of the determined time profile of the angle sector can therefore be transmitted from the interface 4 to the artificial neural network 1 with information about the dynamic spatial scenario.
- the artificial neural network 1 is preferably set up to recognize at least one pattern in each case in all representations which are assigned to the same traffic scenario class. On the basis of such a recognized pattern, a template can be defined that characterizes a known traffic scenario. The templates defined in this way can be stored in a database 5 for further use, for example for evaluating sensor data generated in the normal operation of a vehicle.
- FIG. 2 shows a preferred embodiment of a system 300 for analyzing sensor data S which are suitable for characterizing a dynamic spatial scenario in relation to an ego object and at least one other object.
- the system 300 has a generation module 3, which is set up to generate, based on the sensor data S, a representation of a temporal profile of an angle sector, which is covered by another object from the perspective of the ego object.
- the system 300 also has a comparison module 6, which is set up to compare the generated representation with at least one predefined template of a known dynamic spatial scenario.
- the comparison module 6 preferably has access to a database 5, in which at least one predefined template is stored.
- the result of the comparison is output by the comparison module 6 in a preferred manner and can be used, for example, to control a driver assistance system with which an ego vehicle is equipped. If, for example, at least a predetermined degree of agreement is ascertained between the generated representation and the at least one predefined template, for example by analyzing a same generated measure of similarity, it can be concluded that the known scenario is present, and the comparison module 6 can output approximately one scenario class as the output signal. Alternatively or additionally, the start and / or the end of an identified driving maneuver can also be output, in particular signaled.
- FIG. 3 shows an example for the determination of an angle sector F in a dynamic spatial scenario 10, in which an ego object 11 moves along a direction of movement, here along the x-axis of a coordinate system, and from other objects 12 is surrounded.
- the spatial scenario 10 is a traffic scenario with an ego vehicle 1 1a, which is moving in the direction of travel in a middle lane and is surrounded by other vehicles 12a, 12b in adjacent lanes.
- FIG. 3A shows the traffic scenario from a bird's eye view, a temporal sequence of spatial scenes being summarized in the illustration shown.
- the other vehicles 12a, 12b are in different positions relative to the ego vehicle 1 1a, which are indicated by different fillings of the rectangles representing the vehicles 12a, 12b.
- a denser filling corresponds to a position further back in time.
- a first vehicle 12a performs a lane change from an outer lane to the lane used by the ego vehicle 1 1a and in doing so, it turns in front of the ego vehicle 1 1a.
- a second vehicle 12b which is initially at approximately the same level as the ego vehicle 1 1 a with respect to the direction of movement, falls backwards over time. Due to the fact that the change in position of the second vehicle 12b is relatively small, the positions assumed by the second vehicle 12b at different times overlap in this illustration.
- an angle sector can be determined for each of the two vehicles 12a, 12b, which indicates the area in the field of vision of the ego vehicle 11a that is covered by the respective vehicle 12a, 12b. This is shown in FIG. 3B as an example for the first vehicle 12a.
- the contour 13 of the first vehicle 12a that results in relation to the perspective of the ego vehicle 1 1 a is indicated as a solid line and spans the angle sector F.
- a position cp of the angle sector F, and thus also of the first vehicle 12a View of the ego vehicle 1 1 a can be specified relative to a predetermined direction, for example the direction of movement. If the position of the vehicle 12a shifts relative to the ego vehicle 11a, both the width of the angle sector F and its position cp can change.
- FIG. 3C The bar diagram depicted there, in which the inverse distance d, indicated as a black arrow in FIG. 3A, between the ego vehicle 1 1a and the other vehicles 12a, 12b is plotted against the position cp of the angle sector F, forms an abstract representation of the one shown in FIG. 3A traffic scenarios shown.
- the three bars on the right correspond to the angular range F covered by the first vehicle 12a, while the three bars on the left correspond to the angular range F covered by the second vehicle 12b.
- the different times are indicated by the corresponding filling of the bars.
- the angular range F covered by the first vehicle 12a moves in the direction of a 0 ° position when it is reeved in front of the ego vehicle 11a, the 0 ° position of the angle sector F corresponding to a position in front of the ego vehicle 11a .
- the width of the angular range F covered by the first vehicle 12a also increases.
- the distance d between the first vehicle 12a and the ego vehicle 11a is also encoded in the height of the bars, which increases with the passage of time.
- the angular range F covered by the second vehicle 12b moves away from the 0 ° position to the extent that the second vehicle 12b falls behind the ego vehicle 1 1a. Since the distance between the second vehicle 12b and the ego vehicle 1 1a increases, the height of the bars and their width also decrease.
- FIG. 4 shows a first example to explain the relationship between a preferred exemplary embodiment of a representation 20 according to the invention and a corresponding dynamic spatial scenario 10.
- FIG. 4A shows the representation 20 of time profiles of angle sectors FQ, F ⁇ FO, that of others Objects from the perspective of an ego object, such as an ego vehicle 1 1a, are covered become.
- the time t is plotted against the position cp of the angle sectors Oa, Ob, Oc, the 0 ° position corresponding to a position in front of the ego object.
- the temporal courses of the angle sectors Oa, Ob, Oc are represented as lines, the width of which corresponds to the distance of the respective object from the ego object.
- a first object for example a first vehicle 12a
- a second object for example a second vehicle 12b
- the second object changes its position and moves between the ego object and the first object. From this point in time, the time profile of the angle sector Ob, which is covered by the second object, overlaps the profile of the angle sector Oa, which is covered by the first object.
- another angle sector Oc is covered by a third object, such as a third vehicle 12c.
- the position cp of the angle sector Oc covered by the third object subsequently shifts in the direction of the 0 ° position.
- This temporal course can be caused, for example, by a movement of the third object parallel to the direction of movement of the ego object, the distance between the third object and the ego object increasing.
- Such temporal profiles of the angle sectors Oa, Ob, Oc can, for example, be characteristic of the spatial dynamic scenario 10 shown in FIG. 4B, here a traffic scenario.
- the development over time is indicated in FIG. 4B by trajectories of the vehicles on the carriageway marked with dots.
- the second vehicle 12b reevers between the ego vehicle 11a and the first vehicle 12a in the middle lane.
- the ego vehicle 1 1a the first vehicle 12a and the second vehicle 12b continue to move together in the middle lane. Because the second vehicle 12b now obscures the view of the first vehicle 12a from the perspective of the ego vehicle 1 1a, only one line is visible in the corresponding representation 20 in FIG. 4A in the 0 ° position.
- the representation 20 in FIG. 4A created by the time profiles of the angle sectors has a pattern which is characteristic of the spatial dynamic scenario 10 described. If many such representations are generated on the basis of sensor data that were repeatedly collected when such a scenario was recorded, an artificial neural network can learn this pattern or can be trained to recognize this pattern. During regular operation of a vehicle, the temporal course of angle sectors can then be represented in real time from the sensor data generated and analyzed by the trained artificial neural network, in particular compared with the learned pattern. This preferably determines whether or at what point in time a known traffic scenario is pending.
- FIG. 4C shows the result of such a comparison, for example between the illustration 20 shown in FIG. 4A and a corresponding template, an output signal a being plotted against time t.
- This jumps around time t 30
- Output signal a to the value 6, which signals the presence of the known traffic scenario, in this case the reeving of a second vehicle 12b.
- the value of the output signal a can also be used to output the classification of the traffic scenario.
- the comparison of the generated representation with different templates could show the greatest agreement with a template that is assigned to a swinging-out maneuver, and the output signal a could assume a different value accordingly.
- FIG. 5 shows a second example to explain the relationship between a preferred exemplary embodiment of an illustration 20 according to the invention and a corresponding dynamic spatial scenario 10.
- FIG. 5A shows a scene from the traffic situation corresponding to the dynamic spatial scenario 10, in this case one by the arrow indicated, ego vehicle 1 1a shears into a lane in which another first vehicle 12a is already driving.
- a further, second vehicle 12b is traveling in a further lane.
- the development over time of the dynamic spatial scenario 10 is shown in FIG. 5B by trajectories of the vehicles 11a, 12a, 12b, identified by points.
- the other two vehicles 12a, 12b continue to drive straight ahead at a somewhat higher speed, so that the ego vehicle 11a slowly falls behind further.
- 5C shows the temporal profiles of the angle sectors Oa, Ob, which are covered by the first vehicle 12a and the second vehicle 12b from the perspective of the ego vehicle 1 1a, in a representation 20.
- the time t is plotted against a position f of the angle sectors Oa, Ob.
- a 0 ° position shown in FIG. 5c corresponds to a position in front of the ego vehicle 1 1 a.
- the angle sectors Oa, Ob shift in the direction of the 0 ° position, since the first and second vehicles 12a, 12b, as described above, move away from the ego vehicle 11a due to their higher speed.
- the time profiles of the angle sectors Oa curve, whether in the direction of the 0 ° position.
- the angle sector Oa covered by the first vehicle 12a subsequently runs along the 0 ° position, since, as shown in FIGS. 5A and 5B, the first vehicle 12a drives the ego vehicle 11a in the same lane.
- the illustration 20 shows, in addition to the angle sectors Oa, Ob additionally a value which is characteristic of the, in particular transverse, distance between the ego vehicle 1 1 a and the other vehicle 12 a, 12 b and / or for the, in particular transverse, speed of the ego vehicle 11 a at time t.
- the transverse distance or the transverse speed refer here to a transverse component of the distance or the speed, i.e. of the y components in the illustration shown in FIG. 5B.
- the value which is characteristic of the distance and / or the speed is shown in the illustration 20 as a coloration of the time profile indicated by the tint of the angular sectors shown.
- the ego vehicle 11a Since the ego vehicle 11a does not change its (transversal) speed when a lane change is carried out by one of the other vehicles 12a, 12b, in the present case the ego vehicle 11a can be swiveled into the lane used by the first vehicle 12a getting closed.
- sensor data are generated, for example by sensory detection of the environment of an ego object, and classified, i.e. assigned to different dynamic spatial scenarios.
- the classification can be carried out manually, for example, by evaluating an image data stream.
- the sensor data can also be classified automatically, in particular if it is sensor data that was generated by a simulator when simulating various dynamic spatial scenarios.
- a temporal course of an angle sector, which is covered by another object from the perspective of the ego object can be determined on the basis of the sensor data.
- the contour, in particular the cross-sectional area, of the other object can be determined and its width in or the proportion of the field of vision of the ego object can be determined.
- a geometric center of gravity of the contour or the cross-sectional area can be determined and its position in the field of view of the ego object, in particular relative to the direction of movement of the ego object, can be determined.
- a, in particular graphical, representation is generated from the time profile of the angle sector, which represents the maps over time.
- the time profile of the angle sector can, for example, form a pattern, for example a figure.
- the width of the angle sector, in particular its share in the field of view of the ego object, and its position in the field of view of the ego object are preferably determined , in particular relative to the direction of movement of the ego object.
- a speed of the ego object, in particular a transverse speed, and / or a distance, in particular a transverse distance, of the ego object from the other object is also taken into account in generating the representation.
- a value can be determined and stored, for example using a function into which parameters such as the speed and / or the distance characterizing the dynamic spatial scenario are included, or the representation can be colored accordingly.
- the generated representation thus preferably provides information regarding the width and position of the angle sector covered by the other object in the field of view of the ego object and the speed of the ego object and / or the distance to the other object.
- the generated representation is fed, for example by means of an interface, to an artificial neural network, which is thereby trained in particular to recognize patterns in the generated representation.
- an artificial neural network which is thereby trained in particular to recognize patterns in the generated representation.
- information about the dynamic spatial scenarios according to which the sensor data were classified in method step S1 is preferably supplied to the artificial neural network, so that the artificial neural network generates the representation or the recognized pattern can each correlate with one of the dynamic spatial scenarios.
- FIG. 7 shows a preferred exemplary embodiment of a method V3 according to the invention for analyzing sensor data which are suitable for characterizing a dynamic spatial scenario in relation to an ego object and at least one other object.
- the sensor data are preferably sensor data generated by environmental sensors of a ego vehicle.
- a representation of a time profile of an angle sector is generated from the sensor data.
- the angle sector corresponds to the area in the field of view of the ego object that is covered by the other object.
- Such a representation can be, for example, an illustration in which the temporal course of the angle sector forms a pattern, for example a figure.
- the time profile of the angle sector can also be determined in a separate, previous method step (not shown) based on the sensor data.
- step S5 the generated representation is compared with at least one predefined template of a known dynamic spatial scenario. In this way, it can be determined in which dynamic spatial scenario the ego object is currently located and, if necessary, a driver assistance system can be controlled accordingly.
- the generated representation can be saved as a further predefined template, for example in a database, if there is no or at least insufficient correspondence of the generated representation with the at least one predefined template that a known dynamic is assigned to spatial scenarios, can be determined.
- a catalog can be generated with predefined templates, which are suitable for identifying dynamic spatial scenarios, in particular essentially in real time.
- V2 Method for training an artificial neural network
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Evolutionary Computation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- General Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Medical Informatics (AREA)
- Databases & Information Systems (AREA)
- Biophysics (AREA)
- Combustion & Propulsion (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Molecular Biology (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Chemical & Material Sciences (AREA)
- Biomedical Technology (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Business, Economics & Management (AREA)
- Game Theory and Decision Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Traffic Control Systems (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
ATA50788/2018A AT521647B1 (en) | 2018-09-14 | 2018-09-14 | Method and system for data processing, for training an artificial neural network and for analyzing sensor data |
PCT/AT2019/060301 WO2020051618A1 (en) | 2018-09-14 | 2019-09-13 | Analysis of dynamic spatial scenarios |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3850536A1 true EP3850536A1 (en) | 2021-07-21 |
Family
ID=68062777
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP19773727.3A Pending EP3850536A1 (en) | 2018-09-14 | 2019-09-13 | Analysis of dynamic spatial scenarios |
Country Status (7)
Country | Link |
---|---|
US (1) | US20220237889A1 (en) |
EP (1) | EP3850536A1 (en) |
JP (1) | JP7511544B2 (en) |
KR (1) | KR20210060535A (en) |
CN (1) | CN112673379A (en) |
AT (1) | AT521647B1 (en) |
WO (1) | WO2020051618A1 (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102592826B1 (en) * | 2018-10-10 | 2023-10-23 | 현대자동차주식회사 | Apparatus and method for identificating short cut in vehicle and vehicle including the same |
DE102021100395A1 (en) * | 2021-01-12 | 2022-07-14 | Dspace Gmbh | Computer-implemented method for determining similarity values of traffic scenarios |
WO2023087248A1 (en) * | 2021-11-19 | 2023-05-25 | 华为技术有限公司 | Information processing method and apparatus |
DE102022126747A1 (en) | 2022-10-13 | 2024-04-18 | Dr. Ing. H.C. F. Porsche Aktiengesellschaft | Method for generating scenario models for autonomous driving |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3153839B2 (en) * | 1993-08-23 | 2001-04-09 | 三菱電機株式会社 | Preventive safety devices for vehicles |
JP3252680B2 (en) * | 1995-11-24 | 2002-02-04 | トヨタ自動車株式会社 | In-vehicle scanning radar device |
JPH10338057A (en) * | 1997-06-10 | 1998-12-22 | Hitachi Ltd | Automatic travel controller and inter-vehicle distance warning device for automobile |
DE10141037C1 (en) * | 2001-08-20 | 2003-04-03 | Siemens Ag | Obstacle detection device |
CN102307774A (en) * | 2009-02-03 | 2012-01-04 | 大陆-特韦斯贸易合伙股份公司及两合公司 | Method and device for carrying out an avoidance manoeuvre |
JP5267588B2 (en) * | 2010-03-26 | 2013-08-21 | 株式会社デンソー | Marking line detection apparatus and marking line detection method |
CN103530605B (en) * | 2013-09-29 | 2017-01-25 | 数基科技(北京)有限公司 | Method for detecting abnormal behavior of robustness |
DE102014213359A1 (en) * | 2014-07-09 | 2016-01-14 | Robert Bosch Gmbh | Apparatus and method for the acoustic examination of environmental objects of a means of locomotion |
SE538908C2 (en) * | 2015-10-22 | 2017-02-07 | Uniquesec Ab | Testing method with virtual radar signatures for an automotive safety radar system |
US11144761B2 (en) * | 2016-04-04 | 2021-10-12 | Xerox Corporation | Deep data association for online multi-class multi-object tracking |
US10664750B2 (en) * | 2016-08-10 | 2020-05-26 | Google Llc | Deep machine learning to predict and prevent adverse conditions at structural assets |
US10339671B2 (en) * | 2016-11-14 | 2019-07-02 | Nec Corporation | Action recognition using accurate object proposals by tracking detections |
JP7069539B2 (en) * | 2016-12-07 | 2022-05-18 | スズキ株式会社 | Driving support device |
KR20180068511A (en) * | 2016-12-14 | 2018-06-22 | 삼성전자주식회사 | Apparatus and method for generating training data for training neural network determining information related to road included in an image |
US10228693B2 (en) * | 2017-01-13 | 2019-03-12 | Ford Global Technologies, Llc | Generating simulated sensor data for training and validation of detection models |
US10402163B2 (en) * | 2017-02-14 | 2019-09-03 | Accenture Global Solutions Limited | Intelligent data extraction |
WO2019241022A1 (en) * | 2018-06-13 | 2019-12-19 | Nvidia Corporation | Path detection for autonomous machines using deep neural networks |
-
2018
- 2018-09-14 AT ATA50788/2018A patent/AT521647B1/en active
-
2019
- 2019-09-13 EP EP19773727.3A patent/EP3850536A1/en active Pending
- 2019-09-13 KR KR1020217010955A patent/KR20210060535A/en unknown
- 2019-09-13 US US17/275,810 patent/US20220237889A1/en active Pending
- 2019-09-13 JP JP2021514045A patent/JP7511544B2/en active Active
- 2019-09-13 CN CN201980060057.1A patent/CN112673379A/en active Pending
- 2019-09-13 WO PCT/AT2019/060301 patent/WO2020051618A1/en unknown
Also Published As
Publication number | Publication date |
---|---|
AT521647A1 (en) | 2020-03-15 |
CN112673379A (en) | 2021-04-16 |
JP7511544B2 (en) | 2024-07-05 |
JP2022500762A (en) | 2022-01-04 |
AT521647B1 (en) | 2020-09-15 |
US20220237889A1 (en) | 2022-07-28 |
KR20210060535A (en) | 2021-05-26 |
WO2020051618A1 (en) | 2020-03-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AT521607B1 (en) | Method and device for testing a driver assistance system | |
AT521647B1 (en) | Method and system for data processing, for training an artificial neural network and for analyzing sensor data | |
EP3292510B1 (en) | Method and apparatus for detecting and assessing road reflections | |
EP3765927B1 (en) | Method for generating a training data record for training an artificial intelligence module for a control device of a vehicle | |
DE102016107705A1 (en) | Reactive path planning for autonomous driving | |
EP3543985A1 (en) | Simulation of different traffic situations for a test vehicle | |
DE102016007899B4 (en) | Method for operating a device for traffic situation analysis, motor vehicle and data processing device | |
DE102013205882A1 (en) | Method and device for guiding a vehicle around an object | |
DE102008041679A1 (en) | Method for environment recognition for navigation system in car, involves storing data of object or feature in storage, and classifying object or feature by comparison of data after visual inspection of object or feature | |
AT523834B1 (en) | Method and system for testing a driver assistance system | |
DE102018122374B4 (en) | Method for determining a free space surrounding a motor vehicle, computer program product, free space determination device and motor vehicle | |
DE102014003343A1 (en) | Method for determining a lane change requirement of a system vehicle | |
DE102016224291A1 (en) | Method for the computer-aided adaptation of a predetermined semi-automated driving system of a motor vehicle | |
WO2020048669A1 (en) | Method for determining a lane change indication of a vehicle, computer-readable storage medium, and vehicle | |
EP4027245A1 (en) | Computer-implemented method for determining similarities of traffic scenarios | |
DE102016120066A1 (en) | A computer implemented method for controlling an object recognition system | |
EP4350660A1 (en) | System and method for predicting a future position of a traffic participant | |
DE102018205248B4 (en) | Fusion system for fusing environmental information for a motor vehicle | |
EP3576013A1 (en) | Estimation of a path of a rail path | |
DE102019101613A1 (en) | Simulate different traffic situations for a test vehicle | |
WO2022263175A1 (en) | Movement prediction for road users | |
WO2022106336A1 (en) | Method for controlling a motor vehicle, and control device | |
DE102020110671A1 (en) | Process for automated longitudinal control | |
DE102021201521A1 (en) | Methods for supporting or automated vehicle guidance | |
DE102022213064A1 (en) | Detection of unknown objects using neural networks for vehicles |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20210413 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20230222 |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230503 |