WO2022175405A1 - Verfahren zum erfassen von objekten in einer umgebung eines fahrzeugs mit bestimmung einer verdeckung der objekte, recheneinrichtung sowie sensorsystem - Google Patents
Verfahren zum erfassen von objekten in einer umgebung eines fahrzeugs mit bestimmung einer verdeckung der objekte, recheneinrichtung sowie sensorsystem Download PDFInfo
- Publication number
- WO2022175405A1 WO2022175405A1 PCT/EP2022/053993 EP2022053993W WO2022175405A1 WO 2022175405 A1 WO2022175405 A1 WO 2022175405A1 EP 2022053993 W EP2022053993 W EP 2022053993W WO 2022175405 A1 WO2022175405 A1 WO 2022175405A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- objects
- corners
- sensor
- surroundings
- vehicle
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 45
- 238000001514 detection method Methods 0.000 claims description 38
- 238000004590 computer program Methods 0.000 claims description 6
- 238000005259 measurement Methods 0.000 description 9
- 238000004422 calculation algorithm Methods 0.000 description 6
- 238000011156 evaluation Methods 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 230000001174 ascending effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/66—Radar-tracking systems; Analogous systems
- G01S13/72—Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
- G01S13/723—Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/66—Tracking systems using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9323—Alternative operation using light waves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
- G06T2207/30261—Obstacle
Definitions
- the present invention relates to a method for detecting objects in an area surrounding a vehicle.
- the present invention relates to a computing device and a sensor system for a vehicle.
- the present invention relates to a computer program and a computer-readable (storage medium.
- Sensor systems for vehicles which have environment sensors are known from the prior art.
- Corresponding sensor data can be provided with these surroundings sensors, which describe the surroundings of the vehicle.
- Objects and in particular other road users in the area can then be recognized on the basis of the sensor data.
- a reliable detection of other objects and in particular other road users in the area is essential for the automated or autonomous operation of vehicles.
- the existence of the road users is usually estimated. If such an existence or probability of existence is too low, the detection of the road user cannot be trusted.
- the situation can also arise in which a real existing object is temporarily and/or partially covered by another object. This can lead to the reduction of the object's existence probability.
- the object can be deleted during tracking, which in turn means that a driver assistance system of the vehicle no longer reacts to the object. In the worst case, this can lead to a collision with the object.
- a method is used to detect objects in the surroundings of a vehicle.
- the method includes receiving sensor data describing the objects in the environment from an environment sensor of the vehicle.
- the method includes determining respective corners of the objects, with the corners describing outer boundaries of the respective object.
- the method also includes determining a position of the respective corners relative to the surroundings sensor.
- the method includes sorting the determined corners in a predetermined angular direction.
- the method includes checking each of the corners along the angular direction, whether the corner is covered by another of the objects, based on the relative position of the respective corners to the environment sensor, and determining areas of the respective objects that can be detected by the environment sensor based on checking the corners.
- the method can be carried out with a corresponding computing device, which can be formed, for example, by an electronic control unit of the vehicle or a sensor system of the vehicle.
- the sensor data can be recorded with the surroundings sensor.
- the sensor data describe the surroundings of the vehicle or an area of the surroundings. Measuring cycles that follow one another in time can be carried out with the surroundings sensor, with the sensor data being provided in each measuring cycle. This sensor data can then be sent from the environment sensor to the computing device transferred for further evaluation.
- the method can also be carried out on the basis of the sensor data from a number of surroundings sensors.
- the respective objects in the area can be detected based on the sensor data.
- the position of the objects relative to the surroundings sensor can be determined based on the sensor data.
- the leftmost corner and the rightmost corner can be defined as the corners.
- two corners can be determined which describe the spatial extent of the object in the predetermined angular direction. This angular direction can in particular be the azimuth direction.
- An angle is assigned to the corners of the respective objects. After that, the corners are ordered depending on their angle values in the angular direction and in the azimuth direction, respectively. In particular, the corners can be sorted in ascending or descending order according to their angle value.
- each corner is checked sequentially in the angular direction.
- Each corner can be examined to see whether it is covered by one or more other objects or whether the corner is visible to the environment sensor.
- the investigation as to whether the corner is covered or visible is carried out based on the determined relative position of the objects or of their corners in relation to the surroundings sensor.
- the detectable area can also be determined for each of the objects. This detectable area describes that area or part of the object that can be detected or seen with the surroundings sensor.
- a list of corners is determined and the corners are examined in order or in angular direction. This results in the advantage that this list of corners is processed only once for a measuring cycle or evaluation step. In comparison to known tracking models, the computing effort can thus be significantly reduced. The occlusion of the objects in the area can thus be determined more efficiently and at the same time reliably.
- a portion of the respective objects that is located in a detection area of the surroundings sensor is preferably determined, and the areas that can be detected are determined as a function of the portion.
- the detection range which is also referred to as the field of view or field of view, describes the area of the environment in which objects can be detected with the environment sensor. Based on the Measurements with the environment sensor, the object can be traced or tracked. Thus, for example, the position and/or the spatial dimensions of the object are known from previous measurement cycles. Furthermore, the detection range of the surroundings sensor is known. On the basis of this information it can then be determined which part of the object lies in the detection area or is arranged in this area. This part of the object in the detection area can then be taken into account when determining the detectable area. This can be used to determine which part of the object can currently actually be detected.
- an occlusion list is determined, in which the respective objects are entered depending on the position of their corners relative to the surroundings sensor, with the occlusion list being updated when the respective corners are checked along the angular direction.
- each corner is checked in order or in angular direction.
- the occlusion list can be updated.
- the objects or the corners can be entered in the occlusion list.
- An identification or ID can also be assigned to the detected objects. This identifier can be entered in the concealment list.
- the first object in the occlusion list can be the object whose corner is visible.
- the second object in the occlusion list may be occluded by the first object. It can also be taken into account whether the corner examined in each case describes the beginning or the end of the object in the angular direction. If the corner describes the end of the object, the object can be deleted from the occlusion list.
- an angle list is determined which describes detectable angle areas of the respective objects, the angle list being updated when the respective corners are checked along the angle direction. It can also be provided that an angle, which is assigned to a last change, is stored. On the basis of this angle and the previously described occlusion list, it can be determined for which angular range or azimuth range an object or part of an object is visible to the surroundings sensor. Overall, the detectable areas of the respective objects can thus be determined with little computing effort.
- the respective corners of the objects are determined in polar coordinates.
- a tracking algorithm can be used to predict the respective position of the objects at the time of measurement by the environment sensor based on the sensor data. These tracks, which describe the objects or their position, can be transformed into the detection range of the environment sensor. The objects or tracks can then be defined in polar coordinates.
- the corners can each be specified by an angle and a distance or radius. The angle can in particular correspond to the azimuth angle.
- the relative position between the environment sensor and the corner can be determined by the radius. Provision can also be made for not only the radius but also the angle between the environment sensor and the corner to be taken into account when calculating the relative position. In this way, the concealment can also be carried out reliably in the case of corners which are at a small distance from the surroundings sensor but which are associated with a concealed object.
- an existence probability for the respective objects is determined as a function of the detectable areas of the objects.
- the tracking of the respective objects can preferably also be carried out on the basis of the specific detectable areas. It is thus possible to react to an object that is currently not visible to the environment sensor. When the object is visible again, there is already an estimate for position and speed that can be used directly. This offers an advantage over known methods in which the hidden objects are deleted. In addition, with known methods, a certain number of measurements are required before the object is considered to be confirmed again at all. In addition, a certain initialization time is needed to estimate the full state of motion.
- the calculation of the visibility which is based on the detection area, is combined with the covering by other objects.
- the number of other objects in the area can range between 100 and 200.
- a mutual occlusion would have to be calculated for all possible combinations of objects.
- the computing effort can be significantly reduced by the method according to the invention. Experiments have shown that the computing time increases linearly with the number of objects in the area.
- a computing device for a sensor system of a vehicle is set up to carry out a method according to the invention and the advantageous refinements.
- the computing device can be formed in particular by an electronic control unit of the vehicle.
- a sensor system according to the invention includes a computing device according to the invention and at least one environment sensor.
- the surroundings sensor can be in the form of a radar sensor, lidar sensor or camera.
- the environment sensor can also have a detection range of 360° in relation to the azimuth direction.
- the sensor system can also have a plurality of surroundings sensors and also different types of surroundings sensors.
- the angular range can be defined here, for example, from 0° to 360° or from -180° to 180°. Problems can arise with objects that extend beyond the defined angular limit of 180° to -180° or from 360° to 0°. In the present case, it is provided for objects that extend beyond the angular boundary to be divided into two sub-objects. The division takes place at the angle limit. In other words, a total of four corners are then assigned to such an object. However, the same identifier can be assigned to the two sub-objects.
- the sensor system can be part of a driver assistance system of the vehicle.
- the vehicle can be maneuvered automatically or autonomously using the driver assistance system.
- a vehicle according to the invention includes a sensor system according to the invention.
- the vehicle is designed in particular as a passenger car.
- a further aspect of the invention relates to a computer program, comprising instructions which, when the program is executed by a computing device, cause the latter to carry out a method according to the invention and the advantageous configurations thereof. Furthermore, the invention relates to a computer-readable (storage) medium, comprising instructions which, when executed by a computing device, cause it to carry out a method according to the invention and the advantageous configurations thereof.
- FIG. 1 shows a schematic representation of a vehicle, which has a sensor system with surroundings sensors, and an object in the surroundings of the vehicle;
- Fig. 3 is a schematic flow chart of a method for detecting the
- FIG. 4 shows a schematic representation of an environment sensor and an object, with corners of the object being defined in polar coordinates
- FIG. 5 shows a schematic representation of an environment sensor and of three objects, two of which are partially covered for the environment sensor;
- 6a-d occlusion lists, angle lists and areas with angles for different evaluation steps for determining the occlusion of the objects according to FIG. 5.
- Fig. 1 shows a schematic representation of a vehicle 1, which is presently designed as a passenger car, in a top view.
- the vehicle 1 includes a sensor system 2, by means of which objects Ob1, Ob2, Ob3 in an environment 5 of the vehicle 1 can be detected.
- an object Ob1 in the area 5 of the vehicle 1 is shown as an example.
- the sensor system 2 includes surroundings sensors 4, 4', with which measured values or sensor data which describe the object Ob1 in the environment 5 can be provided.
- the sensor system 2 includes a first environment sensor 4, which is designed as a radar sensor, and a second environment sensor 4', which is designed as a camera.
- the sensor system 2 includes a computing device 3, which can be formed, for example, by an electronic control unit.
- the sensor data which are provided by the environment sensors 4, 4', are transmitted to the computing device 3 and evaluated accordingly to identify the object Ob1.
- a corresponding computer program can be run on the computing device 3 for this purpose.
- FIG. 2a to 2c show schematic representations of the vehicle 1, which is in an inner-city traffic situation. 2a to 2c describe successive time steps.
- the vehicle 1 follows a first object Ob1 in the form of another vehicle.
- the vehicle 1 and the first object Ob1 are moving toward an intersection 6 .
- a second object Ob2 in the form of another vehicle is moving towards this intersection 6 from the right.
- the second object Ob2 at the intersection 6 has the right of way.
- the second object Ob2 can be completely detected with the surroundings sensors 4, 4' of the vehicle 1 if the second object Ob2 is in the detection ranges of the respective surroundings sensors 4, 4'. In this way, the second object Ob2 can be tracked using a tracking algorithm.
- 2b shows the traffic situation at a later point in time.
- the first object Ob1 turns to the right, so that the first object Ob1 partially covers the second object Ob2 for the surroundings sensors 4, 4' of the vehicle 1.
- 2c shows the traffic situation at a later point in time.
- the first object Ob1 completely covers the second object Ob2 for the surroundings sensors 4, 4' of the vehicle 1. If the concealment is not taken into account, the surroundings sensors 4, 4' would not supply any new information about the second object Ob2. This would result in the tracking algorithm reducing the probability of the second object Ob2 existing, since the second object Ob2 is located directly in front of the vehicle 1 in the detection range of the surroundings sensors 4, 4'. After a certain time, the second object Ob2 could also be deleted in the tracking algorithm.
- the probability of existence is usually determined using an assumption for the detection probability of the respective surroundings sensor 4, 4'. For example, it can be assumed that the camera is designed to see another vehicle in a distance of 15 m with a probability of 99% and a false positive rate. If the second object Ob2 is recognized again after the occlusion, this would be recognized as a new object after a certain time.
- the concealment of objects Ob1, Ob2, Ob3 in the surroundings 5 of the vehicle 1 is to be determined.
- the probability of detection could be significantly reduced.
- the camera detects the half covered object Ob2 with a probability of 30% and the radar sensor has a detection probability of 70%. If the second object Ob2 is completely occluded, as shown in Fig. 3c
- Detection probabilities are assumed to be in the range of 0% and thus the probability of existence is not reduced. In this way, the detected object Ob2 is not deleted in the tracking algorithm. This results in the advantage that the presence of the second object Ob2 is known and a braking maneuver can therefore be initiated by the vehicle 1 or a driver assistance system of the vehicle 1, for example. As soon as the second object Ob2 is no longer covered, it can be assigned to the current measurement by the surroundings sensors 4, 4'.
- FIG. 3 shows a schematic flow chart of a method for detecting objects Ob1, Ob2, Ob3 in the surroundings 5 of the vehicle 1.
- the method is explained as an example for a surroundings sensor 4, 4', but can be used for all surroundings sensors 4, 4' of the vehicle 1 to be carried out.
- the sensor data are provided with the environment sensors 4, 4'. In this case, measuring cycles that follow one another in time are carried out with the surroundings sensor 4, 4'.
- the sensor data of the surroundings sensor 4, 4' are transmitted to the computing device 3 and in a step S2 the respective positions of so-called tracks at the time of measurement of the surroundings sensor 4, 4' are predicted by means of the tracking algorithm. These tracks describe the respective objects Ob1, Ob2, Ob3 in the surroundings 5.
- a step S3 the tracks are then transformed in the detection area of the surroundings sensor 4, 4' or into the sensor coordinate system. A transformation into polar coordinates then takes place in a step S4.
- a step S5 it is then checked whether the tracks or objects Ob1, Ob2, Ob3 are located in the detection range of the surroundings sensor 4, 4'.
- a step S6 the covering of the respective objects Ob1, Ob2, Ob3 is determined.
- an assignment and an update take place on the basis of the sensor data.
- the existence probability for the objects Ob1, Ob2, Ob3 is updated. This is done based on the results from steps S5 and S6.
- a step S9 a list of the detected tracks or objects Ob1, Ob2, Ob3 is created and updated in a step S10. The update can be carried out in each measuring cycle. The relevant steps of the method are explained in more detail below.
- FIG. 4 shows a schematic representation of surroundings sensor 4 and an object Ob1.
- the detected object Ob1 is transformed into polar coordinates after the transformation into the sensor coordinate system.
- the object Ob1 is assumed to be a rectangle or a two-dimensional box.
- the object Ob1 or the corners CR, CL of the object Ob1 can each be described using an angle Q and a radius r.
- the angle Q corresponds to the azimuth angle. It is assumed that the individual objects Ob1, Ob2, Ob3 or boxes do not overlap.
- each object Ob1, Ob2, Ob3 with four corners can be reduced to the two corners CR, CL and stored with the associated identifier.
- the coordinates of the corner CR at the right edge and/or the corner CL at the left edge can be stored.
- the objects Ob1, Ob2, Ob3 are identified, which are in the detection range of the surroundings sensor 4, 4'. It can be checked whether the two outer corners CR, CL of the respective objects Ob1, Ob2, Ob3 are in the detection area.
- the detection area describes the area in the surroundings 5 of the vehicle 1 in which the surroundings sensor 4, 4' can detect objects Ob1, Ob2, Ob3.
- the detection range can be defined by a maximum radius and an angular range that extends from -0s to 0s. First it can be checked whether the distance or the radius n_ one of the corners CR, CL is larger than the maximum radius of the detection area. If this is the case, it can be assumed that the object Ob1, Ob2, Ob3 is not in the detection area and is not visible.
- the proportion of the object within the detection area can be determined from the quotient (qi_ - Q K )/DQ.
- the new angles can be stored at the edges qi_, 0R. On the basis of angles, it can first be checked whether the object Ob1, Ob2, Ob3 is in the detection area. For example, if a corner CR, CL of the object Ob1, Ob2, Ob3 is in the detection area, it can be assumed that the object Ob1, Ob2, Ob3 is in the detection area.
- the respective outer corners CR, CL of the objects Ob1, Ob2, Ob3 are extracted and then sorted in the azimuth direction. Thereafter, the corners CR, CL are analyzed sequentially or in the azimuth direction.
- the right corner CR is assigned to the beginning of the object Ob1, Ob2, Ob3 and is visible
- the right corner CR is assigned to the beginning of the object Ob1, Ob2, Ob3 and is hidden
- the left corner CL is the End of object Ob1, Ob2, Ob3 assigned and visible or left corner CL assigned to end of object Ob1, Ob2, Ob3 and hidden.
- a list can be provided in which the currently hidden objects Ob1, Ob2, Ob3 are listed.
- This list is referred to as concealment list 7 below.
- the order in this concealment list 7 can indicate the order in which the objects Ob1, Ob2, Ob3 are arranged.
- the first object Ob1, Ob2, Ob3 in the concealment list 7 can have the shortest distance to the vehicle 1 or the surroundings sensor 4, 4' and the next object Ob1, Ob2, Ob3 in the list can emanate from the surroundings sensor 4, 4'. be arranged behind the first object Ob1, Ob2, Ob3 in the concealment list 7.
- a list can be provided in which the respective visible angle or azimuth angle of the objects is described. This list is referred to as angle list 8 below.
- the corner CR of the object Ob1, Ob2, Ob3 on the extreme right side is considered first.
- a check is made as to whether this corner CR is covered or whether another object Ob1, Ob2, Ob3 is located in front of this corner CR in relation to the surroundings sensor 4, 4'.
- this corner CR is compared with the objects Ob1, Ob2, Ob3 in the occlusion list.
- the radii or distances can here Corners CR, CL are compared with each other.
- the scalar product of the corners is determined if there is a positive or negative angle between the currently visible corner and the newly defined extreme corner CR at the right edge.
- the object Ob1, Ob2, Ob3 assigned to the corner CR is included at the top of the concealment list 7.
- the apparent azimuth angle is determined. This is done on the basis of the angle 0 R of the object Ob1, Ob2, Ob3 newly added to the occlusion list 7 and the angle at which the last change took place.
- the determined visible azimuth angle is entered into the angle list 8 and the angle with the last change is updated. If the rightmost corner CR is not visible, it is compared with the next objects Ob1, Ob2, Ob3 in the occlusion list 7 as to whether it is visible. This is carried out until the corner is recognized as visible.
- the object Ob1, Ob2, Ob3 is then sorted into the concealment list 7 at the correct location.
- the occlusion of the leftmost edge corner CL is examined. If the corner CL is visible or if the corner CL is in the first position in the occlusion list, the visible azimuth angle in the angle list 8 is updated.
- the corner CL is deleted from the current occlusion list 7 and the azimuth angle of the last change is updated. If the corner CL is not visible, the object Ob1, Ob2, Ob3 is deleted from the current coverage list 7.
- the visible range of the object Ob1, Ob2, Ob3 is determined based on the visible azimuth angle and the previously calculated proportion of the object Ob1, Ob2, Ob3 in the detection range. For example, if half of the object Ob1, Ob2, Ob3 is in the detection area, the proportion of the object Ob1, Ob2, Ob3 in the detection area is 50%. If 60% of this portion is not covered in the detection area, a total of 30% of the object Ob1, Ob2, Ob3 is visible to the surroundings sensor 4, 4'. The procedure is illustrated below using an example.
- Ob1, Ob2, Ob3 there are three objects Ob1, Ob2, Ob3 in the surroundings 5 of the vehicle 1. These objects Ob1, Ob2, Ob3 are shown in FIG. 5 by way of example. For the sake of clarity, only environment sensor 4 of vehicle 1 is shown.
- the third object Ob3 extends in an angular range between 0° and 50°. In simplified terms, it is assumed that the distance between the surroundings sensor 4 and the corners CR, CL of the third object Ob3 is 20 m in each case.
- the second object Ob2, which is partially located in front of the third object Ob3, extends in an angular range between 20° and 40°. In simplified terms, it is assumed that the distance between the surroundings sensor 4 and the corners CR, CL of the second object Ob2 is 15 m.
- the first object Ob1 which is partially in front of the second object Ob2 and partially in front of the third object Ob3, extends in an angular range between 10° and 30°.
- the distance between the surroundings sensor 4 and the corners CR, CL of the first object Ob1 is 10 m in each case.
- the respective corners CR, CL of the objects Ob1, Ob2, Ob3 are first sorted in the azimuth direction. First, consider the rightmost corner CR of the third object Ob3, which is associated with the angle of 0°. This corner CR is entered in the current concealment list 7 at the first position. Thereafter, the rightmost corner CR of the first object Ob1, which is associated with the angle of 10°, is considered. This corner CR is identified as a new visible corner and entered in the first position in the occlusion list 7 . The corner of the third object Ob3 is moved to the second position in the occlusion list 7 . In the angle list 8 with the visible azimuth angles, each object Ob1, Ob2, Ob3 is initially assigned the angle of 0°.
- the visible azimuth angle of the third object Ob3 is calculated to 10° and updated in the angle list 8. Furthermore, the last azimuth angle 0 L or the angle at which the last change took place is updated from 0° to 10°.
- FIG. 6a shows the occlusion list 7, the angle list 8 and an area 9 for the last azimuth angle 0L.
- the changes in lists 7, 8, 9 are highlighted by underlining.
- Figure 6b shows the occlusion list 7, the angle list 8 and the area 9 for a subsequent step.
- the rightmost corner CR of the second object Ob2 which is associated with the angle of 20°
- the leftmost corner CL of the first object Ob1 which is associated with the angle of 30°
- the difference between this angle of 30° and the last azimuth angle 0 L of 10° then becomes the visible angle calculated from 20° of the first object Ob1 and updated in the angle list 8.
- the first object Ob1 is deleted from the concealment list 7 .
- the last azimuth angle qi_ is updated from 10° to 30°.
- Figure 6c shows the occlusion list 7, the angle list 8 and the area 9 for a subsequent step.
- the extreme left corner C L of the second object Ob2 which is associated with the angle of 40°, is examined.
- the visible angle of 10° of the second object Ob2 is then calculated from the difference between this angle of 40° and the last azimuth angle qi_ of 30° and updated in the angle list 8 .
- the second object Ob2 is deleted from the concealment list 7 .
- the last azimuth angle qi_ is updated from 30° to 40°.
- Figure 6d shows the occlusion list 7, the angle list 8 and the area 9 for a subsequent step.
- the extreme left corner CL of the third object Ob3 which is associated with the angle of 50°, is examined.
- the visible angle of 10° of the third object Ob3 is then calculated from the difference between this angle of 50° and the last azimuth angle 0 L of 40° and updated in the angle list 8 .
- the third object Ob3 is deleted from the concealment list 7 .
- the last azimuth angle 0 L is updated from 40° to 50° and the procedure ends.
- the result of the method is that 20° of the first object Ob1 is visible in the angular range between 10° and 30°. Consequently, 100% of the first object Ob1 is visible. 10° of the second object Ob2 is visible in the angle range between 30° and 40°. Consequently, 50% of the second object Ob2 is visible. 20° of the third object Ob3 is visible in the angle ranges from 0° to 10° and from 40° to 50°. This means that Ob340% of the third object is visible.
- the method can also be used for environment sensors with a detection range of 360°.
- Surroundings sensors of this type which are in the form of radar sensors or lidar sensors, can be arranged on the roof of vehicle 1, for example.
- the angle can be defined, for example, from 0° to 360° or from -180° to 180°.
- Problems can occur with objects Ob1, Ob2, Ob3 that extend beyond the defined angular limit of 180° to -180° or from 360° to 0°. Errors can occur here in the previously described sorting of the angles in the angle direction. It is therefore intended for objects Ob1, Ob2, Ob3, which extend beyond the angle limit, that these are divided into two sub-objects. the Division takes place at the angle limit.
- a total of four outer corners are assigned to a real object Ob1, Ob2, Ob3, but the same identifier.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- Automation & Control Theory (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202280014141.1A CN116848432A (zh) | 2021-02-19 | 2022-02-17 | 通过确定物体遮挡以检测车辆环境中的物体的方法、计算装置和传感器系统 |
US18/277,805 US20240044649A1 (en) | 2021-02-19 | 2022-02-17 | Method for Detecting Objects in the Surroundings of a Vehicle by Determining Coverage of the Objects, Computing Device, and Sensor System |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102021104015.9A DE102021104015A1 (de) | 2021-02-19 | 2021-02-19 | Verfahren zum Erfassen von Objekten in einer Umgebung eines Fahrzeugs mit Bestimmung einer Verdeckung der Objekte, Recheneinrichtung sowie Sensorsystem |
DE102021104015.9 | 2021-02-19 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022175405A1 true WO2022175405A1 (de) | 2022-08-25 |
Family
ID=80785170
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2022/053993 WO2022175405A1 (de) | 2021-02-19 | 2022-02-17 | Verfahren zum erfassen von objekten in einer umgebung eines fahrzeugs mit bestimmung einer verdeckung der objekte, recheneinrichtung sowie sensorsystem |
Country Status (4)
Country | Link |
---|---|
US (1) | US20240044649A1 (de) |
CN (1) | CN116848432A (de) |
DE (1) | DE102021104015A1 (de) |
WO (1) | WO2022175405A1 (de) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE10148064A1 (de) * | 2001-09-28 | 2003-04-10 | Ibeo Automobile Sensor Gmbh | Verfahren zur Erkennung und Verfolgung von Objekten |
EP1995692A2 (de) * | 2004-04-19 | 2008-11-26 | IBEO Automobile Sensor GmbH | Verfahren zur Erkennung und Verfolgung von Objekten |
EP3432032A1 (de) * | 2017-07-19 | 2019-01-23 | Aptiv Technologies Limited | Automatisiertes fahrzeug-lidar-verfolgungssystem für verdeckte objekte |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE10148068A1 (de) | 2001-09-28 | 2003-04-10 | Ibeo Automobile Sensor Gmbh | Verfahren zur Erkennung und Verfolgung von Objekten |
DE102006057277A1 (de) | 2006-12-05 | 2008-06-12 | Robert Bosch Gmbh | Verfahren zum Betrieb eines Radarsystems bei möglicher Zielobjektverdeckung sowie Radarsystem zur Durchführung des Verfahrens |
DE102015207318B4 (de) | 2015-04-22 | 2021-07-22 | Robert Bosch Gmbh | Verfahren und Vorrichtung zur Verdeckungserkennung für stationäre Radarsysteme |
-
2021
- 2021-02-19 DE DE102021104015.9A patent/DE102021104015A1/de active Pending
-
2022
- 2022-02-17 US US18/277,805 patent/US20240044649A1/en active Pending
- 2022-02-17 WO PCT/EP2022/053993 patent/WO2022175405A1/de active Application Filing
- 2022-02-17 CN CN202280014141.1A patent/CN116848432A/zh active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE10148064A1 (de) * | 2001-09-28 | 2003-04-10 | Ibeo Automobile Sensor Gmbh | Verfahren zur Erkennung und Verfolgung von Objekten |
EP1995692A2 (de) * | 2004-04-19 | 2008-11-26 | IBEO Automobile Sensor GmbH | Verfahren zur Erkennung und Verfolgung von Objekten |
EP3432032A1 (de) * | 2017-07-19 | 2019-01-23 | Aptiv Technologies Limited | Automatisiertes fahrzeug-lidar-verfolgungssystem für verdeckte objekte |
Also Published As
Publication number | Publication date |
---|---|
DE102021104015A1 (de) | 2022-08-25 |
US20240044649A1 (en) | 2024-02-08 |
CN116848432A (zh) | 2023-10-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3329216A1 (de) | Bestimmung einer anordnungsinformation für ein fahrzeug | |
EP3254136B1 (de) | Verarbeitung von sensormessungen eines fahrzeugumfeldes bei geringer querauflösung | |
WO2015139864A1 (de) | Verfahren und vorrichtung zum betreiben eines fahrzeugs | |
DE102017101476B3 (de) | Lokalisieren eines Objekts in einer Umgebung eines Kraftfahrzeugs durch ein Ultraschallsensorsystem | |
EP3740738A1 (de) | Verfahren zur positionsbestimmung eines fahrzeugs | |
DE112018004003T5 (de) | Steuerung und verfahren zum automatischen fahren | |
DE102014200279A1 (de) | Verfahren und Vorrichtung zum Detektieren von Objekten in einem Umfeld eines Fahrzeugs | |
DE102019008093A1 (de) | Verfahren zum Fusionieren von Sensordaten einer Vielzahl von Erfassungseinrichtungen mittels eines spärlichen Belegungsgitters, sowie Fahrerassistenzsystem | |
DE102020200169B3 (de) | Verfahren zur Zusammenführung mehrerer Datensätze für die Erzeugung eines aktuellen Spurmodells einer Fahrbahn und Vorrichtung zur Datenverarbeitung | |
DE102020118629A1 (de) | Computerimplementiertes Verfahren zum Bestimmen der Validität einer geschätzten Position eines Fahrzeugs | |
DE102020112825A1 (de) | Verfahren zum Erfassen von relevanten statischen Objekten innerhalb einer Fahrspur sowie Recheneinrichtung für ein Fahrerassistenzsystem eines Fahrzeugs | |
DE102019132150A1 (de) | Verfahren zum automatischen Kalibrieren eines Umfeldsensors, insbesondere eines Lidar-Sensors, eines Fahrzeugs auf Grundlage von Belegungskarten sowie Recheneinrichtung | |
DE102018113559A1 (de) | Verfahren zum Erkennen einer Fahrbahnmarkierung durch Validieren anhand der Linienbreite; Steuereinrichtung; Stellplatzerkennungssystem; sowie Fahrerassistenzsystem | |
DE102020213133A1 (de) | Verfahren und Vorrichtung zum Bewerten einer Eigenlokalisierung eines physikalischen Systems mittels Sensordaten | |
WO2022175405A1 (de) | Verfahren zum erfassen von objekten in einer umgebung eines fahrzeugs mit bestimmung einer verdeckung der objekte, recheneinrichtung sowie sensorsystem | |
WO2018095669A1 (de) | Verfahren und vorrichtung zum bestimmen einer genauen position eines fahrzeugs anhand von radarsignaturen der fahrzeugumgebung | |
DE102020116225B3 (de) | Verfahren zum radarbasierten Nachverfolgen von fahrzeugexternen Objekten, Objektnachverfolgungseinrichtung und Kraftfahrzeug | |
DE102019211006B4 (de) | Auswerten von Sensordaten eines Fahrzeugs | |
WO2022002564A1 (de) | Ermitteln einer ausgangsposition eines fahrzeugs für eine lokalisierung | |
DE102006051091B4 (de) | Verfahren zur Objekterkennung von Fahrzeugen mittels Nahbereichsdetektion | |
DE102020119498A1 (de) | Verfahren zum Schätzen einer Eigenbewegung eines Fahrzeugs anhand von Messungen eines Lidar-Sensors sowie Recheneinrichtung | |
DE102019209637B4 (de) | Generieren einer Fahrzeugumgebungskarte | |
DE102023002044B3 (de) | Verfahren zu einer modellbasierten Fusion verschiedener Sensormodalitäten | |
DE102021204639B4 (de) | Detektion von Schnittpunkten von Fahrbahnmarkierungslinien | |
DE112021007341T5 (de) | Steuervorrichtung und Steuerverfahren |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22710969 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202280014141.1 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18277805 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22710969 Country of ref document: EP Kind code of ref document: A1 |