WO2018224610A1 - Procédé de détection d'objets dans une zone environnante d'un véhicule automobile, dispositif de détection lidar, système d'aide à la conduite et véhicule automobile - Google Patents
Procédé de détection d'objets dans une zone environnante d'un véhicule automobile, dispositif de détection lidar, système d'aide à la conduite et véhicule automobile Download PDFInfo
- Publication number
- WO2018224610A1 WO2018224610A1 PCT/EP2018/065072 EP2018065072W WO2018224610A1 WO 2018224610 A1 WO2018224610 A1 WO 2018224610A1 EP 2018065072 W EP2018065072 W EP 2018065072W WO 2018224610 A1 WO2018224610 A1 WO 2018224610A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- detection points
- cluster
- clusters
- determined
- detection
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/42—Simultaneous measurement of distance and other co-ordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/487—Extracting wanted echo signals, e.g. pulse detection
- G01S7/4873—Extracting wanted echo signals, e.g. pulse detection by deriving and controlling a threshold value
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/4865—Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
Definitions
- the invention relates to a method for detecting objects in a
- the invention also relates to a lidar sensor device, a driver assistance system and a motor vehicle.
- lidar sensor devices in short lidar sensors, which can be arranged, for example, on a motor vehicle in order to monitor an environmental region of the motor vehicle.
- lidar sensors can be used for object recognition by emitting a transmission signal in the form of a light pulse and the light pulse reflected at an object, for example a wall, another motor vehicle, etc., in the surrounding area
- the filtered detection points are grouped into clusters and the objects are detected based on the clusters.
- Information about the object for example the position of the object, can be made available to a driver assistance system of the motor vehicle which, for example based on the position information, is a measure for avoiding a collision of the vehicle Motor vehicle with the object can initiate.
- a collision avoiding measure may be, for example, the output of a warning signal and / or the automatic braking of the motor vehicle.
- the information about the object is updated on the basis of a multiplicity of reception signals recorded in chronological succession.
- the objects are therefore swept or "tracked." For this purpose, even if the motor vehicle and / or the objects have moved, the objects must be reliably detected over time, which means that detection points which correspond to reflection points of the same object in the plurality of received signals must be reliably detected and assigned to each other.
- lidar sensor device is, for example, a cost-effective (“low-cost”) lidar sensor device which has a low angular resolution due to the design This may, for example, result in the disadvantage that the object and its position can not be reliably detected.
- Reliability of lidar sensor devices can be improved and positions of objects in a surrounding area of a motor vehicle can be detected particularly accurately.
- Ambient area reflected and received by a lidar sensor device receiving signal are identified in particular in the received signal to reflection points in the surrounding area corresponding detection points, based on the received signal determines respective intensity values of the detection points
- Detection points by comparing the intensity values with a distance-based Filtered intensity threshold, grouped the filtered detection points into clusters, and recognized the objects based on the clusters.
- the respective detection points within a cluster are compared by comparing the intensity values of the detection points within the cluster with a cluster-specific, relative
- the cluster-specific, relative intensity threshold can be determined in dependence on the intensity values of the detection points of the cluster.
- received signal received in the received signal to reflection points in the surrounding area corresponding detection points are identified, determined based on the received signal respective intensity values of the detection points, the
- Intensity values of the detection points of the cluster is determined.
- the lidar sensor device (lidar "light detection and ranging”) can in particular detect objects or features in the surrounding area and a respective spatial position of the objects to the lidar sensor device and thus to the
- the lidar sensor device can emit transmission signals in the form of light pulses and receive the reflected light pulses in the surrounding area as received signals again.
- the lidar sensor device is designed in particular as a flash lidar sensor device, in which at least one transmission signal illuminating the surrounding region is emitted during a measuring cycle.
- the transmission signal light is thus emitted in a plurality of transmission directions, wherein the reception signal can be received during the measurement cycle from a plurality of reception directions or reflection directions.
- the transmission signal and the reception signal are, in particular, cone-shaped light beams which each have a horizontal total opening angle, for example 60 °, and a vertical total opening angle, for example 10 °. By comparison to the total horizontal opening angle lower total vertical opening angle, which a detection range of the lidar sensor device along a
- Ground echoes can be at least reduced.
- Such ground echoes are detection points which correspond to reflection points on a roadway of the motor vehicle.
- the received signal can be in several light beams, for example in sixteen
- Beams of light to be received Each ray of light is part of the
- the lidar sensor device which is embodied, for example, as a solid-state lidar sensor device, can have a receiving device with several
- Receiving elements for example, with sixteen receiving elements have.
- the receiving elements may, for example, photodetectors in the form of light-sensitive semiconductor devices, such as photodiodes have.
- the reception elements can be arranged, for example, in a matrix, that is, column by column and / or row by row. Each receiving element can be assigned a reflection direction or a reflection angle. In particular, a receiving element receives only light rays, which fall from the receiving element associated reflection direction to the receiving device.
- a light beam describes a course of the intensity of the reflected light, ie the distribution of the intensity values over time. To identify the detection points, this profile can be searched for intensity peaks or intensity peaks which can be identified as maximum values of the intensity in the course.
- Intensity peaks represent the detection points, so-called echoes, which correspond to reflection points in the surrounding area.
- a reflection point may for example be an object point on a surface of an object.
- the value of the intensity is dependent on properties of the reflective object, for example its reflectivity, as well as its distance from the lidar sensor device.
- position values for the detection points are determined on the basis of the received signal in addition to the intensity values.
- the positions of the reflection points corresponding to the detection points are determined.
- a distance value and a reflection angle are respectively determined for the detection points.
- the distance value for a detection point can be determined by transit time measurements based on a to the detection point appropriate time or timestamp be determined within the course. Since, in particular, each receiving element is assigned a reflection direction, ie each receiving element corresponds to a specific reflection angle, the reflection angle can be determined on the basis of the receiving element for the detection point in addition to the distance value. Based on the distance value and the
- Reflection angle can therefore be determined for each detection point, the spatial position or position of the corresponding to the detection point reflection point.
- the surrounding area can be imaged based on detection points.
- the detection points are first filtered based on their absolute intensity.
- the intensity value of at least some detection points, in particular of each detection point can be compared with the predetermined distance-based intensity threshold value.
- the distance-based intensity threshold value is thus an intensity threshold which varies as a function of the distance, and which is constant with all the detection points within a specific one
- Distance range are thus compared with the associated intensity threshold and thus filtered within the distance range based on their absolute intensity. All detection points are filtered out or removed, whose
- Intensity value is below the distance-based intensity threshold.
- the detection points are additionally filtered by comparing the position values with an absolute position threshold.
- the detection points are thus filtered based on their absolute position.
- all detection points are filtered out or removed whose position values exceed the absolute position threshold value.
- the distance-based intensity threshold value and / or the absolute position threshold value are in this case in particular predetermined in such a way that interference signals, for example sensor noise or undesired detections, can be suppressed.
- interference signals for example sensor noise or undesired detections
- those detection points are filtered out, which correspond to unwanted reflections in the surrounding area. Such unwanted reflections can be caused for example by splashing water or water vapor.
- a minimum intensity value required for the detection points to prevent filtering out can be described by a curve in which the
- distance-based intensity threshold decreases with increasing distance of a detection point.
- the distance-based intensity threshold is higher, the lower the distance is. The closer a detection point is, the higher its intensity value must be so as not to be filtered out.
- the detection points are grouped into clusters or point clouds.
- a collection of detection points whose associated reflection points originate from the same object or feature in the surrounding area is interpreted as a cluster.
- each cluster represents an object or a feature, for example a partial area of an object, in the surrounding area.
- the detection points can be grouped into the clusters by density-based spatial cluster analysis, in particular by DBSCAN (Density-Based Spatial Clustering of Applications with Noise), in which a group of detection points in the immediate vicinity is identified as a cluster
- Neighborhood detected which has a high density of detection points.
- the neighborhood is determined by a proximity of multiple detection points within a cluster to another detection point.
- the density is determined over a number of detection points within a cluster. If
- Detection points have a corresponding proximity to each other, so one
- Neighborhood and when a number of detection points within a neighborhood are large, for example exceeding a predetermined threshold for the number, it is considered that these detection points represent a cluster.
- the detection points within a cluster become the cluster candidates based on their relative
- the intensity values of the individual cluster candidates are compared with the cluster-specific, relative intensity threshold value.
- the respective cluster-specific, relative intensity threshold value is determined as a function of the intensity values of the detection points of the associated clusters.
- Cluster-specific, relative intensity threshold value can be specified, for example, as the mean value.
- the mean value can be weighted with a configurable parameter, in particular between 0 and 1, and the weighted mean value can be specified as the cluster-specific, relative intensity threshold value.
- Cluster-specific, relative intensity threshold value is thus determined from the average intensity of at least some, in particular all, detection points within the cluster. Then, in particular, each detection point within the cluster is compared with the mean intensity or weighted average intensity and filtered out if its intensity value is lower than the mean value or the weighted mean value.
- the cluster is searched for at least one detection-point-free region, by which a gap between the detection points of a cluster greater than a predetermined threshold is formed, and the cluster in at least two separate clusters is divided if at least one detection point-free area is identified.
- This embodiment is based on the finding that, by removing detection points from the cluster whose intensity values fall below the cluster-specific, relative intensity threshold value, gaps can arise within a cluster. When a gap caused by the filtering based on the relative intensity exceeds the predetermined threshold for the gap, it separates at least two neighborhoods
- the cluster is split along the gap into two separate clusters.
- Range of distance values of adjacent detection points within a cluster to each other determines and identifies the at least one detection-point-free area, if at least one distance value is a predetermined distance threshold
- Detection points exceeds the predetermined distance threshold, it can be assumed that the two adjacent detection points to
- Reflection points of different objects correspond. Filtering based on relative intensity and identifying detection-point-free area may prevent multiple objects from being erroneously recognized as a single object. Thus, even objects that are particularly close to one another in the surrounding area can be reliably detected.
- centers and sizes of the respective clusters are determined for the respective clusters. Based on the sizes and based on the
- the corresponding objects in the surrounding area are determined such that the objects lie within the corresponding clusters.
- the size of the corresponding object is derived from the size and the center of a cluster.
- the size and in particular also the shape of the object are determined in such a way that the object lies within the cluster.
- the size and in particular also the shape of the object are determined in such a way that boundaries of the cluster surround or enclose the object.
- position values of the detection points within a cluster are determined, the position values of the detection points are weighted with the respective intensity values and the center point of a cluster is determined on the basis of the weighted position values of the cluster.
- the midpoint may be determined as the arithmetic mean of the weighted position values.
- the center of the object can be determined very accurately.
- the accuracy of the object detection at a fixed hardware-dependent angular resolution of the lidar sensor device can be increased by evaluating the intensities of the light beams of the received signal reflected on an object.
- the angular resolution of the sensor system is effectively improved without the lidar sensor device having to be modified on the hardware side.
- orientation lines are determined on the basis of the detection points of the respective clusters, and orientations of the objects with respect to the lidar sensor device are determined on the basis of angles of the orientation lines to a viewing direction of the lidar sensor device.
- the orientation lines of the clusters are determined on the basis of the detection points of the respective clusters by means of the least squares method. The orientation line corresponds to one
- the spatial position of the object for example, an inclination angle of the object with respect to the viewing direction can be detected. Based on the angle of the orientation line can thus be detected whether the surface of the object is aligned frontally or obliquely to the lidar sensor device.
- the invention also relates to a lidar sensor device for a motor vehicle for detecting objects in an environmental region of the motor vehicle.
- the lidar sensor device comprises in particular a
- Receiving device which is designed to receive a received signal reflected from the surrounding area
- an evaluation device which is designed to identify in the received signal to reflection points in the surrounding area corresponding detection points to determine based on the received signal respective intensity values of the detection points, the detection points by filtering the intensity values with a distance-based intensity threshold, grouping the filtered detection points into clusters, and recognizing the objects based on the clusters.
- the detection points by filtering the intensity values with a distance-based intensity threshold, grouping the filtered detection points into clusters, and recognizing the objects based on the clusters.
- Evaluation device configured to filter the respective detection points within a cluster by comparing the intensity values of the detection points within the cluster with a cluster-specific, relative intensity threshold value, the cluster-specific, relative intensity threshold value depending on the
- Intensity values of the detection points of the cluster is determined.
- the lidar sensor device comprises a
- Receiving device which is designed to receive a received signal reflected from the surrounding area
- an evaluation device which is designed to identify in the received signal to reflection points in the surrounding area corresponding detection points to determine based on the received signal respective intensity values of the detection points, the detection points by filtering the intensity values with a distance-based intensity threshold, grouping the filtered detection points into clusters, and recognizing the objects based on the clusters.
- the detection points by filtering the intensity values with a distance-based intensity threshold, grouping the filtered detection points into clusters, and recognizing the objects based on the clusters.
- Evaluation device configured to filter the respective detection points within a cluster by comparing the intensity values of the detection points within the cluster with a cluster-specific, relative intensity threshold value, the cluster-specific, relative intensity threshold value depending on the
- the lidar sensor device may also comprise a transmitting device for transmitting a transmission signal in the form of a light pulse, which is reflected at objects in the surrounding area and which can be received again by the receiving device as the received signal.
- the lidar sensor device is designed in particular as a cost-effective, "low-cost" lidar sensor device which is designed to be able to reliably detect and track objects in the surrounding area of the motor vehicle by means of the method performed by the evaluation device.
- the invention relates to a driver assistance system for a motor vehicle with at least one lidar sensor device according to the invention.
- Driver assistance system is designed in particular to a predetermined
- the driver assistance system can trigger a measure for avoiding a collision between the motor vehicle and the object if it is detected by the driver assistance system based on the position of the object that the object is located in a critical subarea of the surrounding area and a collision with the object is imminent.
- a collision avoiding measure may be, for example, the output of a warning signal and / or an automatic braking of the motor vehicle.
- a motor vehicle according to the invention comprises an inventive
- the motor vehicle may, for example, as a
- Embodiments and their advantages apply correspondingly to the lidar sensor device according to the invention, to the driver assistance system according to the invention and to the motor vehicle according to the invention.
- Fig. 1 is a schematic representation of an embodiment of a
- FIG. 2 shows a schematic representation of an intensity profile of a light beam of the received signal
- Fig. 3 is a schematic representation of a first cluster
- Fig. 4 is a schematic representation of the first cluster with a
- Fig. 5 is a schematic representation of second and third clusters formed from the first cluster.
- FIG. 1 shows a motor vehicle 1 according to an embodiment of the present invention
- the motor vehicle 1 is formed in the present case as a passenger car.
- the motor vehicle 1 has a driver assistance system 2 with at least one lidar sensor device 3, which is designed to monitor a surrounding area 4 of the motor vehicle 1.
- the lidar sensor device 3 is in particular as a flash lidar sensor configured, which illuminates the surrounding area 4 by means of a transmission signal in the form of a light pulse and in the
- the lidar sensor device 3 has a transmitting device 5 for transmitting the transmission signal, a receiving device 6 for receiving the in the
- Ambient area 4, for example on objects 7, 8, reflect the received signal and an evaluation device 9.
- the lidar sensor device 3 is designed to recognize the objects 7, 8 and their sizes and spatial positions in the
- the spatial positions of the objects 7, 8 are determined by respective distances of the objects 7, 8 to the motor vehicle 1 and respective
- Information about the position of the objects 7, 8 to the motor vehicle 1 can be provided to a control device 10 of the driver assistance system 2.
- Control device 10 can initiate, for example, a measure for avoiding a collision of motor vehicle 1 with objects 7, 8 based on the position information.
- the transmitting device 5 of the lidar sensor device 3 can emit a plurality of light pulses during a measuring cycle.
- the receiving device 6 can receive the reflected light pulse as a received signal in a plurality of light beams or reflections again.
- the receiving device 6 may comprise a plurality of receiving elements, each receiving element receiving in particular only light rays from a specific reflection direction. In other words, each receives
- Receiving element only light rays which meet in an associated angle of reflection from the surrounding area 4 to the receiving device 6.
- an intensity profile 1 1 of intensities I of the light beam over the time t as shown in FIG. 2, for intensity peaks P1, P2 can be searched for each light beam.
- P2 with the intensity values 11, 12 represent the detection points D, D ', so-called echoes.
- a first distance d1 of a reflection point corresponding to the detection point D can be measured over the transit time measurement
- a second distance d2 of a reflection point corresponding to the detection point D 'into the surrounding area 4 can be determined.
- a distance-based intensity threshold value l s is predetermined for this distance range, which is compared with the intensity values 11, 12.
- the second intensity value 12 of the second detection D ' fallss below the intensity threshold value l s and is therefore filtered out.
- the intensity value 11 of the first detection D exceeds the distance-based intensity threshold value l s and is therefore not filtered out.
- the filtered detection points D, D "of all the light beams are clustered as shown in Fig. 3, with only one cluster C1 shown in Fig. 3. For this, positions can be determined for each detection point D, D" by Based on the transit time measurement distance d as well as based on the respective
- Detection point D, D "of the receiving element which determines the reflection angles ⁇ The grouping of the detection points D, D" into clusters C1 can be carried out, for example, by means of density-based cluster analysis, in particular DBSCAN. If objects 7, 8 in the surrounding area 4 are close to each other and the
- Receive signal additionally has interference signals, it may happen that the cluster analysis is faulty. For example, it may happen that the cluster C1 is then mistaken for a single object, although the detection points D, D "within the cluster C1 actually correspond to reflection points of several objects 7, 8.
- the detection points D, D "within the cluster C1 are filtered based on their relative intensity, for which the intensities of the detection points D, D" are compared with a cluster-specific, relative intensity threshold, which is for example calculated from an average of the intensity values of the detection points D, D "within the cluster C1. If an intensity value of a
- Detection points D, D "the cluster-specific, relative intensity threshold value, ie the average value of the intensity values of the detection points D, D", below, so these detection points D "is removed from the cluster C1.
- the cluster C1 without the detection point D is shown in Fig. 4. Removing the detection point D" creates a gap 12 which describes a detection point-free area within the cluster C1. If this gap 12 a predetermined Size, it is assumed that the detection point areas separated by the gap 12 within the cluster C1 correspond to reflection points of different objects. In this case, the cluster C1 is split into several clusters. Here, as shown in Fig. 5, the cluster C1 is divided into two clusters C2, C3. For example, cluster C2 corresponds to object 7 in FIG. 4.
- Detection points D within the clusters C2, C3 are determined, the positions are weighted by the intensity values of the associated detection points D and the mean value M is determined based on the weighted positions.
- a respective size of the clusters C2, C3 can be determined.
- a size of the respective object 7, 8 can be determined. The size of the object 7, 8 is determined so that the object 7, 8 is within the cluster C2, C3.
- a respective orientation line L for the clusters C2, C3 can be determined.
- the orientation line L describes a spatial position of reflection points on the surface of the respective object 7, 8 with respect to the lidar sensor device 3 and thus with respect to the
- the orientation line L can be determined, for example, by means of the least squares method. Based on the orientation lines L and the sizes of the objects 7, 8, the surrounding area 4 around the motor vehicle 1 can be described object-based with high accuracy.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
L'invention concerne un procédé de détection d'objets (7, 8) dans une zone environnante (4) d'un véhicule automobile (1) par le biais d'au moins un signal de réception réfléchi par la zone environnante (4) et reçu par un dispositif de détection lidar (3). Des points de détection (D, D', D'') correspondant à des points de réflexion dans la zone environnante (4) sont identifiés dans le signal reçu, les valeurs d'intensité (I1, I2) respectives des points de détection (D, D', D'') sont déterminées à partir du signal reçu, les points de détection (D, D', D'') sont filtrés en comparant les valeurs d'intensité avec une valeur de consigne d'intensité (IS) basée sur la distance, les points de détection filtrés (D, D'') sont réunis en groupes (C1) et les objets (7, 8) sont détectés sur la base des groupes (C1). Selon l'invention, les points de détection (D, D''') respectifs sont filtrés dans un groupe (C1) par comparaison des valeurs d'intensité des points de détection (D, D'') dans le groupe (C1) avec une valeur de consigne d'intensité relative spécifique à un groupe, la valeur de consigne d'intensité relative spécifique à un groupe étant déterminée en fonction des valeurs d'intensité des points de détection (D, D''') du groupe (C1). L'invention concerne également un dispositif de détection lidar (3), un système d'aide à la conduite (2) et un véhicule automobile (1).
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102017112784.4 | 2017-06-09 | ||
DE102017112784.4A DE102017112784A1 (de) | 2017-06-09 | 2017-06-09 | Verfahren zum Erfassen von Objekten in einem Umgebungsbereich eines Kraftfahrzeugs, Lidar-Sensorvorrichtung, Fahrerassistenzsystem sowie Kraftfahrzeug |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018224610A1 true WO2018224610A1 (fr) | 2018-12-13 |
Family
ID=62563154
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2018/065072 WO2018224610A1 (fr) | 2017-06-09 | 2018-06-07 | Procédé de détection d'objets dans une zone environnante d'un véhicule automobile, dispositif de détection lidar, système d'aide à la conduite et véhicule automobile |
Country Status (2)
Country | Link |
---|---|
DE (1) | DE102017112784A1 (fr) |
WO (1) | WO2018224610A1 (fr) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111563398A (zh) * | 2019-02-13 | 2020-08-21 | 北京京东尚科信息技术有限公司 | 用于确定目标物的信息的方法和装置 |
CN111722196A (zh) * | 2019-03-19 | 2020-09-29 | 富士通株式会社 | 雷达反射点提取方法和装置 |
CN112698301A (zh) * | 2020-12-11 | 2021-04-23 | 中国科学院微电子研究所 | 基于距离强度关联的激光雷达目标识别方法、装置及设备 |
CN113196092A (zh) * | 2019-11-29 | 2021-07-30 | 深圳市大疆创新科技有限公司 | 噪点过滤方法和设备、激光雷达 |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102019211100A1 (de) * | 2019-07-25 | 2021-01-28 | Volkswagen Aktiengesellschaft | Verfahren, Vorrichtung und Computerprogramm zum Aufbereiten von Daten über eine Ampelanlage |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030218919A1 (en) * | 2002-02-08 | 2003-11-27 | Omron Corporation | Distance measuring apparatus |
DE102010062209A1 (de) * | 2009-12-02 | 2011-06-09 | Denso Corporation, Kariya-City | Objekterkennungsvorrichtung, die eine Strahlabtastung zur Erfassung von Breiten von Objekten verschiedener Grössen, die in verschiedenen Abständen angeordnet sind, verwendet |
US20110234450A1 (en) * | 2010-03-26 | 2011-09-29 | Denso Corporation | Apparatus and method for detecting division lines depicted on road |
EP2966473A1 (fr) * | 2014-07-11 | 2016-01-13 | Sick Ag | Procédé de mesure d'un objet |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102004003870A1 (de) * | 2004-01-26 | 2005-08-11 | Ibeo Automobile Sensor Gmbh | Verfahren zur Klassifizierung von Objekten |
-
2017
- 2017-06-09 DE DE102017112784.4A patent/DE102017112784A1/de not_active Withdrawn
-
2018
- 2018-06-07 WO PCT/EP2018/065072 patent/WO2018224610A1/fr active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20030218919A1 (en) * | 2002-02-08 | 2003-11-27 | Omron Corporation | Distance measuring apparatus |
DE102010062209A1 (de) * | 2009-12-02 | 2011-06-09 | Denso Corporation, Kariya-City | Objekterkennungsvorrichtung, die eine Strahlabtastung zur Erfassung von Breiten von Objekten verschiedener Grössen, die in verschiedenen Abständen angeordnet sind, verwendet |
US20110234450A1 (en) * | 2010-03-26 | 2011-09-29 | Denso Corporation | Apparatus and method for detecting division lines depicted on road |
EP2966473A1 (fr) * | 2014-07-11 | 2016-01-13 | Sick Ag | Procédé de mesure d'un objet |
Non-Patent Citations (1)
Title |
---|
SULIMAN GARGOUM ET AL.: "Automated Highway Sign Extraction using Lidar Data", TRANSPORTATION RESEARCH RECORD: JOURNAL OF THE TRANSPORTATION RESEARCH BOARD |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111563398A (zh) * | 2019-02-13 | 2020-08-21 | 北京京东尚科信息技术有限公司 | 用于确定目标物的信息的方法和装置 |
CN111722196A (zh) * | 2019-03-19 | 2020-09-29 | 富士通株式会社 | 雷达反射点提取方法和装置 |
CN113196092A (zh) * | 2019-11-29 | 2021-07-30 | 深圳市大疆创新科技有限公司 | 噪点过滤方法和设备、激光雷达 |
CN112698301A (zh) * | 2020-12-11 | 2021-04-23 | 中国科学院微电子研究所 | 基于距离强度关联的激光雷达目标识别方法、装置及设备 |
Also Published As
Publication number | Publication date |
---|---|
DE102017112784A1 (de) | 2018-12-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018224610A1 (fr) | Procédé de détection d'objets dans une zone environnante d'un véhicule automobile, dispositif de détection lidar, système d'aide à la conduite et véhicule automobile | |
EP2824478B1 (fr) | Capteur optoélectronique et procédé destiné à la détection d'objets et à la détermination de distance dans une zone de surveillance | |
EP1910866B1 (fr) | Procede pour determiner une limitation en profondeur d'un creneau de stationnement au moyen de capteurs a ultrasons et systeme associe | |
EP3304124B1 (fr) | Procédé de classification d'un objet fixe allongé de manière oblongue dans une zone d'environnement latérale d'un véhicule automobile, système d'aide à la conduite et véhicule automobile | |
DE102013104443A1 (de) | Verkehrsüberwachungssystem zur Geschwindigkeitsmessung und Zuordnung von bewegten Fahrzeugen bei einem Mehrziel-Aufnahmemodul | |
EP2698646A1 (fr) | Procédé de classification de véhicules roulants par le suivi de la taille de position du véhicule | |
DE102013018721A1 (de) | Verfahren zur Erkennung wenigstens einer Parklücke für einen Kraftwagen | |
DE102004032118B4 (de) | Objekterkennungsverfahren für Fahrzeuge | |
DE102006057277A1 (de) | Verfahren zum Betrieb eines Radarsystems bei möglicher Zielobjektverdeckung sowie Radarsystem zur Durchführung des Verfahrens | |
DE102015011020B3 (de) | Verfahren zur Zuordnung von vorgegebenen Einbaupositionen zu an den Einbaupositionen in einem Kraftfahrzeug verbauten Radarsensoren und Kraftfahrzeug | |
DE102015213558A1 (de) | Vorrichtung und Verfahren zur Fusion zweier Hinderniskarten zur Umfelddetektion | |
DE102015007040B4 (de) | Verfahren zur Detektion und Klassifikation von Fußgängern in einer Umgebung eines Kraftfahrzeugs und Kraftfahrzeug | |
DE10353347A1 (de) | Verfahren zur Erkennung von Fußgängern | |
EP3193276B1 (fr) | Dispositif de detection et procede destine a la reconnaissance d'essieux de vehicule | |
DE102020000803A1 (de) | Verfahren zum Plausibilisieren einer Detektion eines LIDAR-Systems eines Fahrzeugs, Vorrichtung eingerichtet zum Durchführen eines solchen Verfahrens, und Fahrzeug mit einer solchen Vorrichtung | |
DE102015122413B4 (de) | Verfahren zum Betreiben eines Ultraschallsensors eines Kraftfahrzeugs, Ultraschallsensorvorrichtung, Fahrerassistenzsystem sowie Kraftfahrzeug | |
DE102019207688A1 (de) | Verfahren und Fahrerassistenzsystem zur Klassifizierung von Objekten in der Umgebung eines Fahrzeugs | |
DE102010021053B3 (de) | Verfahren zur Detektion von Störungen des Messbetriebs einer Ultraschall-Messanordnung eines Kraftfahrzeugs und Kraftfahrzeug | |
EP3299844A1 (fr) | Procédé de fonctionnement pour un système détecteur à ultrasons, unité de commande, système détecteur à ultrasons et véhicule | |
WO2021001178A1 (fr) | Dispositif d'adaptation et dispositif de mesure lidar | |
EP3433632B1 (fr) | Procédé de détection d'au moins un objet, dispositif d'un appareil de détection, appareil de détection et système d'assistance à la conduite comprenant au moins un appareil de détection | |
DE102018121158A1 (de) | Verfahren zum Erfassen von Bodenabtastpunkten und Fahrerunterstützungssystem, das dafür konfiguriert ist, ein derartiges Verfahren auszuführen | |
WO2022089819A1 (fr) | Procédé et dispositif d'identification de candidats d'efflorescence dans une mesure lidar | |
WO2022033980A1 (fr) | Procédé de reconnaissance d'usagers de la route dans un environnement d'un véhicule sur la base de mesures d'un capteur radar en identifiant des détections défectueuses et dispositif informatique | |
DE112020007006T5 (de) | Objekterkennungsystem und Objekterkennungsverfahren |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18729957 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18729957 Country of ref document: EP Kind code of ref document: A1 |