WO2021001171A1 - Procédé pour faire fonctionner un dispositif de détection de distance 3d - Google Patents
Procédé pour faire fonctionner un dispositif de détection de distance 3d Download PDFInfo
- Publication number
- WO2021001171A1 WO2021001171A1 PCT/EP2020/067110 EP2020067110W WO2021001171A1 WO 2021001171 A1 WO2021001171 A1 WO 2021001171A1 EP 2020067110 W EP2020067110 W EP 2020067110W WO 2021001171 A1 WO2021001171 A1 WO 2021001171A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensor device
- distance sensor
- threshold value
- measurement points
- distance
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/40—Means for monitoring or calibrating
- G01S7/4004—Means for monitoring or calibrating of parts of a radar system
- G01S7/4039—Means for monitoring or calibrating of parts of a radar system of sensor or antenna obstruction, e.g. dirt- or ice-coating
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4808—Evaluating distance, position or velocity data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
- G01S2007/4975—Means for monitoring or calibrating of sensor obstruction by, e.g. dirt- or ice-coating, e.g. by reflection measurement on front-screen
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9323—Alternative operation using light waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/40—Means for monitoring or calibrating
- G01S7/4052—Means for monitoring or calibrating by simulation of echoes
- G01S7/4082—Means for monitoring or calibrating by simulation of echoes using externally generated reference signals, e.g. via remote reflector or transponder
- G01S7/4091—Means for monitoring or calibrating by simulation of echoes using externally generated reference signals, e.g. via remote reflector or transponder during normal radar operation
Definitions
- the invention relates to a method for operating a 3D distance sensor device.
- the invention also relates to a 3D distance sensor device.
- the invention also relates to a computer program.
- the invention also relates to a machine-readable storage medium.
- DE 199 31 825 A1 discloses a device for measuring visibility, especially for motor vehicles, which has at least one optical transmitting element, at least one optical receiving element and a measurement signal evaluation unit which determines a current visibility from the light reflected at one or more different spatial zones.
- DE 10 2016 014 549 A1 discloses an autonomous vehicle that contains both a LiDAR sensor and a camera.
- the LiDAR sensor can work in the range of visible light, in the IR range or another wavelength range. If signs have an IR-reflective print, the IR range can be used.
- DE 10 2016 014 549 A1 discloses a method for determining the visual range from a vehicle by means of a LiDAR sensor.
- EP 2 101 193 A1 discloses a safety sensor and a method for the contactless measurement of positions, distances and / or speeds.
- US Pat. No. 6,362,773 B1 discloses a combination of several sensors or an adapted, complex evaluation of raw sensor data. However, these raw data are not available in many sensor architectures of the central sensor fusion and evaluation unit.
- the invention creates a method for operating a 3D distance sensor device, comprising the steps:
- the proposed method advantageously represents a method at the system level, i.e. that an already evaluated 3D point cloud of measuring points is used and raw sensor data is therefore not required.
- the object is achieved with a 3D distance sensor device which is designed to carry out the proposed method.
- the object is achieved with a computer program.
- the object is achieved with a machine-readable storage medium.
- An advantageous further development of the method is characterized in that measurement points of a geometrically lowest level of the 3D distance sensor device are analyzed, the number of detected measurement points being compared with a defined first threshold value. It is expected that all scan or measurement points of the geometrically lowest level are always available, because a road is arranged in this level, on which the vehicle with the 3D distance sensor device is located, whereby the road is usually only approx. 10m far from the 3D distance sensor device. If this is not the case, appropriate conclusions can be drawn from it.
- a further advantageous development of the method is characterized in that in the event that the proportion of the detected measurement points is defined below the first threshold value, a cover glass of the 3D distance sensor device is covered. In this case, a cleaning process of the 3D distance sensor device can be triggered, for example, in order to establish the full range of vision.
- Another advantageous development of the method provides that in the event that the proportion of detected measuring points is defined above the first threshold value, an evaluation of a defined, calibrated distance histogram of the measuring points is carried out, a decay rate of the frequency of the measuring points being determined over the distance .
- the distance histogram is evaluated in this way in order to be able to make more precise statements about the visual range. For example, reduced visibility due to fog or rain can be determined in the distance histogram.
- Another advantageous development of the method provides that in the event that the proportion of the detected measuring points is defined below a second threshold value of the decay rate, a defined high visual range of the 3D distance sensor device is detected. This is determined by the fact that the distance histogram slowly declines at a high visibility range.
- Another advantageous development of the method is characterized in that in the event that the proportion of the detected measuring points is defined below a second threshold value of the decay rate, an evaluation of a total number of the measuring points is carried out.
- the influence of fog on the visibility is determined, which is particularly irrelevant for short distances, in which case the total number of measurement points is evaluated.
- Another advantageous embodiment of the method is characterized in that when the total number of measurement points falls below a third threshold value, a defined low visual range of the 3D distance sensor device is determined and, when the third threshold value of the total number of measurement points is exceeded, the 3D distance sensor device is blocked from sight is detected.
- a third threshold value of the total number of measurement points is exceeded, the 3D distance sensor device is blocked from sight is detected.
- the distance histogram declines quickly, but there are a large number of measurement points in the 3D point cloud, which can be an indication of a blockage in the view of the 3D sensor device.
- a further advantageous embodiment of the method is characterized in that a driving function of a vehicle is adapted from the estimated visual range and / or cleaning of the 3D distance sensor device is initiated. As a result, suitable measures can be taken to adapt to the reduced visibility of the 3D distance sensor device.
- Another advantageous embodiment of the method is characterized in that the method is carried out in normal operation of the 3D distance sensor device. In this way, it is advantageously possible to determine the visual range of the 3D distance sensor device in quasi real time.
- Another advantageous embodiment of the method is characterized in that the method is carried out within the 3D distance sensor device or on an external computer device. That way one can "Intelligent 3D distance sensor" can be implemented or the computing power required for this can be suitably outsourced.
- Fig. 1 is a basic flowchart of a proposed method for operating a 3D distance sensor device
- a core idea of the present invention is in particular to provide an improved method for operating or analyzing a 3D distance sensor device.
- One advantage of the proposed method is that, for example, a central evaluation unit of a 3D distance sensor can determine or estimate the visual range, with additional sensors advantageously not being required for this.
- the speed must be adjusted to the current field of vision of the sensors.
- the range of vision of a sensor can be severely impaired by various atmospheric weather influences such as rain, fog, snowfall, etc.
- a distinction must be made as to whether the performance of the sensor is based on such atmospheric influences or due to statistically distributed blockages on the sensor cover, such as raindrops, dirt, etc. is impaired.
- the proposed method also makes it possible to identify situations in which the view of the 3D distance sensor is blocked by one or more objects (e.g. a vehicle parked in a garage) and therefore no statement is made about the current visibility of the 3D distance sensor can be.
- a 3D point cloud is understood to be a three-dimensional cluster of measurement points that are recorded using a three-dimensional distance sensor (e.g. LiDAR, radar sensor, etc.).
- a three-dimensional distance sensor e.g. LiDAR, radar sensor, etc.
- the number of points in the environment detected by the 3D distance sensor is primarily used.
- the method is described below by way of example for a 3D LiDAR distance sensor arranged on a vehicle, but is advantageously also suitable for improved operation of a radar sensor.
- the 3D distance sensor device provides a 3D point cloud with a defined number of measurement points in the environment.
- a step 101 an analysis of the number of points of the measurement points of a geometrically lowest level of the 3D distance sensor device is carried out.
- the geometrically lowest level of a 3D distance sensor device typically covers a road at a short distance from the vehicle, measurement data provided in this way therefore often also being used to identify open spaces.
- the short distance means that all measuring points of this plane should be present in the 3D point cloud even if the visibility is limited.
- This parameter therefore does not represent a measure of the visual range of the 3D distance sensor device, but it does allow, for example, a blockage of vision on the cover of the 3D distance sensor device to be recognized.
- a step 102 a total number of measuring points of the geometrically lowest level is checked, with the case that the total number of measuring points defines one falls below the first threshold value SW1 defined, it is determined that a blockage of a cover of the 3D distance sensor device is present. For example, this can be caused by a deposit (eg dirt, snow, ice, etc.) on the sensor cover, which is why a cleaning process of the 3D distance sensor device is initiated in this case.
- a deposit eg dirt, snow, ice, etc.
- the distance histogram represents a graph that shows the frequency of the measurement points of the 3D point cloud as a function of distance.
- a suitable mathematical function can be provided for this data, an exponentially decreasing function being selected in the present case, which is characterized by the two parameters amplitude and decay rate.
- the amplitude is influenced by blockages on the sensor cover of the 3D distance sensor device.
- the rate of decay is influenced by the atmospheric visibility, since a small visibility only affects the detection of objects that are far away, but not the detection of objects arranged close to the 3D distance sensor device.
- the decay rate is influenced by the structure of the landscape or by a blockage of the 3D distance sensor device's view by other objects (e.g. a garage, location behind a truck, etc.).
- the distance histogram allows a value for the visual range of the 3D distance sensor device to be specified based on the decay rate.
- a step 111 it is checked whether the decay rate of the measuring points of the distance histogram has a certain second threshold value SW2, in which case in the case that this is less than the defined second threshold value SW2 in In a step 112, a defined high visibility range for the 3D distance sensor device is determined.
- the decay rate of the distance histogram is greater than the second threshold value SW2
- An analysis of the total number of points of the measuring points of the 3D point cloud is carried out in step 121.
- a step 122 it is determined that the total number of points mentioned for the measurement points of the 3D point cloud is less than a defined third threshold value SW3, whereby a low visibility range (e.g. due to fog, rain, snowfall, etc.) is determined for the 3D distance sensor device .
- a low visibility range e.g. due to fog, rain, snowfall, etc.
- This can be the case, for example, in fog, in which case there is hardly any restriction in the field of view of the 3D sensor device for short distances, but a total number of measuring points can be considerably reduced for greater distances.
- a step 123 it is established that the named total number of points of the measurement points of the 3D point cloud is greater than the defined third threshold value SW3, in which case a visual blockage of the 3D distance sensor device is established.
- This situation can exist, for example, when the vehicle equipped with the 3D distance sensor device is parked in front of a wall.
- the distance histograms are each suitably pre-calibrated for a specific type of 3D distance sensor device, whereby a specific decay rate is assigned to a visual range of the 3D distance sensor device. Due to the large number of different types of 3D sensor devices, it is not possible to specify generally valid numerical values for visibility, distances, numbers of measuring points of the 3D point cloud and threshold values SW1, SW2, SW3.
- the second threshold value SW2 corresponds to the decay rate or the slope of the curves A, B.
- the proposed 3D distance sensor device can advantageously be used to detect the environment in highly and fully automated vehicles (level 3-5).
- the proposed method can preferably be carried out in a defined time pattern of the 3D distance sensor device, quasi in real time (eg with a cycle time of approx. 10 frames / s, ie with approx. 10 Hz) in a background operation. It can be provided that the proposed method is implemented as software running in the 3D distance sensor device. Alternatively, it can also be provided that the software for executing the proposed method runs on an external computer device, which can be arranged in the cloud, for example.
- Suitable measures can be derived from the determined visibility, e.g. an adaptation of one or more driver assistance functions of the
- the proposed method is preferably carried out in normal operation of the 3D distance sensor device, in which the 3D distance sensor device is arranged in or on a highly or fully automated vehicle and is used to detect the surroundings when the vehicle is in operation.
- the 3D distance sensor device is arranged in or on a highly or fully automated vehicle and is used to detect the surroundings when the vehicle is in operation.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
L'invention concerne un procédé pour faire fonctionner un dispositif de détection de distance 3D, comprenant les étapes consistant : - à prendre un nombre défini de points de mesure détectés ; - à évaluer les points de mesure détectés à l'aide d'au moins un critère défini et - à estimer une portée visuelle du dispositif de détection de distance 3D à partir des points de mesure évalués.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102019209846.0A DE102019209846A1 (de) | 2019-07-04 | 2019-07-04 | Verfahren zum Betreiben einer 3D-Distanzsensorvorrichtung |
DE102019209846.0 | 2019-07-04 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021001171A1 true WO2021001171A1 (fr) | 2021-01-07 |
Family
ID=71111440
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2020/067110 WO2021001171A1 (fr) | 2019-07-04 | 2020-06-19 | Procédé pour faire fonctionner un dispositif de détection de distance 3d |
Country Status (2)
Country | Link |
---|---|
DE (1) | DE102019209846A1 (fr) |
WO (1) | WO2021001171A1 (fr) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022005358A1 (fr) * | 2020-06-29 | 2022-01-06 | Epiroc Rock Drills Aktiebolag | Procédé d'auto-test pour un agencement de capteur de mesure de distance d'une machine de travail |
WO2022005357A1 (fr) * | 2020-06-29 | 2022-01-06 | Epiroc Rock Drills Aktiebolag | Procédé d'auto-test pour un agencement de capteur de mesure de distance d'une machine de travail |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102021202878A1 (de) | 2021-03-24 | 2022-09-29 | Robert Bosch Gesellschaft mit beschränkter Haftung | Verfahren zur Reichweitenbestimmung für einen LiDAR-Sensor |
DE102021204904A1 (de) | 2021-05-14 | 2022-12-01 | Robert Bosch Gesellschaft mit beschränkter Haftung | LiDAR-System und Verfahren zum Erkennen einer Verschmutzung eines Strahlengangs eines LiDAR-Systems |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE19931825A1 (de) | 1999-07-08 | 2001-01-25 | Bosch Gmbh Robert | Vorrichtung zur Sichtweitenmessung |
DE10055457A1 (de) * | 1999-11-10 | 2001-07-05 | Denso Corp | Verfahren zur Erkennung einer Radarcharakteristik, eine Vorrichtung zur Erkennung einer Radarcharakteristik, und ein Aufzeichnungsmedium, das Daten zur Erkennung einer Radarcharakteristik aufzeichnet |
US6362773B1 (en) | 1999-06-24 | 2002-03-26 | Robert Bosch Gmbh | Method for determining range of vision |
DE102005059902A1 (de) * | 2005-12-15 | 2007-06-28 | Robert Bosch Gmbh | Verfahren zur Sensorzustandserfassung sowie Abstandsmessvorrichtung und Einparkassistenzsystem |
EP2000826A1 (fr) * | 2007-06-04 | 2008-12-10 | Audi Ag | Procédé de détection de givre d'un capteur radar servant à la détection d'objets d'un système d'assistance du conducteur prévu dans un véhicule automobile |
EP2101193A1 (fr) | 2008-03-10 | 2009-09-16 | Sick Ag | Système de sécurité destiné à la mesure sans contact de positions, de voies et de vitesses |
DE102016014549A1 (de) | 2016-12-07 | 2017-07-06 | Daimler Ag | Verfahren zur Bestimmung der Sichtweite aus einem Fahrzeug |
EP3299839A1 (fr) * | 2016-09-26 | 2018-03-28 | Continental Automotive GmbH | Procédé, unité de surveillance et détecteur radar |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE4005919C2 (de) * | 1990-02-24 | 1996-12-19 | Eltro Gmbh | Verfahren und Anordnung zum Ermitteln der Sichtweite für Autofahrer beim Auftreten von Nebel |
DE10149768A1 (de) * | 2001-10-09 | 2003-04-17 | Ibeo Automobile Sensor Gmbh | Sichtweitenbestimmung |
EP2306217B1 (fr) * | 2009-09-30 | 2017-04-19 | Sick Ag | Détermination d'un environnement |
DE102017117162A1 (de) * | 2017-07-28 | 2019-01-31 | Sick Ag | Sensor und Verfahren zur Erfassung und Abstandsbestimmung von Objekten |
-
2019
- 2019-07-04 DE DE102019209846.0A patent/DE102019209846A1/de active Pending
-
2020
- 2020-06-19 WO PCT/EP2020/067110 patent/WO2021001171A1/fr active Application Filing
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6362773B1 (en) | 1999-06-24 | 2002-03-26 | Robert Bosch Gmbh | Method for determining range of vision |
DE19931825A1 (de) | 1999-07-08 | 2001-01-25 | Bosch Gmbh Robert | Vorrichtung zur Sichtweitenmessung |
DE10055457A1 (de) * | 1999-11-10 | 2001-07-05 | Denso Corp | Verfahren zur Erkennung einer Radarcharakteristik, eine Vorrichtung zur Erkennung einer Radarcharakteristik, und ein Aufzeichnungsmedium, das Daten zur Erkennung einer Radarcharakteristik aufzeichnet |
DE102005059902A1 (de) * | 2005-12-15 | 2007-06-28 | Robert Bosch Gmbh | Verfahren zur Sensorzustandserfassung sowie Abstandsmessvorrichtung und Einparkassistenzsystem |
EP2000826A1 (fr) * | 2007-06-04 | 2008-12-10 | Audi Ag | Procédé de détection de givre d'un capteur radar servant à la détection d'objets d'un système d'assistance du conducteur prévu dans un véhicule automobile |
EP2101193A1 (fr) | 2008-03-10 | 2009-09-16 | Sick Ag | Système de sécurité destiné à la mesure sans contact de positions, de voies et de vitesses |
EP3299839A1 (fr) * | 2016-09-26 | 2018-03-28 | Continental Automotive GmbH | Procédé, unité de surveillance et détecteur radar |
DE102016014549A1 (de) | 2016-12-07 | 2017-07-06 | Daimler Ag | Verfahren zur Bestimmung der Sichtweite aus einem Fahrzeug |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022005358A1 (fr) * | 2020-06-29 | 2022-01-06 | Epiroc Rock Drills Aktiebolag | Procédé d'auto-test pour un agencement de capteur de mesure de distance d'une machine de travail |
WO2022005357A1 (fr) * | 2020-06-29 | 2022-01-06 | Epiroc Rock Drills Aktiebolag | Procédé d'auto-test pour un agencement de capteur de mesure de distance d'une machine de travail |
Also Published As
Publication number | Publication date |
---|---|
DE102019209846A1 (de) | 2021-01-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021001171A1 (fr) | Procédé pour faire fonctionner un dispositif de détection de distance 3d | |
EP3121620B1 (fr) | Procede de segmentation de donnees d'un capteur 3d, fabrique en presence de nuages d'alcool, destine a l'augmentation de la reconnaissance de situations et la reconnaissance de positions d'obstacles | |
DE102009009815B4 (de) | Verfahren und Vorrichtung zur Erkennung von Parklücken | |
EP0897545B1 (fr) | Procede de determination de l'etat de la route a l'avance pour vehicules a moteur | |
DE102018104243B3 (de) | Verfahren und System zur Erkennung von für ein Fahrzeug geeigneten Parklücken | |
DE102013202915A1 (de) | Verfahren und Vorrichtung zum Vermessen einer Parklücke für ein Einparkassistenzsystem eines Kraftfahrzeugs | |
WO2019038174A1 (fr) | Évitement d'avertissements d'angle mort dûs à des éclaboussures | |
DE102019205565A1 (de) | Verfahren und Vorrichtung zum Bewerten einer Objekthöhe mittels von einem an einem Fahrzeug angebrachten Ultraschallsensor empfangenen Ultraschallsignalen | |
DE102017217072B4 (de) | Verfahren zum Erkennen eines Witterungsverhältnisses in einer Umgebung eines Kraftfahrzeugs sowie Steuervorrichtung und Kraftfahrzeug | |
WO2008074314A1 (fr) | Procédé d'évaluation de la fonctionnalité d'un système détecteur | |
DE10328814B3 (de) | Verfahren und Vorrichtung zur Verbesserung der Erkennung und/oder Wiedererkennung von Objekten in der Bildverarbeitung | |
EP3663881B1 (fr) | Procédé de commande d'un véhicule autonome en fonction des vecteurs de mouvement estimés | |
DE102014208272A1 (de) | Verfahren und Vorrichtung zur Tracking-basierten Sichtweitenschätzung | |
DE102018119632B4 (de) | Verfahren zur Bestimmung der Detektionsreichweite eines Umgebungssensors | |
DE102018121158A1 (de) | Verfahren zum Erfassen von Bodenabtastpunkten und Fahrerunterstützungssystem, das dafür konfiguriert ist, ein derartiges Verfahren auszuführen | |
DE102018109645A1 (de) | Parkplatzgrößenkorrektur | |
DE102009042476B4 (de) | Bestimmung von Zuständen in der Umgebung eines Kraftfahrzeugs mittels einer Stereokamera | |
EP0710927A2 (fr) | Méthode de reconnaissance orientée-objet d'objets mouvants | |
DE102019210128A1 (de) | Verfahren zum Bewerten von erfassten Sensordaten eines Sensors | |
EP3871006A1 (fr) | Système de détection de pluie comprenant un capteur d'environnement servant à détecter point par point un environnement d'un véhicule en particulier avec un capteur d'environnement lidar | |
EP3289520B1 (fr) | Reconnaissance gestuelle améliorée pour un véhicule | |
DE102019108998A1 (de) | Fahrunterstützungsverfahren | |
DE102021002910B3 (de) | Verfahren und Vorrichtung zur Bewertung einer Leistungsfähigkeit mindestens eines Umgebungssensors eines Fahrzeugs sowie Fahrzeug mit einer solchen Vorrichtung | |
DE102021210738A1 (de) | Verfahren zur radargestützten Überwachung eines rückwärtigen Raums | |
WO2023148053A1 (fr) | Procédé et dispositif de détermination de dégradation de visibilité d'un système lidar, programme d'ordinateur et support de stockage lisible par machine |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20733984 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20733984 Country of ref document: EP Kind code of ref document: A1 |