EP3788404A1 - Procédé permettant de déterminer un angle de roulis d'un capteur optoélectronique au moyen de points de balayage d'une image du capteur et capteur optoélectronique - Google Patents

Procédé permettant de déterminer un angle de roulis d'un capteur optoélectronique au moyen de points de balayage d'une image du capteur et capteur optoélectronique

Info

Publication number
EP3788404A1
EP3788404A1 EP19719862.5A EP19719862A EP3788404A1 EP 3788404 A1 EP3788404 A1 EP 3788404A1 EP 19719862 A EP19719862 A EP 19719862A EP 3788404 A1 EP3788404 A1 EP 3788404A1
Authority
EP
European Patent Office
Prior art keywords
optoelectronic sensor
roll angle
space coordinate
motor vehicle
determined
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP19719862.5A
Other languages
German (de)
English (en)
Inventor
Nikolai SERGEEV
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Valeo Schalter und Sensoren GmbH
Original Assignee
Valeo Schalter und Sensoren GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Valeo Schalter und Sensoren GmbH filed Critical Valeo Schalter und Sensoren GmbH
Publication of EP3788404A1 publication Critical patent/EP3788404A1/fr
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S7/4972Alignment of sensor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • G06V20/582Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads of traffic signs

Definitions

  • the invention relates to a method for determining at least one roll angle of an optoelectronic sensor of a motor vehicle.
  • the optoelectronic sensor comprises a transmitting device, a receiving unit with at least two receiving elements and an evaluation unit.
  • the transmitting device By means of the transmitting device light beams are emitted in an environment of the motor vehicle.
  • Light beams are received by the receiving unit and by means of
  • Evaluation unit shown as a sampling point in a sensor image generated by the optoelectronic sensor. Furthermore, the invention relates to an optoelectronic sensor.
  • the lidar sensor supplies a 3D point cloud, which comprises scanning points reflected on the ground and on at least one motor vehicle.
  • the 3D point cloud is compared to a 3D reference point cloud and misalignment of the
  • Object of the present invention is to provide a method and an optoelectronic sensor, by means of which a roll angle of the optoelectronic sensor can be determined improved.
  • One aspect of the invention relates to a method for determining a roll angle of an optoelectronic sensor of a motor vehicle.
  • the optoelectronic sensor comprises at least one transmitting device, at least one receiving unit and a
  • the method comprises the following steps:
  • the receiving unit receives reflected light on an object by the receiving unit, wherein the received light beams are represented by the evaluation as sampling points in a sensor image generated by the optoelectronic sensor image of the environment of the motor vehicle.
  • the objects may be, for example, other vehicles, traffic signs, or persons.
  • reflections from the road surface or the surface on which the vehicle is located can be received.
  • Roll angle is between at least one scanning axis and at least one
  • the scanning axis is formed by at least one scanning point of a ground structure and a reference point of a reference axis of the optoelectronic sensor.
  • a sampling point of a soil structure is used as a starting point for the determination of the roll angle.
  • a Abtastachse is determined, for example in the form of a straight line, which is defined by the sampling point and by the reference point.
  • the roll angle is determined as the angle between the scanning axis and the reference axis.
  • a scanning plane and a reference plane can also be clamped in order to determine the roll angle.
  • Optoelectronic sensor are spanned a scanning plane.
  • the angle between the scanning plane and the reference plane corresponds here again to the roll angle.
  • the roll angle is in particular a rotation of the optoelectronic sensor about a vehicle longitudinal axis of the motor vehicle.
  • the reference axis is determined depending on a pitch angle and a yaw angle of the optoelectronic sensor.
  • the yaw angle which is a rotation of the optoelectronic sensor to a
  • Vehicle vertical axis describes, and the pitch angle, which describes a rotation of the optoelectronic sensor about a vehicle transverse axis, already determined before the determination of the roll angle.
  • the optoelectronic sensor has a
  • Evaluation unit by means of which the optoelectronic sensor was advantageously already calibrated or corrected by the pitch angle and the yaw angle, so that an improved determination of the roll angle can be performed.
  • the pitch angle and the yaw angle can be determined such that by means of at least two sampling points of a first receiving element
  • a sensor coordinate system is determined.
  • a reference coordinate system may be defined by at least one sampling point of the first Receiving element are determined with at least one sampling point of a second receiving element, wherein depending on a selection of the sampling point of the first receiving element, the sampling point of the second receiving element is predetermined.
  • the sampling points of the first and the second receiving element which are responsible for the generation of the reference coordinate system
  • horizontal angle is an angle within a plane that is spanned by a vehicle longitudinal and a vehicle transverse axis.
  • the sensor coordinate system can be compared with the reference coordinate system and depending on the comparison of the pitch angle and / or the yaw angle of the
  • optoelectronic sensor can be determined.
  • the optoelectronic sensor can be designed in particular as a lidar sensor or as a laser scanner.
  • a distance threshold value is determined and the roll angle is determined accordingly only when the distance threshold value is exceeded by the lateral distance. For example, this one
  • Distance threshold corresponding to at least half the vehicle width.
  • a mounting height of the optoelectronic sensor on the motor vehicle is used as the basis for the reference point.
  • the reference point is predetermined by a known installation position in the vehicle. If the installation height or the installation position is known, then, for example, the vertical distance of the sensor to a roadway or to a surface below the
  • the roadway or the surface can be used as a basis for determining the reference axis or the reference plane.
  • the roll angle can be determined simply and yet reliably in many driving situations.
  • the sampling point has a first space coordinate, a second space coordinate and a third space coordinate and the roll angle by an arctangent function of the ratio of the second space coordinate to the third
  • Space coordinate is determined.
  • the first spatial coordinate lies in the vehicle longitudinal direction (X_A)
  • A is the sampling point and a is the roll angle.
  • the roll angle can be determined by averaging the respective roll angle.
  • a second received sampling point is used as the reference point.
  • the second received sampling point may originate from a second line on the roadway.
  • the two received sampling points have a sufficiently large lateral distance from each other, so that advantageously the roll angle can be determined.
  • the scanning axis is then laid between the first sampling point and the second sampling point. Between the reference axis passing through the second sampling point and the scanning axis, an angle corresponding to the roll angle can be determined. This makes it possible, in particular, that the roll angle can be reliably determined independently of a mounting height of the optoelectronic sensor.
  • the shoring height can, for example, due to the suspension of the
  • Motor vehicle and / or due to a removal of the tire surface and / or due to a load of the motor vehicle vary.
  • the sampling point has a first spatial coordinate (X_A), a second spatial coordinate (Y_A) and a third spatial coordinate (Z_A), and the second sampling point has a further first spatial coordinate (X_B), another second spatial coordinate (Y_B) and has a further third spatial coordinate (Z_B) and the Roll angle is determined by an arctangent function of the ratio of the difference of the other third spatial coordinate and the third spatial coordinate and the difference of the other second spatial coordinate and the second spatial coordinate.
  • the first and further first spatial coordinates lie in the vehicle longitudinal direction (X coordinate), the second and further second spatial coordinates in the vehicle transverse direction (Y coordinate) and the third and further third spatial coordinates in the vehicle vertical direction (Z coordinate):
  • A is the first sampling point and B is the second sampling point and a
  • Sample point corresponds. This makes it possible, in particular, regardless of a mounting height of the optoelectronic sensor, which may vary in particular, for example, due to the suspension of the motor vehicle and / or due to erosion of the tire surface and / or due to a load of the motor vehicle, the roll angle can be reliably determined.
  • the optoelectronic sensor is calibrated depending on the roll angle. With the evaluation of the optoelectronic sensor, the roll angle can be determined and then depending on the roll angle
  • the optoelectronic sensor to be calibrated.
  • Evaluation unit can be corrected so that when using the optoelectronic sensor, a corrected sensor image can be provided for evaluation for other assistance systems.
  • a corrected sensor image can be provided for evaluation for other assistance systems.
  • an improved optoelectronic sensor in the autonomous, ferry operation of the motor vehicle, can be provided, which can be used reliably and safely in road traffic.
  • the determination of the roll angle during an operation, in particular a ferry operation, of the motor vehicle is performed.
  • the current roll angle of the motor vehicle is performed.
  • a driving situation here for example, driving on a highway, a Parking process or a stand z. B. at a traffic light or an intersection.
  • Determining the roll angle can also be calibrated at the different loading conditions of the optoelectronic sensor.
  • the optoelectronic sensor can be improved in several driving situations, so that road safety can be improved for a large number of different driving situations.
  • an improved at least semi-autonomous operation in particular during autonomous operation of the motor vehicle, an improved
  • Motor vehicle can be realized thereby, which can lead to an increase in road safety.
  • At least one sampling point of a lane marking recognized in the sensor image is used. This can be exploited that road markings on many roads
  • road markings may also have highly reflective particles. Consequently, these lane markings can be recognized well.
  • lane markings may have a sufficiently large lateral distance from each other and may also be aligned parallel to one another. This can be reliably determined with recognized in the sensor image road markings.
  • a guard rail and / or a wall, in particular a tunnel wall can also be used for the method.
  • the method can be carried out very reliably. Since, in particular guardrails and / or walls, in particular tunnel walls, are also aligned parallel to each other and these can be used reliably for the process. In particular, it is also possible that curves can be taken into account in the method.
  • guardrails and walls in particular tunnel walls, it can therefore be soil structures in the sense of the invention.
  • a respective roll angle is determined for a plurality of sampling points and the roll angle of
  • Optoelectronic sensor as a statistical mean of the respective specific roll angle the sampling points determined.
  • the specific roll angle of a respective sampling point is verified by further determinations of roll angles, and in particular by the determination of the roll angle via the statistical mean, inaccuracies can be compensated. This allows the
  • Roll angle of the optoelectronic sensor can be reliably determined.
  • Another aspect of the invention relates to an optoelectronic sensor having at least one transmitting device for emitting light beams and having at least one receiving unit.
  • the receiving unit is configured to receive light beams reflected from an object.
  • the optoelectronic sensor has a
  • Evaluation unit which is adapted to represent the received light beams as sampling points in a sensor image of the environment of the motor vehicle. Furthermore, the evaluation unit is designed to determine a roll angle between at least one scanning axis and at least the reference axis.
  • the scan axis is defined by at least one received sample point of a soil structure and a
  • Optoelectronic sensor has physical features that a
  • Fig. 1 is a schematic plan view of a motor vehicle with a
  • Fig. 2 is a schematic plan view of a first receiving situation by the
  • Fig. 3 is a schematic transverse view of the embodiment of the
  • Fig. 5 is a schematic transverse view of the embodiment of the
  • Fig. 1 shows a motor vehicle 1 with a driver assistance system 2.
  • Driver assistance system 2 for example, an object 3, which is located in a
  • a distance between the motor vehicle 1 and the object 3 can be determined by means of the driver assistance system 2.
  • the driver assistance system 2 comprises at least one optoelectronic sensor 5.
  • the optoelectronic sensor 5 may be designed as a lidar sensor or as a laser scanner.
  • the optoelectronic sensor 5 comprises a transmitting device 6, with which light beams 8 can be emitted or emitted.
  • the optoelectronic sensor 5 is arranged on a front region of the motor vehicle 1.
  • the Optoelectronic sensor 5 can also be arranged on other areas, for example on a rear area or on a side area of motor vehicle 1. The present example is therefore not exhaustive, but merely serves to illustrate an essential idea.
  • the transmitting device 6 With the transmitting device 6, the light beams 8 within a predetermined detection range E or a predetermined angular range
  • the light beams 8 may be emitted in a predetermined horizontal angle range.
  • Optoelectronic sensor 5 a deflection device, not shown, with which the
  • Light beams 8 can be deflected into the environment 4 and thus the
  • Detection area E is scanned.
  • the optoelectronic sensor 5 comprises a receiving unit 7, which may have, for example, a photodiode. With the receiving unit 7, the light beams 9 reflected by the object 3 can be received as a received signal. Furthermore, the optoelectronic sensor 5 may have a control device, which may be formed for example by a microcontroller or a digital signal processor. The optoelectronic sensor 5 can have an evaluation unit 10, by means of which the reflected light beams 9 can be evaluated as sampling points 15, 16, 20, 21 (see FIGS. 3 and 4).
  • the driver assistance system 2 further comprises a control device 1 1, which may be formed for example by an electronic control unit (ECU-electronic control unit) of the motor vehicle 1.
  • ECU-electronic control unit electronicelectronic control unit
  • Control device 1 1 is connected to the data transmission with the optoelectronic sensor 5.
  • the data transmission can, for example, via the data bus of the
  • Fig. 2 shows a schematic plan view of a first receiving situation.
  • the optoelectronic sensor 5 is directed with its reception area E on a roadway 12.
  • the motor vehicle 1 is not shown, but only the optoelectronic sensor 5, which is arranged on the motor vehicle 1.
  • On the roadway 12 is a soil structure 13, which is in particular formed linearly and in the present case is designed as a first road marking 14.
  • the first lane marking 14 has at least one first sampling point 15.
  • FIG. 3 shows a schematic transverse view of the first receiving situation from FIG.
  • FIG. 3 shows a vehicle transverse axis Y and a vehicle vertical axis Z.
  • a roll angle ⁇ is determined, which is to be found, in particular, between the vehicle transverse axis Y and the vehicle vertical axis Z.
  • the roll angle a is a rotation about a vehicle longitudinal axis X which is presently indicated in the origin.
  • FIG. 3 shows the sampling point 15 and a reference point 17, which in the present case is likewise arranged at the origin.
  • a scanning axis 18 is placed between the reference point 17 and the sampling point 15, a scanning axis 18 is placed.
  • a reference axis 19 is also dependent on the reference point 17, a pitch angle and a yaw angle of the optoelectronic sensor 5.
  • the reference axis 19 by the yaw angle and the pitch angle of the
  • the roll angle a is in particular the angle between the scanning axis 18 and the reference axis 19.
  • the scanning point 15 and the reference point 17 are in particular determined such that they have at least one predetermined lateral distance from one another.
  • reference point 17 in particular a mounting height of the optoelectronic sensor 5 on the motor vehicle 1 can be used as a basis.
  • the sampling point 15 has a first spatial coordinate X_A in the X direction, a second spatial coordinate Y_A in the Y direction, and a third spatial coordinate Z_A in the Z direction.
  • the roll angle a is determined by an arctangent function of the ratio of the second spatial coordinate Y_A to the third spatial coordinate Z_A:
  • the motor vehicle 1 is not darteils, but only the optoelectronic sensor 5, which is arranged on the motor vehicle 1.
  • the roadway 12 has a floor structure 13, which in this example is designed as two lane markings 14, 22.
  • the first lane marking 14 has at least one first sampling point 15.
  • the second road marking 22 has at least in the sensor image.
  • both the first lane mark 14 and the second lane mark 22 may have a plurality of sampling points 15, 20 and a respective roll angle a between the scanning points 15, 16 of the first lane marking 14 and the sampling points 20, 21 of the second lane marking 22 are determined and the roll angle a of the optoelectronic sensor 5 as a statistical average of the respective specific roll angle a of the sampling points 15, 16, 20, 21 are determined.
  • the second received sampling point 20 is used as the reference point 17.
  • the second received sampling point 20 is used as the reference point 17.
  • Loading conditions of the motor vehicle 1 may vary, the roll angle a can be determined.
  • FIG. 5 shows a schematic transverse view of the second receiving situation from FIG. 4.
  • the roll angle ⁇ between the sampling point 15 and the second sampling point 20 is determined.
  • the scanning axis 18 is placed between the scanning point 15 and the second scanning point 20.
  • the reference axis 19 is determined depending on the second sampling point 20, the pitch angle and the yaw angle. It is
  • the second sampling point 20 likewise has a further first spatial coordinate X_B in the X direction, a further second spatial coordinate Y_B in the Y direction and a further third spatial coordinate Z_B in the Z direction.
  • the roll angle a can then by an arctangent function of
  • Ratio of the difference of the other third spatial coordinate Z_B and the third spatial coordinate Z_A and the difference of the other second spatial coordinate Y_B and the second spatial coordinate Y_A are determined:
  • the optoelectronic sensor 5 is calibrated or corrected as a function of the determined roll angle a.
  • the evaluation unit 10 of the optoelectronic sensor 5 can determine the roll angle ⁇ and send it to the control device 10 in a correspondingly corrected manner.
  • this can improve the evaluation of the information of the Optoelectronic sensor 5 are provided for the control device 1 1, so that the safety in traffic can be increased.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne un procédé pour déterminer un angle de roulis (α) d'un capteur optoélectronique (5) d'un véhicule à moteur (1), le capteur optoélectronique comprenant au moins un dispositif d'émission (6), au moins une unité de réception (7) et au moins une unité d'évaluation (10), ledit procédé comprenant les étapes suivantes : - émission de faisceaux lumineux (8) dans un environnement (4) du véhicule à moteur (1) par le dispositif d'émission (7), - réception de faisceaux lumineux (8) réfléchis sur un objet (3) par l'unité de réception (7), les faisceaux lumineux (8) reçus étant représentés par l'unité d'évaluation (10) sous la forme de points de balayage (17A, 17B, 17C) dans une image de l'environnement du véhicule à moteur (1) produite par le capteur optoélectronique (5), l'angle de roulis (α) entre au moins un axe de balayage (18) et au moins un axe de référence (19) étant déterminé par l'unité d'évaluation (10), l'axe de balayage (18) étant formé par au moins un point de balayage (15, 16, 20, 21) d'une structure au sol (13) et un point de référence (17) de l'axe de référence (19) du capteur optoélectronique (5). L'invention concerne également un capteur optoélectronique (5) correspondant.
EP19719862.5A 2018-05-04 2019-04-25 Procédé permettant de déterminer un angle de roulis d'un capteur optoélectronique au moyen de points de balayage d'une image du capteur et capteur optoélectronique Pending EP3788404A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102018110774.9A DE102018110774A1 (de) 2018-05-04 2018-05-04 Verfahren zum Bestimmen eines Rollwinkels eines optoelektronischen Sensors mittels Abtastpunkten eines Sensorbildes sowie optoelektronischer Sensor
PCT/EP2019/060562 WO2019211149A1 (fr) 2018-05-04 2019-04-25 Procédé permettant de déterminer un angle de roulis d'un capteur optoélectronique au moyen de points de balayage d'une image du capteur et capteur optoélectronique

Publications (1)

Publication Number Publication Date
EP3788404A1 true EP3788404A1 (fr) 2021-03-10

Family

ID=66290444

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19719862.5A Pending EP3788404A1 (fr) 2018-05-04 2019-04-25 Procédé permettant de déterminer un angle de roulis d'un capteur optoélectronique au moyen de points de balayage d'une image du capteur et capteur optoélectronique

Country Status (6)

Country Link
US (1) US12105227B2 (fr)
EP (1) EP3788404A1 (fr)
JP (1) JP6969020B2 (fr)
KR (1) KR102397360B1 (fr)
DE (1) DE102018110774A1 (fr)
WO (1) WO2019211149A1 (fr)

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9907688D0 (en) * 1999-04-06 1999-05-26 Univ Belfast Solid matrices for surface-enhanced Raman spectroscopy
US20040247157A1 (en) * 2001-06-15 2004-12-09 Ulrich Lages Method for preparing image information
US6810330B2 (en) * 2001-07-31 2004-10-26 Omron Corporation Apparatus for and method of detecting object on road
JP2003057345A (ja) * 2001-08-17 2003-02-26 Nissan Motor Co Ltd 車両用測距装置
DE102004003850A1 (de) 2004-01-26 2005-08-18 Ibeo Automobile Sensor Gmbh Verfahren zur Erkennung von Markierungen auf einer Fahrbahn
CA2678156C (fr) * 2007-02-16 2013-10-08 Mitsubishi Electric Corporation Appareil de mesure, methode de mesure et appareil d'identification des caracteristiques
DE102008016188A1 (de) 2008-03-26 2009-10-01 Robot Visual Systems Gmbh Verfahren zur parallelen Ausrichtung eines Laserscanners zu einer Fahrbahn
DE102008026876A1 (de) 2008-06-05 2009-12-10 Hella Kgaa Hueck & Co. Stereokamerasystem und Verfahren zum Ermitteln mindestens eines Kalibrierfehlers eines Stereokamerasystems
US8512975B2 (en) * 2008-07-24 2013-08-20 Biomerieux, Inc. Method for detection and characterization of a microorganism in a sample using time dependent spectroscopic measurements
JP4790045B2 (ja) * 2009-05-19 2011-10-12 本田技研工業株式会社 レーダの軸ずれを判定する装置
US20140038849A1 (en) * 2011-03-17 2014-02-06 Northwestern University Method of analyzing an analyte using combinatorial arrays and uniform patterns
DE102012101377B4 (de) * 2012-02-21 2017-02-09 Leica Biosystems Nussloch Gmbh Verfahren bei der Vorbereitung von Proben zum Mikroskopieren und Vorrichtung zum Überprüfen der Eindeckqualität von Proben
US9052721B1 (en) 2012-08-28 2015-06-09 Google Inc. Method for correcting alignment of vehicle mounted laser scans with an elevation map for obstacle detection
DE102012024983A1 (de) * 2012-12-20 2013-07-11 Daimler Ag Verfahren zum Auswerten eines mittels eines optischen Detektionssystems eines Kraftfahrzeugs erzeugten Bildes eines Vorfeldes des Kraftfahrzeugs
US9671368B2 (en) * 2013-05-10 2017-06-06 The Regents Of The University Of California Two-dimensional microfluidic devices and methods of using the same
JP6453701B2 (ja) * 2015-04-23 2019-01-16 株式会社デンソー 姿勢推定装置
FR3036180B1 (fr) * 2015-05-11 2018-08-10 Valeo Schalter Und Sensoren Gmbh Procede de determination de l'assiette d'un vehicule automobile.
KR101738750B1 (ko) * 2015-06-11 2017-05-24 한국과학기술원 실외 환경에서의 강인한 위치 인식 방법 및 장치
DE102015118085A1 (de) * 2015-10-23 2017-04-27 Valeo Schalter Und Sensoren Gmbh Verfahren zum Korrigieren einer fehlerhaften Ausrichtung eines optischen Sensors eines Kraftfahrzeugs, Recheneinrichtung, Fahrerassistenzsystem sowie Kraftfahrzeug
JP6801269B2 (ja) * 2016-07-06 2020-12-16 株式会社明電舎 自律移動装置

Also Published As

Publication number Publication date
US20210181324A1 (en) 2021-06-17
JP6969020B2 (ja) 2021-11-24
KR20200139233A (ko) 2020-12-11
KR102397360B1 (ko) 2022-05-12
US12105227B2 (en) 2024-10-01
WO2019211149A1 (fr) 2019-11-07
DE102018110774A1 (de) 2019-11-07
JP2021522509A (ja) 2021-08-30

Similar Documents

Publication Publication Date Title
EP3386825B1 (fr) Procédé de détection d'une collision possible entre un véhicule automobile et un objet en prenant en compte une incertitude spatiale, dispositif de commande, système d'aide à la conduite et véhicule automobile
EP1664838B1 (fr) Procede et programme informatique pour detecter le contour d'un obstacle dans l'environnement d'un vehicule
DE102015109160A1 (de) Fahrerassistenzsystem für ein Kraftfahrzeug, Kraftfahrzeug sowie Verfahren
EP3356203B1 (fr) Procédé de détermination d'une zone de stationnement pour stationner un véhicule automobile, système d'assistance à la conduite et véhicule automobile
DE102004021561A1 (de) Objekterkennungssystem für ein Kraftfahrzeug
DE102006008139A1 (de) Sensor mit einem dynamischen Erfassungsbereich
DE102016117712A1 (de) Verfahren zum zumindest semi-autonomen Manövrieren eines Kraftfahrzeugs unter Berücksichtigung eines Erfassungsbereichs eines Sensors, Fahrerassistenzsystem sowie Kraftfahrzeug
DE102016100718A1 (de) Verfahren zum Erkennen von Fahrspuren auf einer Fahrbahn anhand einer Häufigkeitsverteilung von Abstandswerten, Steuereinrichtung, Fahrerassistenzsystem sowie Kraftfahrzeug
DE102016109850B4 (de) Verfahren zum Erkennen einer Neigung in einer Fahrbahn eines Kraftfahrzeugs, Fahrerassistenzsystem sowie Kraftfahrzeug
DE102018207274A1 (de) Ultraschallsensorsystem und Verfahren zum Erkennen von Objekten im Umfeld eines Fahrzeugs, sowie Fahrzeug mit einem Ultraschallsensorsystem
WO2019211149A1 (fr) Procédé permettant de déterminer un angle de roulis d'un capteur optoélectronique au moyen de points de balayage d'une image du capteur et capteur optoélectronique
EP3788406B1 (fr) Procédé permettant de déterminer une orientation d'un capteur optoélectronique au moyen de points de balayage d'une image du capteur et capteur optoélectronique
DE102004033212A1 (de) Verfahren und Vorrichtung zur Kompensation von Einbautoleranzen eines Abstandssensors
DE102017124962B4 (de) Vermeidung von falschen Objekterfassungen bei Hindernissen oberhalb eines Fahrzeugs
EP4025929A1 (fr) Procédé pour faire fonctionner un système d'aide au virage, système d'aide au virage et véhicule automobile équipé d'un tel système d'aide au virage
EP3178727B1 (fr) Procédé de reconnaissance de créneau pour stationnement d'un véhicule automobile à l'aide de marquage au sol, système d'assistance au conducteur et véhicule automobile
DE10312611A1 (de) Verfahren und Vorrichtung zum Erfassen eines Objekts im Umfeld eines Kraftfahrzeugs
DE102018127714A1 (de) Verfahren zum Bestimmen einer aktuellen Erfassungsreichweite eines optoelektronischen Sensors durch Vergleich zweier Intensitätswerte, optoelektronischer Sensor sowie Kraftfahrzeug
DE102017115475A1 (de) Verfahren zum Erkennen eines Hindernisses in einem Umgebungsbereich eines Kraftfahrzeugs, Auswerteeinrichtung, Fahrerassistenzsystem sowie Kraftfahrzeug
EP3788405B1 (fr) Procédé pour déterminer une position angulaire d'un capteur optoélectronique et banc d'essai
DE102015121343A1 (de) Radarsensoreinrichtung zum Erfassen eines Objektes, Fahrerassistenzsystem, Kraftfahrzeug sowie Verfahren zum Erfassen eines Objektes
DE102020121785A1 (de) Verfahren zum Manövrieren eines Fahrzeugs in eine Schrägparklücke mit Vermeidung einer Kollision einer Frontschürze des Fahrzeugs mit einer Begrenzung der Schrägparklücke, Fahrerassistenzsystem sowie Fahrzeug
DE102018112407A1 (de) Bestimmung von Neigungswinkeln eines optischen Umgebungssensors
DE102018127712A1 (de) Verfahren zum Erkennen zumindest einer Reifenspur für ein Kraftfahrzeug mittels eines optoelektronischen Sensors, optoelektronischer Sensor sowie Kraftfahrzeug
EP2449343A1 (fr) Dispositif, détecteur et procédé pour déterminer visuellement la distance par rapport à un objet

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20201019

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230528

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20240510