US20230194719A1 - Method for Measuring a Distance Between an Object and an Optical Sensor, Control Device for Carrying Out Such a Method, Distance Measuring Apparatus Comprising Such a Control Device, and Motor Vehicle Comprising Such a Distance Measuring Apparatus - Google Patents

Method for Measuring a Distance Between an Object and an Optical Sensor, Control Device for Carrying Out Such a Method, Distance Measuring Apparatus Comprising Such a Control Device, and Motor Vehicle Comprising Such a Distance Measuring Apparatus Download PDF

Info

Publication number
US20230194719A1
US20230194719A1 US17/926,412 US202117926412A US2023194719A1 US 20230194719 A1 US20230194719 A1 US 20230194719A1 US 202117926412 A US202117926412 A US 202117926412A US 2023194719 A1 US2023194719 A1 US 2023194719A1
Authority
US
United States
Prior art keywords
distance
optical sensor
image line
region
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/926,412
Other languages
English (en)
Inventor
Fridtjof Stein
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Daimler Truck Holding AG
Mercedes Benz Group AG
Original Assignee
Daimler Truck AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Daimler Truck AG filed Critical Daimler Truck AG
Assigned to Daimler Truck AG reassignment Daimler Truck AG ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: STEIN, FRIDTJOF
Publication of US20230194719A1 publication Critical patent/US20230194719A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • G01C3/08Use of electric radiation detectors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • G01S17/18Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves wherein range gates are used
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4865Time delay measurement, e.g. time-of-flight measurement, time of arrival measurement or determining the exact position of a peak
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/536Depth or shape recovery from perspective effects, e.g. by using vanishing points
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle
    • G06T2207/30261Obstacle

Definitions

  • the invention relates to a method for measuring a distance between an object and an optical sensor, a control device configured for carrying out such a method, a distance measuring apparatus comprising such a control device, and a motor vehicle comprising such a distance measuring apparatus.
  • the invention is therefore based on the object of providing a method for measuring a distance between an object and an optical sensor, a control device configured for carrying out such a method, a distance measuring apparatus comprising such a control device and a motor vehicle comprising such a distance measuring apparatus, wherein the stated disadvantages are at least partially redressed and preferably avoided.
  • the object is solved in particular by providing a method for measuring a distance between an object and an optical sensor by means of an illumination device and the optical sensor.
  • the illumination device and the optical sensor are here controlled in a manner temporally coordinated with one another.
  • a spatial position of a visible distance region in an observation region of the optical sensor is specified here by the temporally coordinated control of the illumination device and of the optical sensor.
  • a captured image of the visible distance region is captured by the optical sensor by means of the coordinated control.
  • a start image line for the beginning of the visible distance region is determined in the captured image.
  • an end image line for the end of the visible distance region is determined in the captured image.
  • a base point image line is determined in the captured image, wherein the base point image line is that image line in which firstly the object can be detected and which secondly has the shortest distance—in the image plane of the optical sensor—to the start image line.
  • the distance from the object is ascertained by evaluating the image position of the base point image line relative to the start image line and the end image line while taking account of the spatial position of the visible distance region.
  • the spatial position of the visible distance region in real space i.e., on the object side
  • the image region of the distance region on the optical sensor is known by ascertaining the start image line, on the one hand, and the end image line, on the other hand. Therefore, when the base point image line is found, the spatial, object-side position of the object within the object-side visible distance region can now advantageously be deduced from the position of the base point image line in the image region, i.e., the image position relative to the start image line and the end image line.
  • the method can particularly advantageously be applied in self-driving vehicles, in particular self-driving trucks.
  • objects arranged in the vehicle's own lane which cannot be driven over can advantageously be detected by means of the method, in particular objects which are small compared to the size of the vehicle.
  • the method enables a prompt and appropriate reaction to the detection of such objects and in particular to the detection of a distance of the vehicle from these objects.
  • Such an appropriate reaction can be, for example, emergency braking or driving along an evasive trajectory, which may be determined ad hoc.
  • Such small objects which cannot be driven over are typically also referred to as “lost cargo”.
  • such objects might also be people or animals in the road, possibly as a result of an accident.
  • the method for generating captured images by means of temporally coordinated control of an illumination device and an optical sensor is in particular a method known as gated imaging; the optical sensor is in particular a camera that is triggered sensitively only in a specific, restricted time period, which is referred to as “gated control”.
  • the camera is therefore a gated camera.
  • the illumination device is also correspondingly temporally controlled only in a specific, selected time interval, in order to illuminate a scene on the object side.
  • a predefined number of light pulses are emitted by the illumination device, preferably lasting between 5 ns and 20 ns.
  • the beginning and the end of the exposure of the optical sensor is coupled to the number and duration of the light pulses given off.
  • the optical sensor can detect a specific visible distance region by temporally controlling the illumination device on the one hand and the optical sensor on the other hand, with a correspondingly defined spatial position, i.e., in particular specific distance of the beginning of the distance region from the optical sensor and specific distance region width.
  • the visible distance region is that—object-side—region in three-dimensional space which is imaged by means of the optical sensor in a two-dimensional captured image on an image plane of the optical sensor by the number and duration of the light pulses of the illumination device in conjunction with the start and the end of the exposure of the optical sensor.
  • the observation region is, by contrast, in particular the—object-side—region in three-dimensional space which could be imaged as a whole—in particular to the maximum extent—by means of the optical sensor in a two-dimensional captured image given sufficient illumination and exposure of the optical sensor.
  • the observation region corresponds to the entire exposable image region of the optical sensor that could theoretically be illuminated.
  • the visible distance region is thus a subset of the observation region in real space. Only a subset of the image plane of the optical sensor is accordingly exposed in the method provided here, wherein this subset of the image plane is given in particular between the start image line and the end image line.
  • object-side refers to a region in real space, i.e., on the sides of the object to be observed.
  • image-side refers to a region on the image plane of the optical sensor.
  • the observation region and the visible distance region are given here on the object side. They correspond to image-side areas on the image plane that are assigned by the laws of imaging and the temporal control of the illumination device and of the optical sensor.
  • the visible distance region can be specified, wherein the temporal coordination of the illumination device, on the one hand, and of the optical sensor, on the other hand, is determined therefrom and appropriately specified.
  • An image line is understood here to mean in particular the set of all pixels of a captured image in the image plane of the optical sensor which lie on a common horizontal line in the image plane.
  • the illumination device is a laser in a preferred configuration.
  • the optical sensor is a camera in a preferred configuration.
  • the base point image line is preferably ascertained as follows: object recognition is carried out in the captured image in particular by means of pattern recognition, preferably using a classification algorithm and/or by means of deep learning. If an object is recognised, all image lines in which the object is depicted are ascertained in the captured image on the basis of this recognition or classification. That image line which has the shortest distance to the start image line is then ascertained as base point image line.
  • the method advantageously enables in particular the distance between the object and the optical sensor to be determined from a single captured image.
  • the distance between the object and the optical sensor is preferably determined from a single captured image.
  • One development of the invention provides that, for the captured image of the distance region, a line histogram is created over all of the image lines associated with an evaluation region in the observation region on the optical sensor by means of summing the illumination intensities per image line of the optical sensor. The start image line and the end image line are then determined by means of the line histogram.
  • This advantageously enables the determination of the image position of that region on the optical sensor that is associated on the image side with the object-side visible distance region.
  • the temporal control of the illumination device on the one hand and of the optical sensor on the other hand results in a clear brightness transition at the beginning of the image-side distance region and at the end of the image-side distance region. This ultimately enables the object distance to be determined by interpolating the position of the base point image line relative to the start image line and the end image line.
  • a line histogram is understood here to mean in particular that the individual image lines of the optical sensor in the evaluation region are assigned the sum of the illumination intensities over all pixels of the respective image line that are lying in the evaluation region. In this way, the brightness transition that is correspondingly created by the temporal control can be detected very easily and reliably in the image plane of the optical sensor.
  • the evaluation region is identical to the observation region. This corresponds to one embodiment of the method that is particularly easy to implement.
  • the evaluation region is selected to be smaller than the observation region, in particular than a region of interest in which the objects to be detected can be found. This advantageously means that the method can be carried out more quickly and efficiently.
  • the evaluation region can also be limited horizontally in particular.
  • the evaluation region is preferably identified in the captured image prior to calculating the line histogram by GPS prediction, in particular by back-projecting the course of the road into the image plane, and/or by a method for optical lane tracking.
  • an object distance is determined as a distance between the object and the optical sensor, wherein a distance region width is determined as difference from the end of the visible distance region and the beginning of the visible distance region.
  • a base point distance is determined as image line distance on the optical sensor between the base point image line and the start image line.
  • a distance region image width is ascertained as image line distance between the end image line and the start image line.
  • the object distance is then finally ascertained as the sum of the beginning of the visible distance region, that is in particular the spatial distance between the beginning of the visible distance region and the optical sensor, and the product of the distance region width with the ratio of the base point distance to the distance region image width.
  • the object distance is determined according to the following formula:
  • x near is the beginning of the visible distance region
  • x far is the end of the visible distance region
  • correspondingly (x far ⁇ x near ) is the distance region width
  • v near is the start image line
  • v far is the end image line
  • correspondingly (v far ⁇ v near ) is the distance region image width
  • v is the base point image line
  • correspondingly (v ⁇ v near ) is the base point distance
  • x is the object distance.
  • This approach is ultimately based on the intercept theorem, wherein two assumptions are required for an application. Firstly, the course of the roadway surface within the visible distance region is assumed to be linear. Secondly, the intercept theorum strictly presupposes that imaginary connecting lines are parallel to each other between the start image line and the beginning of the visible distance region, on the one hand, and the end image line and the end of the visible distance region on the other hand, which is generally not the case. However, the distance between the optical sensor and the visible distance region is generally great enough to be able to assume, at any rate with good approximation, that the corresponding imagined lines are parallel, so that the intercept theorum can be applied with very good approximation at any rate.
  • the illumination device and the optical sensor are each designed for operation in the near infrared range. This has the advantage that the eyes of people and/or animals into which light from the illumination device might fall are not adversely affected. It is particularly advantageous to use a wavelength of more than 1.4 ⁇ m, in particular 1.55 ⁇ m, since this is strongly absorbed in particular by the lens and the cornea of the eye, so that at most a low intensity falls on the retina. It is also advantageous that other road uses are not dazzled by the illumination device, in particular when driving at night.
  • a temporal sequence of captured images is created, wherein the temporal coordination of the illumination device and of the optical sensor is altered so that a change in the distance of the object over time is determined.
  • the temporal coordination for at least two captured images in the temporal sequence of captured images is altered.
  • the temporal coordination for each captured image in the temporal sequence is altered.
  • the temporal coordination for the captured images of the temporal sequence are altered in particular such that the base point image line is retained approximately centrally between the start image line and the end image line.
  • the change in the distance of the object over time can in turn be deduced from the change in the temporal coordination of the control of the illumination device and of the optical sensor, which is necessary for this. In this way, advantageously, the distance of the object is measured dynamically.
  • control device which is configured for carrying out a method according to the invention or a method according to one of the embodiments described above.
  • the control device is preferably in the form of a computing device, particularly preferably a computer, or control unit, in particular control unit of a vehicle.
  • the object is also solved by providing a distance measuring apparatus which has an illumination device, an optical sensor, and a control device according to the invention or a control device according to one of the exemplary embodiments described above.
  • a distance measuring apparatus which has an illumination device, an optical sensor, and a control device according to the invention or a control device according to one of the exemplary embodiments described above.
  • the control device is preferably operatively connected to the illumination device, on the one hand, and to the optical sensor, on the other hand, and is configured for the control thereof.
  • the object is lastly also solved by providing a motor vehicle having a distance measuring apparatus according to the invention or a distance measuring apparatus according to one of the exemplary embodiments described above.
  • the advantages that have already been explained in connection with the method, the control device and the distance measuring apparatus apply in particular in connection with the motor vehicle.
  • the motor vehicle is designed as a truck.
  • the motor vehicle it is also possible for the motor vehicle to be a passenger motor car, a utility vehicle or another motor vehicle.
  • FIG. 1 shows a schematic illustration of one exemplary embodiment of a motor vehicle with one exemplary embodiment of a distance measuring apparatus
  • FIG. 2 shows a schematic illustration of a captured image, captured in the context of one embodiment of the method using an optical sensor
  • FIG. 3 shows a schematic illustration of a line histogram which is used in one embodiment of the method.
  • FIG. 1 shows a schematic illustration of one exemplary embodiment of a motor vehicle 1 , with one exemplary embodiment of a distance measuring apparatus 3 .
  • the distance measuring apparatus 3 has an illumination device 5 and an optical sensor 7 .
  • the distance measuring apparatus 3 has a control device 9 which is only shown schematically here and, in a manner not shown explicitly, is operatively connected to the illumination device 5 and the optical sensor 7 for the respective control thereof.
  • An illumination frustum 11 of the illumination device 5 and an observation region 13 of the optical sensor 7 are shown in particular in FIG. 1 .
  • a visible distance region 15 which results as a subset of the observation region 13 of the optical sensor 7 is also shown in hatched lines.
  • a object 17 is arranged in the visible distance region 15 .
  • a beginning 19 and an end 21 of the visible distance region 15 are also drawn in FIG. 1 .
  • the control device 9 is configured in particular to carry out an embodiment that is described in more detail below of a method for measuring a distance x between the object 17 and the optical sensor 7 .
  • the illumination device 5 and the optical sensor 7 are controlled in a manner temporally coordinated with one another, wherein a spatial position of the visible distance region 15 in the observation region 13 is specified by the temporally coordinated control of the illumination device 5 and of the optical sensor 7 .
  • a captured image of the visible distance region 15 is captured by the optical sensor 7 using the coordinated control.
  • FIG. 2 shows a schematic illustration of such a captured image 23 in an image plane of the optical sensor 7 .
  • a start image line v near for the beginning 19 and an end image line v far for the end 21 of the visible distance region 15 in the captured image 23 is illustrated in FIG. 2 .
  • the position of this start image line v near and of the end image line v far is determined.
  • a base point image line v is also determined in the captured image 23 as that image line having the shortest distance to the start image line v near in which the object 17 can be detected.
  • the distance of the object 17 is then ascertained by evaluating the image position of the base point image line v, i.e., the position thereof in the captured image 23 , relative to the start image line v near and the end image line v far while taking the object-side spatial position of the visible distance region 15 into account.
  • the image of the object 17 in the captured image 23 is denoted with 17 ′ in FIG. 2 .
  • an evaluation region 27 which can be determined in particular by a GPS prediction and/or by a method for optical lane tracking is drawn in FIG. 2 .
  • the evaluation region 27 is smaller here than the observation region 13 . However, it can also coincide with the latter.
  • a base point distance (v ⁇ v near ) is determined as image line distance on the optical sensor 7 between the base point image line v and the start image line v near .
  • a distance region image width (v far ⁇ v near ) is ascertained as image line distance between the end image line v far and the start image line v near .
  • the object distance x is then ascertained as sum of the beginning 19 of the visible distance region 15 and the product of the distance region width (x far ⁇ x near ) with the ratio of the base point distance (v—v near ) to the distance region image width (v far ⁇ v near )
  • the object distance x is ascertained according to Equation (1) given above.
  • FIG. 3 shows a schematic illustration of a line histogram 25 of the captured image 23 according to FIG. 2 or of the evaluation region 27 of the captured image 23 .
  • the individual image lines of the optical sensor 7 are plotted on the x axis in this line histogram 25 , with a sum of the illumination intensities per pixel over all pixels of the respective image line in the evaluation region 27 being plotted on the y axis for each image line.
  • This line histogram 25 is created over all image lines assigned to the evaluation region 27 on the optical sensor 7 by summing the illumination intensities per image line of the optical sensor 7 .
  • the start image line v near and the end image line v far are then ascertained by means of the line histogram 25 , wherein in particular owing to the temporally coordinated control of the illumination device 5 and of the optical sensor 7 , significant jumps in intensity in the start image line v near and in the end image line v far can be seen.
  • the illumination device 5 and the optical sensor 7 are preferably designed for operation in the near infrared range, in particular at 1.55 ⁇ m.
  • a temporal sequence of captured images 23 is preferably created, wherein the temporal coordination of the illumination device 5 and of the optical sensor 7 is altered so that a change in the distance of the object 17 over time can be determined.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Measurement Of Optical Distance (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Length Measuring Devices By Optical Means (AREA)
US17/926,412 2020-05-19 2021-04-01 Method for Measuring a Distance Between an Object and an Optical Sensor, Control Device for Carrying Out Such a Method, Distance Measuring Apparatus Comprising Such a Control Device, and Motor Vehicle Comprising Such a Distance Measuring Apparatus Pending US20230194719A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE102020002994.9A DE102020002994B4 (de) 2020-05-19 2020-05-19 Verfahren zur Messung eines Abstandes zwischen einem Objekt und einem optischen Sensor, Steuereinrichtung zur Durchführung eines solchen Verfahrens, Abstandsmessvorrichtung mit einer solchen Steuereinrichtung und Kraftfahrzeug mit einer solchen Abstandsmessvorrichtung
DE102020002994.9 2020-05-19
PCT/EP2021/058733 WO2021233603A1 (de) 2020-05-19 2021-04-01 Verfahren zur messung eines abstandes zwischen einem objekt und einem optischen sensor, steuereinrichtung zur durchführung eines solchen verfahrens, abstandsmessvorrichtung mit einer solchen steuereinrichtung und kraftfahrzeug mit einer solchen abstandsmessvorrichtung

Publications (1)

Publication Number Publication Date
US20230194719A1 true US20230194719A1 (en) 2023-06-22

Family

ID=71079896

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/926,412 Pending US20230194719A1 (en) 2020-05-19 2021-04-01 Method for Measuring a Distance Between an Object and an Optical Sensor, Control Device for Carrying Out Such a Method, Distance Measuring Apparatus Comprising Such a Control Device, and Motor Vehicle Comprising Such a Distance Measuring Apparatus

Country Status (4)

Country Link
US (1) US20230194719A1 (de)
CN (1) CN115803656A (de)
DE (1) DE102020002994B4 (de)
WO (1) WO2021233603A1 (de)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102020004690A1 (de) * 2020-08-03 2021-05-27 Daimler Ag Verfahren zur Erkennung von Objekten, Steuereinrichtung zur Durchführung eines solchen Verfahrens, Erkennungsvorrichtung mit einer solchen Steuereinrichtung und Kraftfahrzeug mit einer solchen Erkennungsvorrichtung
DE102020005343A1 (de) 2020-08-31 2022-03-03 Daimler Ag Verfahren zur Objektverfolgung von mindestens einem Objekt, Steuereinrichtung zur Durchführung eines solchen Verfahrens, Objektverfolgungsvorrichtung mit einer solchen Steuereinrichtung und Kraftfahrzeug mit einer solchen Objektverfolgungsvorrichtung
DE102020005762B4 (de) 2020-09-21 2023-10-12 Daimler Truck AG Verfahren zum Kalibrieren einer Beleuchtungseinrichtung und eines optischen Sensors, Steuereinrichtung zur Durchführung eines solchen Verfahrens, Kalibrierungsvorrichtung mit einer solchen Steuereinrichtung sowie Kraftfahrzeug mit einer solchen Kalibrierungsvorrichtung
DE102020006880A1 (de) 2020-11-09 2021-01-14 Daimler Ag Verfahren zum Detektieren eines Objekts mittels einer Beleuchtungseinrichtung und eines optischen Sensors, Steuereinrichtung zur Durchführung eines solchen Verfahrens, Detektionsvorrichtung mit einer solchen Steuereinrlchtung und Kraftfahrzeug mit einer solchen Detektionsvorrichtung
DE102020007061B4 (de) 2020-11-19 2022-08-11 Daimler Truck AG Verfahren zum Betreiben einer ersten Beleuchtungseinrichtung, einer zweiten Beleuchtungseinrichtung und eines optischen Sensors, Steuereinrichtung zur Durchführung eines solchen Verfahrens, Gated-Kamera-Vorrichtung mit einer solchen Steuereinrichtung und Kraftfahrzeug mit einer solchen Gated-Kamera-Vorrichtung
DE102021001175A1 (de) 2021-03-05 2022-09-08 Daimler Truck AG Verfahren zum Betreiben einer Gated-Kamera, Steuereinrichtung zur Durchführung eines solchen Verfahrens, Kameravorrichtung mit einer solchen Steuereinrichtung und Kraftfahrzeug mit einer solchen Kameravorrichtung
DE102021002915A1 (de) 2021-06-07 2022-12-08 Daimler Truck AG Verfahren zum Betreiben einer Gated-Kamera, Steuervorrichtung zur Durchführung eines solchen Verfahrens, Sichtweitenmessvorrichtung mit einer solchen Steuervorrichtung und Kraftfahrzeug mit einer solchen Sichtweitenmessvorrichtung
DE102021003153A1 (de) 2021-06-18 2022-12-22 Daimler Truck AG Verfahren zum Betreiben einer Gated-Kamera, Steuervorrichtung zur Durchführung eines solchen Verfahrens, Schulterblickvorrichtung mit einer solchen Steuervorrichtung und Kraftfahrzeug mit einer solchen Schulterblickvorrichtung
DE102021003728B4 (de) 2021-07-20 2023-04-20 Daimler Truck AG Verfahren zum Betreiben einer Gated-Kamera, Steuervorrichtung zur Durchführung eines solchen Verfahrens, Gated-Kamera-Vorrichtung mit einer solchen Steuervorrichtung und Kraftfahrzeug mit einer solchen Gated-Kamera-Vorrichtung

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2542913B1 (de) 2010-03-02 2019-05-08 Elbit Systems Ltd. Bildgesteuerte kamera zur erkennung von objekten in einer meeresumwelt
US9810785B2 (en) 2012-05-29 2017-11-07 Brightway Vision Ltd. Gated imaging using an adaptive depth of field
IL239919A (en) 2015-07-14 2016-11-30 Brightway Vision Ltd Branded template lighting
US10708577B2 (en) 2015-12-16 2020-07-07 Facebook Technologies, Llc Range-gated depth camera assembly
EP3396410A4 (de) * 2015-12-21 2019-08-21 Koito Manufacturing Co., Ltd. Bilderfassungsvorrichtung für fahrzeuge, steuerungsvorrichtung, mit der bilderfassungsvorrichtung für fahrzeuge und der steuerungsvorrichtung ausgestattetes fahrzeug bilderfassungsverfahren für fahrzeuge
US10909714B2 (en) * 2018-10-30 2021-02-02 Here Global B.V. Method, apparatus, and system for providing a distance marker in an image

Also Published As

Publication number Publication date
DE102020002994B4 (de) 2023-03-30
CN115803656A (zh) 2023-03-14
DE102020002994A1 (de) 2020-07-02
WO2021233603A1 (de) 2021-11-25

Similar Documents

Publication Publication Date Title
US20230194719A1 (en) Method for Measuring a Distance Between an Object and an Optical Sensor, Control Device for Carrying Out Such a Method, Distance Measuring Apparatus Comprising Such a Control Device, and Motor Vehicle Comprising Such a Distance Measuring Apparatus
JP6387407B2 (ja) 周辺検知システム
US10073462B2 (en) Autonomous vehicle with improved visual detection ability
CN103164708B (zh) 确定车辆承载检测的像素分类阈值
KR101030763B1 (ko) 이미지 획득 유닛, 방법 및 연관된 제어 유닛
EP2150437B1 (de) Erkennung rückseitiger hindernisse
US8422737B2 (en) Device and method for measuring a parking space
US8416397B2 (en) Device for a motor vehicle used for the three-dimensional detection of a scene inside or outside said motor vehicle
CN110651313A (zh) 控制装置和控制方法
US20060111841A1 (en) Method and apparatus for obstacle avoidance with camera vision
US9069075B2 (en) Coupled range and intensity imaging for motion estimation
JP2009041929A (ja) 距離計測方法および装置、ならびに距離計測装置を備えた車両
EP3428677B1 (de) Sichtsystem und sichtverfahren für ein fahrzeug
CN110045389A (zh) 用于物体检测的结构化光照明系统
EP3705913A1 (de) Lidar-bildgebungsvorrichtung für ein kraftfahrzeug
US20240012144A1 (en) Method for Detecting an Object by Means of a Lighting Device and an Optical Sensor, Control Device for Carrying Out Such a Method, Detection Device With Such a Control Device and Motor Vehicle With Such a Detection Device
EP4204847A1 (de) Erkennung von retroreflektoren in nir-bildern zur steuerung der lidar-abtastung
WO2020064674A1 (de) Sensorvorrichtung zum erfassen eines umfelds eines geräts, verfahren zum betreiben einer sensorvorrichtung zum erfassen eines umfelds eines geräts und gerät mit der sensorvorrichtung
CN116685865A (zh) 用于操作第一照明装置、第二照明装置和光学传感器的方法、用于执行这种方法的控制装置、具有这种控制装置的选通摄像头装置和具有这种选通摄像头装置的机动车
US20230302987A1 (en) Method for Object Tracking at Least One Object, Control Device for Carrying Out a Method of This Kind, Object Tracking Device Having a Control Device of This Kind and Motor Vehicle Having an Object Tracking Device of This Kind
US20240190349A1 (en) Method for Operating a Gated Camera, Control Device for Carrying Out Such a Method, Over-The-Shoulder View Device Having Such a Control Device, and Motor Vehicle Having Such an Over-The-Shoulder View Device
DE102018219420A1 (de) Verfahren und Vorrichtung zur Bestimmung einer Entfernung von zumindest einem Objekt in einem Umfeld eines Fahrzeugs unter Verwendung einer Lasereinheit und einer Kameraeinheit
WO2023135952A1 (ja) 画像処理装置、画像処理方法および画像処理システム
WO2022269995A1 (ja) 測距装置および方法、並びにプログラム
US11924578B2 (en) Image processing apparatus, camera, moveable body, and image processing method

Legal Events

Date Code Title Description
AS Assignment

Owner name: DAIMLER TRUCK AG, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:STEIN, FRIDTJOF;REEL/FRAME:061829/0142

Effective date: 20221117

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION