EP4241114A1 - Verfahren zum detektieren eines objekts mittels einer beleuchtungseinrichtung und eines optischen sensors, steuereinrichtung zur durchführung eines solchen verfahrens, detektionsvorrichtung mit einer solchen steuereinrichtung und kraftfahrzeug mit einer solchen detektionsvorrichtung - Google Patents

Verfahren zum detektieren eines objekts mittels einer beleuchtungseinrichtung und eines optischen sensors, steuereinrichtung zur durchführung eines solchen verfahrens, detektionsvorrichtung mit einer solchen steuereinrichtung und kraftfahrzeug mit einer solchen detektionsvorrichtung

Info

Publication number
EP4241114A1
EP4241114A1 EP21789679.4A EP21789679A EP4241114A1 EP 4241114 A1 EP4241114 A1 EP 4241114A1 EP 21789679 A EP21789679 A EP 21789679A EP 4241114 A1 EP4241114 A1 EP 4241114A1
Authority
EP
European Patent Office
Prior art keywords
image
distance range
visible distance
optical sensor
detected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP21789679.4A
Other languages
German (de)
English (en)
French (fr)
Inventor
Fridtjof Stein
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Daimler Truck Holding AG
Original Assignee
Daimler Truck AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Daimler Truck AG filed Critical Daimler Truck AG
Publication of EP4241114A1 publication Critical patent/EP4241114A1/de
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/06Use of electric means to obtain final indication
    • G01C3/08Use of electric radiation detectors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/141Control of illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/58Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • G06V10/449Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters
    • G06V10/451Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters with interaction between the filter responses, e.g. cortical complex cells
    • G06V10/454Integrating the filters into a hierarchical structure, e.g. convolutional neural networks [CNN]

Definitions

  • the invention relates to a method for detecting an object using an illumination device and an optical sensor, a control device for carrying out such a method, a detection device with such a control device and a motor vehicle with such a detection device.
  • the object is achieved in particular by creating a method for detecting an object using an illumination device and an optical sensor, activation of the illumination device and the optical sensor being coordinated in terms of time and the coordinated activation being assigned a visible distance range. At least one limit of the visible distance range on the image side is compared with a predetermined standard representation of the at least one limit of the visible distance range. Based on the comparison, an object is searched for at the at least one border, in particular at the at least one image-side border.
  • the method it is advantageously possible to search for and detect an object, in particular early on, in particular at a distant limit of the visible distance range.
  • early object recognition it is advantageously possible to plan a braking and/or avoidance strategy that is optimal in terms of traffic safety.
  • additional sensors in particular radar sensors and/or lidar sensors, to measure the detected object precisely and to carry out object tracking.
  • the optical sensor and the illumination device are arranged at a spatial distance from one another; the optical sensor and the illumination device are preferably arranged at the greatest possible spatial distance from one another.
  • a shadow cast by an object is generated by the spacing of the optical sensor and the lighting device, which in the recording of the optical sensor is visible. Objects that have little or no contrast to the road surface can thus also be detected by means of the shadow cast.
  • the detection of an object at the far limit of the visible distance range depends on the reflection properties and/or the brightness of the object.
  • An object that is bright and/or has good reflective properties can be detected as soon as the object enters the visible distance range at the distant boundary.
  • An object that is dark and/or has poor reflection properties can only be detected when a non-negligible part of the object and/or the shadow cast by the object is within the visible distance range.
  • the detection of an object at a near limit of the visible distance range is independent of the reflection properties and/or the brightness of the object.
  • the method for generating recordings by means of a temporally coordinated activation of an illumination device and an optical sensor is, in particular, a method known as a gated imaging method; in particular, the optical sensor is a camera that is only sensitively switched in a specific, limited time range, which is referred to as "gated activation", so the camera is a gated camera.
  • the lighting device is also controlled correspondingly only in a specific, selected time interval in order to illuminate a scene on the object side.
  • the lighting device emits a predefined number of light pulses, preferably each with a duration of between 5 ns and 20 ns.
  • the start and end of the exposure of the optical sensor is linked to the number and duration of the emitted light pulses.
  • a specific visible distance range can be detected by the optical sensor by the temporal activation of the lighting device on the one hand and the optical sensor on the other hand with a correspondingly defined local position, i.e. in particular specific distances between the near and far limit of the visible distance range from the optical sensor .
  • the visible distance range is that - object-side - range in three-dimensional space, which is determined by the number and duration of the light pulses of the lighting device in connection with the start and end of the exposure of the optical sensor by means of the optical sensor in a two-dimensional recording on an image plane of the optical Sensor is mapped.
  • object-side an area in real space is addressed.
  • on the image side an area on the image plane of the optical sensor is addressed.
  • the visible distance range is given on the object side. This corresponds to an image-side area on the image plane that is assigned by the imaging laws and the temporal control of the illumination device and the optical sensor.
  • the method in particular, by suitably selecting the temporal control of the lighting device on the one hand and the optical sensor on the other hand, to determine the position and the spatial width of the visible distance range, in particular a distance between the near limit and the far limit of the visible distance range to define.
  • the visible distance range is specified, from which the timing of the lighting device on the one hand and the optical sensor on the other hand is determined and specified accordingly.
  • the illumination device has at least one surface emitter, in particular a so-called VCSE laser.
  • the optical sensor is preferably a camera.
  • the detected object is classified.
  • the classification is preferably carried out using a neural network or a deep learning method.
  • a line on the image side which runs between an exposed area and an unexposed area in the photograph, is determined as the limit of the visible distance area on the image side.
  • An object is detected based on a deviation of the image-side line from a horizontal course of the standard representation.
  • the predetermined standard representation of the at least one limit of the visible distance range is in particular a horizontal line between the exposed area and the unexposed area of the photograph. If an object is located on the at least one border of the visible distance range, then this object is visible in the recording. An object that is bright and/or has good reflective properties increases the exposed area of the image and reduces the unexposed area of the image. An object that is dark and/or has poor editorial qualities will increase the unexposed area of the shot and decrease the exposed area of the shot. In both cases, the border of the visible distance range on the image side has a course that deviates from a horizontal course. The object is advantageously detected based on this deviation.
  • an image-side evaluation area is determined, which has the at least one image-side limit of the visible distance area.
  • a column histogram is created over all the pixels assigned to the evaluation area on the optical sensor by summing the illumination intensities of the assigned pixels for each image column of the evaluation area.
  • An object is detected based on a deviation of the column histogram from a horizontal course.
  • a horizontal run of a column histogram is a run in which all values are within a predetermined interval. This means in particular that the values are constant within a predetermined tolerance.
  • a horizontal curve of a column histogram is a curve which can be interpolated with a predetermined maximum error using a horizontal line.
  • a distance of the detected object from the optical sensor is determined in the recording. Furthermore, a vertical extent of the object on the image side is determined in the recording, and an approximate height of the detected object on the object side is estimated based on the distance and the vertical extent on the image side. It is advantageously possible to estimate a maximum possible object-side height of a detected object.
  • a method for determining the distance between the detected object and the optical sensor emerges from the German patent application DE 102020 002 994 A1.
  • the vertical extent of the object on the image side can be determined directly in the recording.
  • an object-side distance between the optical sensor and a maximum object-side extent of the shadow cast by the object can preferably be determined from the vertical extent of the object on the image side.
  • the detected object is arranged on the object side in an xy plane, and the illumination device is arranged on a z axis at height ZB above the xy plane.
  • the distance xo between the lighting device and/or the optical sensor and the detected object is measured in the x-direction.
  • the object-side height, in particular the maximum possible object-side height, of the detected object zo provided that the object has no extension in the x-direction.
  • the height of the lighting device ZB, the height of the detected object zo, the distance xo and the maximum extension of the shadow cast by the object xs are set in relation to one another by means of the set of rays.
  • the object-side height zo is then estimated.
  • the estimation using the formula (1) always supplies a height zo on the object side which is smaller than the height of the lighting device ZB.
  • a series of recordings is created if an object is detected at the distant limit of the visible distance range as a first image-side limit of the at least one image-side limit.
  • the near boundary of the visible distance range is evaluated on the image side as a second image-side boundary of the at least one image-side boundary, the image-side near boundary of the visible distance range being compared to a predetermined standard representation of the near boundary of the visible distance range.
  • the detected object is searched for based on the comparison.
  • the series of recordings ends when the detected object is found at the near limit of the visible distance range.
  • an object cannot be driven over if an object is detected in a recording both at the near limit of the visible distance range and at the far limit of the visible distance range.
  • an object that cannot be driven over has a height zo on the object side, which is greater than the height of the lighting device ZB or corresponds to the height of the lighting device ZB.
  • an object tracking is carried out for a detected object.
  • the object tracking is carried out using a Kanade-Lucas-Tomasi method (KLT method).
  • the comparison of the at least one image-side border with the standard representation of the at least one border is carried out using a deep learning method and preferably with a neural network.
  • the object is also achieved by creating a control device that is set up to carry out a method according to the invention or a method according to one or more of the embodiments described above.
  • the control device is preferably designed as a computing device, particularly preferably as a computer, or as a control unit, in particular as a control unit of a motor vehicle.
  • a computing device particularly preferably as a computer
  • a control unit in particular as a control unit of a motor vehicle.
  • the object is also achieved by creating a detection device which has an illumination device, an optical sensor and a control device according to the invention or a control device according to one or more of the exemplary embodiments described above.
  • the control device is preferably operatively connected to the lighting device and the optical sensor and set up for their respective control.
  • the object is also achieved by creating a motor vehicle with a detection device according to the invention or a detection device according to one or more of the exemplary embodiments described above.
  • a detection device according to the invention or a detection device according to one or more of the exemplary embodiments described above.
  • the motor vehicle is designed as a truck.
  • the motor vehicle it is also possible for the motor vehicle to be a passenger car, a commercial vehicle, or another motor vehicle.
  • the invention is explained in more detail below with reference to the drawing.
  • FIG. 1 shows a schematic representation of an exemplary embodiment of a motor vehicle with an exemplary embodiment of a detection device
  • FIG. 2 shows a schematic representation of a first and a second example of a recording with a visualization of a distant boundary and a near boundary of a first visible distance range
  • FIG. 3 shows a schematic representation of a third example of a recording with a visualization of the distant boundary and the near boundary of a second visible distance range
  • FIG. 5 shows a schematic representation of a fifth example of a recording and an example of an associated column histogram
  • FIG. 6 shows a schematic representation of an example for estimating a height of a first object and a second object.
  • FIG. 1 shows a schematic representation of an exemplary embodiment of a motor vehicle 1 with an exemplary embodiment of a detection device 3 .
  • the detection device 3 has an illumination device 5 , an optical sensor 7 , in particular a camera, and a control device 9 .
  • the control device 9 is functionally connected to the lighting device 5 and the optical sensor 7 in a manner that is not explicitly shown and is set up for the respective control.
  • the illumination device 5 preferably has at least one surface emitter, in particular a so-called VCSE laser.
  • FIG. 1 shows an illumination frustum 11 of the illumination device 5 and an observation area 13 of the optical sensor 7.
  • An object 19 is arranged at a distant boundary 17.1 of the visible distance range 15.
  • the control device 9 is set up in particular to carry out an embodiment of a method for detecting the object 19 using the illumination device 5 and the optical sensor 7, which is described in more detail below.
  • Activation of the lighting device 5 and the optical sensor 7 are coordinated in terms of time, and the visible distance range 15 is assigned to the coordinated activation.
  • At least one image-side boundary 17', in particular the image-side distant boundary 17.1' or the image-side near boundary 17.2', of the visible distance range 15 is compared with a predetermined standard representation of the at least one boundary 17 of the visible distance range 15.
  • the object 19 is searched for at the at least one border 17 based on the comparison.
  • a series of recordings 21 is preferably created if the object 19 is detected at the distant limit 17.1 of the visible distance range 15. Furthermore, the near border 17.2. visible distance range 15 is evaluated on the image side, with the image-side close limit 17.2' of the visible distance range 15 being compared with a predetermined standard representation of the close limit 17.2 of the visible distance range 15. Based on the comparison, the object 19 previously detected at the remote boundary 17.1 is sought. The series of recordings 21 ends when the object 19 is also detected at the near limit 17.2 of the visible distance range 15.
  • Object tracking is preferably carried out for the detected object 19 .
  • a recording 21 of the optical sensor 7 an image-side line 23 is determined as the image-side boundary 17' of the visible distance range 15.
  • the image-side line 23 runs between an exposed area 25 and an unexposed area 27.
  • the object 19 is detected based on a deviation of the image-side line 23 from a horizontal course of the standard representation.
  • FIG. 2a shows a schematic representation of a first example of the receptacle 21.
  • An image-side object 19' which is bright and/or good Has reflection properties, is arranged at the distant boundary 17.1 of the visible distance range 15.
  • An image-side line 23.1 shows the image-side far boundary 17.1' of the visible distance region 15.
  • a image-side line 23.2 shows the image-side near limit 17.2' of the visible distance region 15.
  • the predetermined norm representation of both the far limit 17.1 and the near limit 17.2 is a line with a horizontal gradient. Due to the light color and/or the good reflection properties of the object 19, the image-side line 23.1 has a bulge upwards in the area of the image-side object 19', as a result of which the exposed area 25 is enlarged. Based on a deviation of the line 23.1 on the image side, in particular the bulge, the object 19 is detected at the distant boundary 17.1 of the visible distance range 15.
  • FIG. 2 b shows a schematic representation of a second example of the receptacle 21 .
  • An image-side object 19' which is dark and/or has poor reflection properties, is arranged at the far border 17.1 of the visible distance range 15. Due to the dark color and/or the poor reflection properties of the object 19, the image-side line 23.1 has a downward bulge in the area of the image-side object 19', as a result of which the unexposed area 27.1 is enlarged. Based on the deviation of the line 23.1 on the image side, in particular the bulge, the object 19 is detected at the distant boundary 17.1 of the visible distance range 15.
  • FIG. 1 A schematic representation of a third example of the receptacle 21 is shown in FIG.
  • An image-side object 19' which is dark and/or has poor reflection properties, is arranged at the near limit 17.2 of the visible distance range 15. Due to the dark color and/or the poor reflection properties and a shadow cast by the object 19, the image-side line 23.2 has a bulge upwards in the area of the image-side object 19', as a result of which the unexposed area 27.2 is enlarged. Based on a deviation of the line 23.2 on the image side, in particular the bulge, the object 19 is detected at the near limit 17.2 of the visible distance range 15.
  • FIG. 4 shows a schematic representation of a fourth example of the recording 21.
  • the unexposed areas 27.1 and 27.2 are covered by the object 19' on the image side, which is dark and/or has poor reflection properties. optically connected.
  • the object 19 is detected simultaneously both at the distant limit 17.2 of the visible distance range 15 and at the near limit 17.2 of the visible distance range 15, and an object 19 that cannot be driven over is inferred.
  • Figure 5 shows a schematic representation of a fifth example of the recording 21 and an example of an associated column histogram 29.
  • FIG. 5a shows a section of the recording 21 with the border 17.T of the visible distance region 15 that is far away on the image side. Furthermore, the object 19 is arranged at the distant boundary 17.1 of the visible distance range 15 and is shown in the recording 21 as an image-side object 19'.
  • An evaluation area 31 on the image side is determined for detecting the object 19 .
  • the image-side evaluation area 31 is determined in such a way that the image-side remote boundary 17' is contained in the evaluation area 31.
  • the column histogram 29 shown in FIG. 5 b) for all the pixels assigned to the evaluation area 31 on the optical sensor 7 is created by summing the illumination intensities of the assigned pixels for each image column of the evaluation area 31 .
  • the object 19 is detected based on a deviation of the column histogram 29 from a horizontal profile.
  • the object 19 is bright and/or has good reflection properties, so the column histogram 29 shows a clear upward deviation from a horizontal course.
  • the object 19 is detected on the basis of this deviation of the column histogram 29 from a horizontal course.
  • FIG. 6 shows a schematic representation of an example for estimating a height of a first object 19.1 and a second object 19.2. Both the first object 19.1 and the second object 19.2 are at an identical distance xo from the illumination device 5. Shown by means of a light beam 35, both the first object 19.1 and the second object 19.2 have an identical extension xs of the shadow 37 cast. Using a suitable method, both the distance xo and the vertical extent on the image side are determined. The extent xs of the shadow cast 37 is calculated from the vertical extent.
  • a height zo is estimated from both objects 19, the estimated height zo being identical for both objects.
  • the estimated height zo is slightly greater than the actual height Z02 of the second object 19.2.
  • the estimated height zo is much greater than the actual height zoi of the first object 19.1. It is thus clearly evident from FIG. 6 that the height of an object 19 is never underestimated when formula (1) is used.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Theoretical Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Evolutionary Computation (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Measurement Of Optical Distance (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)
EP21789679.4A 2020-11-09 2021-10-07 Verfahren zum detektieren eines objekts mittels einer beleuchtungseinrichtung und eines optischen sensors, steuereinrichtung zur durchführung eines solchen verfahrens, detektionsvorrichtung mit einer solchen steuereinrichtung und kraftfahrzeug mit einer solchen detektionsvorrichtung Pending EP4241114A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102020006880.4A DE102020006880A1 (de) 2020-11-09 2020-11-09 Verfahren zum Detektieren eines Objekts mittels einer Beleuchtungseinrichtung und eines optischen Sensors, Steuereinrichtung zur Durchführung eines solchen Verfahrens, Detektionsvorrichtung mit einer solchen Steuereinrlchtung und Kraftfahrzeug mit einer solchen Detektionsvorrichtung
PCT/EP2021/077661 WO2022096215A1 (de) 2020-11-09 2021-10-07 Verfahren zum detektieren eines objekts mittels einer beleuchtungseinrichtung und eines optischen sensors, steuereinrichtung zur durchführung eines solchen verfahrens, detektionsvorrichtung mit einer solchen steuereinrichtung und kraftfahrzeug mit einer solchen detektionsvorrichtung

Publications (1)

Publication Number Publication Date
EP4241114A1 true EP4241114A1 (de) 2023-09-13

Family

ID=74091972

Family Applications (1)

Application Number Title Priority Date Filing Date
EP21789679.4A Pending EP4241114A1 (de) 2020-11-09 2021-10-07 Verfahren zum detektieren eines objekts mittels einer beleuchtungseinrichtung und eines optischen sensors, steuereinrichtung zur durchführung eines solchen verfahrens, detektionsvorrichtung mit einer solchen steuereinrichtung und kraftfahrzeug mit einer solchen detektionsvorrichtung

Country Status (5)

Country Link
US (1) US20240012144A1 (zh)
EP (1) EP4241114A1 (zh)
CN (1) CN116529633A (zh)
DE (1) DE102020006880A1 (zh)
WO (1) WO2022096215A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102021001175A1 (de) * 2021-03-05 2022-09-08 Daimler Truck AG Verfahren zum Betreiben einer Gated-Kamera, Steuereinrichtung zur Durchführung eines solchen Verfahrens, Kameravorrichtung mit einer solchen Steuereinrichtung und Kraftfahrzeug mit einer solchen Kameravorrichtung
DE102021002915A1 (de) * 2021-06-07 2022-12-08 Daimler Truck AG Verfahren zum Betreiben einer Gated-Kamera, Steuervorrichtung zur Durchführung eines solchen Verfahrens, Sichtweitenmessvorrichtung mit einer solchen Steuervorrichtung und Kraftfahrzeug mit einer solchen Sichtweitenmessvorrichtung

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6189116B2 (ja) * 2013-07-09 2017-08-30 ソニーセミコンダクタソリューションズ株式会社 画像処理装置、画像処理方法およびプログラム
IL239919A (en) 2015-07-14 2016-11-30 Brightway Vision Ltd Branded template lighting
DE102020002994B4 (de) 2020-05-19 2023-03-30 Daimler Truck AG Verfahren zur Messung eines Abstandes zwischen einem Objekt und einem optischen Sensor, Steuereinrichtung zur Durchführung eines solchen Verfahrens, Abstandsmessvorrichtung mit einer solchen Steuereinrichtung und Kraftfahrzeug mit einer solchen Abstandsmessvorrichtung

Also Published As

Publication number Publication date
DE102020006880A1 (de) 2021-01-14
WO2022096215A1 (de) 2022-05-12
CN116529633A (zh) 2023-08-01
US20240012144A1 (en) 2024-01-11

Similar Documents

Publication Publication Date Title
DE4110132C2 (de) Abstandsteuergerät für ein Fahrzeug
EP1298454B1 (de) Verfahren zur Erkennung und Verfolgung von Objekten
DE102020002994B4 (de) Verfahren zur Messung eines Abstandes zwischen einem Objekt und einem optischen Sensor, Steuereinrichtung zur Durchführung eines solchen Verfahrens, Abstandsmessvorrichtung mit einer solchen Steuereinrichtung und Kraftfahrzeug mit einer solchen Abstandsmessvorrichtung
EP1557693B1 (de) Verfahren zur Verfolgung von Objekten
WO2022096215A1 (de) Verfahren zum detektieren eines objekts mittels einer beleuchtungseinrichtung und eines optischen sensors, steuereinrichtung zur durchführung eines solchen verfahrens, detektionsvorrichtung mit einer solchen steuereinrichtung und kraftfahrzeug mit einer solchen detektionsvorrichtung
DE4107177C2 (de) Fahrzeug/Fahrzeug-Entfernungsmeßvorrichtung
EP1554606A1 (de) Verfahren zur erkennung und verfolgung von objekten
EP1589484A1 (de) Verfahren zur Erkennung und/oder Verfolgung von Objekten
WO2021239323A1 (de) Verfahren zur erkennung von bildartefakten, steuereinrichtung zur durchführung eines solchen verfahrens, erkennungsvorrichtung mit einer solchen steuereinrichtung und kraftfahrzeug mit einer solchen erkennungsvorrichtung
DE102020004690A1 (de) Verfahren zur Erkennung von Objekten, Steuereinrichtung zur Durchführung eines solchen Verfahrens, Erkennungsvorrichtung mit einer solchen Steuereinrichtung und Kraftfahrzeug mit einer solchen Erkennungsvorrichtung
EP3715779B1 (de) Verfahren und vorrichtung zur bestimmung von verformungen an einem objekt
EP3663881B1 (de) Verfahren zur steuerung eines autonomen fahrzeugs auf der grundlage von geschätzten bewegungsvektoren
DE102014204423B4 (de) Lichtlaufzeitkamerasystem
DE102020007061B4 (de) Verfahren zum Betreiben einer ersten Beleuchtungseinrichtung, einer zweiten Beleuchtungseinrichtung und eines optischen Sensors, Steuereinrichtung zur Durchführung eines solchen Verfahrens, Gated-Kamera-Vorrichtung mit einer solchen Steuereinrichtung und Kraftfahrzeug mit einer solchen Gated-Kamera-Vorrichtung
WO2022167343A1 (de) Verfahren zum kalibrieren einer gated-kamera, steuereinrichtung zur durchführung eines solchen verfahrens, kalibrierungsvorrichtung mit einer solchen steuereinrichtung und kraftfahrzeug mit einer solchen kalibrierungsvorrichtung
EP3663800B1 (de) Verfahren zur objekterfassung mit einer 3d-kamera
DE102021004521B4 (de) Gated-Kamera-Vorrichtung und Kraftfahrzeug mit einer solchen Gated-Kamera-Vorrichtung
DE102021003153A1 (de) Verfahren zum Betreiben einer Gated-Kamera, Steuervorrichtung zur Durchführung eines solchen Verfahrens, Schulterblickvorrichtung mit einer solchen Steuervorrichtung und Kraftfahrzeug mit einer solchen Schulterblickvorrichtung
EP4127767B1 (de) Objekterkennung durch ein aktives optisches sensorsystem
DE102020005762B4 (de) Verfahren zum Kalibrieren einer Beleuchtungseinrichtung und eines optischen Sensors, Steuereinrichtung zur Durchführung eines solchen Verfahrens, Kalibrierungsvorrichtung mit einer solchen Steuereinrichtung sowie Kraftfahrzeug mit einer solchen Kalibrierungsvorrichtung
DE10148063A1 (de) Verfahren zur Erkennung und Verfolgung von Objekten
WO2022258527A1 (de) Verfahren zum betreiben einer gated-kamera, steuervorrichtung zur durchführung eines solchen verfahrens, sichtweitenmessvorrichtung mit einer solchen steuervorrichtung und kraftfahrzeug mit einer solchen sichtweitenmessvorrichtung
DE102021003728A1 (de) Verfahren zum Betreiben einer Gated-Kamera, Steuervorrichtung zur Durchführung eines solchen Verfahrens, Gated-Kamera-Vorrichtung mit einer solchen Steuervorrichtung und Kraftfahrzeug mit einer solchen Gated-Kamera-Vorrichtung
DE102019134321A1 (de) Auf LIDAR-Datenwolken basierende Segmentierung und Klassifizierung von Fahrzeugen
WO2022184738A1 (de) Verfahren zum betreiben einer gated-kamera, steuereinrichtung zur durchführung eines solchen verfahrens, kameravorrichtung mit einer solchen steuereinrichtung und kraftfahrzeug mit einer solchen kameravorrichtung

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20230324

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)