EP3724677A1 - Système de détection 3d muni d'un éclairage de scène fonction d'un angle solide - Google Patents

Système de détection 3d muni d'un éclairage de scène fonction d'un angle solide

Info

Publication number
EP3724677A1
EP3724677A1 EP18826220.8A EP18826220A EP3724677A1 EP 3724677 A1 EP3724677 A1 EP 3724677A1 EP 18826220 A EP18826220 A EP 18826220A EP 3724677 A1 EP3724677 A1 EP 3724677A1
Authority
EP
European Patent Office
Prior art keywords
light
scene
sensor system
illumination
illumination light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP18826220.8A
Other languages
German (de)
English (en)
Inventor
Urs Hunziker
Johannes Eckstein
Beat Wyss
Christian Seiler
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bircher Reglomat AG
Original Assignee
Bircher Reglomat AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bircher Reglomat AG filed Critical Bircher Reglomat AG
Publication of EP3724677A1 publication Critical patent/EP3724677A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/4912Receivers
    • G01S7/4918Controlling received signal intensity, gain or exposure of sensor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • G01S7/4815Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4861Circuits for detection, sampling, integration or read-out
    • G01S7/4863Detector arrays, e.g. charge-transfer gates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4868Controlling received signal intensity or exposure of sensor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/4912Receivers
    • G01S7/4913Circuits for detection, sampling, integration or read-out
    • G01S7/4914Circuits for detection, sampling, integration or read-out of detector arrays, e.g. charge-transfer gates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/32Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4811Constructional features, e.g. arrangements of optical elements common to transmitter and receiver
    • G01S7/4813Housing arrangements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/499Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using polarisation effects

Definitions

  • the present invention relates to a sensor system and a method for three-dimensional acquisition of a scene based on transit time measurements. Furthermore, the present invention relates to several uses of such
  • Actuators operated closure body used, which facilitate the handling of the respective closure body for operators or automatically operated without any action, for example, when an object to be passed to the opening passes into the region of the opening.
  • Such an opening may, for example, be a passageway in a building.
  • a closure body may be, for example, a door or a gate.
  • such a sensor system may cause automatic opening of the closure body or opening.
  • a sensor system or a data processing device connected downstream of such a sensor system uses known methods of image processing
  • a face recognition for example, a face recognition.
  • a 3D sensor system for the field of application of the monitoring of automatically opening doors and / or gates, which is based on the principle of transit time measurement of light beams emitted by illumination sources and after an at least partial reflection or 180 ° Backscatter be detected by a light receiver.
  • Such sensor systems are commonly referred to as "time-of-flight” (TOF) sensor systems.
  • TOF sensor systems have the disadvantage that with
  • the intensity of the (backscattered) measuring light to be detected by a light receiver of the TOF sensor is weakened in two respects. In the case of a punctiform illumination light source without a special focus, it scales
  • Attenuation of the illumination light emitted by the illumination sources with l / d / 2, where d is the distance to the illumination light source.
  • d is the distance to the illumination light source.
  • Illumination light is scattered isotropically, perceived as a point light source. As a result, in this case, this leads to a l / d 'scaling of the intensity of the
  • the intensity attenuation is correspondingly lower, but still contributes to a significant loss of light output. This in turn leads to a correspondingly poor energy efficiency of a TOF sensor system.
  • the sensor system includes (a) a lighting device for illuminating the scene
  • Illumination light (B) a measuring device (BL) for receiving measuring light, which is at least partially backscattered by at least one object contained in the scene illumination light, and (b2) for measuring
  • the illumination device is configured such that an illumination intensity of the illumination light depends on the solid angle of the beam path of the illumination light, so that a distance-based intensity loss of the illumination light and the measurement light is at least partially compensated.
  • the described sensor system which is a so-called Time Of Flight (TOF)
  • Sensor system is, it is based on the knowledge that by a variable in terms of intensity illumination of the scene to be detected optimal
  • Lighting can be achieved, which in the result leads to that of all sub-areas of the scene received measuring light in terms of its intensity is at least approximately equal. As a result, it can be avoided that there are underexposed and / or overexposed subregions of the measurement light in an image of the detected scene.
  • the characteristic of the illumination light can thus be adjusted so that the intensity is always just as high as it is for a reliable detection of the respective solid angle range
  • Lighting only requires as much energy as is required, so that the described sensor system is characterized in the result by a good energy efficiency.
  • criteria for reliable detection over time or with the solid angle may vary.
  • a particularly reliable detection in a region of the closing edge of a closing sliding door may be required and be ensured by a suitable illumination angle dependent on the illumination angle.
  • illumination light in this document means those electromagnetic waves which are emitted by a light source of the illumination device and strike the relevant object of the scene.
  • the “measuring light” are the backscattered by or on the object electromagnetic waves, which from the measuring device or a
  • the term "scene” may in particular be understood to mean that spatial area which is optically detected by the sensor system. Objects in the scene are recognized by a suitable image analysis.
  • the data processing device can make use of known methods for image evaluation and / or image analysis.
  • Data processing device can therefore be a special
  • object can be understood as meaning any spatially physical structure which has a surface texture which leads to an at least partial reflection or scattering of illumination light and is therefore visible to the measuring device by the resulting measurement light.
  • the object may be an object such as a motor vehicle or a living being such as a human.
  • the object may be in relation to that
  • Sensor system be static or dormant object. Furthermore, the object may also move within, leave or enter the scene.
  • characteristic of a scene can be understood as the entirety of all spatial structures which are detected by the sensor system. In this case, some structures can be recognized as being relevant and other structures as being less or even irrelevant by the data processing device by means of image processing and / or image recognition.
  • distance-based loss of intensity can mean the one
  • Lighting light beams is caused.
  • this loss scales with l / d ⁇ , where d is the
  • Beam shaping for example, focusing, the illumination light, the measuring light and / or in a non-isotropic scattering of the illumination light with a preferred emission of the measuring light in the direction of
  • the "distance-based intensity loss” is correspondingly lower, but in practice still represents a significant loss, which reduces the energy efficiency of a TOF sensor. According to the invention, these losses are at least partially reduced or compensated by a suitable room-angle-dependent intensity of the illumination light.
  • optical and / or “light” may refer to electromagnetic waves having a particular wavelength or frequency or a particular spectrum of wavelengths or frequencies.
  • the electromagnetic waves used can be assigned to the spectral range visible to the human eye.
  • electromagnetic waves associated with the ultraviolet (UV) or infrared (IR) spectral regions may also be used.
  • the IR spectral range can extend into the long-wave IR range with wavelengths between 3.5 pm to 15 pm, which are determined by means of the
  • Light receiver of the sensor can be detected.
  • Intensity distribution of the illumination light is not only possible for TOF sensor systems that illuminate the whole or at least larger portions of the scene simultaneously.
  • the compensation according to the invention can also profitably be used in TOF sensor systems, which scan the scene sequentially with an illumination light beam, for example a laser beam.
  • the illumination device is configured to provide the illumination light with a spatial intensity distribution which at least approximately compensates for an edge light drop.
  • the edge light drop is a natural edge light drop, according to which a brightness in an image when a uniformly bright subject is imaged by a lens decreases by a factor of cos'M from that in the center of the image.
  • Measuring device can be used.
  • the compensation of the natural marginal light drop described here contributes greatly to the improvement of the light intensity ratios.
  • the compensation of the natural Randlichtabfalls by a suitable space angle-dependent distribution of the illumination intensity may be at least 30%, preferably 50%, more preferably 80% and even more preferably 90% or even 95%.
  • Lighting device to control such that a characteristic of the illumination light, which describes the dependence of the illumination intensity of the illumination light of the solid angle, during a operation of the sensor system is dynamically changeable.
  • Lighting light can be one and the same scene at different
  • Illumination conditions are recorded several times.
  • different data sets of one and the same scene are available to the data processing device, so that by means of a suitable method of image analysis (by the data processing device) that data record for determining the three-dimensional characteristic of the scene can be used, which reproduces the scene most accurately.
  • an optimal lighting characteristic can also be determined according to the "try-and-error" principle or by other statistical optimization procedures. This can be done dynamically during a real operation of the sensor system or as part of a calibration by means of a detection of suitable reference objects.
  • Lighting characteristics recorded 3D images of the scene are processed together so that a comprehensive data set is available for a final determination of the 3D characteristic of the scene.
  • different partial areas of the scene can be characterized in that for a first partial area a first partial data record recorded in a first illumination characteristic and for a second partial area of the second partial data set recorded for a second illumination characteristic for the determination of the
  • Lighting light characteristic are assigned.
  • Data processing device coupled to the illumination light controller and configured to evaluate the determined three-dimensional characteristic of the scene and based on a result of this evaluation to change the characteristic of the illumination light. Vividly, the way the scene depends on one
  • Scene detection is dependent on the angle of illumination illuminated by the illumination device, of measurement and evaluation results, which have been determined from a previous scene detection.
  • the characteristic of the lighting becomes so dynamically on the basis of results of a
  • Appropriate control of the lighting device may depend on current environmental conditions resulting in the result of
  • Such environmental conditions may be weather conditions such as the presence of rain, snow, hail, fog, smoke, suspended particles, etc. in the scene.
  • the result of the evaluation depends on the optical scattering behavior of at least one object contained in the scene. This has the advantage that in addition to the distance-based
  • Compensation also an optionally existing different scattering behavior and / or reflection behavior of different objects in the scene is taken into account, so that the measuring light impinges with an at least approximately spatially uniform intensity distribution on a light receiver of the measuring device.
  • a brightness which is as uniform as possible over the light-sensitive surface of the light receiver favors precise distance measurement by the described TOF sensor system.
  • Lighting device configured to control the characteristic of the illumination light in response to an external control signal.
  • the control signals can via a corresponding data input of the
  • the control of the illumination device can then be carried out by the above-described illumination light control unit of the sensor system.
  • a control or adaptation of the illumination characteristic can therefore not (only) depend on the information generated in the context of the TOF measurement or the results of the scene evaluation. According to the embodiment described here, a suitable adaptation of the illumination characteristic
  • external control signals can be indicative of all features and / or states which have an influence on the backscattering behavior of the
  • Such a feature is, for example, a (color) change of an object to be detected and / or an object newly entering or leaving the scene, which object
  • the characteristic of the illumination light is determined by at least one of the following features: (a) wavelength, (b) spectral intensity distribution, (c) polarization direction, and (d)
  • the wavelength, frequency or color of the illumination light can by a suitable control of spectrally different light elements, in particular LEDs with different colors, varies or to the
  • the polarization direction of the illumination light can in a known manner, for example by the
  • Illumination light source for spatially scanning the scene with a
  • Illumination light sources which are in particular individually controllable and in each case assigned to a specific solid angle region of the scene, and / or (d) a planar illumination light source, in particular with a luminous intensity which is not homogeneous over the surface.
  • a scanning the scene laser beam can be directed in a known manner via two rotatable mirrors with mutually non-parallel and preferably perpendicular to each other oriented axes of rotation to each illuminated point of the scene.
  • a (dynamically adaptive) deflection can also non-mechanical optical elements such as diffractive
  • Optical elements are used.
  • the deflection can in particular by the illumination light control device described above being controlled.
  • the at least approximately punctiform illumination light source may be a (sufficiently strong) semiconductor diode, for example a laser or light emitting diode.
  • a (sufficiently strong) semiconductor diode for example a laser or light emitting diode.
  • Beam shaping systems in particular lens systems are used.
  • suitable optical elements for beam deflection are used.
  • Beam splitting and / or beam merge can be used. Also DOEs can be used advantageously.
  • the plurality of illumination light sources which are also in particular laser or light-emitting diodes, can be controlled (in particular individually) by the illumination light control device described above. This advantageously allows an adaptively controlled or even regulated adjustment of the characteristic of the illumination light.
  • a flat light source can also be the source for a spatially-dependent, non-homogeneous intensity distribution. If it is a spatially homogeneously illuminated surface, suitable optical elements for
  • Beam deflection, beam splitting, beam merging and / or beam shaping can be used to realize the described spatial angle-dependent uneven illumination of the scene.
  • Lighting device at least one diffractive or refractive optical element which is configured, the space angle-dependent
  • the term diffraction is used in this context generally designates the spatial deflection of an electromagnetic wave at structural obstacles. Such obstacles may be an edge, a hole or a one-dimensional, a two-dimensional or even a three-dimensional grid.
  • the diffractive optical element can be, for example and preferably, a DOE, which advantageously has a dynamic adaptation or adaptation of the illumination characteristic during the
  • refraction refers to the change in the propagation direction of a wave due to a spatial change in its propagation velocity, which is specific to light waves through the
  • Refractive index n of a medium is described.
  • Lighting device configured to provide the illumination light with a deviating beam cross section of a circular shape. This has the advantage that in scenes which are "not round", inadequate illumination of corner areas of the scene can be avoided. An inadequate
  • the illumination light has a rectangular beam cross section.
  • the beam cross-section is adapted to achieve as homogeneous as possible illumination to the shape of the scene to be detected.
  • a suitable shape of the beam cross section can be realized not only by a corresponding shaping of the luminous area of the illumination device, the beam cross section can also be adjusted by optical components such as mirrors and refractive optical elements (eg lens system) in a suitable manner become. Diffractive optical elements (DOEs) can also be used, which optionally even allow a dynamic and / or scene-dependent shaping of the beam cross-section.
  • DOEs diffractive optical elements
  • DLP digital light processing
  • MEMS microelectromechanical systems
  • Measuring device on a light receiver with a plurality of pixels for receiving the measuring light and coupled to the light receiver
  • a light receiver control device wherein the light receiver control device and the light receiver are configured such that in a modified operation of the sensor system at least two pixels of the plurality of pixels are combined to form a higher-level pixel.
  • the plurality of pixels are grouped together so that a certain number of pixels are combined into a higher-level pixel.
  • the certain number may be, for example, (preferably) two, three, (preferably) four, six, (preferably) eight, or (preferably) nine. Of course, an even stronger summary of pixels is possible.
  • Such aggregation of pixels also referred to as "binning" has the effect of increasing the number of photons of the measurement light collected during a scene acquisition of one pixel, in proportion to the number of pixels, at the expense of spatial resolution pixels summed up to a parent pixel.
  • the so-called statistical value is reduced, in particular in the case of weak measuring light
  • Photon noise which improves the scene evaluation accuracy.
  • a binning is therefore particularly advantageous in the case of a weak measuring light when high spatial resolution is not required.
  • binning can also be carried out locally in only at least a partial area of the active areas of the light receiver via the surface of the light receiver. Although this leads to an inhomogeneous spatial resolution, which is not necessarily desired. However, the disadvantage of such an inhomogeneous spatial resolution is overcompensated in many applications by the increased photon accumulation.
  • a local "binning" can be done at least in some known light receivers without special electronic or apparatus elements simply by a corresponding control of the light receiver, which control determines the "binning" and thus the operating mode of the sensor system.
  • a local "binning" is performed in such a way that, measured by the measuring device and / or learned by the data processing device, exactly those areas of the
  • Light receiver which have received too little light energy in at least one previous scene detection, by a suitable control of the light receiver by the light receiver control device in subsequent scene captures in a suitable manner to higher-level pixels
  • Such a dynamically controlled or regulated “binning” can during a normal operation of the sensor system (learned) and / or during the configuration of the sensor system, for example in the context of a (first) installation, a maintenance, a cyclic or
  • the spatial resolution of the light receiver along different directions will be different if the individual pixels are square Have shape. This can be exploited in some cases in an advantageous manner.
  • Such an application is present, for example, when a movement of an object of the scene along a previously known spatial direction is to be detected with high accuracy.
  • the number of pixels arranged along a line perpendicular to this known spatial direction may be larger than the number of pixels arranged along a line perpendicular thereto. Then the spatial resolution along the direction of motion is greater than the spatial resolution perpendicular to the
  • Direction of movement and the motion profile of such a linearly moving object can with a particularly high accuracy even at a
  • the described binning is also adaptive in response to at least one previously acquired (and evaluated)
  • Scene characteristic (automatically) can be activated. This means that the "binning" is controlled not only controlled by the light receiver controller but depending on the results obtained by a scene evaluation. This will be a particularly reliable
  • Scene detection also allows for weak measuring light, so that the described sensor system with a correspondingly weak
  • Lighting light and thus can be operated in an energy efficient manner.
  • the light receiver has a light-sensitive surface, which is subdivided into a multiplicity of pixels. With or on each pixel those photons of the measurement light are accumulated, which originate from a certain solid angle region or the associated subregion of the scene.
  • the measuring unit is used to determine the runtime of the associated light beams of the illumination light and the measuring light for each pixel.
  • the measuring device is mechanically coupled, wherein the holder is designed such that the sensor system can be attached to a stationary in relation to the scene to be detected holding structure.
  • the holder ensures that the described sensor system can be a stationary system which has a certain spatially fixed detection range and thus always monitors the same scene.
  • spatially stationary objects that are present in the scene can be detected in an image analysis and hidden in a further image analysis with respect to movement profiles.
  • computing power can be saved and the energy efficiency of the described sensor system can be improved.
  • the stationary support structure may be directly or indirectly mechanically coupled to a device for controlling a coverage characteristic of an opening to be passed through the object by at least one closure body.
  • this device in addition to a suitable guide or
  • Closing on, in particular for moving the closing body between a closed position and an open position (and vice versa).
  • the opening may be an entrance, for example, for a person or a vehicle.
  • the closing body may be a door, for example a front door or a garage door.
  • the stationary support structure may be, for example, the stationary frame structure of an entrance, for example the frame of a door.
  • Data processing device further configured such that a
  • Covering characteristic of an opening to be passed by an object by at least one closing body is controllable.
  • the opening which is, for example, an entrance (or an exit) of a building
  • the closing body can be moved automatically between an open position and a closed position.
  • a method of three-dimensionally capturing a scene comprises (a) illuminating the scene with illumination light by means of a Lighting device and by means of a measuring device (b) receiving measurement light which is at least partially backscattered by at least one object contained in the scene illumination light; and (c) measuring distances between the sensor system and the at least one object based on a light transit time of the illumination light and the measurement light. Furthermore, the described method comprises (d) determining the
  • Lighting device configured such that an illumination intensity of the illumination light from the solid angle of the beam path of the
  • Lighting light depends, so that a distance-based loss of intensity of the illumination light and the measuring light is at least partially compensated.
  • the described method is also based on the knowledge that the intensity of the received measurement light, which is received by different and preferably all subregions of the scene, is at least approximately the same due to a suitable illumination angle dependent on the angle of the room.
  • the spatial characteristic of the illumination light can be adjusted so that the intensity is always just as high as required for a reliable detection of the subarea of the scene assigned to the respective solid angle region. As a result, only as much energy as required is needed for the lighting and an energy-efficient three-dimensional one
  • the method further comprises (a) detecting an object in the scene;
  • the approved action may, for example, be an authorized passage through an opening in a building, which opening is closed by a closing body before being identified as an approved object and only after successful identification by a corresponding movement of the door
  • Closing body is opened.
  • the objects to be identified may preferably be persons and / or vehicles. Successful identification may be to control or activate a closure mechanism for a closure body prior to opening a building.
  • Covering characteristic of an opening to be passed by an object by at least one closing body is
  • Covering characteristic which is controlled by the described sensor system or at least co-controlled. Because such sensor systems
  • Sensor system can be monitored in an energetically efficient manner and larger distances, which naturally to an earlier detection of a Opening request of the closure body leads, which can be a great advantage especially for fast moving objects. Furthermore, the scene can be detected with a wider detection angle, which, for example, leads to an early detection of itself transversely to the opening
  • the opening is an entrance or an exit, in particular an emergency exit in a building.
  • the object is a person or a vehicle.
  • the building may in particular be a house or a garage.
  • a sensor system described above for detecting and / or controlling traffic flows of objects moving through a scene of the sensor system, the scene being determined by a spatial detection range of the sensor system.
  • This described use is based on the finding that it is in a traffic detection and / or traffic flow control on a
  • Energy-efficient sensor technology arrives, since this sensor is typically constantly in operation and beyond, especially in larger traffic flows typically a very large number of such sensor systems are in use.
  • the relevant for the traffic flow objects for example, people, vehicles, products such. B. packages, suitcases, etc. be. Since a plurality or even a multiplicity of 3D sensors are usually used for such applications, energy savings have a particularly positive effect here.
  • TOF-based sensor systems can be subdivided into two fundamentally different classes both with regard to the illumination light and with regard to the measurement light, which can be combined as desired.
  • Bl The first alternative (Bl) for the lighting is characterized by the fact that the scene by means of a single illumination light beam high
  • Focusing and low divergence sequentially is scanned. For each position of the illumination light beam in the scene, a measurement of the duration of the illumination light and the measurement light
  • the scanning can be realized using movable optical components, in particular mirrors. Alternatively or in
  • Combination can be used for sequential scanning of the scene with the
  • Illuminating light beam can be used a solid body, which manages without mechanically moving parts and has integrated photonic structures or circuits. With a suitable control of these structures, the illumination light beam is then directed to the desired location of the scene.
  • a solid is known for example from US 2015/293224 Al.
  • B2 The second alternative (B2) for lighting is characterized by the fact that the entire scene is illuminated (all at once and flatly). If necessary, the intensity of the illumination light in selected subregions of the scene can be (selectively) increased in order to enable improved 3D object detection at these locations. Such spatially uneven distribution of the intensity of the illumination light can be done without moving optical
  • DOE Diffractive optical element
  • Ml A first alternative (Ml) for the measurement is based on pulsed
  • Illumination light beams The "travel time" of a light pulse on the receiver side for each pixel within a time window is determined and derived from the distance.
  • the second alternative (M2) for the measurement is based on a temporal, preferably sinusoidal, modulation of the illumination light with a
  • the predetermined frequency is predetermined, with appropriate values for this frequency depending on the expected transit time or the maximum detection distance.
  • the phase difference is measured for each pixel and derived therefrom the distance information.
  • Both measuring principles M 1 and M 2 are based on an integration of the number of photons or the photoelectrons generated in the light receiver, which arrive on each pixel to be measured.
  • an ever-present light or photon noise depends on the number of photons accumulated in a pixel. Therefore, the higher the number of accumulated photons, the more accurate the distance information obtained from the TOF measurement becomes.
  • Figure 1 shows the use of a sensor system for controlling a
  • Figure 2 shows the use of a sensor system for detecting a
  • Figures 3a and 3b illustrate a collection of single pixels of a light receiver.
  • Figures 4a to 4c show different beam cross sections of a
  • Light receiver to optimize the incident optical energy of the measuring light by illuminating the scene differently (intensively) depending on the characteristics of the scene.
  • the scene to be captured is cuboidal or cubic in nature.
  • known TOF sensors typically have a detection range that has at least approximately the same range for all detected solid angles.
  • the TOF sensor system described in this document is now able to at least partially compensate for this "spherical shell boundary" by selectively increasing or reducing the illumination intensity in selected subregions of the scene.
  • This can be described with the above-described cos' law described adverse effect and optimally utilized the energy used for lighting.
  • certain subregions of the scene which are dependent on the spatially geometrical arrangement of the sensor system and the scene to be detected can be "exposed" in a generalized manner with an optimum illumination intensity.
  • a TOF sensor system is typically mounted above the average height of the objects to be observed (people, products, vehicles, etc.) so that unwanted object shadowing is less problematic in a variety of objects.
  • Capturing scene can be used to further optimize the operation of the
  • lighting dependent on the geometry of the scene can be realized by controlling the illumination angle-dependent illumination intensity.
  • a different strategy can be used.
  • Brightness of a single element varies with respect to the other elements. This variation can be both constructive in design (e.g., laser or
  • Control via variable current per laser or light emitting diode by means of suitable electronics, for example by measuring in the (initial) installation set). Furthermore, a dynamic adjustment of the individual laser or LEDs during operation is possible. In this case, simply those laser or light emitting diodes, which are assigned to areas of the scene that provide little measuring light, are correspondingly more energized. This is particularly well suited for the above lighting principle B2 in combination with the aforementioned measuring principle Ml or M2.
  • the illumination intensity can be specifically controlled.
  • a static scene can be measured, with the appropriate intensity for each solid angle is taught to illuminating light.
  • An embodiment of the invention achieves energy savings by means of dynamic lighting energy optimization, wherein Illumination energies of different wavelengths or frequencies are used. Depending on the color of the object, for example, the wavelengths that contribute to the most intense reflections or
  • other wavelengths or other wavelength regions with less reflection or scattering may be present in the wavelength spectrum with a higher intensity.
  • a red object can be illuminated primarily with a red light component and the green and blue light components are reduced (for the relevant solid angle), preferably to at least approximately zero intensity.
  • the same principle can also be applied in the relationship between visible light and infrared (IR) light.
  • variable frequency or wavelength variable illumination with respect to reflection and scattering properties with associated distance and solid angle can be used in a subsequent
  • Scene analysis of moving objects can be of great advantage, because it allows easier objects to be detected and tracked, since the
  • the sensor system described in this document may be used, for example, in passageways, especially in passageways having a shutter characteristic that is automatically controlled (e.g., by means of doors, gates, barriers, traffic lights, etc.). Since the sensors for a
  • Passage control is usually powered by the existing closure systems with energy, it is with a given amount of energy as much sensory effect to achieve.
  • the sensor system described in this document allows, compared to known sensor systems (i), data acquisition for longer distances (earlier detection of a
  • Cross-traffic and / or (iii) a more reliable detection of objects in a security area of the closure system.
  • a receiving chip whose pixels have different sensitivities is used for the light receiver. This can e.g. by reducing the noise of individual pixels or zones of pixels. Since the noise is often correlated with the heat of the sensor, e.g. by means of a heat pump (for example a Peltier element) a higher sensitivity can be achieved for a part of the receiving chip. The more punctual this
  • Temperature change can be generated on the receiving chip, the higher the energy efficiency of the sensor system can be.
  • Lighting principle B2 is suitable, an (additional) spatial variation of the illumination light is achieved by DOE's.
  • DOEs in the context of laser systems, the maximum illumination energy can thus be utilized, because those portions of the illumination light beams, which should strike the scene at a lower intensity, are not simply passed through a mask
  • Pattern projections are required, which are required for 3D sensors, which are based on the known principle of structured illumination or the so-called. Strip projection.
  • the spatial variation of the illumination light described here can also be used with additional lens systems, in particular Freeform lenses are additionally optimized.
  • the room-angle-dependent illumination intensity described in this document provides for scene-dependent illumination, which results in a more uniform distribution of the intensity of the light received from the different parts of the scene
  • Measuring light leads.
  • the scene to be detected is first illuminated conventionally, in particular according to the illumination principle B2.
  • the sensor system can be operated at the lower limit of the measurability as long as the scene is (still) static. However, if a change in the scene is (grossly) detected or at least suspected, then an increase in the illumination intensity can be reacted immediately, so that the scene or the scene changes can then be detected and evaluated with high accuracy.
  • This mechanism can be used both for IR sensor systems and for sensor systems with
  • FIG. 1 shows the use of a sensor system 100 to control a coverage characteristic of an opening 184 depending on the characteristics of a scene 190 monitored by the sensor system 100.
  • the opening 184 is an entrance for persons into a building or garage entrance for motor vehicles.
  • the corresponding input structure is provided with the reference numeral 180.
  • An object 195 in the scene is intended to symbolize such a person or a motor vehicle.
  • the input structure 180 comprises a stationary support structure 182, which constitutes a frame and a guide for two closing bodies 186 designed as sliding doors.
  • the sliding doors 186 can each be represented by means of a motor 187 along the two thick double arrows
  • Move instructions are moved.
  • the actuation of the motors 187 takes place, as explained below, by means of the sensor system 100 described in this document.
  • the sensor system 100 has a TOF measuring device 110, a
  • the TOF measuring device 110 has a lighting device 130 and a light receiver 120. As shown here
  • Exemplary embodiment comprises the TOF measuring device 110 or are associated with the TOF measuring device 110 (i) an illumination light control device 135 for controlling the operation of the illumination device 130, (ii) a measuring unit 125 connected downstream of the light receiver 120 for measuring a
  • the whole sensor system 100 (in contrast to the representation of FIG. 1) is constructed as a module which, in addition to the TOF measuring device 110, also has the data processing device 150 and the database 160 within a compact design.
  • the light receiver control device 140 has an interface 142, via which an external control signal 142a can be received.
  • the external control signal may be from an attached system (not shown), such as a
  • Monitoring system which controls the operation of the sensor device 100 depending on external parameters.
  • an external parameter may be, for example, a previously known optical property of the object 195. Also a below with reference to the figures 3a and 3b described
  • Scene-dependent grouping of pixels of the light receiver 120 may be initiated by the control signal 142a.
  • Signaling data transmitted via the interface 142 may also include information about the detected and evaluated scene 190. Such information may be, for example, the information that an escape route is blocked, that a license plate written out for search has been recognized, a parking lot has been illegally occupied, a suspicious object is in the monitored scene 190, etc. In this case, a corresponding information flow from the sensor system 100 or more precisely from the light receiver control device 140 to the affiliated system. Alternatively or in combination, such signaling data 152a may also be be output from the data processing device 150 via an interface 152.
  • An external control signal 152a transferred to the data processing device 150 via the interface 152 can also be used to make the operation of the data processing device 150 at least partially dependent on external information.
  • an "a priori knowledge" can be transmitted via an object 195 for an improved evaluation of the detected scene 190 and in particular for an improved object recognition.
  • Lighting device 130 which may be, for example, an array of individually controllable laser or light-emitting diodes, the scene 190 and thus also located in the scene 190 object 195 with a pulsed and thus temporally modulated illumination light 131.
  • the illumination light control device 135 is configured, the Lighting device 130 to control such that a characteristic of the illumination light 131, which describes the dependence of the illumination intensity of the illumination light 131 of the solid angle (in which the illumination light 131 strikes the scene 190), during a operation of the sensor system 100 is dynamically changeable.
  • the illumination light controller 135 may also cause the characteristic of the illumination light 131 to at least approximately compensate for natural edge light falloff, according to which brightness in an image when imaging a uniformly bright subject through an objective not shown for clarity decreases the factor cos' with respect to the brightness in the center of the image picked up by the light receiver.
  • the space angle-dependent intensity distribution of the illumination light 131 is illustrated in FIG. 1 by arrows of different widths and dashed lines.
  • solid angles of the scene 190 which are associated with a larger measuring distance, are illuminated more strongly than solid angles, which have a smaller measuring distance assigned.
  • the illumination light controller 135 may be the characteristic of the
  • Illumination light 131 depending on an external control signal 137
  • Control signal 137a may be indicative of the space angle dependent
  • Illumination light 131 for example its (a) wavelength, (b) spectral intensity distribution, (c) polarization direction, and (d) intensity distribution for different polarization directions. These further properties can be selected such that they contribute to the most reliable and accurate object recognition possible. Again, a "a priori knowledge" about optical properties of the object 195 can be considered.
  • the light receiver 120 of the TOF measuring device 110 receives illuminating light 131 backscattered from the object 195. This backscattered light is referred to as measuring light 196 in this document.
  • the spatial detection of the scene 195 takes place on the basis of the principles explained in detail above of a transit time measurement, which is also referred to as Time Of Flight (TOF) measurement.
  • TOF Time Of Flight
  • the corresponding TOF data is transferred to the data processing device 150. This can be done directly or indirectly via the
  • Light receiver controller 140 done. It should be noted that the illumination device 130 in addition to the illumination units shown in Figure 1 can also have other lighting units that illuminate the scene 190 from a different angle. Likewise, the two lighting units can also be arranged outside the housing of the TOF measuring device 110 and thus be further spaced from the light receiver 120. On the principles of
  • the detected optical scene 190 is identified by means of suitable methods
  • Image evaluation evaluated For this purpose, a plurality of images taken by the scene 190 under different illumination conditions or different illumination characteristics may be shared.
  • 3D images of the scene 190 can be recorded within certain time intervals.
  • the data processing device 150 is able, based on corresponding position shifts of the object 195 not only its speed as an absolute value but as a motion vector (with a
  • Sliding doors 186 open only when the object 195 actually moves in the direction of the opening 184. If the object 195 is a vehicle of so-called cross traffic, which is located in the
  • the corresponding motion vector is detected by the data processing device 150 and there is no opening of the sliding doors 186. After passing the object 195 through the opening 184, it can be quickly closed again, for example, to pass further
  • the sensor system 100 is also capable of performing object recognition. This is what the
  • Data processing device 150 to a stored in the database 160 record of reference objects corresponding to selected objects that are authorized to pass through the opening 184. This means that with a suitable approach of the object 195 to the input 184, the sliding doors 186 are opened only when the detected object 195 at least approximately coincides with one of the stored reference objects. This clearly indicates that in the use described here of the
  • the coverage characteristic of the opening 184 depends not only on the motion profile of the object 195, but also that an object-based access control takes place.
  • FIG. 2 shows a further use or a further use of the invention
  • the TOF measuring device 110 detects a traffic flow of (various) objects 295a, 295b, 295c, 295d and 295e, which are on a conveyor belt 298 and along the direction of movement represented by an arrow through a scene 290 move through.
  • a reliable knowledge of the number and / or the type of objects 295a to 295e can be used in the field of logistics for traffic flow control. Only an example of such
  • Controlling a traffic flow is the control of luggage transport in an airport.
  • labels on the relevant objects 295a - 295e can also determine the type of the respective object. It should be noted, however, that use in an airport is merely one example of a variety of other uses in the field
  • FIGS. 3a and 3b illustrate a combination of individual pixels of a light receiver 320a designed as a semiconductor or CCD chip
  • the light receiver 320a has a plurality of light-sensitive or
  • the pixels 322a are associated with the full spatial resolution of the light receiver 320a, which resolution is predetermined by the semiconductor architecture of the chip 320a.
  • the light receiver 320b In the light receiver 320b, four of the light-sensitive pixels (for a full resolution) are respectively added to a higher-level pixel 322b (for an increased resolution)
  • a pixel 322b collects a four-fold amount of light compared to a single pixel 322a. Such a "binning" reduces the required
  • binning reduces the intensity of the illumination light and thus reduces the energy consumption of the sensor system.
  • the described "binning" can also be realized dynamically by a corresponding control of one and the same light receiver 320a or 320b.
  • the light receiver is either in a first operating mode (with full resolution) or in a second operating mode (with
  • Switching between different modes of operation may be controlled by external control signals (see reference numeral 142a in Figure 1). Alternatively or in combination, such switching may also depend on the result of a scene evaluation, such that the "binning" mode of operation for a next
  • Scene capture is regulated.
  • each with a different strong summary of pixels can be used. Furthermore, it is possible to combine a different number of individual pixels into a higher-order pixel in different partial areas of the light receiver. Then, individual portions of the scene with a higher spatial resolution (and less photon accumulation) and other portions of the scene with a lower spatial resolution (and higher
  • Different levels of pixel aggregation can be carried out dynamically or adaptively in exactly those subregions in which a particular object is currently located.
  • Figures 4a to 4c show different beam cross sections of a
  • a first illumination light 431a illustrated in FIG. 4a has a substantially circular beam cross section and is preferably suitable for "round scenes". However, for most applications that do not capture (and evaluate) a "round scene," one of a circular shape is appropriate deviating beam cross section.
  • FIG. 4b shows an illumination light 431b with an elliptical beam cross section.
  • FIG. 4c shows
  • Illumination light 431c with a rectangular beam cross section.
  • optical components such as mirrors and refractive optical elements (e.g.
  • Lens system are suitably adapted to the scene to be detected.
  • Diffractive optical elements DOEs
  • DOEs can also be used, which optionally even allow a dynamic and / or scene-dependent shaping of the beam cross-section.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne un système de détection (100) et un procédé d'enregistrement tridimensionnel d'une scène (190). Le système de détection (100) présente : (a) un dispositif d'éclairage (130) servant à éclairer la scène (190) au moyen d'une lumière d'éclairage (131) ; (b) un dispositif de mesure (110) servant à recevoir une lumière de mesure (196) qui est au moins en partie de la lumière d'éclairage (131) réfléchie par au moins un objet (195) se trouvant dans la scène, et à mesurer des distances entre le système de détection (100) et le ou les objets (195) sur la base du temps de propagation de la lumière d'éclairage (131) et de la lumière de mesure (196) ; et (c) un dispositif de traitement de données (150) monté en aval du dispositif de mesure (110) et servant à déterminer la caractéristique tridimensionnelle de la scène (190) sur la base des distances mesurées. Le dispositif d'éclairage (130) est configuré de telle manière qu'une intensité lumineuse de la lumière d'éclairage (131) est fonction de l'angle solide du chemin optique de la lumière d'éclairage (131), de sorte qu'une perte d'intensité, basée sur les distances, de la lumière d'éclairage (131) et de la lumière de mesure (196) est au moins en partie compensée. L'invention concerne par ailleurs différentes utilisations dudit système de détection (100).
EP18826220.8A 2017-12-12 2018-12-11 Système de détection 3d muni d'un éclairage de scène fonction d'un angle solide Withdrawn EP3724677A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102017129639.5A DE102017129639A1 (de) 2017-12-12 2017-12-12 3D Sensorsystem mit einer von einem Raumwinkel abhängigen Szenenbeleuchtung
PCT/EP2018/084422 WO2019115559A1 (fr) 2017-12-12 2018-12-11 Système de détection 3d muni d'un éclairage de scène fonction d'un angle solide

Publications (1)

Publication Number Publication Date
EP3724677A1 true EP3724677A1 (fr) 2020-10-21

Family

ID=64899260

Family Applications (1)

Application Number Title Priority Date Filing Date
EP18826220.8A Withdrawn EP3724677A1 (fr) 2017-12-12 2018-12-11 Système de détection 3d muni d'un éclairage de scène fonction d'un angle solide

Country Status (3)

Country Link
EP (1) EP3724677A1 (fr)
DE (1) DE102017129639A1 (fr)
WO (1) WO2019115559A1 (fr)

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10351714B4 (de) * 2003-11-05 2005-09-29 Eads Astrium Gmbh Vorrichtung zur optischen Erfassung eines entfernten Objekts
JP2008286565A (ja) * 2007-05-16 2008-11-27 Omron Corp 物体検知装置
EP2558977B1 (fr) * 2010-04-15 2015-04-22 IEE International Electronics & Engineering S.A. Dispositif configurable de détection du contrôle d'accès
DE102010033818A1 (de) * 2010-08-09 2012-02-09 Dorma Gmbh + Co. Kg Sensor
EP2453252B1 (fr) 2010-11-15 2015-06-10 Cedes AG Capteur 3D à économie d'énergie
EP2469301A1 (fr) * 2010-12-23 2012-06-27 André Borowski Procédés et dispositifs pour générer une représentation d'une scène 3D à très haute vitesse
DE202012010014U1 (de) * 2012-10-19 2014-01-20 Sick Ag Laserscanner
US10132928B2 (en) 2013-05-09 2018-11-20 Quanergy Systems, Inc. Solid state optical phased array lidar and method of using same
US9635231B2 (en) * 2014-12-22 2017-04-25 Google Inc. Time-of-flight camera system and method to improve measurement quality of weak field-of-view signal regions
DE102015115101A1 (de) * 2015-09-08 2017-03-09 Huf Hülsbeck & Fürst Gmbh & Co. Kg Sensorsystem einer Sensoreinrichtung eines Kraftfahrzeugs
DE102015220798A1 (de) * 2015-10-23 2017-04-27 Designa Verkehrsleittechnik Gmbh Zugangskontrollsystem für einen Lagerbereich sowie Verfahren zur Zugangskontrolle
TWI570387B (zh) * 2015-11-09 2017-02-11 財團法人工業技術研究院 影像測距系統、光源模組及影像感測模組
DE102015226771A1 (de) * 2015-12-29 2017-06-29 Robert Bosch Gmbh Umlenkeinrichtung für einen Lidarsensor
DE102016202181A1 (de) * 2016-02-12 2017-08-17 pmdtechnologies ag Beleuchtung für eine 3D-Kamera
US10782393B2 (en) * 2016-02-18 2020-09-22 Aeye, Inc. Ladar receiver range measurement using distinct optical path for reference light
US11237251B2 (en) * 2016-05-11 2022-02-01 Texas Instruments Incorporated Lidar scanning with expanded scan angle
DE102016122712B3 (de) * 2016-11-24 2017-11-23 Sick Ag Optoelektronischer Sensor und Verfahren zur Erfassung von Objektinformationen

Also Published As

Publication number Publication date
WO2019115559A1 (fr) 2019-06-20
DE102017129639A1 (de) 2019-06-13

Similar Documents

Publication Publication Date Title
DE112017001112T5 (de) Verfahren und Vorrichtung für eine aktiv gepulste 4D-Kamera zur Bildaufnahme und -analyse
DE102018105301B4 (de) Kamera und Verfahren zur Erfassung von Bilddaten
EP1027235B1 (fr) Dispositif et procede pour la detection d'objets se trouvant sur un pare-brise
EP3014569B1 (fr) Inspection de la surface profilée du dessous de caisse d'un véhicule à moteur
EP1159636A1 (fr) Systeme de mesure de distance a resolution locale
EP2314427B1 (fr) Procédé et dispositif de commande pour la commande d'appareils électriques par détection de mouvements
DE102013108824A1 (de) Sensoranordnung zur Erfassung von Bediengesten an Fahrzeugen
WO2010076066A1 (fr) Système de caméra pour l'acquisition de l'état d'une vitre d'un véhicule
EP3526963A1 (fr) Surveillance d'objet par enregistrement d'image infrarouge et éclairage par impulsion infrarouge
WO2020229186A1 (fr) Système de détection 3d apte à fonctionner dans différents modes de fonctionnement en fonction d'un état de fonctionnement d'un corps de fermeture
DE102019003049B4 (de) Vorrichtung und Verfahren zum Erfassen von Objekten
DE102013100521A1 (de) Sensoranordnung zur Erfassung von Bediengesten an Fahrzeugen
EP2946226A1 (fr) Ensemble de détection universel conçu pour détecter des gestes de commande dans des véhicules
DE102007022523A1 (de) Kraftfahrzeug
WO2017041915A1 (fr) Systeme de capteur d'un dispositif de détection d'un véhicule automobile
EP3696537B1 (fr) Dispositif et procédé de détection des dommages sur un véhicule en déplacement
EP3724676A1 (fr) Système de capteur pour la capture tridimensionnelle d'une scène au moyen de plusieurs accumulations de photons
EP2562685B1 (fr) Procédé et dispositif de classification d'un objet lumineux situé à l'avant d'un véhicule
EP3724674A1 (fr) Système de capteur 3d avec optique de forme libre
EP3724601B1 (fr) Détermination de la distance basée sur différentes profondeurs de champ avec différents réglages de mise au point d'une lentille d'objectif
EP3724677A1 (fr) Système de détection 3d muni d'un éclairage de scène fonction d'un angle solide
WO2023247302A1 (fr) Procédé de détermination d'au moins une fonction de correction pour un système lidar, système lidar, véhicule comprenant au moins un système lidar et système de mesure
DE102017129654A1 (de) 3D Szenenerfassung basierend auf Lichtlaufzeiten und einem charakteristischen Merkmal von erfassten elektromagnetischen Wellen
DE102006010990B4 (de) Sicherheitssystem
WO2020229190A1 (fr) Identification d'un objet sur la base d'une reconnaissance d'une partie de l'objet et de données de description d'un objet de référence

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20200604

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

RIN1 Information on inventor provided before grant (corrected)

Inventor name: WYSS, BEAT

Inventor name: HUNZIKER, URS

Inventor name: ECKSTEIN, JOHANNES

Inventor name: SEILER, CHRISTIAN

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20210629

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20211110