EP3724676A1 - Système de capteur pour la capture tridimensionnelle d'une scène au moyen de plusieurs accumulations de photons - Google Patents

Système de capteur pour la capture tridimensionnelle d'une scène au moyen de plusieurs accumulations de photons

Info

Publication number
EP3724676A1
EP3724676A1 EP18825924.6A EP18825924A EP3724676A1 EP 3724676 A1 EP3724676 A1 EP 3724676A1 EP 18825924 A EP18825924 A EP 18825924A EP 3724676 A1 EP3724676 A1 EP 3724676A1
Authority
EP
European Patent Office
Prior art keywords
scene
sensor
sensor system
light
illumination
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP18825924.6A
Other languages
German (de)
English (en)
Inventor
Urs Hunziker
Johannes Eckstein
Christian Seiler
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bircher Reglomat AG
Original Assignee
Bircher Reglomat AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bircher Reglomat AG filed Critical Bircher Reglomat AG
Publication of EP3724676A1 publication Critical patent/EP3724676A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • G01S7/4815Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4861Circuits for detection, sampling, integration or read-out
    • G01S7/4863Detector arrays, e.g. charge-transfer gates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4868Controlling received signal intensity or exposure of sensor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/4912Receivers
    • G01S7/4913Circuits for detection, sampling, integration or read-out
    • G01S7/4914Circuits for detection, sampling, integration or read-out of detector arrays, e.g. charge-transfer gates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/4912Receivers
    • G01S7/4918Controlling received signal intensity, gain or exposure of sensor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/32Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4811Constructional features, e.g. arrangements of optical elements common to transmitter and receiver
    • G01S7/4813Housing arrangements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/499Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using polarisation effects

Definitions

  • the present invention relates to a sensor system and a method for three-dimensional acquisition of a scene based on transit time measurements. Furthermore, the present invention relates to several uses of such
  • Actuators operated closing body used, which facilitate the handling of the respective closure body for operators or operated fully automatically without any action, for example, when an object to be passed the opening passes into the region of the opening.
  • Such an opening may, for example, be a passageway in a building.
  • Closing body may for example be a door or a gate.
  • such a sensor system cause automatic opening of the closing body or the opening.
  • a sensor system or a data processing device connected downstream of such a sensor system uses known methods of image processing
  • a face recognition for example, a face recognition.
  • a 3D sensor system for the field of application of the monitoring of automatically opening doors and / or gates, which is based on the principle of transit time measurement of light beams emitted by illumination sources and after an at least partial reflection or 180 ° Backscatter be detected by a light receiver.
  • Such sensor systems are commonly referred to as "time-of-flight” (TOF) sensor systems.
  • TOF sensor systems have the disadvantage that with
  • the intensity of the (backscattered) measuring light to be detected by a light receiver of the TOF sensor is weakened in two respects. In the case of a punctiform illumination light source without a special focus, it scales
  • Attenuation of the illumination light emitted by the illumination sources with l / d / 2, where d is the distance to the illumination light source.
  • d is the distance to the illumination light source.
  • Illumination light is scattered isotropically, perceived as a point light source. As a result, this leads to a l / d 'scaling of the intensity of the received measurement light.
  • beamforming realized in any manner, for example focusing, of the illumination light, of the measuring light and / or in the case of non-isotropic scattering of the illumination light with a preferred beam
  • the described sensor system includes (a) a sensor for measuring distances based on a light transit time; (b) a control unit for controlling the operation of the sensor such that (bl) accumulates a first number of photons upon a first acquisition of the scene per pixel of the sensor and (b2) a second acquisition of the scene per pixel of the sensor amount of
  • Photons is accumulated, the second number being greater than the first number; and (c) a downstream of the sensor
  • a data processing device configured to evaluate the scene based on a first result of the first capture of the scene and / or a second result of the second capture of the scene.
  • the sensor system described is based on the knowledge that two different (image) data sets can be made available for energy-efficient three-dimensional detection and evaluation of a scene of a data processing device, in which case at least in one sub-region a light receiver of the sensor per pixel a different number of photons were accumulated or collected. Since a relative pixel error caused by an ever-present statistical (photon) noise is smaller with a larger number of photons accumulated per pixel, the statistical accuracy of scene detection becomes larger when a larger number of photons per pixel are accumulated.
  • the data processing device can decide, depending on a respectively detected characteristic of the scene, whether the first result or the second result is used for a (sufficiently accurate) scene evaluation. A combination of the two results in terms of the most accurate scene evaluation is possible.
  • the respectively required accuracy of the scene evaluation may depend on the current and / or on an earlier determined optical characteristic of the scene.
  • an evaluation of the first result of the first detection of the scene which does not have sufficient accuracy for at least a partial area of the scene, can
  • control unit may be made to cause the second capture of the scene with a second number of photons accumulated per pixel. Also, the control unit can
  • the "different number of photons accumulated per pixel” may refer to a fixed predetermined period of time, which is determined, for example, by the repetition rate of a repetitive reading of the pixels. This may in particular mean that the time voltage is determined by the time difference between two consecutive pixel readings. In this case, the absolute number of accumulated photons available for one pixel of an image or result is different for both acquisitions. The image with the higher number of photons accumulated per pixel is then the more accurate image, which in particular has a lower (photon) noise.
  • the periods of time during which the respective photons are accumulated may be different for the two acquisitions.
  • this "difference" may be chosen such that the absolute number of photons accumulated per pixel is the same in both acquisitions.
  • the image associated with the shorter amount of time may be used for scene detection with a higher temporal resolution, which may be particularly advantageous in fast moving objects.
  • control unit may provide that of a total number of first acquisitions and second acquisitions, one of the two types of acquisitions that requires more energy will only account for a certain percentage. In this case, the energetically more complex observations can be used for a more accurate scene capture.
  • the term "scene” may in particular be understood to mean that spatial area which is optically detected by the sensor system. Objects in the scene are recognized by a suitable image analysis.
  • the data processing device can make use of known methods for image evaluation and / or image analysis.
  • the Data processing device may accordingly be an image processing processor configured to apply known methods for image evaluation and / or image processing.
  • characteristic of a scene can be understood as the entirety of all spatial structures which are detected by the sensor. In this case, some structures can be recognized as being relevant and other structures as being less or even irrelevant by the data processing device by means of image processing and / or image recognition.
  • the term "result" may in particular be an analysis of an image captured by the sensor or of a part of an image.
  • the result may also be at least one piece of information about an object in the scene.
  • Such information may be the type of object that (optical)
  • the "measurement light” are the electromagnetic waves scattered back and forth from the object, which are received by the light receiver of the sensor and used for the three-dimensional evaluation of the scene, together with the corresponding TOF distance information.
  • optical and / or “light” may refer to electromagnetic waves having a particular wavelength or frequency or a particular spectrum of wavelengths or frequencies.
  • the electromagnetic waves used can be assigned to the spectral range visible to the human eye.
  • electromagnetic waves associated with the ultraviolet (UV) or infrared (IR) spectral regions may also be used.
  • the IR spectral range can extend into the long-wave IR range with wavelengths between 3.5 pm to 15 pm, which are determined by means of the
  • Light receiver of the sensor can be detected.
  • the sensor system further has a holder which is mechanically coupled to the sensor, wherein the holder is designed such that the sensor system can be attached to a stationary holding structure with respect to the scene to be detected.
  • the holder ensures that the described sensor system can be a stationary system which has a certain spatially fixed detection range and thus always monitors the same scene.
  • spatially stationary objects that are present in the scene can be detected in an image analysis and hidden in a further image analysis with respect to movement profiles.
  • computing power can be saved and the energy efficiency of the described sensor system can be improved.
  • the stationary support structure may be directly or indirectly mechanically coupled to a device for controlling a coverage characteristic of an opening to be passed through the object by at least one closure body.
  • this device in addition to a suitable guide or
  • the opening may be an entrance, for example for a person or a vehicle
  • the closure body may be a door, for example a front door or a garage door.
  • Holding structure may be, for example, the stationary frame structure of an entrance, such as the frame of a door.
  • Data processing device is further configured such that a
  • Covering characteristic of an opening to be passed by an object by at least one closing body is controllable.
  • the opening which is, for example, an entrance (or an exit) of a building
  • the closing body can be moved automatically between an open position and a closed position.
  • control unit has an interface and the control unit is further configured such that the accumulation of the first number of photons and / or the accumulation of the second number of photons is controllable by an external control signal.
  • the described interface allows external control of the operation of the sensor device.
  • the accuracy of at least one of the two scene captures can be adapted to the respective requirements.
  • an external trigger signal simply between the two types of detection with different
  • the external control signal may be indicative of the state of a system attached to the sensor system, for example, a
  • Monitoring system an automatic door opening system, etc. Further, e.g. be detected by another sensor that an object in
  • Movement is, which can be reliably detected even with "little accumulated photons". In this case, it is sufficient if an energy-consuming, more complex second acquisition of the scene is not carried out so frequently in comparison with the less energy-consuming first acquisition of the scene. Possibly. For example, after a one-time scene capture of the second type, the scene can only be performed with first acquisitions (of the first type).
  • the interface is connected to a manually operated button for activating an automatic door.
  • the control signal transmitted via the interface may then cause the scene to be detected in a room area located near the relevant door opening with a raised area
  • Accuracy is performed.
  • Such a different accuracy for different subregions of the scene can, for example, be achieved by a targeted combination of pixels by means of so-called binning, which
  • the first capture of the scene takes place with a first one Exposure time and the second detection of the scene takes place with a second exposure time, which is longer than the first exposure time.
  • Sensor systems can determine configuration and / or
  • Measurement parameters are, for example, (a) the (expected) distance to an object to be detected, (b) an energy available for the operation of the sensor system, in particular for a (pulsed or temporally modulated) illumination of the scene, which illumination is required for a TOF measurement is, (c) an optical reflection and / or scattering behavior of the surface of a
  • Data processing device coupled to the control unit and the
  • Data processing device is further configured, depending on the first result and / or the second result via the control unit
  • the (different) photon accumulation can be adapted dynamically to the characteristic of the three-dimensional scene to be detected.
  • the characteristic of the scene depends on the actual (and not the expected) optical properties of at least one object of the scene. Therefore, the scene evaluation dependent control of photon accumulation provides a control mechanism for the
  • a suitable photon accumulation can also be found by a learning process in which one and the same scene is recorded and evaluated multiple times or in which scenes are similarly recorded and evaluated. It is further pointed out that a repetition rate or frequency with which subsequently the scene with first acquisitions and / or with second acquisitions is detected and evaluated can also depend on the first result and / or on the second result. For example, one can
  • scene capture and scene evaluation can be performed at a comparatively high repetition rate.
  • the high repetition rate may be based on first scene acquisitions with a first photon accumulation and / or on second scene captures with a second photon accumulation
  • control unit and the sensor are configured in such a way that the second number of accumulated photons, which is higher in comparison to the first number, is realized by combining adjacent single pixels into one pixel.
  • Summing up pixels which is also known as binning, has the advantage of being uncomplicated and also fast or dynamic
  • Binning can also be performed locally in only at least a portion of the active surfaces of the light receiver. Although this leads to an inhomogeneous spatial resolution, which is not necessarily desired. The disadvantage of such inhomogeneous spatial resolution However, it is overcompensated in many applications by the advantages of different photon accumulation described above.
  • a local "binning" can, at least in some known sensors without special electronic or apparatus elements simply by a
  • control determines the "binning" and thus the operating mode of the sensor.
  • the binning described also responds to at least one previously detected and evaluated
  • Scene characteristic (automatically) can be activated.
  • the senor has an illuminating device for emitting illuminating light onto an object to be detected in the scene and a light receiver for receiving measuring light, which is illuminating light scattered at least partially on an object of the scene and impinging on the light receiver.
  • the sensor is configured to measure the light propagation time based on (a) a measurement of the time between emitting a pulse of the illumination light and receiving the measurement light or measurement light pulse associated with the pulse and / or (b) measuring a phase shift between a temporal modulation of the illumination light and an associated modulation of the
  • the described sensor system can be realized in a suitable manner depending on the particular application.
  • the sensor is configured such that it is possible to switch between the two different measurement principles "pulse mode" and “phase measurement” flexibly or, if necessary, between the two different measurement principles.
  • Lighting device configured to provide the illumination light with a deviating beam cross section of a circular shape. This has the advantage that in scenes which are "not round", insufficient illumination of Corner areas of the scene can be avoided. Although insufficient illumination of the corner regions could possibly be prevented by an overall increased intensity of the illumination light, in this case the middle regions of the scene would be overexposed, which would be very disadvantageous, at least from an energetic point of view.
  • the illumination light has a rectangular beam cross section.
  • the beam cross-section is adapted to achieve as homogeneous as possible illumination of the shape he scene to be detected.
  • Suitable shaping of the beam cross section can not only be achieved by a corresponding shaping of the luminous area of the illumination device, but also the beam cross section can be suitably adapted by optical components such as mirrors and refractive optical elements (for example lens system). Diffractive optical elements (DOEs) can also be used, which optionally even allow a dynamic and / or scene-dependent shaping of the beam cross-section.
  • DOEs diffractive optical elements
  • Lighting device configured to emit for the first detection of the scene, the illumination light with a first illumination intensity and for the second detection of the scene, the illumination light with a second
  • Emitting illumination intensity wherein the second intensity is greater than the first intensity.
  • the first illumination intensity and / or the second illumination intensity may be homogeneous or alternatively inhomogeneous over the entire scene.
  • an inhomogeneous lighting Intensity preferably those portions of the scene are illuminated with a higher illumination intensity, which are particularly relevant for object detection or in which are relevant objects of the scene.
  • the characteristic of the illumination in particular the illumination intensity, depending on the ambient light and / or the ambient conditions can be adjusted. So, for example, at an intense illumination
  • the sunlight has to be "drowned out”.
  • An adaptive control or regulation of the operation of the illumination device with, if possible, a reduction of the intensity of the illumination light thus enables an energetically efficient operation of the sensor system.
  • Sunlight and / or external disturbance can significantly degrade TOF measurement conditions.
  • the described sensor system can still work (with a reduced range of functions). For example, by a reduced frequency or
  • Lighting device (i) at least indirectly coupled to the data processing device and (ii) configured to control a lighting characteristic of the emitted illumination light in response to the first result and / or the second result.
  • At least one of the two results is a controlled variable for adjusting the characteristic of the lighting.
  • a dynamic adaptation of the illumination to a previously detected and expected scene characteristic of the scene can advantageously be carried out, and a scene-dependent, at least approximately optimal illumination can be realized.
  • the control of the lighting device can be of current
  • Ambient conditions may be weather conditions such as the presence of rain, snow, hail, fog, smoke, suspended particles, etc. in the scene.
  • Lighting device configured, the lighting characteristic of the emitted illumination light in response to an external
  • the control signals can via a corresponding data input of the
  • Sensor system in particular a data input of the control unit, are received.
  • the control of the illumination device can be carried out by the control unit of the sensor system described above become.
  • a control or adaptation of the lighting characteristic can therefore not (only) of those generated in the context of the TOF measurement
  • a suitable adaptation of the illumination characteristic takes place externally.
  • external control signals can be indicative of all features and / or states which have an influence on the backscattering behavior of the
  • Such a feature is, for example, a (color) change of an object to be detected and / or an object newly entering or leaving the scene, which object
  • Lighting characteristic determined by at least one of the following features: (a) illumination intensity for at least a portion of the scene, (b) differences in illumination intensity between different
  • Illumination light and (f) intensity distribution for different
  • the objects to be recognized can be illuminated particularly well and recognized as a result with particularly high accuracy and reliability.
  • a partial area can be assigned to exactly one pixel of the sensor.
  • the wavelength, frequency or color of the illumination light and the spectral distribution of the illumination light can by a suitable control of spectrally different light-emitting elements, in particular LEDs with
  • the selected wavelength or spectral distribution of the illumination light may depend on the color and thus on the optical backscattering behavior of the object.
  • the polarization direction of the illumination light can be adjusted in a known manner, for example by the use of polarization filters.
  • All mentioned characteristic features can optionally be adapted dynamically to changing scenes, so that the best possible illumination of the scene can always be achieved. This can be an optimal
  • Lighting characteristic can also be determined according to the principle of "try-and-error” or by other statistical optimization procedures. This can be done dynamically during a real operation of the sensor system or as part of a calibration by means of a detection of suitable reference objects.
  • the method includes (a) providing a sensor for measuring distances based on a light transit time; (b) controlling the operation of the sensor such that (bl) accumulates a first number of photons upon a first detection of the scene per pixel of the sensor and (b2) on a second detection of the scene per pixel of the sensor a second number Photons is accumulated, the second number being greater than the first number; and (c) a
  • Data processing device based on a first result of the first detection of the scene and / or a second result of the second detection of Scene.
  • the method described is based on the knowledge that the scene can be detected and evaluated in an energy-efficient manner by three-dimensional recording and evaluation of a scene based on two different (image) data sets. In order to achieve a certain degree of accuracy, it is not always necessary to carry out that scene detection and scene evaluation, which is the more complex from the energy point of view.
  • the method further comprises (d) detecting an object in the scene; (e) comparing the detected object with at least one comparison object; and,
  • the approved action may, for example, be an authorized passage through an opening in a building, which opening is closed by a closure body prior to identification as an approved object and is opened only after successful identification by a corresponding movement of the closure body.
  • the objects to be identified may preferably be persons and / or vehicles. Successful identification may be to control or activate a closure mechanism for a closure body prior to opening a building.
  • Embodiments is merely determined by a (very simple) object recognition that it is a living object to a human and not an animal, such as a bird or a bat, is. An opening of the passage can then take place only if a human was recognized.
  • 3D TOF (depth) information of the sensor can at Persons, for example, a reliable face recognition can be realized, which can be used to activate a shutter mechanism.
  • certain areas of the scene for example, by a
  • Covering characteristic of an opening to be passed by an object by at least one closing body is
  • Covering characteristic which is controlled by the described sensor system or at least co-controlled. Because such sensor systems
  • Sensor system can be monitored in an energetically efficient manner and larger distances, which naturally leads to an earlier detection of an opening requirement of the closure body, which can be of great advantage, especially in fast-moving objects. Furthermore, the scene can be detected with a wider detection angle, which, for example, leads to an early detection of itself transversely to the opening
  • the opening is an entrance or an exit, in particular an emergency exit in a building.
  • an input or output monitors in particular a blocked emergency exit detected, and the corresponding information to an affiliated system, for example, to a monitoring system, transmitted.
  • the object is a person or a vehicle.
  • the building may in particular be a house or a garage.
  • a sensor system described above for detecting and / or controlling traffic flows of objects moving through a scene of the sensor system, the scene being represented by a spatial
  • Detection range of the sensor system is determined. This described use is based on the finding that it is in a traffic detection and / or traffic flow control on a
  • Energy-efficient sensor technology arrives, since this sensor is typically constantly in operation.
  • the objects relevant to the traffic flow in question can be, for example, persons, vehicles, products such as e.g. Be packages, suitcases, etc. Since a plurality or even a multiplicity of 3D sensors are usually used for such applications, energy savings have a particularly positive effect here.
  • TOF-based sensor systems can be subdivided into two fundamentally different classes both with regard to the illumination light and with regard to the measurement light, which can be combined as desired.
  • Bl The first alternative (Bl) for the lighting is characterized by the fact that the scene by means of a single illumination light beam high
  • Focusing and low divergence ie high collimation is scanned sequentially. For each position of the illumination light beam in the scene, a measurement of the duration of the illumination light and the measurement light
  • the scanning can be realized using movable optical components, in particular mirrors. Alternatively or in
  • Combination can be used for sequential scanning of the scene with the
  • Illuminating light beam can be used a solid body, which manages without mechanically moving parts and has integrated photonic structures or circuits. With a suitable control of these structures, the illumination light beam is then directed to the desired location of the scene.
  • a solid is known for example from US 2015/293224 Al.
  • B2 The second alternative (B2) for lighting is characterized by the fact that the entire scene is illuminated (all at once and flatly). If necessary, the intensity of the illumination light in selected subregions of the scene can be (selectively) increased in order to enable improved 3D object detection at these locations. Such spatially uneven distribution of the illumination light can be done without moving optical components
  • DOE Diffractive optical element
  • Ml A first alternative (Ml) for the measurement is based on pulsed
  • Illumination light beams In this case, a "travel time" of a light pulse is detected on the receiver side for each pixel within a time window, and the distance is derived therefrom.
  • M2 The second alternative (M2) for the measurement is based on a temporal, preferably sinusoidal, modulation of the illumination light with a predetermined frequency, with appropriate values for this frequency depending on the expected transit time or the maximum detection distance.
  • the phase difference is measured for each pixel and derived therefrom the distance information.
  • Both measuring principles Ml and M2 are based on an integration of the number of photons or the photoelectrons generated in the light receiver, which arrive on each pixel to be measured.
  • an ever-present light or photon noise depends on the number of photons accumulated in a pixel. Therefore, the higher the number of accumulated photons, the more accurate the distance information obtained from the TOF measurement becomes.
  • Figure 1 shows the use of a sensor system for controlling a
  • Figure 2 shows the use of a sensor system for detecting a
  • FIGS. 3a and 3b illustrate a collection of single pixels of a light receiver.
  • FIGS. 4a to 4c show different beam cross sections of an illumination light for adapting the illumination to the shape of the illumination
  • Embodiment are the same or at least functionally identical, are provided with the same reference numerals or reference numerals, which are identical in the last two digits with the reference numerals of corresponding same or at least functionally identical features or components.
  • reference numerals or reference numerals which are identical in the last two digits with the reference numerals of corresponding same or at least functionally identical features or components.
  • the light receiver can optimize the optical energy of the measuring light by illuminating the scene differently, depending on the nature of the scene.
  • monitored zone can easily be done with a "wake-up delay", which allows monitoring with a less powerful inside
  • illumination light This can be realized by a function-dependent illumination in which an energy optimization can be achieved by controlling the radiation-beam-dependent illumination energy over a time average. Different strategies can be used.
  • the illumination intensity of at least one individual element can be varied with respect to the other elements. This variation can already be constructive in the construction of the corresponding
  • Lighting device can be realized, for example by the use of light emitting diodes of different intensity. Alternatively or in combination, this can also be achieved by the type of control of the LEDs, each with an individually adjustable current.
  • individual power control can also be performed dynamically during operation of the sensor system. So can
  • Lighting beam scanning the entire scene sequentially or scan is typically at any time the current solid angle of the
  • Illuminating beam By varying the intensity of the
  • illumination light can thus be controlled or regulated in a controlled manner depending on the optical characteristic of the detected scene (and optionally with dynamic control or regulation depending on the intensity of the measurement light scattered back from the respective solid angle).
  • illumination light ray
  • a static scene can be measured, with the appropriate intensity for each solid angle being taught to illuminating light.
  • An optional dynamic adjustment of the illuminating light intensity can be carried out both in real time, in that those solid angle ranges from which only a small intensity of measuring light is received, directly, i. without delay, be more illuminated.
  • Such a variation may also be adaptive from one scene capture to the next
  • the frequency or repetition rate of entire scene acquisitions can also be adjusted if necessary reduce (and turn off the lighting between the scene captures), which also contributes to a (further) energy savings.
  • the frequency or the repetition rate of scene captures to reduce. With only halving the frequency, this results in a nearly 50% saving in energy consumed by the sensor system.
  • the illumination intensity in a portion of the scene can be selectively focused by appropriate focusing of the scene
  • Illumination light can be increased. Although the lateral resolution of the scene detection decreases, because such a focusing part of the
  • Subarea is supplied with less illumination light than another part of this subarea. However, if such a measure still provides a sufficient spatial resolution for the particular application
  • FIG. 1 shows the use of a sensor system 100 for controlling a coverage characteristic of an opening 184 depending on the characteristic of a scene 190 monitored by the sensor system 100.
  • the opening 184 is a passenger entry into a building or garage entrance for motor vehicles a garage.
  • the corresponding input structure is provided with the reference numeral 180.
  • the input structure 180 includes a stationary support structure 182 which includes a frame and a guide for two sliding doors
  • Closing 186 represents.
  • the sliding doors 186 can each be represented by means of a motor 187 along the two thick double arrows
  • Move instructions are moved.
  • the actuation of the motors 187 takes place, as explained below, by means of the sensor system 100 described in this document.
  • the sensor system 100 has a TOF sensor 110, a control unit 140, a data processing device 150 and a database 160.
  • the TOF sensor 110 contains all the optical components of the sensor system 100
  • the sensor system 100 (as opposed to the representation of FIG. 1) is preferably constructed as a module which, in addition to the TOF sensor 110, also has the control unit 140, the data processing device 150 and the database 160 within a compact design.
  • control unit 140 has an interface 142 via which an external control signal 142a can be received.
  • the external control signal may be from an attached system (not shown), such as a
  • Monitoring system which controls the operation of the sensor device 100 depending on external parameters.
  • an external parameter may, for example, be a previously known object property.
  • Signaling data transmitted via the interface 142 may also include information about the detected and evaluated scene 190.
  • Such information may be, for example, the information that a license plate advertised for search was recognized, that a parking lot was illegally occupied, that a suspicious object is in the monitored scene 190, etc.
  • a corresponding flow of information is provided by the sensor system 100 More specifically, from the control unit 140 to the attached system.
  • the data processing device 150 is also provided with an interface 152, which can also receive an external control signal, which is provided with the reference numeral 152a.
  • This control signal 152a can be used to make the operation of the data processing device 150 at least partially dependent on external information.
  • an a priori knowledge about an object 195 can be transmitted via the control signal 152a for an improved evaluation of the detected scene 190.
  • the scene detection is done by a "two-stage detection" with different ones
  • the TOF sensor 110 comprises an illumination device 130, for example an array of light-emitting diodes, which illuminates the scene 190 and thus also the object 195 in the scene 190 with a pulsed and temporally modulated illumination light 131. Further, the TOF sensor 110 includes a Light receiver 120, which receives illumination light 131 backscattered from the object 195, which is referred to as measuring light 196 in this document.
  • the spatial detection of the scene 195 takes place on the basis of the principles explained in detail above of a transit time measurement, which is also referred to as Time Of Flight (TOF) measurement.
  • TOF Time Of Flight
  • the corresponding TOF data is transferred to the data processing device 150. This can be done directly or indirectly via the control unit 140.
  • the illumination device 130 in addition to the illumination units shown in Figure 1 can also have other lighting units that illuminate the scene 190 from a different angle. Likewise, the two lighting units can also be arranged outside the housing of the TOF sensor 110 and thus of the
  • Light receiver 120 further spaced.
  • the detected optical scene 190 is identified by means of suitable methods
  • first data are obtained from the above-described first scene capture with the first photon accumulation and / or second data from the second one described above
  • an image of the scene 190 with the first is within certain time intervals
  • the data processing device 150 is capable of determining not only its velocity as an absolute value but as a motion vector (with direction information) based on respective positional shifts of the object 195.
  • a knowledge of the exact position and / or the movement profile of the object 195 can then be advantageously used to both motors 187 in a suitable manner by the data processing device 150 to control.
  • the sliding doors 186 are only opened when the object 195 actually moves in the direction of the opening 184. If the object 195 is a vehicle of so-called cross-traffic, which essentially moves past the opening 184, then the corresponding motion vector of the
  • Data processing device 150 is detected and there is no opening of the sliding doors 186th
  • the sensor system 100 is also capable of performing object recognition. This is what the
  • Data processing device 150 to a stored in the database 160 record of reference objects corresponding to selected objects that are authorized to pass through the opening 184.
  • the sliding doors 186 are opened only when the detected object 195 at least approximately coincides with one of the stored reference objects.
  • the coverage characteristic of the opening 184 depends not only on the motion profile of the object 195, but also that object-based access control still takes place.
  • FIG. 2 shows a further use or a further use of the invention
  • the TOF sensor 110 detects a traffic flow of (different) objects 295a, 295b, 295c, 295d, and 295e that are on a conveyor belt 298 and move through a scene 290 along the direction of movement represented by an arrow.
  • Reliable knowledge of the number and / or type of objects 295a-295e may be used in the field of logistics for traffic flow control. Merely an example of such control of a
  • Traffic flow is the control of luggage transport in an airport.
  • labels on the relevant objects 295a - 295e can also determine the type of the respective object. It should be noted, however, that use in an airport is just one example of a variety of other traffic control applications.
  • FIGS. 3a and 3b illustrate a combination of individual pixels of a light receiver 320a designed as a semiconductor or CCD chip
  • the light receiver 320a has a plurality of light-sensitive or
  • the pixels 322a are associated with the full spatial resolution of the light receiver 320a, which resolution is predetermined by the semiconductor architecture of the chip 320a.
  • the light receiver 320b four of the light-sensitive pixels (for a full resolution) are respectively added to a higher-level pixel 322b (for an increased resolution)
  • a pixel 322b accumulates four times as much light as compared to a single pixel 322a.
  • Such a “binning” reduces the required (minimum) intensity of the detected measuring light, which is needed to evaluate the corresponding image area of the scene. Since the intensity of the measurement light depends directly on the intensity of the illumination light, binning reduces the intensity of the illumination light and thus reduces the energy consumption of the sensor system.
  • the described "binning" can also be realized dynamically by a corresponding control of one and the same light receiver 320a or 320b.
  • the light receiver is either in a first operating mode (with full resolution) or in a second operating mode (with
  • Switching between different modes of operation may be controlled by external control signals. Alternatively or in combination, such switching may also depend on the result of a scene evaluation, so that the "binning" mode of operation is regulated for a next scene capture.
  • Portions of the scene with a higher spatial resolution (and less photon accumulation) and other portions of the scene with a lower spatial resolution (and higher photon accumulation) be recorded.
  • the described local and differently strong grouping of pixels can be dynamic or adaptive in exactly those
  • Subareas can be performed in which there is currently a specific object.
  • Figures 4a to 4c show different beam cross sections of a
  • a first illumination light 431a illustrated in FIG. 4a has a substantially circular beam cross section and is preferably suitable for "round scenes". However, for most applications which do not detect (and evaluate) a "round scene", a beam cross section deviating from a circular shape is suitable.
  • FIG. 4b shows an illumination light 431b with an elliptical beam cross section.
  • FIG. 4c shows
  • Illumination light 431c with a rectangular beam cross section.
  • optical components such as mirrors and refractive optical elements (e.g.
  • Lens system are suitably adapted to the scene to be detected.
  • Diffractive optical elements DOEs
  • DOEs can also be used, which optionally even allow a dynamic and / or scene-dependent shaping of the beam cross-section.
  • 322b parent pixel / pooled pixel a Round cross-section illumination lightb
  • Elliptical cross-section illumination lightc Illuminating light of rectangular cross-section

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

L'invention concerne un système de capteur (100) ainsi qu'un procédé pour la capture optique tridimensionnelle d'une scène (190). Le système de capteur (100) comprend (a) un capteur (110) pour la mesure de distances sur la base d'une durée de trajet de la lumière ; (b) une unité de commande (140) pour la commande du fonctionnement du capteur (110) de telle façon que, lors d'une première capture de la scène (190), un premier nombre de photons est accumulé par pixel (322a) du capteur (110) et, lors d'une deuxième capture de la scène (190), un deuxième nombre de photons est accumulé par pixel (322b) du capteur (110), le deuxième nombre étant supérieur au premier nombre ; et (c) un dispositif de traitement de données (150) monté en aval du capteur (110), lequel est configuré pour évaluer la scène (190) sur la base d'un premier résultat de la première capture de la scène (190) et/ou un deuxième résultat de la deuxième capture de la scène (190). L'invention concerne en outre différentes utilisations d'un tel système de capteur (100).
EP18825924.6A 2017-12-12 2018-12-11 Système de capteur pour la capture tridimensionnelle d'une scène au moyen de plusieurs accumulations de photons Withdrawn EP3724676A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102017129626.3A DE102017129626A1 (de) 2017-12-12 2017-12-12 Sensorsystem zum dreidimensionalen Erfassen einer Szene mit verschiedenen Photonenakkumulationen
PCT/EP2018/084424 WO2019115561A1 (fr) 2017-12-12 2018-12-11 Système de capteur pour la capture tridimensionnelle d'une scène au moyen de plusieurs accumulations de photons

Publications (1)

Publication Number Publication Date
EP3724676A1 true EP3724676A1 (fr) 2020-10-21

Family

ID=64870434

Family Applications (1)

Application Number Title Priority Date Filing Date
EP18825924.6A Withdrawn EP3724676A1 (fr) 2017-12-12 2018-12-11 Système de capteur pour la capture tridimensionnelle d'une scène au moyen de plusieurs accumulations de photons

Country Status (3)

Country Link
EP (1) EP3724676A1 (fr)
DE (1) DE102017129626A1 (fr)
WO (1) WO2019115561A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220397674A1 (en) * 2019-07-05 2022-12-15 Sony Semiconductor Solutions Corporation Time-of-flight sensing circuitry with different imaging modes and method for operating such a time-of-flight sensing circuitry
DE102020203626B4 (de) 2020-03-20 2024-03-07 Geze Gmbh Automatische Fenster- oder Türanlage

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10351714B4 (de) * 2003-11-05 2005-09-29 Eads Astrium Gmbh Vorrichtung zur optischen Erfassung eines entfernten Objekts
WO2011128408A1 (fr) * 2010-04-15 2011-10-20 Iee International Electronics & Engineering S.A. Dispositif de détection de commande d'accès configurable
DE102010033818A1 (de) * 2010-08-09 2012-02-09 Dorma Gmbh + Co. Kg Sensor
EP2453252B1 (fr) 2010-11-15 2015-06-10 Cedes AG Capteur 3D à économie d'énergie
EP2469301A1 (fr) * 2010-12-23 2012-06-27 André Borowski Procédés et dispositifs pour générer une représentation d'une scène 3D à très haute vitesse
DE202012010014U1 (de) * 2012-10-19 2014-01-20 Sick Ag Laserscanner
US10132928B2 (en) 2013-05-09 2018-11-20 Quanergy Systems, Inc. Solid state optical phased array lidar and method of using same
DE102015112656A1 (de) * 2015-07-31 2017-02-02 Sick Ag Distanzsensor
DE102015220798A1 (de) * 2015-10-23 2017-04-27 Designa Verkehrsleittechnik Gmbh Zugangskontrollsystem für einen Lagerbereich sowie Verfahren zur Zugangskontrolle
US10761196B2 (en) * 2016-02-18 2020-09-01 Aeye, Inc. Adaptive ladar receiving method
DE102016013861A1 (de) * 2016-11-22 2017-05-18 Daimler Ag Optischer Sensor für ein Kraftfahrzeug und Kraftfahrzeug

Also Published As

Publication number Publication date
DE102017129626A1 (de) 2019-06-13
WO2019115561A1 (fr) 2019-06-20

Similar Documents

Publication Publication Date Title
EP2667218B1 (fr) Capteur 3D à économie d'énergie
EP2946227B1 (fr) Ensemble de détection conçu pour détecter des gestes de commande dans des véhicules
EP2946226B1 (fr) Ensemble de détection universel conçu pour détecter des gestes de commande dans des véhicules
EP1159636A1 (fr) Systeme de mesure de distance a resolution locale
EP3033251A1 (fr) Ensemble de capteurs destiné à détecter des gestes de manipulation sur des véhicules
EP2314427A2 (fr) Procédé et dispositif de commande pour la commande d'appareils électriques par détection de mouvements
WO2020229186A1 (fr) Système de détection 3d apte à fonctionner dans différents modes de fonctionnement en fonction d'un état de fonctionnement d'un corps de fermeture
DE102004047022A1 (de) Vorrichtung zur Überwachung von Raumbereichen
EP2145289B1 (fr) Véhicule à moteur
EP3724676A1 (fr) Système de capteur pour la capture tridimensionnelle d'une scène au moyen de plusieurs accumulations de photons
DE102018115274A1 (de) Überwachungsvorrichtung und Verfahren zum Überwachen eines Türbereichs einer Fahrzeugtür für ein Fahrzeug und Türsystem mit einer Überwachungsvorrichtung
DE102005011116B4 (de) Vorrichtung zur Ansteuerung und/oder Überwachung eines Flügels
WO2017041915A1 (fr) Systeme de capteur d'un dispositif de détection d'un véhicule automobile
DE102014205282B4 (de) Vorrichtung zum Öffnen oder Schließen einer Öffnung eines Fahrzeugs
WO2020229190A1 (fr) Identification d'un objet sur la base d'une reconnaissance d'une partie de l'objet et de données de description d'un objet de référence
EP3724674A1 (fr) Système de capteur 3d avec optique de forme libre
EP3724601B1 (fr) Détermination de la distance basée sur différentes profondeurs de champ avec différents réglages de mise au point d'une lentille d'objectif
WO2020229189A1 (fr) Système de détection tof doté d'un dispositif d'éclairage comprenant un réseau de sources lumineuses individuelles
DE102017129654A1 (de) 3D Szenenerfassung basierend auf Lichtlaufzeiten und einem charakteristischen Merkmal von erfassten elektromagnetischen Wellen
WO2019115559A1 (fr) Système de détection 3d muni d'un éclairage de scène fonction d'un angle solide
DE102017129663A1 (de) Sensorsystem mit einer Lichtfeldkamera zum Bestimmen der Charakteristik einer Szene
DE102013018800A1 (de) Verfahren und Vorrichtung zum optischen Bestimmen von Abständen zu Objekten in einem Überwachungsbereich, insbesondere in einem Überwachungsbereich von automatischen Türen
DE102017220397A1 (de) Abstandsmesseinheit
DE102021115280A1 (de) Automatische Türanordnung mit Sensorvorrichtung und Verfahren zum Betreiben einer solchen automatischen Türanordnung
DE102011086026A1 (de) System und Verfahren zur Objekterkennung

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20200604

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

RIN1 Information on inventor provided before grant (corrected)

Inventor name: ECKSTEIN, JOHANNES

Inventor name: HUNZIKER, URS

Inventor name: SEILER, CHRISTIAN

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20210623

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20220104