WO2020229186A1 - Système de détection 3d apte à fonctionner dans différents modes de fonctionnement en fonction d'un état de fonctionnement d'un corps de fermeture - Google Patents

Système de détection 3d apte à fonctionner dans différents modes de fonctionnement en fonction d'un état de fonctionnement d'un corps de fermeture Download PDF

Info

Publication number
WO2020229186A1
WO2020229186A1 PCT/EP2020/061901 EP2020061901W WO2020229186A1 WO 2020229186 A1 WO2020229186 A1 WO 2020229186A1 EP 2020061901 W EP2020061901 W EP 2020061901W WO 2020229186 A1 WO2020229186 A1 WO 2020229186A1
Authority
WO
WIPO (PCT)
Prior art keywords
scene
sensor system
light
operating mode
area
Prior art date
Application number
PCT/EP2020/061901
Other languages
German (de)
English (en)
Inventor
Heinz Macher
Urs Hunziker
Original Assignee
Bircher Reglomat Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bircher Reglomat Ag filed Critical Bircher Reglomat Ag
Publication of WO2020229186A1 publication Critical patent/WO2020229186A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/04Systems determining the presence of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/4802Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4814Constructional features, e.g. arrangements of optical elements of transmitters alone
    • G01S7/4815Constructional features, e.g. arrangements of optical elements of transmitters alone using multiple transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4816Constructional features, e.g. arrangements of optical elements of receivers alone
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/484Transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4861Circuits for detection, sampling, integration or read-out
    • G01S7/4863Detector arrays, e.g. charge-transfer gates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/483Details of pulse systems
    • G01S7/486Receivers
    • G01S7/4868Controlling received signal intensity or exposure of sensor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/4911Transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • G01S7/4912Receivers
    • G01S7/4913Circuits for detection, sampling, integration or read-out

Definitions

  • 3D sensor system can be operated in different operating modes depending on the operating state of a closure body
  • the present invention relates to a sensor system and a method for three-dimensional detection of a scene based on time of flight measurements.
  • the present invention also relates to a use of such a sensor system.
  • closing bodies operated by means of actuators are often used, which for
  • Operators facilitate the handling of the relevant closure body or are operated automatically without any operator action, for example when an object to be passed through the opening comes into the area of the opening.
  • Such an opening can be a passage in a building, for example.
  • a closing body can be a door, for example.
  • such a sensor system can automatically open and / or close the closure body or the
  • Illumination sources are emitted and detected by a light receiver after an at least partial reflection or 180 ° backscatter.
  • Such sensor systems are generally referred to as “time-of-flight” (TOF) sensor systems.
  • TOF sensor systems have the disadvantage that with
  • the intensity of the measurement light to be detected (backscattered) by a light receiver of the TOF sensor is weakened in two ways. In the case of a point illumination light source without special focusing, this is scaled
  • Illuminating light is scattered isotropically, perceived as a point light source. As a result, in this case this leads to a l / d 'scaling of the intensity of the
  • the intensity attenuation is correspondingly lower, but nevertheless contributes to a significant loss of light output. This in turn leads to a
  • a system for automatically monitoring an opening or passage which can be closed by two sliding doors is known from WO 2018/064745 A1.
  • 3D cameras are used, which are based on the TOF principle.
  • the movement area around the sliding doors is divided into two sub-areas that are evaluated separately from each other.
  • a first sub-area attention is paid to security in order to prevent an object from being trapped by the sliding doors.
  • an image evaluation takes place in With regard to another purpose, for example for activating the sliding doors or for counting objects, which this second
  • the present invention is based on the object in the field of automatic monitoring of openings, which by means of at least one
  • Closing body can be closed to enable an efficient and nevertheless reliable three-dimensional detection of a scene from an energetic point of view.
  • a sensor system for three-dimensional detection of a scene in a predetermined spatial area, which is assigned to a closure body for a passage.
  • the sensor system described has (a) an illumination device for illuminating the scene with illuminating light; (b) a measuring device (bl) for receiving measuring light, which is at least partially back-scattered illumination light from at least one object contained in the scene, and (b2) for measuring distances between the sensor system and the at least one object based on a light transit time of the illumination light and the measuring light; (c) a data processing and control device connected downstream of the measuring device for determining a three-dimensional characteristic of the scene based on the measured distances; and (d) a
  • the lighting device and / or the measuring device are in a first Operating mode with a first energy consumption and in a second
  • Operating mode can be operated with a second energy consumption which is smaller than the first energy consumption.
  • Sensor system is based on the knowledge that the energy consumption of a 3D sensor (system), which energy consumption is typically significantly greater than that of a 2D sensor, can be easily reduced if it is always operated in an "energy-saving mode" when the requirements for scene monitoring are not so great. This is the case with a 3D sensor that is used to monitor a passage that can be closed by a closure body, for example when the closure body is at rest or when it is moved in the direction of an opening position. Then there is namely the danger that an object, in particular a person, which (s) is in a range of motion of the
  • Closing body is located, is clamped by the closing body, does not exist or at least significantly reduced.
  • the operating state of the closing body can be a state of movement of the closing body and can in particular be determined by whether the closing body is in the direction of its opening position or its
  • Closing position moved The speed at which the closure body moves can also characterize the state of movement. Furthermore, the operating state can also depend on the current position of the closure body, this position being a rest position or a current position over which the closure body is currently moving.
  • the operating mode of the lighting device and / or the measuring device can be any type of operating mode that (also) determines the energy consumption of the entire sensor system.
  • the energy consumption can relate to the lighting device and / or the measuring device.
  • the term “scene” can in particular be understood to mean that spatial area which is evaluated by the sensor system using image processing methods, the scene being recorded in a certain spatial area. Objects in the scene are recognized by suitable image evaluation.
  • Image evaluation and / or image analysis can be used.
  • the data processing and control device can accordingly be a special image processing processor and can have one that is configured to use known methods for image evaluation and / or image processing
  • the term “illuminating light” is to be understood as meaning those electromagnetic waves which are emitted by a light source of the illuminating device and strike the relevant object in the scene.
  • the “measuring light” are the electromagnetic waves scattered back from or on the object, which are emitted by the measuring device or a
  • Received light receiver of the measuring device and used for the three-dimensional evaluation of the scene.
  • optical and / or “light” can refer to electromagnetic waves that have a specific wavelength or frequency or a specific spectrum of wavelengths or frequencies.
  • the electromagnetic waves used can be assigned to the spectral range that is visible to the human eye.
  • electromagnetic waves assigned to the ultraviolet (UV) or the infrared (IR) spectral range can also be used.
  • UV ultraviolet
  • IR infrared
  • the IR spectral range can extend into the long-wave IR range with wavelengths between 3.5 pm to 15 pm, which by means of the
  • Light receiver of the sensor can be detected.
  • object can be understood to mean any three-dimensional physical structure which has a surface quality that leads to at least partial reflection or scattering of illuminating light and is thus visible to the measuring device through the resulting measuring light.
  • the object can be an object such as a motor vehicle or a living being such as a person.
  • the object can be a related to the
  • the movement of the object can then be determined by repeated scene detection (by comparing the different spatial positions determined with different scene detections).
  • Absolute value of the speed and / or the motion vector i.e. additionally the direction of movement can be determined.
  • Closing body is assigned, can depend on the type of
  • Closing body and its movement behavior can also depend on the expected speed at which objects approach the passage and possibly want to pass it. Also from one to be expected
  • the size and / or shape of the spatial area may depend.
  • the wear body can be any desired barrier element by means of which an opening can be temporarily at least partially closed.
  • the closing body can be, for example, (i) a sliding door, (ii) a revolving door, (iii) a turnstile, (iv) a gate, in particular a garage door, (v) a barrier, in particular at a level crossing or a vehicle entrance or
  • the type of closure body can in particular depend on the type of passage and / or on the type of objects for which the passage is to be opened or closed.
  • the danger posed by a closure body can in particular be determined by its closing edges when the closure body moves, in particular opens. Closing edges can be in
  • Main closing edges, secondary closing edges and counter closing edges are divided.
  • a primary edge of the closing body is referred to as the main closing edge. This is usually that closing edge which, when the closing body moves, covers the greatest distance and thus (with the greatest speed) sweeps over the largest area.
  • Main closing edge determines the main hazard zone of the closing body.
  • Secondary edge (s) of a closure body are referred to as secondary closing edges, which typically cover a smaller distance (at a lower speed) when the closure body moves.
  • Opposite closing edges are called edges which typically run opposite and / or parallel to the main closing edge.
  • a secondary closing edge attaches when the closing body moves
  • the data processing and control device can be any type of data processing and control device.
  • the data processing and control device can thus be implemented by means of software and by means of one or more special electronic circuits, i.e. in hardware.
  • the data processing and control device can be implemented logically and / or in terms of equipment by means of a common processor or function block or by means of several processors or function blocks. Also one
  • a closing body can be a door or a gate, for example.
  • the following types of doors or gates can be monitored with the described sensor system. The following list is not exhaustive.
  • Swing doors / gates These are locking bodies with one or two leaves that rotate around a vertical axis on a leaf edge.
  • Sliding doors / gates These are closure bodies with one or more horizontally moving door leaves that move in their own plane over an opening or passage.
  • Folding wing doors / gates These are closing bodies with two or more wings that are hinged to one another and where one side of the wing is connected to a frame.
  • Roller doors These are closing bodies with a flexible wing that is moved vertically and, when opened, is wound onto a winding shaft.
  • Sectional doors These are closing bodies with a non-rigid wing, which is made up of a number of typically horizontally interconnected
  • the way in which the sash is put down in the upper opening position depends on the type (e.g. horizontal, vertical, folded).
  • Up-and-over doors are locking bodies with a rigid wing which, when actuated, executes a tilting movement and remains in an upper horizontal distortion when fully opened.
  • Operating mode at least a part of the scene can be detected with greater spatial accuracy than in the second operating mode.
  • the accuracy can depend on the intensity of the illumination by the illumination light and / or from the the spatial resolution of the measuring device.
  • a greater illumination intensity ensures better illumination so that objects can be better recognized.
  • a higher spatial resolution typically requires a higher energy consumption because a certain minimum number of photons must be accumulated per pixel of a sensor chip in order to produce a
  • the sensor system is configured to record the scene with a time sequence of scene recordings, the number of which is within a predetermined time period
  • Scene detection in the first operating mode is greater than in the second operating mode. This means that a (mean) time resolution or a (mean) repetition rate is greater in the first operating mode than in the second operating mode.
  • a (mean) time resolution or a (mean) repetition rate is greater in the first operating mode than in the second operating mode.
  • the idle state can be a so-called standby state or "sleep mode".
  • the sensor system can, apart from monitoring an event that causes a transition to the first (active)
  • Monitoring can but also have a reception of a wake-up signal, which (from the data processing and control device) from an external
  • the spatial area comprises a first partial area and a second partial area, the second partial area being at least partially different from the first partial area.
  • the two spatial sub-areas can be spatially separated from one another or have a certain spatial overlap.
  • the two spatial sub-areas are assigned different tasks or hazard potentials.
  • the first spatial sub-area can be, for example, the immediate movement area of the closure body, in which a collision between an object and the closure body is possible. This sub-area can also be used as a
  • the second sub-area can be that area in which objects are recognized that may be moving in the direction of the passage. The detection of such objects can then, depending on the application, lead to the
  • Closing body is moved from its open position to its closed position or vice versa.
  • Such a second sub-area can also be called
  • Activation area are designated.
  • activation are, for example, radar sensors, 3D sensors or
  • the spatial extent and / or the spatial position of at least one of the two spatial subregions in the first operating mode is different from the spatial extent and / or the spatial position in the second operating mode.
  • the dynamic division can go so far that in either of the two
  • Sensor system furthermore to an adaptive optical device for changing the spatial area.
  • the data processing and control device is set up in such a way that the spatial area differs from the current
  • Operating mode depends.
  • the size, the shape and / or the spatial position of the spatial area can depend on the current operating mode depend.
  • Closing body a further energy saving can be realized.
  • the adaptive optical device can also be used to divide the entire spatial area into two spatial sub-areas as described above, depending on the
  • the adaptive optical device can have an adaptive optics such as a deformable refractive lens or a diffractive optical element (DOE).
  • Such adaptive optics can be assigned to the lighting device and / or the measuring device.
  • a plurality of adaptive optics that can optionally be activated independently of one another can also be used.
  • adaptive optics can also be combined with other non-adaptive optics that cannot be changed over time
  • diffraction in this context generally refers to the spatial deflection of an electromagnetic wave at structural obstacles.
  • Such obstacles can be an edge, a hole or a one-dimensional, a two-dimensional or even a three-dimensional grid.
  • the DOE can advantageously allow a dynamic adaptation or adaptation of the lighting characteristics during the operation of the sensor system.
  • refraction refers to the change in the direction of propagation of a wave due to a spatial change in its propagation speed, which is specific to light waves through the
  • the sensor system also has a data output for outputting a
  • Closing body is indicative.
  • the output signal can in particular from the data processing and
  • Control device are generated, which based on a
  • automatic image evaluation controls the closure body in a suitable manner. For example, certain objects can be passed through the closure body
  • Passage is denied and other objects are allowed through the passage.
  • Such a distinction can be made by automatic image evaluation with regard to the type and / or the identity of an object.
  • Data processing and control device configured, at least two carried out at a certain time interval from one another
  • Energy consumption of the sensor system can be further reduced.
  • Scene captures can be combined with one another.
  • the number of scene captures that are processed together can depend on the required time resolution. This means that for objects that only move comparatively slowly, the number of jointly evaluated scene captures can be larger than with
  • Measuring device comprises (a) a light receiver with a plurality of pixels for receiving the measurement light and (b) a light receiver control device coupled to the light receiver, the light receiver control device and the light receiver being configured such that at least two pixels of the plurality of pixels are summarized in a parent pixel.
  • the plurality of pixels are combined in such a way that in each case a certain number of pixels are combined to form a superordinate pixel.
  • the certain number can be, for example, (preferably) two, three, (preferably) four, six, (preferably) eight, or (preferably) nine. Of course, an even stronger grouping of pixels is also possible.
  • Such a grouping of pixels which is also referred to as "binning" has the effect that, at the expense of spatial resolution, the number of photons of the measurement light that are collected or accumulated by a pixel during a scene detection increases according to the number is increased to a parent pixel combined pixels. As a result, the so-called statistical is reduced, especially with weak measuring light
  • binning can also be carried out locally over the area of the light receiver in only at least a partial area of the active areas of the light receiver. This then leads to an inhomogeneous spatial resolution, which is not absolutely desirable.
  • the disadvantage of such an inhomogeneous spatial resolution is, however, overcompensated in many applications by the increased photon accumulation.
  • a local "binning" can take place at least with some known light receivers without special electronic or apparatus elements simply by a corresponding control of the light receiver, which control determines the "binning" and thus the operating mode of the sensor system.
  • a local "binning" is carried out in such a way that, measured by the measuring device and / or learned by the data processing and control device, precisely those areas of the light receiver that were used in at least one previous
  • Scene detection have received too little light energy by suitable control of the light receiver by the light receiver control device, can be combined in a suitable manner into superordinate pixels during subsequent scene detections.
  • a dynamically controlled or regulated "binning" can be (learned) during normal operation of the sensor system and / or during the configuration of the sensor system, for example in the context of (initial) installation, maintenance, cyclical or
  • the spatial resolution of the light receiver is different along different directions if the individual pixels have a square shape.
  • This can be used to advantage in some applications.
  • Such an application is, for example, when a movement of an object in the scene along a previously known spatial direction is to be detected with high accuracy than a movement along another spatial direction, preferably perpendicular thereto.
  • the number of pixels which are located along a line parallel to this previously known spatial direction are arranged, be larger than the number of pixels, which are arranged along a line perpendicular thereto.
  • the spatial resolution along the direction of movement is greater than the spatial resolution perpendicular to the direction of movement and the movement profile of such a linearly moving object can be determined with a particularly high degree of accuracy even with a comparatively weak measuring light.
  • the described binning can also be adapted adaptively in response to at least one previously recorded (and evaluated)
  • Scene characteristics can be activated (automatically). This means that the "binning" is not only controlled by the light receiver control device but is regulated as a function of the results obtained through a scene evaluation. This makes a particularly reliable one
  • Illuminating light and thus can be operated in an energy-efficient manner.
  • Illumination light source for spatial scanning of the scene with a
  • Illumination light sources which are in particular individually controllable and each assigned to a specific solid angle range of the scene, and / or (d) a flat illumination light source, in particular with a light intensity that is not homogeneous over the area.
  • a laser beam that scans the scene can be directed in a known manner via two rotatable mirrors with axes of rotation that are not parallel to one another and preferably oriented perpendicular to one another to the point of the scene to be illuminated.
  • two rotatable mirrors with axes of rotation that are not parallel to one another and preferably oriented perpendicular to one another to the point of the scene to be illuminated.
  • non-mechanical optical elements such as diffractives
  • Optical elements are used.
  • the deflection can in particular be controlled by the above-described illuminating light control device.
  • the at least approximately point-shaped illumination light source can be a (sufficiently strong) semiconductor diode, for example a laser or light-emitting diode.
  • a semiconductor diode for example a laser or light-emitting diode.
  • Beam shaping systems in particular lens systems, are used.
  • suitable optical elements for beam deflection are used.
  • Beam splitting and / or beam merging can be used. DOEs can also be used to advantage.
  • the plurality of illumination light sources which are likewise in particular lasers or light-emitting diodes, can be controlled (in particular individually) by the illumination light control device described above. This advantageously allows an adaptively controlled or even regulated setting of the characteristics of the illuminating light.
  • a flat light source can also be the source for an intensity distribution that is not homogeneous as a function of the spatial angle. If it is a spatially homogeneously illuminated surface, suitable optical elements can be used
  • Beam deflection, beam splitting, beam merging and / or beam shaping can be used in order to achieve the described non-uniform illumination of the scene as a function of the spatial angle.
  • the data processing and control device is further configured in such a way that a coverage characteristic of the passage can be controlled by at least one closure body.
  • a coverage characteristic of the passage can be controlled by at least one closure body.
  • Sensor system can be coupled to the control of a known control system for a closure body.
  • a method for three-dimensional detection of a scene in a predetermined spatial area which is assigned to a closure body for a passage.
  • the method described comprises (a) illuminating the scene with
  • Illuminating light by means of a lighting device; (b) receiving measurement light, which is at least partially backscattered illumination light from at least one object contained in the scene, by means of a
  • the second operating mode can deliver (second) scene acquisitions which, compared to (first) scene acquisitions, have, for example, a lower temporal resolution, a lower spatial resolution and / or a lower signal-to-noise ratio.
  • the method further comprises (a) detecting an object located in the scene; (b) a comparison of the detected object with at least one comparison object stored in a database; and (c) if the object matches a comparison object within predetermined permissible deviations
  • Identifying the object as an object allowed for a particular action Identifying the object as an object allowed for a particular action.
  • the permitted action can be, for example, a permitted passage through an opening in a building, which opening is closed by a locking body before identification as an approved object and only after successful identification by a corresponding movement of the
  • Closing body is opened.
  • the objects to be identified can preferably be people and / or vehicles. Successful identification can be for controlling or activating a locking mechanism for a locking body in front of a passage or opening of a building.
  • a use of a sensor system of the type described above for controlling a coverage characteristic of a passage to be passed by an object by at least one closure body is described.
  • the described use is based on the knowledge that an energetically efficient detection and evaluation of an optical scene can be used in an advantageous manner for passages which can be closed by a closure body. This applies in particular to passages which have a closure or a covering characteristic that is controlled or at least also controlled by the described sensor system.
  • Sensor system can also be monitored in an energetically efficient manner, even larger distances, which naturally leads to an earlier detection of a
  • Opening request of the closure body leads, which can be of great advantage, especially with fast-moving objects. Furthermore, the scene can be captured with a wider capture angle, which, for example, leads to an early detection of people moving across the opening
  • the opening is an entrance or an exit, in particular an emergency exit in a building.
  • Detection of an existing but possibly not moving object in a passage area can monitor an input or output
  • the object is a person or a vehicle.
  • the building can in particular be a house or a garage.
  • TOF-based sensor systems can generally be used in relation to the
  • Illumination light as well as in relation to the measurement light can be divided into two fundamentally different classes, which can be combined with one another as required.
  • Bl The first alternative (Bl) for the lighting is characterized in that the scene is higher by means of a single illuminating light beam
  • Focusing and low divergence ie high collimation is scanned sequentially. For each position of the illuminating light beam in the scene, a measurement of the transit time of the illuminating light and the measuring light is made
  • the scanning can be implemented using movable optical components, in particular mirrors. Alternatively or in
  • Combination can be used for sequential scanning of the scene with the
  • Illuminating light beam a solid can be used, which manages without mechanically moving parts and integrated photonic structures or having circuits. With a suitable control of these structures, the illuminating light beam is then directed to the desired location in the scene.
  • a solid is known, for example, from US 2015/293224 A1.
  • B2 The second alternative (B2) for the lighting is characterized by the fact that the entire scene is illuminated (all at once and over an area). If necessary, the intensity of the illuminating light can be increased (selectively) in selected sub-areas of the scene in order to enable improved 3D object detection at these points. Such a spatially uneven distribution of the intensity of the illuminating light can be achieved without moving optical
  • DOE diffractive optical element
  • Ml A first alternative (Ml) for the measurement is based on pulsed
  • Illuminating light rays Illuminating light rays.
  • the "travel time" of a light pulse on the receiver side is determined for each pixel within a time window and the distance is derived from this.
  • the second alternative (M2) for the measurement is based on a temporal, preferably sinusoidal, modulation of the illuminating light with a
  • the phase difference is measured for each pixel and the distance information is derived from it.
  • Both measuring principles M1 and M2 are based on an integration of the number of photons or the photoelectrons generated in the light receiver, which arrive at each pixel to be measured.
  • light or photon noise that is always present depends on the number of photons accumulated in a pixel.
  • the distance information obtained from the TOF measurement is more accurate the higher the number of accumulated photons.
  • Figure 1 shows the use of a sensor system for controlling a
  • FIG. 2 illustrates a variation in the size of a spatial region in which a scene is recorded by means of an adaptive optical device.
  • Figures 3a and 3b illustrate a combination of individual pixels for
  • FIG. 1 shows the use of a sensor system 100 for controlling a coverage characteristic of an opening or passage 184 depending on the characteristic of a scene 190 monitored by the sensor system 100.
  • the opening 184 is a
  • the entrance structure 180 comprises a stationary holding structure 182 which has a frame and a guide for two designed as sliding doors
  • Closing body 186 represents.
  • the sliding doors 186 can each be opened by means of a motor 187 along the lines shown by two thick double arrows
  • Displacement directions are moved.
  • the control of the motors 187 takes place, as set out below, by means of a data processing and
  • Control device 150 of the sensor system 100 described in this document is
  • an operating state of the two sliding doors 186 is recorded by means of an encoder 188 each, which compared to the respective motor 187 represents a separate unit.
  • the encoder can also be integrated in the respective motor 187.
  • the function of the encoder can also be dependent on the
  • Data processing and control device 150 are taken over.
  • the operating state of the sliding doors 186 can be the current position of the sliding doors 186 and / or the current speed at which the sliding doors 186 move from an open position to a closed position or vice versa from the closed position to the open position.
  • the sensor system 100 has a time of flight (TOF) measuring device 110, the data processing and control device 150 and a database 160.
  • TOF time of flight
  • the TOF measuring device 110 in turn has an illumination device 130 and a light receiver 120. According to the one shown here
  • the TOF measuring device 110 has or are assigned to the TOF measuring device 110 (i) an illumination light control device 135 for controlling the operation of the illumination device 130, (ii) a measuring unit 125 connected downstream of the light receiver 120 for measuring a
  • the entire sensor system 100 (in contrast to the illustration in FIG. 1) is preferably constructed as a module which, within a compact design, not only has the TOF measuring device 110, but also the data processing and control device 150 and the database 160.
  • Control device 150 the two motors 187. One required for this
  • Electric power for actuating the doors 186 is provided in each case by an output stage or an amplifier which, according to the exemplary embodiment shown here, is integrated in the housing of the respective motor 187.
  • the two output stages are each controlled via a signal 152a, which is output at a data output 152 by the data processing and control device 150 and is therefore referred to in this document as output signal 152a.
  • the operating state of the doors 186 has at least a certain influence on the operation of the TOF measuring device 110. Specifically, this operating state determines the operation of the lighting device 130 and the operation of the
  • Light receiver 120 (with). This influence manifests itself in two operating modes of the TOF measuring device 110. In a first operating mode, they have
  • the operating state of the doors 186 is transmitted to one of the two encoders 188 by means of a data signal or a sequence of data signals
  • This data signal is used in this document from the point of view of
  • Data processing and control device 150- also referred to as input signal 151a.
  • the data processing and control device 150 processes this input signal 151a in a suitable manner and controls, inter alia. depending on this input signal 151a, the illumination light control device 135 and the light receiver control device 140 in such a way that an operating mode is established which (also) determines the energy consumption of the illumination device and / or the light receiver 120.
  • the first operating mode with the higher energy consumption is activated when the two doors 186 are moved, in particular from their open position to their closed position.
  • the second operating mode with the lower energy consumption is activated when the two doors 186 are at rest.
  • the scene 190 is captured in a spatial area which comprises a first partial area 191 and a second partial area 192.
  • the first sub-area 191 is a so-called hazard area. If there is an object in this hazard area 191, there is basically the risk that it will be trapped and possibly injured when the doors 186 are closed.
  • the sensor system 100 is therefore configured in such a way as to basically detect this hazardous area 191 with a very high degree of accuracy. For this purpose, a correspondingly bright illumination is provided in a first operating mode
  • Hazardous area 191 as well as detection with a high degree of accuracy required In this context, it is obvious that this requires a relatively large amount of energy. However, if the doors 186 do not move, then there is no risk of an object being trapped or injured. It is therefore completely sufficient in a second operating mode if the hazard area 191 is illuminated by the lighting device 130 with a lower intensity and / or if the light receiver 120 detects the hazard area 191 with a reduced accuracy.
  • the second partial area 192 is an area in which objects are detected so that the doors 186 can be moved in a suitable manner.
  • the second sub-area 192 is therefore referred to as the activation area 192 in this document.
  • the activation area 192 is generally not safety-relevant (with regard to a collision of a door 186 with an object). Therefore, in the embodiment described here, the activation area 192 is only ever operated, regardless of the operating state of the doors 186, that at the expense of the accuracy of the
  • the sensor system 100 is able to carry out an object detection.
  • Data processing and control device 150 to a data record of reference objects stored in the database 160 which correspond to selected objects that are authorized to pass through the passage or opening 184. This means that when the object 195 appropriately approaches the entrance 184, the sliding doors 186 are only opened when the detected object 195 at least approximately matches one of the stored ones
  • FIG. 2 illustrates an embodiment in which the size of the spatial area detected by the light receiver 120 depends on the operating state of the closure body. This is done according to the one shown here
  • Embodiment implemented by means of an adaptive optical device which can assume two spatial configurations 221a and 221b.
  • the adaptive optical device is shown schematically by means of a deformable optical lens.
  • a comparatively small sub-area 291a of the scene is recorded with a particularly high level of accuracy (high spatial and / or temporal resolution, large signal-to-noise ratio, etc.).
  • the adaptive optical device assumes the first configuration 221a (with a small focal length).
  • a second operating mode a comparatively small sub-area 291a of the scene is recorded with a particularly high level of accuracy (high spatial and / or temporal resolution, large signal-to-noise ratio, etc.).
  • the adaptive optical device In the operating mode, a comparatively large sub-area 291b of the scene is recorded with reduced accuracy.
  • the adaptive optical device assumes the second configuration 221b (with a large focal length).
  • the adaptive optical device can be realized with any desired element with which imaging properties can be changed dynamically.
  • the adaptive optical device can have a plurality of lenses with different focal lengths that can be displaced along an axis or one objective with a variable focal length.
  • the use of at least one DOE is also possible in order to enable a corresponding variation of the optical imaging of the sub-area 291a / 291b on a light-sensitive chip of the light receiver 120.
  • Illumination device emitted light quantity is of course best used when the area of the scene which is detected by the light receiver 120 is exactly the same area as that of the
  • FIGS. 3a and 3b illustrate a combination of individual pixels of a light receiver 320a or a light receiver designed as a semiconductor or CCD chip.
  • the light receiver 320a has a large number of light-sensitive or
  • Photon collecting pixels 322a According to the one shown here
  • the pixels 322a are assigned to a full spatial resolution of the light receiver 320a, which resolution is predetermined by the semiconductor architecture of the chip 320a.
  • the light receiver 320b four of the light-sensitive pixels (for a full resolution) are associated with a higher-level pixel 322b (for an increased
  • a pixel 322b collects four times the amount of light compared to a single pixel 322a. Such a combination (English “binning") reduces the required
  • (Minimum) intensity of the recorded measurement light which is required to evaluate the corresponding image area of the scene. Since the intensity of the measuring light depends directly on the intensity of the illuminating light, the intensity of the illuminating light can be reduced through the "binning" and thus the energy consumption of the sensor system can be reduced.
  • Closing body realized by a corresponding control of one and the same light receiver 320a or 320b.
  • the light receiver is operated either in a first operating mode (with full resolution) or in a second operating mode (with combined photon-collecting pixels).

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

L'invention concerne un système de détection (100) ainsi qu'un procédé d'acquisition tridimensionnelle d'une scène (190) dans une zone spatiale prédéfinie qui est associée à un corps de fermeture (186) destiné à un passage (184). Le système de détection (100) comprend : (a) un dispositif d'éclairage (130) servant à éclairer la scène (190) avec une lumière d'éclairage (131) ; (b) un dispositif de mesure (110) servant à recevoir une lumière de mesure (196) qui est une lumière d'éclairage (196) au moins en partie rétrodiffusée par au moins un objet (195) se trouvant dans la scène (190), et à mesurer des distances entre le système de détection (100) et ledit au moins un objet (195) sur la base d'un temps de propagation de la lumière d'éclairage (131) et de la lumière de mesure (195) ; et (c) un dispositif de traitement de données et de commande (150) monté en aval du dispositif de mesure (110) et servant à déterminer une caractéristique tridimensionnelle de la scène (190) sur la base des distances mesurées ; et (d) une entrée de données (151) destinée à recevoir un signal d'entrée (151a) qui indique un état de fonctionnement du moment du corps de fonctionnement (186). Le dispositif d'éclairage (130) et/ou le dispositif de mesure (110) peuvent fonctionner dans un premier mode de fonctionnement avec une première consommation d'énergie et dans un second mode de fonctionnement avec une seconde consommation d'énergie qui est inférieure à la première consommation d'énergie. L'invention concerne en outre l'utilisation d'un tel système de détection (100).
PCT/EP2020/061901 2019-05-10 2020-04-29 Système de détection 3d apte à fonctionner dans différents modes de fonctionnement en fonction d'un état de fonctionnement d'un corps de fermeture WO2020229186A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102019112338.0 2019-05-10
DE102019112338.0A DE102019112338A1 (de) 2019-05-10 2019-05-10 3D Sensorsystem, betreibbar in verschiedenen Betriebsmodi in Abhängigkeit eines Betriebszustandes eines Verschließkörpers

Publications (1)

Publication Number Publication Date
WO2020229186A1 true WO2020229186A1 (fr) 2020-11-19

Family

ID=70682817

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2020/061901 WO2020229186A1 (fr) 2019-05-10 2020-04-29 Système de détection 3d apte à fonctionner dans différents modes de fonctionnement en fonction d'un état de fonctionnement d'un corps de fermeture

Country Status (2)

Country Link
DE (1) DE102019112338A1 (fr)
WO (1) WO2020229186A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102021108349A1 (de) 2021-04-01 2022-10-06 OSRAM Opto Semiconductors Gesellschaft mit beschränkter Haftung Licht emittierende vorrichtung und lidar-system
DE102021115280A1 (de) 2021-06-14 2022-12-15 Agtatec Ag Automatische Türanordnung mit Sensorvorrichtung und Verfahren zum Betreiben einer solchen automatischen Türanordnung
DE102022123564A1 (de) 2022-09-15 2024-03-21 Gu Automatic Gmbh "Automatiktür sowie Verfahren zum Betrieb einer Automatiktür"

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2418517A2 (fr) * 2010-08-09 2012-02-15 Dorma GmbH&Co. Kg Capteur optoélectronique
EP2453252B1 (fr) 2010-11-15 2015-06-10 Cedes AG Capteur 3D à économie d'énergie
US20150293224A1 (en) 2013-05-09 2015-10-15 Quanergy Systems, Inc. Solid state optical phased array lidar and method of using same
US20180038991A1 (en) * 2016-08-05 2018-02-08 Blackberry Limited Determining a load status of a platform
DE102016119343A1 (de) * 2016-10-11 2018-04-12 Bircher Reglomat Ag Objektüberwachung mit Infrarotbildaufnahme und Infrarotpulsbeleuchtung
WO2018064745A1 (fr) 2016-10-03 2018-04-12 Sensotech Inc. Système de détection basé sur le temps de vol (tof) pour une porte automatique

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5661799B2 (ja) * 2010-12-03 2015-01-28 ナブテスコ株式会社 自動ドア用センサ
US20150260830A1 (en) * 2013-07-12 2015-09-17 Princeton Optronics Inc. 2-D Planar VCSEL Source for 3-D Imaging

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2418517A2 (fr) * 2010-08-09 2012-02-15 Dorma GmbH&Co. Kg Capteur optoélectronique
EP2453252B1 (fr) 2010-11-15 2015-06-10 Cedes AG Capteur 3D à économie d'énergie
US20150293224A1 (en) 2013-05-09 2015-10-15 Quanergy Systems, Inc. Solid state optical phased array lidar and method of using same
US20180038991A1 (en) * 2016-08-05 2018-02-08 Blackberry Limited Determining a load status of a platform
WO2018064745A1 (fr) 2016-10-03 2018-04-12 Sensotech Inc. Système de détection basé sur le temps de vol (tof) pour une porte automatique
DE102016119343A1 (de) * 2016-10-11 2018-04-12 Bircher Reglomat Ag Objektüberwachung mit Infrarotbildaufnahme und Infrarotpulsbeleuchtung

Also Published As

Publication number Publication date
DE102019112338A1 (de) 2020-11-12

Similar Documents

Publication Publication Date Title
WO2020229186A1 (fr) Système de détection 3d apte à fonctionner dans différents modes de fonctionnement en fonction d'un état de fonctionnement d'un corps de fermeture
EP0344404B1 (fr) Procédé et dispositif pour commander la position d'une porte automatique
DE2425466C2 (de) Einrichtung zur Überwachung von Räumen durch optisch-elektronische Meßmittel
EP1345444B1 (fr) Systeme de sureillance video en trois dimensions avec une source de rayonnements infrarouges
DE19522760C2 (de) Automatische Tür und Verfahren zum Betrieb einer automatischen Tür
EP2453252B1 (fr) Capteur 3D à économie d'énergie
WO2004084556A1 (fr) Surveillance au niveau d'un ascenseur au moyen d'un capteur 3d
EP1159636A1 (fr) Systeme de mesure de distance a resolution locale
EP2946226B1 (fr) Ensemble de détection universel conçu pour détecter des gestes de commande dans des véhicules
DE102004047022A1 (de) Vorrichtung zur Überwachung von Raumbereichen
WO2020002332A1 (fr) Dispositif de surveillance et procédé de surveillance d'une zone de porte d'un système de porte de véhicule et porte dotée d'un dispositif de surveillance
DE102005011116B4 (de) Vorrichtung zur Ansteuerung und/oder Überwachung eines Flügels
EP3311190A1 (fr) Système de capteur d'un dispositif de détection d'un véhicule automobile
DE10055689B4 (de) Verfahren zum Betrieb eines optischen Triangulationslichtgitters
EP3724676A1 (fr) Système de capteur pour la capture tridimensionnelle d'une scène au moyen de plusieurs accumulations de photons
WO2020229190A1 (fr) Identification d'un objet sur la base d'une reconnaissance d'une partie de l'objet et de données de description d'un objet de référence
WO2020229189A1 (fr) Système de détection tof doté d'un dispositif d'éclairage comprenant un réseau de sources lumineuses individuelles
DE102006010990B4 (de) Sicherheitssystem
EP3724601B1 (fr) Détermination de la distance basée sur différentes profondeurs de champ avec différents réglages de mise au point d'une lentille d'objectif
WO2019115558A1 (fr) Système de capteur 3d avec optique de forme libre
DE102017129654A1 (de) 3D Szenenerfassung basierend auf Lichtlaufzeiten und einem charakteristischen Merkmal von erfassten elektromagnetischen Wellen
DE102014205282A1 (de) Vorrichtung zum Öffnen oder Schließen einer Öffnung eines Fahrzeugs
WO2019115559A1 (fr) Système de détection 3d muni d'un éclairage de scène fonction d'un angle solide
DE102013018800A1 (de) Verfahren und Vorrichtung zum optischen Bestimmen von Abständen zu Objekten in einem Überwachungsbereich, insbesondere in einem Überwachungsbereich von automatischen Türen
DE102013013778B4 (de) Tor mit einer Detektionseinrichtung sowie ein Verfahren zur Anwendung der Detektionseinrichtung bei einem Tor

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20725637

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20725637

Country of ref document: EP

Kind code of ref document: A1