EP4265877A1 - Procédé de vérification de l'exactitude des données d'une unité de capteur d'une installation de porte - Google Patents

Procédé de vérification de l'exactitude des données d'une unité de capteur d'une installation de porte Download PDF

Info

Publication number
EP4265877A1
EP4265877A1 EP22168690.0A EP22168690A EP4265877A1 EP 4265877 A1 EP4265877 A1 EP 4265877A1 EP 22168690 A EP22168690 A EP 22168690A EP 4265877 A1 EP4265877 A1 EP 4265877A1
Authority
EP
European Patent Office
Prior art keywords
sensor
door system
unit
sensor unit
control unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP22168690.0A
Other languages
German (de)
English (en)
Inventor
Dennis Meiering
Sven Busch
Patrick Winkelmann
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dormakaba Deutschland GmbH
Original Assignee
Dormakaba Deutschland GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dormakaba Deutschland GmbH filed Critical Dormakaba Deutschland GmbH
Priority to EP22168690.0A priority Critical patent/EP4265877A1/fr
Publication of EP4265877A1 publication Critical patent/EP4265877A1/fr
Pending legal-status Critical Current

Links

Images

Classifications

    • EFIXED CONSTRUCTIONS
    • E05LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
    • E05FDEVICES FOR MOVING WINGS INTO OPEN OR CLOSED POSITION; CHECKS FOR WINGS; WING FITTINGS NOT OTHERWISE PROVIDED FOR, CONCERNED WITH THE FUNCTIONING OF THE WING
    • E05F15/00Power-operated mechanisms for wings
    • E05F15/70Power-operated mechanisms for wings with automatic actuation
    • E05F15/73Power-operated mechanisms for wings with automatic actuation responsive to movement or presence of persons or objects
    • EFIXED CONSTRUCTIONS
    • E05LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
    • E05FDEVICES FOR MOVING WINGS INTO OPEN OR CLOSED POSITION; CHECKS FOR WINGS; WING FITTINGS NOT OTHERWISE PROVIDED FOR, CONCERNED WITH THE FUNCTIONING OF THE WING
    • E05F15/00Power-operated mechanisms for wings
    • E05F15/70Power-operated mechanisms for wings with automatic actuation
    • E05F15/73Power-operated mechanisms for wings with automatic actuation responsive to movement or presence of persons or objects
    • E05F2015/767Power-operated mechanisms for wings with automatic actuation responsive to movement or presence of persons or objects using cameras
    • EFIXED CONSTRUCTIONS
    • E05LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
    • E05YINDEXING SCHEME ASSOCIATED WITH SUBCLASSES E05D AND E05F, RELATING TO CONSTRUCTION ELEMENTS, ELECTRIC CONTROL, POWER SUPPLY, POWER SIGNAL OR TRANSMISSION, USER INTERFACES, MOUNTING OR COUPLING, DETAILS, ACCESSORIES, AUXILIARY OPERATIONS NOT OTHERWISE PROVIDED FOR, APPLICATION THEREOF
    • E05Y2400/00Electronic control; Electrical power; Power supply; Power or signal transmission; User interfaces
    • E05Y2400/10Electronic control
    • E05Y2400/50Fault detection
    • E05Y2400/508Fault detection of detection
    • EFIXED CONSTRUCTIONS
    • E05LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
    • E05YINDEXING SCHEME ASSOCIATED WITH SUBCLASSES E05D AND E05F, RELATING TO CONSTRUCTION ELEMENTS, ELECTRIC CONTROL, POWER SUPPLY, POWER SIGNAL OR TRANSMISSION, USER INTERFACES, MOUNTING OR COUPLING, DETAILS, ACCESSORIES, AUXILIARY OPERATIONS NOT OTHERWISE PROVIDED FOR, APPLICATION THEREOF
    • E05Y2900/00Application of doors, windows, wings or fittings thereof
    • E05Y2900/10Application of doors, windows, wings or fittings thereof for buildings or parts thereof
    • E05Y2900/13Type of wing
    • E05Y2900/132Doors

Definitions

  • the present invention relates to a method for operating a sensor unit of a door system with at least one movable wing element, in particular a sliding door system, wherein the wing element is moved based on sensor data from the sensor unit when an object is detected in a sensor detection area of the sensor unit.
  • the invention further relates to a door system with a control unit for carrying out a method.
  • door drives which are connected via a control unit to sensors which are designed to detect objects, particularly in the form of people.
  • this shows DE 203 20 497 U1 a door system with a door drive and with a sensor unit, the sensor unit serving as a presence sensor and with which the presence of people in a detection zone can be detected.
  • the door drive opens the leaf element of the door system.
  • radar sensors can be used as the sensor unit.
  • a method for operating a sensor unit of a door system with at least one movable wing element is known, and the door system is a sliding door system educated.
  • the wing element can be moved based on sensor data from the sensor unit when an object is detected in a sensor detection area of the sensor unit.
  • the sensor unit includes a so-called TOF (Time of Flight) camera with which 3D data can be recorded within the sensor detection area, for example to detect the position of the object within the sensor detection area.
  • TOF Time of Flight
  • the method for operating a sensor unit of a door system provides that the sensor unit is supplied with a test event with which the authenticity of the sensor data is checked by subsequently analyzing the test event from the sensor data.
  • the core idea of the invention is to check the accuracy of the sensor data that is generated with the sensor unit, the test event being output, for example, with a control unit, in particular with a control unit for controlling the door system and therefore the leaf elements.
  • the control unit specified repeatedly below can also form or contain a control unit in each specified context.
  • the control unit enables a closed control circuit with which the door drive and thus the leaf element can be continuously and dynamically controlled in the position between a closed position and an open position as a result of the movement and / or the contour of the object, especially a person.
  • the control unit is therefore aware of the test event, and the control unit subsequently checks whether the test event can also be found or reflected in the sensor data of the sensor unit, so that only then, if the output test event is also recognized by the sensor unit, the authenticity of the sensor data is evaluated positively.
  • the door system can continue to be operated, and if the authenticity of the sensor data is not recognized, even temporarily, for example, the movement of the wing element can be stopped or moved into the open position or future movements of the wing element can be completely omitted and the Door system switches off or is transferred to a safe state and/or issues a message.
  • the test event is preferably fed to the sensor unit repeatedly after regular, random or characteristic time periods.
  • the test event is fed to the sensor unit regularly or preferably irregularly after regular periods of time. This means that the test event can be repeated periodically, so that the accuracy of the sensor data is ensured throughout the entire service life of the door system. It is also conceivable that the test event is repeated randomly, after a maximum time or according to a characteristic. It is even better to trigger an audit event irregularly so that it cannot form a regular recording.
  • the test event must be carried out at least once within a specified time, which has been agreed, for example, with a testing authority.
  • the exact time is preferably random so that no historical data can represent or predict the event.
  • An alternative to random timing is a characteristic in the test event, e.g. the length or speed at which the event occurs.
  • the door system preferably has a control unit, with the test event being fed to the sensor unit by means of the control unit.
  • the control unit is preferably designed in such a way that the sensor data of the sensor unit are received with the control unit, so that the authenticity of the sensor data is also checked in or with the control unit or at least in connection with the control unit.
  • the door system particularly advantageously has the control unit with which the wing elements are tracked in direct dependence on the movement of the object and/or the contour of the object.
  • the tracking takes place continuously and dynamically.
  • the test event is supplied to the sensor unit spontaneously in that no prior communication of the test event with the sensor unit takes place.
  • This is the only way to ensure that the sensor signal returned by the sensor unit is authentic, so that, for example, when the image from the sensor detection area is reproduced, there is no periodic data transfer to the control unit.
  • the test event can vary from event to event, with the variation being generated by the control unit and in this respect also being known to the control unit. Consequently, the sensor signal of the sensor data must also return the corresponding variation of the test event received by the control unit.
  • control unit preferably analyzes the sensor data to the sensor unit for the detection of the test event by means of the sensor unit.
  • At least one image evaluation unit is set up in conjunction with the at least one sensor unit, with the accuracy of the test event output as sensor data by the image evaluation unit being evaluated by the control unit.
  • control of the door system can also in particular be the image evaluation unit, whereby the image evaluation unit can also be part of the sensor unit.
  • control can also have a control unit, which forms a control unit for setting up and operating a continuous control loop between the behavior and contour of the object and the movement of the wing element.
  • the generated test event can have a non-random characteristic in the sensor data, so that the generated test event can be reliably recognized as such and can be reliably distinguished from a malfunction caused by, for example, a system error.
  • These features are, for example, the duration of the test event generated, which is characteristic and/or a diaphragm would close the field of vision from one side as a test event, then the movement of the diaphragm in the sensor field would be understandable and characteristic, the diaphragm would be a small part of the sensor image field leave open so that this partial coverage would be characteristic and/or an additional light source would interfere with the sensor image, then the light source could have a specific color, brightness, duration, flashing frequency, etc. that would be characteristic.
  • the sensor unit is designed, for example, as a 3D camera, in particular a stereo camera or a LIDAR system, and the image evaluation unit is connected downstream of the 3D camera and can deliver the sensor data relating to the object data to the control unit.
  • the image evaluation unit can also be part of the control unit, whereby the data output by the image evaluation unit must include the test event that is evaluated by the control unit.
  • the sensor unit has at least one sensor camera, in particular where the test event is generated as a time-limited change in the image that is recorded by the sensor camera within the sensor detection area.
  • the sensor camera or the sensor unit is pivoted and/or displaced for a limited time in order to generate the test event, in particular a temporary change in the image that is recorded by the sensor camera.
  • the sensor camera can be pivoted within the sensor unit in the same way as the entire sensor unit in arrangement on the door system. It is also conceivable that with two sensor cameras to form a stereo camera of the sensor unit, the orientation of the sensor cameras to one another is changed at short notice, so that a test event can also be generated. By pivoting the sensor camera or the sensor unit, not only can the timeliness of the camera image provided be verified, but the correctness of the distance measurement itself can also be checked.
  • the swivel angle can be random and different, so that it cannot be predicted and is sent to the sensor unit by a transmitter unit.
  • the swivel angle of the sensor unit can be calculated by the control unit from the sensor image data and compared with the predetermined swivel angle of the transmission data, so that the randomness of the test event can be excluded and the correct operation of the sensor unit can be proven, in particular that it is also correct How it works can be demonstrated across the entire sensor field.
  • the panning event would not be displayed correctly in all pixels. Since the test event is caused by a pivoting causes a characteristic change in virtually all image points, this process can be used to analyze the correct functioning of the sensor in the entire field of view. This makes the process more extensive than, for example, using a shutter, which only provides a "live signal” but does not allow any statement about the correct functioning, i.e. the accuracy of the measured values can be verified. This is advantageous because the swivel method can be used to meet the requirement for proof of a “safe” electronic condition. In this respect, sensors can also be used whose safe electronic properties cannot be proven on their own due to the operating system used (e.g. LINUX) or unknown electronic components used.
  • LINUX operating system used
  • a reference distance measuring means is set up, with the reference distance measuring means being used to carry out a reference distance measurement of an object within the sensor detection range, the value of the reference distance measurement being compared with a measured camera distance value.
  • the idea of using a reference distance measuring device is based on the fact that the at least one object moves within the sensor detection range, so that the current position can be compared, which is detected by the sensor unit on the one hand and by the reference distance measuring device on the other hand.
  • An object does not have to be measured; a point on the ground can also be measured as a reference, for example, as long as this point can be assigned to the corresponding measured point by the sensor. This allows a distance comparison to be carried out continuously. Only if the distance information from both measuring systems essentially agree can the authenticity of the sensor data be recognized, and if the distance measurement values clearly do not match, then from an at least temporary disruption in the accuracy of the sensor data of the sensor unit can be assumed.
  • a mechanical shutter can be set up, wherein the mechanical shutter is briefly moved in front of the sensor camera, whereby the image capture of the sensor camera is interrupted and/or a particularly temporary change in the image is generated, and the interruption and/or change is caused by the control unit is evaluated.
  • the shutter can also be provided as a mechanical shutter within the housing of the sensor unit, whereby it may be sufficient to dim only one sensor camera with the shutter if two sensor cameras are set up to form a stereo camera of the sensor unit.
  • the interruption only affects the interruption of the recording of the spatial image, so that if the shutter is temporarily in front of the camera, for example, a black or contentless image is transmitted at the same time.
  • test event it can be implemented that the exposure time of the sensor camera is temporarily minimized and / or that an interference light source is provided and temporarily switched on, whereby the image quality of the sensor camera is deteriorated, and the deterioration is evaluated by the control unit.
  • the camera image is generated with a time stamp using the stereo camera of the sensor unit, the time stamp being evaluated by the control unit.
  • the timestamp can include a time and a date; it is also conceivable that the timestamp includes generated graphics or images that are reproduced accordingly in the camera image and that are in the Sensor data must be mapped and evaluated by the control unit.
  • the invention is further directed to a door system with a control unit for carrying out the method described above.
  • the control unit can in particular be set up to continuously and dynamically regulate the position of a wing element based on the sensor data depending on the movement and/or the contour of at least one object within the sensor detection area.
  • the closing edge protection of the at least one wing element is carried out with the at least one sensor unit and with the associated sensor detection area by means of the evaluation of the sensor data of the sensor unit via the control unit.
  • the respective sensor detection areas can, for example, overlap in the plane of movement of the wing elements of the door system or only form a small detection gap. This ensures that an object that passes through the door system is tracked throughout its entire movement from the time it approaches the door system until it leaves the door system.
  • the camera image from the sensor cameras can be examined for the presence of objects, with two at least partially overlapping sensor detection areas being able to provide redundant protection, particularly in the movement plane of the wing elements. This redundant protection can of course be superimposed on the checking of the authenticity of the sensor data using the test event according to the invention.
  • the sensor unit preferably has at least one, in particular a single camera, preferably two cameras and/or the sensor unit has at least one light source with which a light grid can be projected into the sensor detection area and/or the sensor unit has a LIDAR sensor.
  • the sensor unit preferably has at least one and preferably two cameras and/or the sensor unit has at least one light source with which a light grid can be projected into the sensor detection area and/or the sensor unit has a LIDAR sensor.
  • Sensor units are preferably or exclusively used which are suitable for measuring distances between the sensor and surfaces, the surfaces being formed by the objects to be detected and/or by objects in the environment such as floors, doors, frames and walls.
  • Such sensor units determine the distance between the sensor and the surface either by the triangulation method and/or by measuring the transit time of radiation from a transmission source belonging to the sensor unit.
  • the different directional angles become a defined surface point of at least two spaced reference points, which consist of two or more wave-sensitive sensors, e.g. line sensors or single-point sensors or cameras. This is preferably: a stereo camera.
  • the triangulation method uses a wave-sensitive sensor, in particular a camera, and a point-shaped reference light source, for example a point grid light source.
  • one or more transmission sources belonging to the sensor unit When measuring the transit time of radiation, one or more transmission sources belonging to the sensor unit are used, which generate the radiation in the form of electromagnetic waves, in particular light, radar, radio, X-rays, microwaves and/or and/or sound waves , send out and throw onto the surfaces of the objects to be detected.
  • a receiving system in the sensor unit which is sensitive to the respective type of radiation, collects the rays reflected from the surfaces.
  • the transit time is determined directly in the form of time measurement and/or indirectly, in particular in the form of measurement of interference, phase shifts and/or frequency shifts, in particular in relation to the emitted radiation, which determines the radiation from the point in time required from transmission to reception in the receiving system.
  • FMCW Frequency-Modulated Continuous Wave Radar Systems
  • LiDAR Light Detection and Ranging
  • laser array i.e. one after the other, or combinations.
  • a method can be used which combines both the triangulation method and the measurement of the transit time of radiation from a transmission source belonging to the sensor unit.
  • a distance image can be provided which includes the particular complete sensor detection area from several individual distance measurement points.
  • a LIDAR sensor is particularly preferably used in conjunction with a method based on distance measurement. This combination, in particular as well as the others mentioned, represents a particularly efficient option, especially with regard to security and/or complexity.
  • FIG. 1 shows a schematic view of a door system 100 with the essential components of the invention.
  • the door system 100 has two wing elements 11, which can be moved independently of one another between an open position and a closed position with associated door drives 21, the illustration showing the wing elements 11 in a position shortly before the closed position.
  • the door system 100 is designed as a sliding door system, so that the wing elements can be moved in their plane of extension.
  • Three objects 14 in the form of people are shown by way of example, and a person is shown by way of example with an arrow, who can move into a sensor detection area 13 of the sensor unit 12 shown.
  • the side of the door system 100 on which the objects 14 are shown forms the approach side, and the side facing away from it forms the exit side of the door system 100.
  • a sensor unit 10 is shown both on the approach side and on the exit side of the door system 100, which spans an assigned sensor detection area 13 within which the objects 14 can be recognized.
  • the sensor units 10 are formed, for example, by 3D cameras, which, taken alone, each have two cameras to form a stereo camera, and furthermore the sensor units 10 can have a light source 22 with which a light grid 23 can be generated within the sensor detection area 13.
  • an image evaluation unit 17 the sensor units 10 being connected to the image evaluation unit 17 and sensor data 12 being transmitted from the sensor units 10 to the image evaluation unit 17.
  • a control unit 16 which is connected to the image evaluation unit 17 for receiving sensor data 12 prepared or enriched according to the example, and the control unit 16 is connected to the sensor units 10, so that a test event 20 is sent to the sensor units 10 via this return connection can be issued.
  • the control unit 16 first sends the symbolically represented test event 20 to the sensor units 10. These generate a camera image which is transmitted to the image evaluation unit 17.
  • the captured camera image must include the test event 20, which is again symbolically represented, essentially in real time, which is consequently processed with the image evaluation unit 17 and transmitted to the control unit 16. Consequently, the control unit 16 can be used to detect whether the camera image actually obtained corresponds to the real image that is recorded within the sensor detection area 13.
  • Figures 2a to 2c show how the movement of the wing elements 11 can take place in response to the movement of the object 14
  • Figure 2a represents the object 14 within the sensor detection area 13, so that the object 14 is detected by the sensor unit 10 on the approach side of the door system 100.
  • the object 14 has a distance from the wing elements 11 that is still too large for the wing elements 11 to be moved from the closed position shown into the open position, as indicated by the arrows.
  • Figure 2c represents the object 14 in the sensor detection area 13 on the exit side of the door system 100, and the object 14 has a distance from the wing elements 11, which causes the wing elements 11 to move again.
  • the wing elements 11 are continuously and dynamically regulated in direct response to the current position of the object 14. For example, from the position of Figure 2a towards the position in the Figure 2b If the object 14 approaches the wing elements 11 but then turns around, the wing elements 11 would immediately close again as a result of the active control loop without the wing elements 11 moving into a predetermined opening position. The same applies to the transition from the Figure 2b to Figure 2c , and if the object 14 were to turn around again after passing through the door system 100, the wing elements 11 would move from the closed position or on the way there back to the open position.
  • the sensor detection areas 13 can also extend into the movement range of the wing elements 11, so that closing edge protection can also be carried out by means of the sensor unit 10, which is used to detect and track the objects 14 with regard to position, speed of movement and direction of movement as well as the contour of the objects 14 serve.
  • a check of the authenticity of the correctness of the sensor data 12 is important, which is checked using the method according to the invention to form a test event 20.
  • Figure 3 shows a sensor unit 10 arranged on a door system 100, with the sensor unit 10 forming a component of the door system 100 in the sense of the present invention.
  • the sensor unit 10 itself has one and preferably two sensor cameras 15, which are set up as stereo cameras and can generate a 3D camera image.
  • the sensor unit 10 has, for example, a reference distance measuring means 18, with which the distance, for example of the object 14, within the sensor detection area 13 can be determined in order to check the correctness of the function of the sensor unit 10.
  • the test event is generated in a modified form so that the reference distance measuring means 18 possibly outputs a different distance value than the sensor unit 10, which is recognized by the control unit.
  • This distance signal is output to the control unit 16, and at the same time the distance of the sensor detected by the sensor cameras 15 Object 14 measured. If the distance values agree with one another, taking into account a tolerable deviation, then sensor data 12 can be classified as authentic, so that the door system 100 can continue to be operated. If bridging of the distances were to deviate from one another, it must be assumed that the camera image of the sensor camera 15 is no longer current, so the sensor data is no longer correct, and a temporary interruption of the operation of the door system 100 must take place because a collision would occur , for example between the object 14 and the wing elements 11.
  • the method according to the invention of supplying the sensor unit with a test event, with which the authenticity of the sensor data is checked by subsequently analyzing the test event from the sensor data using a control unit, in particular in conjunction with an image evaluation unit, can also be used for activating an escape route door.

Landscapes

  • Length Measuring Devices By Optical Means (AREA)
EP22168690.0A 2022-04-18 2022-04-18 Procédé de vérification de l'exactitude des données d'une unité de capteur d'une installation de porte Pending EP4265877A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP22168690.0A EP4265877A1 (fr) 2022-04-18 2022-04-18 Procédé de vérification de l'exactitude des données d'une unité de capteur d'une installation de porte

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP22168690.0A EP4265877A1 (fr) 2022-04-18 2022-04-18 Procédé de vérification de l'exactitude des données d'une unité de capteur d'une installation de porte

Publications (1)

Publication Number Publication Date
EP4265877A1 true EP4265877A1 (fr) 2023-10-25

Family

ID=81327609

Family Applications (1)

Application Number Title Priority Date Filing Date
EP22168690.0A Pending EP4265877A1 (fr) 2022-04-18 2022-04-18 Procédé de vérification de l'exactitude des données d'une unité de capteur d'une installation de porte

Country Status (1)

Country Link
EP (1) EP4265877A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE20320497U1 (de) 2003-06-04 2004-11-11 Dorma Gmbh + Co. Kg Türsteuerung mit einem Präsenzsensor
EP2698649B1 (fr) * 2010-11-15 2015-07-08 Cedes AG Capteur de surveillance doté d'un contrôle automatique
DE102015211913A1 (de) * 2015-06-26 2016-12-29 Siemens Aktiengesellschaft Fahrzeugtüreinrichtung
WO2018064745A1 (fr) 2016-10-03 2018-04-12 Sensotech Inc. Système de détection basé sur le temps de vol (tof) pour une porte automatique
DE102018104202A1 (de) * 2018-02-23 2019-08-29 Marantec Antriebs- Und Steuerungstechnik Gmbh & Co. Kg Verfahren zum Betrieb eines Torsystems sowie Torsystem

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE20320497U1 (de) 2003-06-04 2004-11-11 Dorma Gmbh + Co. Kg Türsteuerung mit einem Präsenzsensor
EP2698649B1 (fr) * 2010-11-15 2015-07-08 Cedes AG Capteur de surveillance doté d'un contrôle automatique
DE102015211913A1 (de) * 2015-06-26 2016-12-29 Siemens Aktiengesellschaft Fahrzeugtüreinrichtung
WO2018064745A1 (fr) 2016-10-03 2018-04-12 Sensotech Inc. Système de détection basé sur le temps de vol (tof) pour une porte automatique
DE102018104202A1 (de) * 2018-02-23 2019-08-29 Marantec Antriebs- Und Steuerungstechnik Gmbh & Co. Kg Verfahren zum Betrieb eines Torsystems sowie Torsystem

Similar Documents

Publication Publication Date Title
EP2917762B1 (fr) Procédé d'enregistrement pour au moins deux caméras temps de vol
EP2722684B1 (fr) Lecteur laser
EP3601136B1 (fr) Procédé et dispositif de surveillance d'une porte de cabine d'ascenseur
DE102010037744B3 (de) Optoelektronischer Sensor
EP3350616B1 (fr) Procédé et dispositif de mesure optique de distance
EP2306428A1 (fr) Dispositif et procédé destinés à la détermination de la direction, de la vitesse et/ou de l'écart de véhicules
EP2418517A2 (fr) Capteur optoélectronique
EP2306426B1 (fr) Dispositif de détection de véhicules sur une voie de circulation
EP1426784B1 (fr) Dispositif et procédé de localisation
EP2778714A1 (fr) Procédé d'enregistrement d'embouteillages dans une zone de feux de signalisation par le relevé des parties arrières à l'aide d'un radar
DE102012112754A1 (de) Verfahren und Anordnung zur Erfassung von Verkehrsverstößen in einem Ampelbereich durch Heckanmessung mit einem Radargerät
DE102010024328B4 (de) Radarvorrichtung mit situationsadaptiver Modulationsumschaltung und Steuerungsverfahren
DE102018203584A1 (de) Betriebsverfahren für ein LiDAR-System, Steuereinheit für ein LiDAR-System, LiDAR-System und Arbeitsvorrichtung
EP2915922A1 (fr) Procédé de fonctionnement d'une barrière de véhicules
WO2020229186A1 (fr) Système de détection 3d apte à fonctionner dans différents modes de fonctionnement en fonction d'un état de fonctionnement d'un corps de fermeture
WO2021185537A1 (fr) Système de test pour tester un dispositif lidar
EP4266269A1 (fr) Procédé de commande d'une installation de porte
EP2341367B1 (fr) Procédé et agencement de détection des infractions au code de la route dans une zone de feux de signalisation
DE102014011121A1 (de) Kraftfahrzeug mit einem Kollisionsschutzsystem für wenigstens eine Tür
WO2001017838A1 (fr) Procede pour la surveillance d'une zone de danger
EP4265877A1 (fr) Procédé de vérification de l'exactitude des données d'une unité de capteur d'une installation de porte
DE102014204423B4 (de) Lichtlaufzeitkamerasystem
EP2605038B1 (fr) Procédé et dispositif destinés à la détection d'un objet dans un champ radar
EP0029827B1 (fr) Procede pour la surveillance d'un local au moyen d'ondes directionnelles pulsees et installation pour la mise en oeuvre du procede
EP2312340B1 (fr) Procédé de fonctionnement d'un capteur optoélectronique et capteur optoélectronique

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE