EP1170715A2 - Verfahren zur Bodenraumüberwachung - Google Patents
Verfahren zur Bodenraumüberwachung Download PDFInfo
- Publication number
- EP1170715A2 EP1170715A2 EP01116027A EP01116027A EP1170715A2 EP 1170715 A2 EP1170715 A2 EP 1170715A2 EP 01116027 A EP01116027 A EP 01116027A EP 01116027 A EP01116027 A EP 01116027A EP 1170715 A2 EP1170715 A2 EP 1170715A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- state vectors
- data
- traffic situation
- traffic
- state
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft
- G08G5/20—Arrangements for acquiring, generating, sharing or displaying traffic information
- G08G5/22—Arrangements for acquiring, generating, sharing or displaying traffic information located on the ground
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft
- G08G5/70—Arrangements for monitoring traffic-related situations or conditions
- G08G5/72—Arrangements for monitoring traffic-related situations or conditions for monitoring traffic
- G08G5/727—Arrangements for monitoring traffic-related situations or conditions for monitoring traffic from a ground station
Definitions
- the present invention relates to a method for space monitoring, in particular of the floor space at airports.
- Air and ground surveillance is part of air traffic control carried out and serves to make air traffic smooth, fast and safe.
- the tasks of air traffic control include the avoidance of collisions between vehicles in the air and on the runways, the tarmac and the parking areas of the controlled airports.
- Air traffic capacity is essentially limited by two bottlenecks. So is for due to restricted airways and corridors, the usable airspace is severely restricted. On the other hand, the traffic throughput on the ground is often severely limited because it is inexpensive and intelligent floor space monitoring systems are missing, and there is one besides aircraft Variety of service vehicles takes part in ground traffic.
- the known floor space monitoring systems are very expensive because they have a large number of sensors, such as. Airport radar, secondary radar and GPS are required.
- the object of the present invention is therefore to provide a method, an inexpensive and effective room surveillance, especially of the air and ground space from airports, allowed.
- any place on the airfield you can e.g. any place on the airfield understand. This can be the runway or landing road or any access road be on the apron or a gate exit. Even if the method according to the invention can already be used with advantage if only one traffic location with at least is monitored by a video sensor, a whole series of traffic locations are preferred or traffic hubs are each monitored with at least one video sensor.
- the video sensor can be any sensor that reacts to light. Particularly preferred
- digital video cameras are used because they deliver image data that is special can be easily processed.
- This image data can be in bitmap format, for example with each pixel individually entered in the file. But others can also benefit less memory-intensive data formats are used.
- the video sensor provides a snapshot his field of view in the form of an image data file, which is suitably in a Storage unit is read. This image data is evaluated for one or more events recognizable, if there are any. Under an event are all irregularities understood in the field of view of the video sensor, in particular movements of objects in the field of view the camera or a foreground object in front of a specific and saved one solid surface. Is e.g.
- a state vector is calculated for each known event from the image data.
- the state vector can be defined arbitrarily, e.g. as the difference between two vectors Sets of image pixels or data. One possibility is that he takes the position of the observed event.
- the method described enables one or more traffic locations to be monitored efficiently become.
- the object data can either be sent directly to the air traffic control personnel or to a suitable processing facility can be passed on.
- the air traffic control personnel is thereby informed at all times whether an aircraft is at one of the observed traffic locations or where the planes are located.
- a method is particularly preferred in which image data of at least two temporally spaced Snapshots of at least one video sensor can be read.
- two temporally Staggered snapshots of the same video sensor and thus the same traffic location it is possible due to the change in the position of the pixel group that the event represents to calculate a speed component of the object.
- the speed component With a suitable installation of the video sensor and the corresponding route of the aircraft corresponds to the speed component the absolute speed. If the route is not straight it can be advantageous if at least one traffic location with several, preferably at least three, video sensors is monitored. In this case, the relative speed component can be for each video sensor of the object can be calculated separately. Knowing the locations of the Video sensors can determine the absolute speed and the direction of movement of the object be determined.
- a particularly preferred embodiment of the present method therefore provides that the state vector also the speed and / or the direction of movement represents the object.
- the State vector represents the acceleration of an object.
- To calculate the acceleration at least three snapshots of a video sensor are required. From these three snapshots the acceleration of the object can be calculated. The accuracy of the position prediction the process is significantly improved. This is particularly true in areas in which the speed of the aircraft changes significantly; e.g. due to a braking process, of great advantage. Can namely the acceleration of the state vector Object can be removed, so this can be extrapolated over time the position of the object be taken into account.
- the state vector that are a measure of the statistical probabilities of the object parameters.
- This Values can also be integrated into the state vector. This measure can reduce the error rate be significantly minimized in event detection. For example, it is possible that the image data of the video sensor due to external influences, e.g. heavy rainfall or Bird flight, do not show the object to be observed. Then it can happen that either a Object is recognized, which is actually not at the traffic location or that is at the traffic location aircraft is not recognized. Therefore, a state vector of an object, that on many consecutive snapshots and from different video sensors a high statistical probability was assigned during a state vector of an object that was only recognized in a snapshot, a very low statistical probability is assigned. The statistical probability is therefore a measure of the quality or quality (i.e. the accuracy and reliability) of the detected Event. It is then possible, for example, to define a limit value and all state vectors, the value of the statistical probability does not fall below this limit to consider.
- a particularly preferred embodiment of the method according to the invention provides that the Assignment of the state vector to an object with access to traffic situation data takes place were not won by the video sensors.
- This traffic situation data can be varied Way can be obtained.
- the air traffic control personnel assigns the individual status vectors to the arriving aircraft at a suitable terminal.
- the state vector also contains an object identification number or one Object name. This makes it possible to use the method not only to monitor where a Plane, but even monitor where each plane is located.
- the existing one Air traffic control system with a system which implements the method according to the invention, be coupled so that the floor space monitoring system whenever an aircraft is on the Runway lands, accesses the flight plan data of the existing air traffic control system and that Airplane identified. This identification is then used during further floor space monitoring maintained. An object that moves away from a parking position can be used in the same way Using the occupancy plan of the parking positions can be identified.
- radar data can also be used.
- all other position sensors can also be used provide additional traffic situation data. This is where Mode-S radar sensors, Near come for example Range radar networks or GPS receivers and servers in question.
- a particularly expedient embodiment provides that the image data are filtered digitally, preferably using discrete Kalman filtering.
- the traffic situation data can also be digitally filtered, with discrete Kalman filtering preferably also being used here.
- Kalman filtering enables real-time estimation and smoothing of the measured values.
- other suitable estimation techniques for example, the MLS-filtering are (m inimum l east s quare) was used.
- the quality of the method according to the invention can thereby be significantly increased.
- the state vectors of the same object are preferably those of different ones Sensors were determined, summarized. The same applies to those received from the sensors State vectors and the state vectors from the other traffic situation data. This will make the Floor space monitoring is clearer, since fewer condition vectors have to be followed.
- the State vectors of the same object are correlated with one another. Delivers the radar system, for example with large time intervals reliable position data of a certain object, so the state vector of this object can be correlated with the radar data. In other words Whenever the reliable radar data is available, it is determined from the video sensors and, if necessary, time-extrapolated state vector of this object is adapted. This correlation allows the state vectors to be corrected even if the object is not on one traffic location monitored by video sensors.
- a method is particularly preferred in which the assignment of the state vectors takes place Objects the step connects:
- This traffic situation can be displayed in any coordinate system.
- the state vectors of the recognized objects are converted into positions in the traffic situation display and entered into the traffic situation display at these positions. It goes without saying it is possible to visualize the traffic situation so that the air traffic control personnel on can see at a glance where the individual aircraft are and whether they are on somewhere a collision is imminent.
- the state vectors with regard to possible Collisions are evaluated.
- the state vectors are not only used for generation a traffic situation representation in real time, but there is an extrapolation of the State vectors in the future to be able to predict possible collisions.
- the method therefore provides that an alarm signal depending on the collision evaluation is activated.
- This alarm signal can be, for example, an optical or acoustic signal for air traffic control personnel.
- the only figure shows the basic structure of a floor space monitoring system.
- the single ones Components of the system are interconnected via a communication network 9 (e.g. TCP / IP-LAN network).
- the individual ones are above the line 9 in FIG Clients 10, 11 and 12 are shown, while below the line 9 the three servers 2, 3 and 5 are shown are.
- Server 2 is responsible for merging the types of video sensors.
- the individual video sensors 7, which recognize the objects 8, are activated via the control and read-out unit 1 connected to the server 2.
- the server 2 collects the sensor data and filters them with more discrete time variants Kalman filtering and forms the state vectors of objects 8.
- the state vectors are passed on to the radar server 3 for further processing.
- the radar server 3 filters one or several airspace surveillance radars with a time-invariant discrete Kalman filter to Generate flight routes. This server carries out a correlation between the video-based trajectories and the radar trajectories together with a correlation with the flight plan data 5.
- the correlated Information is provided to clients 10, 11 and 12.
- the flight plan server 5 delivers the flight plan data to the radar server 3 for the correlation of the radar with the flight plan and for the time-based Identification of the aircraft taking off.
- the flight plan server 5 receives flight plan changes from its clients 6.
- the method according to the invention converts video sensor data with other traffic situation data combined to produce a traffic situation representation that provides safe and reliable monitoring of the air and ground space guaranteed.
- a system that the inventive method realized consists of sensor hardware 1, 7, 8, computer hardware 2, 3, 5, network infrastructure 9 as well as sensor software, communication software, processing software and situation display software.
- the video sensor data are recorded with the help of the video sensors 7.
- pixel data include so-called image frames one or more high frequency video sensors (typically greater than 27 Hz).
- Image frames are at least two-dimensional vectors of pixels (gray or color values), which reflect the recording of the digital camera with a non-infinite resolution.
- the state data or state vectors are calculated.
- Information includes: object identification number, object classification, position, speed, Direction of movement, acceleration and statistical probability / quality (accuracy, Reliability).
- the other traffic situation data can be, for example, radar data that the radar server 3 provides Provides. This is digitized data from, for example, long-range radars and airport radars. These radars have been symbolized by the box with the reference number 4.
- the flight plan data provided by the flight plan data server 5 can for example, from existing flight plan systems that are connected to a system that realized the method according to the invention, can be connected.
- the video sensor data is combined with the other traffic situation data by image data processing, digital filtering, statistical interference, sensor data fusion, track correlation and Time frame correlation.
- image data processing includes the transmission of the pixel data selected digital camera to the dialing position display system as well as the recognition and Classification of objects in the pixel data.
- digital filtering of the measured values is carried out using the discrete Kalman filtering. With their help, all extracted state vectors from event detection of the video sensor data filtered.
- the radar data can be transmitted in the same way or the data from other sensors can be filtered.
- Statistical inference means that in the calculation of future state vectors or the extrapolation of state vectors is taken into account that the output data with a corresponding inaccuracy and unreliability, so that the statistical probability of the new State vector from the propagation of the inaccuracy and unreliability of the original State vectors calculated.
- Sensor data fusion is the correct merging of object information for objects, that were detected by several sensors. For example, two different Video sensors complementary motion information of one and the same object received, they are entered in a common state vector. With sensor data fusion However, care must be taken to ensure that there are different objects that may differ come very close, can be correctly distinguished at any time.
- the track correlation is understood to mean that those calculated on the basis of different sensors Flight or movement routes for the same objects are checked. In other words the flight routes determined from the video sensors are compared with the flight routes derived from the radar data were determined, correlated, if available.
- the time window correlation is understood to mean that the routes based on the sensor data were determined with the movement schedule of the objects in accordance with the available flight plan data can be adjusted within a certain time tolerance window.
- the state vectors determined in this way are entered in a positional representation. This is done in sequential cycles.
- the individual position data are sent from the server 2 to the radar server 3 passed.
- the servers 2 and / or 3 also take over the conflict determination. That means for Two status vectors are checked to determine whether there is a risk of collision. Furthermore, it is checked whether blocked traffic protection areas are violated by other intruding objects.
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
- Überwachung mindestens eines Verkehrsortes mit mindestens einem Videosensor,
- Einlesen der Bilddaten mindestens einer Momentaufnahme des zumindest einen Videosensors in eine Speichereinheit,
- Erkennen von Ereignissen,
- Berechnung von Zustandsvektoren der Ereignisse aus den Bilddaten und
- Zuordnen der Zustandsvektoren jeweils zu einem Objekt.
- Figur 1
- einen schematischen Aufbau eines Bodenraumüberwachungssystems.
Claims (22)
- Verfahren zur Raumüberwachung, das die Schritte aufweist:Überwachung mindestens eines Verkehrsortes mit mindestens einem Videosensor,Einlesen der Bilddaten mindestens einer Momentaufnahme des zumindest einen Videosensors in eine Speichereinheit,Erkennen von Ereignissen,Berechnen von Zustandsvektoren der Ereignisse aus den Bilddaten,Zuordnen der Zustandsvektoren jeweils zu einem Objekt undAusgabe von Objektdaten.
- Verfahren nach Anspruch 1, dadurch gekennzeichnet, daß das überwachte Raumgebiet mindestens einen Teil eines Flugfeldes bzw. Flughafengeländes bildet.
- Verfahren nach Anspruch 1 oder 2, dadurch gekennzeichnet, daß der Zustandsvektor die Position eines Objektes darstellt.
- Verfahren nach einem der Ansprüche 1 bis 3, dadurch gekennzeichnet, daß Bilddaten mindestens zweier zeitlich beabstandeter Momentaufnahmen des mindestens einen Videosensors eingelesen werden.
- Verfahren nach einem der Ansprüche 1 bis 4, dadurch gekennzeichnet, daß der mindestens eine Verkehrsort mit mehreren, vorzugsweise mit mindestens drei, Videosensoren überwacht wird.
- Verfahren nach einem der Ansprüche 1 bis 5, dadurch gekennzeichnet, daß der Zustandsvektor die Geschwindigkeit und/oder die Bewegungsrichtung eines Objekts darstellt.
- Verfahren nach einem der Ansprüche 1 bis 6, dadurch gekennzeichnet, daß der Zustandsvektor die Beschleunigung eines Objekts darstellt.
- Verfahren nach einem der Ansprüche 1 bis 7, dadurch gekennzeichnet, daß dem Zustandsvektor ein Wert zugeordnet wird, der ein Maß für die statistische Wahrscheinlichkeit des Objektes bzw. der Genauigkeit und Zuverlässigkeit der Objektparameter ist.
- Verfahren nach einem der Ansprüche 1 bis 8, dadurch gekennzeichnet, daß die Zuordnung der Zustandsvektoren zu einem Objekt unter Zugriff auf Verkehrslagedaten erfolgt, die nicht durch die Videosensoren gewonnen wurden.
- Verfahren nach Anspruch 9, dadurch gekennzeichnet, daß die Zuordnung der Zustandsvektoren zu einem Objekt unter Zugriff auf Radardaten erfolgt.
- Verfahren nach Anspruch 9 oder 10, dadurch gekennzeichnet, daß die Zuordnung der Zustandsvektoren zu einem Objekt unter Zugriff auf Flugplandaten erfolgt.
- Verfahren nach einem der Ansprüche 9 bis 11, dadurch gekennzeichnet, daß die Zuordnung der Zustandsvektoren zu einem Objekt unter Zugriff auf Positionssensoren erfolgt.
- Verfahren nach einem der Ansprüche 1 bis 12, dadurch gekennzeichnet, daß die Bilddaten digital gefiltert werden, wobei vorzugsweise die diskrete Kalman-Filterung oder MLS-Filterung verwendet wird.
- Verfahren nach einem der Ansprüche 9 bis 13, dadurch gekennzeichnet, daß zumindest ein Teil der Verkehrslagedaten digital gefiltert wird, wobei vorzugsweise die diskrete Kalman-Filterung oder MLS-Filterung verwendet wird.
- Verfahren nach einem der Ansprüche 1 bis 14, dadurch gekennzeichnet, daß die Zustandsvektoren zeitabhängig in Echtzeit weiterberechnet werden.
- Verfahren nach einem der Ansprüche 1 bis 15, dadurch gekennzeichnet, daß die Zustandsvektoren desselben Objekts, die von unterschiedlichen Sensoren oder Verkehrslagedaten ermittelt wurden, zusammengefaßt werden.
- Verfahren nach einem der Ansprüche 1 bis 16, dadurch gekennzeichnet, daß die Zustandsvektoren desselben Objekts, die aus unterschiedlichen Verkehrslagedaten ermittelt wurden, korreliert werden.
- Verfahren nach einem der Ansprüche 1 bis 17, dadurch gekennzeichnet, daß sich der Zuordnung der Zustandsvektoren zu Objekten der Schritt anschließt:
Erzeugung einer Verkehrslagedarstellung. - Verfahren nach Anspruch 18, dadurch gekennzeichnet, daß die Zustandsvektoren in Positionen in der Verkehrslagedarstellung umgerechnet werden und an diesen Positionen in die Verkehrslagedarstellung eingetragen werden.
- Verfahren nach Anspruch 18 oder 19, dadurch gekennzeichnet, daß die Zustandsvektoren im Hinblick auf mögliche Kollisionen ausgewertet werden.
- Verfahren nach Anspruch 20, dadurch gekennzeichnet, daß ein Alarmsignal in Abhängigkeit von dem Ergebnis der Kollisionsauswertung aktiviert wird.
- Verfahren nach einem Ansprüche 18 bis 21, dadurch gekennzeichnet, daß die Verkehrslagedarstellung in vorzugsweise festen äquidistanten Zeitabständen aktualisiert wird.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| DE10032433 | 2000-07-04 | ||
| DE10032433A DE10032433A1 (de) | 2000-07-04 | 2000-07-04 | Verfahren zur Bodenraumüberwachung |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| EP1170715A2 true EP1170715A2 (de) | 2002-01-09 |
| EP1170715A3 EP1170715A3 (de) | 2003-01-29 |
Family
ID=7647721
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| EP01116027A Withdrawn EP1170715A3 (de) | 2000-07-04 | 2001-07-02 | Verfahren zur Bodenraumüberwachung |
Country Status (2)
| Country | Link |
|---|---|
| EP (1) | EP1170715A3 (de) |
| DE (1) | DE10032433A1 (de) |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6099517A (en) * | 1986-08-19 | 2000-08-08 | Genentech, Inc. | Intrapulmonary delivery of polypeptide growth factors and cytokines |
| EP1363140A1 (de) * | 2002-05-13 | 2003-11-19 | Joval N.V. | Verfahren und Anlage zum Feststellen eines Objektes auf einer Startbahn |
| WO2004008403A3 (en) * | 2002-07-15 | 2004-03-11 | Magna B S P Ltd | Method and apparatus for implementing multipurpose monitoring system |
| EP1561190A4 (de) * | 2002-10-28 | 2006-07-26 | Xsight Systems Ltd | System und verfahren zur erkennung von fremdkörpern |
| EP1936583A1 (de) * | 2006-12-20 | 2008-06-25 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Flughafenverkehrsinformations-Anzeigesystem |
| US8022841B2 (en) | 2008-03-31 | 2011-09-20 | Xsight Systems Ltd. | System and method for ascription of foreign object debris detected on airport travel surfaces to foreign object sources |
| US9135830B2 (en) | 2010-02-18 | 2015-09-15 | Xsight Systems Ltd. | Airport travel surface edge lighting and foreign object detection system and method |
Families Citing this family (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE102005041705A1 (de) | 2005-09-02 | 2007-03-15 | Oerlikon Contraves Ag | Verfahren zur Raum-/Luftraumüberwachung |
| DE102007014599A1 (de) * | 2007-03-23 | 2008-09-25 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Verfahren zur Überwachung eines Rollverkehrs-Management-Systems für Flughäfen |
| DE102008018880A1 (de) | 2008-04-14 | 2009-10-15 | Carl Zeiss Optronics Gmbh | Überwachungsverfahren und -vorrichtung für Windkraftanlagen, Gebäude mit transparenten Bereichen, Start- und Landebahnen und/oder Flugkorridore von Flughäfen |
| DE102008054203A1 (de) * | 2008-10-31 | 2010-06-10 | Adb N.V | Vorrichtung zur Flugfeldbefeuerung eines Flughafens |
| DE102016223094A1 (de) * | 2016-11-23 | 2018-05-24 | Robert Bosch Gmbh | Verfahren und System zum Detektieren eines sich innerhalb eines Parkplatzes befindenden erhabenen Objekts |
| DE102019106461A1 (de) * | 2019-03-14 | 2020-09-17 | Bayerische Motoren Werke Aktiengesellschaft | Verfahren, Vorrichtung, Computerprogramm und Computerprogrammprodukt zur Verarbeitung von Messdatensätzen |
Family Cites Families (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| DE4140406C2 (de) * | 1991-12-07 | 1998-09-03 | Daimler Benz Aerospace Ag | Verfahren zur Orientierung, Navigation, Führung und Überwachung von Flugzeugen |
| US5519618A (en) * | 1993-08-02 | 1996-05-21 | Massachusetts Institute Of Technology | Airport surface safety logic |
| DE4332753C2 (de) * | 1993-09-25 | 1997-01-30 | Bosch Gmbh Robert | Verfahren zur Erkennung bewegter Objekte |
| JP3605667B2 (ja) * | 1996-01-30 | 2004-12-22 | 株式会社ナックイメージテクノロジー | 航空機の離着陸性能計測装置 |
| AUPN903296A0 (en) * | 1996-03-29 | 1996-04-26 | Commonwealth Scientific And Industrial Research Organisation | An aircraft detection system |
| DE19640938A1 (de) * | 1996-10-04 | 1998-04-09 | Bosch Gmbh Robert | Anordnung und Verfahren zur Überwachung von Verkehrsflächen |
| EP0939946A1 (de) * | 1996-11-15 | 1999-09-08 | Siemens Aktiengesellschaft | Terminalkoordinationssystem für flughäfen |
-
2000
- 2000-07-04 DE DE10032433A patent/DE10032433A1/de not_active Withdrawn
-
2001
- 2001-07-02 EP EP01116027A patent/EP1170715A3/de not_active Withdrawn
Cited By (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6099517A (en) * | 1986-08-19 | 2000-08-08 | Genentech, Inc. | Intrapulmonary delivery of polypeptide growth factors and cytokines |
| EP1363140A1 (de) * | 2002-05-13 | 2003-11-19 | Joval N.V. | Verfahren und Anlage zum Feststellen eines Objektes auf einer Startbahn |
| BE1014829A3 (nl) * | 2002-05-13 | 2004-05-04 | Joval N V | Werkwijze en inrichting voor het bepalen van de |
| WO2004008403A3 (en) * | 2002-07-15 | 2004-03-11 | Magna B S P Ltd | Method and apparatus for implementing multipurpose monitoring system |
| US8111289B2 (en) | 2002-07-15 | 2012-02-07 | Magna B.S.P. Ltd. | Method and apparatus for implementing multipurpose monitoring system |
| EP1561190A4 (de) * | 2002-10-28 | 2006-07-26 | Xsight Systems Ltd | System und verfahren zur erkennung von fremdkörpern |
| US7253748B2 (en) | 2002-10-28 | 2007-08-07 | Xsight Systems Ltd | Foreign object detection system and method |
| EP1995707A3 (de) * | 2002-10-28 | 2009-03-25 | Xsight Systems Ltd. | System zur Erkennung von Fremdkörpern |
| US7663507B2 (en) | 2002-10-28 | 2010-02-16 | Xsight Systems Ltd. | Foreign object detection system and method |
| EP1936583A1 (de) * | 2006-12-20 | 2008-06-25 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Flughafenverkehrsinformations-Anzeigesystem |
| US8022841B2 (en) | 2008-03-31 | 2011-09-20 | Xsight Systems Ltd. | System and method for ascription of foreign object debris detected on airport travel surfaces to foreign object sources |
| US9135830B2 (en) | 2010-02-18 | 2015-09-15 | Xsight Systems Ltd. | Airport travel surface edge lighting and foreign object detection system and method |
Also Published As
| Publication number | Publication date |
|---|---|
| EP1170715A3 (de) | 2003-01-29 |
| DE10032433A1 (de) | 2002-01-17 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP0886847B1 (de) | Verfahren zur erkennung eines kollisionsrisikos und zur vermeidung von kollisionen in der luftfahrt | |
| EP1505556B1 (de) | Verfahren und Vorrichtung zur Erkennung eines Flughindernisses | |
| EP0883873B1 (de) | Flughafen-leitsystem, insbesondere flughafen-bodenverkehrsleitsystem | |
| DE69731009T2 (de) | System zum Erkennen von Hindernissen | |
| EP1475764B1 (de) | Verfahren und Vorrichtung zur Ermittlung der Wahrscheinlichkeit für eine Kollision eines Fahrzeugs mit einem Gegenstand | |
| DE60006550T2 (de) | Luftverkehrsüberwachungssystem | |
| EP1170715A2 (de) | Verfahren zur Bodenraumüberwachung | |
| EP2266314B9 (de) | Überwachungsverfahren und -vorrichtung für flugkorridore oder teile von flugkorridoren von flughäfen | |
| DE102017217056A1 (de) | Verfahren und Einrichtung zum Betreiben eines Fahrerassistenzsystems sowie Fahrerassistenzsystem und Kraftfahrzeug | |
| WO2020200792A1 (de) | Verfahren zur überprüfung eines umfelderfassungssensors eines fahrzeugs und verfahren zum betrieb eines fahrzeugs | |
| DE102019116380A1 (de) | Vorrichtung und Verfahren zum Steuern eines Fahrens eines Fahrzeugs | |
| DE102019212842B4 (de) | Steuerung eines Kraftfahrzeugs unter Verwendung einer unbemannten Flugvorrichtung | |
| EP0740280A2 (de) | Verfahren zur Störungserkennung im Strassenverkehr | |
| WO2019162794A1 (de) | Verfahren und system zur erkennung von für ein fahrzeug geeigneten parklücken | |
| DE102014219691A1 (de) | Verfahren zur Überwachung einer Umgebung einer Schienenfahrbahn und Überwachungssystem | |
| EP2521070A2 (de) | Verfahren und System zum Erfassen einer statischen oder dynamischen Szene, zum Bestimmen von Rohereignissen und zum Erkennen von freien Flächen in einem Beobachtungsgebiet | |
| DE102019114354A1 (de) | Verfahren und System zur Vermeidung von Kollisionen zwischen Fluggeräten und anderen fliegenden Objekten | |
| DE19621612A1 (de) | Verfahren und Vorrichtung zur optischen Freiraumüberwachung | |
| EP1770595B1 (de) | Verfahren zur Unterstützung von Tiefflügen zur Erkennung von Hindernissen | |
| EP3472818B1 (de) | Konzept zum steuern eines verkehrs innerhalb eines parkplatzes | |
| DE102017212513A1 (de) | Verfahren und System zum Detektieren eines freien Bereiches innerhalb eines Parkplatzes | |
| EP4350657B1 (de) | Alarmsystem zur warnung vulnerabler verkehrsteilnehmer in einem vorgegebenen strassenabschnitt | |
| DE112013003958T5 (de) | Geräuschbeobachtungsvorrichtung und Geräuschbeobachtungsverfahren | |
| EP2254104B1 (de) | Verfahren zum automatischen Erkennen einer Situationsänderung | |
| DE10049366A1 (de) | Verfahren zum Überwachen eines Sicherheitsbereichs und entsprechendes System |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
| AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR |
|
| AX | Request for extension of the european patent |
Free format text: AL;LT;LV;MK;RO;SI |
|
| PUAL | Search report despatched |
Free format text: ORIGINAL CODE: 0009013 |
|
| AK | Designated contracting states |
Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LI LU MC NL PT SE TR |
|
| AX | Request for extension of the european patent |
Extension state: AL LT LV MK RO SI |
|
| 19U | Interruption of proceedings before grant |
Effective date: 20020920 |
|
| 19W | Proceedings resumed before grant after interruption of proceedings |
Effective date: 20031116 |
|
| AKX | Designation fees paid | ||
| STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
| REG | Reference to a national code |
Ref country code: DE Ref legal event code: 8566 |
|
| 18D | Application deemed to be withdrawn |
Effective date: 20031117 |