EP1936583A1 - Airport traffic information display system - Google Patents

Airport traffic information display system Download PDF

Info

Publication number
EP1936583A1
EP1936583A1 EP07024748A EP07024748A EP1936583A1 EP 1936583 A1 EP1936583 A1 EP 1936583A1 EP 07024748 A EP07024748 A EP 07024748A EP 07024748 A EP07024748 A EP 07024748A EP 1936583 A1 EP1936583 A1 EP 1936583A1
Authority
EP
European Patent Office
Prior art keywords
image processing
airport
processing unit
characterized
panoramic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP07024748A
Other languages
German (de)
French (fr)
Other versions
EP1936583B1 (en
Inventor
Norbert Dr. Fürstenau
Markus Schmidt
Michael Rudolph
Bernd Dr. Werther
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Deutsches Zentrum fur Luft- und Raumfahrt eV
Original Assignee
Deutsches Zentrum fur Luft- und Raumfahrt eV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to DE200610060904 priority Critical patent/DE102006060904B4/en
Application filed by Deutsches Zentrum fur Luft- und Raumfahrt eV filed Critical Deutsches Zentrum fur Luft- und Raumfahrt eV
Publication of EP1936583A1 publication Critical patent/EP1936583A1/en
Application granted granted Critical
Publication of EP1936583B1 publication Critical patent/EP1936583B1/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0073Surveillance aids
    • G08G5/0082Surveillance aids for monitoring traffic from a ground station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0017Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
    • G08G5/0026Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located on the ground

Abstract

An airport traffic information display system (1) having at least one camera (2) alignable with an observable airport area, an image processing unit (9) connected to the at least one camera (2) and at least one display unit (11) communicating with the image processing unit (9) ) is described. The image processing unit (9) is set up to select and display a foveal panoramic segment corresponding to the attention focus of a selected line of sight on the observed airport area and to present information about objects in the peripheral, extra-foveal panoramic segment, not shown.

Description

  • The invention relates to an airport traffic information display system having at least one camera alignable to an observable airport area, an image processing unit connected to the at least one camera and at least one display unit in communication with the image processing unit.
  • There is a need to control air traffic in the airport area without a direct outside view with means for a pure sensor-based control center. This has the advantage that in large-scale airport facilities tower for monitoring of outdoor areas that are not visible from the main tower, no longer have to be staffed. The control of the entire airport can rather be carried out in the main tower. The effort for the construction of tower structures (tower) to monitor the outdoor areas is also significantly reduced.
  • There is also a need to reduce the burden of controlling low-traffic airports by relocating the control functions to the pilots of a (more) more frequented airport. For this purpose, a sensor-based monitoring of the airport area with a transfer of the image data to a display unit for the pilots who carry out the control is required.
  • The working conditions demand from the pilot in addition to the communication by radio and telephone a frequent change of attention between outside view, which is reinforced now and then by the handle to the binoculars, and various monitors and radar displays. In the case of poor visibility, the possibility of visual traffic monitoring by looking out of the windows of the control centers naturally clearly reduces and the traffic must be correspondingly reduced for safety reasons.
  • In N. Fürstenau, M. Rudolph, M. Schmidt, B. Werther: Virtual Tower, in: Competition of Visions 2001 to 2004, German Aerospace Center, 2004, Pages 16 to 21 , the project of developing a virtual control tower with proposals for human-machine interaction is described.
  • N. Fürstenau: Perspectives of Virtual Reality for Integration, in: 12th Scientific Seminar, DLR, Inst. Of Flight Guidance, October 2002 , discloses the use of virtual reality technologies to control and control air traffic at airports.
  • Out CD Wickens: Multiple Resources and Performance Prediction in: Theor. Issues in Ergon. Sci., 2002, Vol. 3, no. 2, pages 159 to 177 A four-dimensional multi-resource model is described in which a distinction is made between a visual processing in the focal attention area and in the surrounding area of a human, to which qualitatively different processing brain structures and information processing mechanisms are assigned. The focal view of, for example, a pilot is almost always confined to the foveal retina area and is required for pattern recognition tasks. The background / environment view is strongly coupled to the peripheral view and responsible for the perception of the orientation and the perception of movement (self-movement) of the observer.
  • DE 10 2005 005 879 discloses a method of reconstructing a real tower exterior view by means of a high resolution panoramic projection on monitors or by means of a wide angle projection to display the images of at least four high resolution digital video cameras for a 180 ° panorama. The digitized signals are provided in a four segment panorama by one computer per segment by decompressing the compressed image data. One camera and one projection or one screen per panoramic segment are provided per segment. There is the possibility of a simultaneous real-time image processing for object recognition, motion detection and / or traffic parameter determination (position, speed, etc.).
  • Wide-angle projections suitable for high-resolution viewing of a video panorama are known in the virtual reality field. These are tiled projection systems with a high-resolution digital projector pro Tile or two-digital projectors for stereo projections. The simulators for the (virtual) airport tower exterior view are so far more than four high-resolution fields by pixel-precise overlaying on the vertical edges (stitching) in a cylindrical or hollow spherical segment projection to a virtual panoramic view (200 to 300 °) Airport surface put together. For a single workstation in the tower environment they are not suitable because of the size.
  • As a single workstation with a wide-angle projection (eg 180 °), a spherical projection can also be used. However, the maximum available resolution of 1600 x 1200 pixels does not meet the requirements for a 180 ° panorama, which can replace the direct outside view of the surgeon from the tower.
  • The object of the present invention is therefore to provide an improved airport traffic information display system which enables a high-resolution viewing of the relevant airport areas with reduced technical complexity and simultaneously provides the surgeon with the necessary overview information.
  • The object is achieved with the airport traffic information system of the type mentioned in that the image processing unit for selecting and displaying a foveal panoramic segment corresponding to the attention focus of a selected line of sight on the observed airport area and for simultaneous display of information about moving objects in peripheral, extra-foveal Panoramasegment is set up.
  • It exploits the fact that in the area of conscious attention and visual processing, when viewed with the eye without amplification by binoculars, there is only the object area imaged in the foveal area (<5 °) of the retina. Accordingly, a segment that is typically represented by a single monitor (e.g., 45 °) already covers a much larger panoramic segment. For a panoramic projection in a single workstation, however, it is not necessary, the peripheral area of a z. B. 180 ° panoramas at the same time complete, since in a real external view in these extra-foveal sectors anyway only striking movements are perceived. The representation of these extra-foveal sectors according to the invention is thus reduced to essential movement information, while the foveal panoran segment is displayed in high-resolution and detailed.
  • Preferably, a remotely controllable pan-tilt-zoom camera is provided, which is adapted to select a zoom section of the panoramic segment, to display the zoom section on one of the display units.
  • With this remote-controlled pan-tilt-zoom camera, it is thus possible to simulate the use of a binocular by the surgeon and to display an enlarged view of a cutout area. In order to facilitate orientation, as in reality, the detailed representation of the foveal panoramic view is preserved.
  • A pan-tilt-zoom camera is a pan-tilt camera with zoom capability. For this purpose, the zoom camera is mounted on a pan-tilt panoramic head, which allows a pivot around the Nodalpunkt and tilting. The nodal point describes the location of the main planes of a lens of the zoom camera.
  • The image processing unit is either directly or preferably indirectly, for example via the in the DE 10 2005 005 879 described transmission system connected to at least one camera.
  • Preferably, four cameras are provided for receiving a high-resolution 180 ° panorama. To minimize costs, the number of high-resolution panoramic cameras can optionally be minimized to a panoramic camera, which is optionally arranged pivotable via a remote control and thereby can cover the entire panorama. The 180 ° panoramic segment of an observable airport area is thus divided into four 45 ° segments, each of which is observed through a panoramic camera. Only the current foveal observation range of a section of one of the 45 ° segments in high-resolution and correspondingly reproduced in detail.
  • It is particularly advantageous if at least one sensor connected to the panoramic projection system automatically selects the segment to be observed (foveal attention area) on the basis of object recognition and / or position determination (by means of eg automatic image processing, approach radar, multilateration system or satellite navigation with ADS). The image processing unit is then set up to automatically select the foveal panoramic segment depending on the foveal attention sector, thereby allowing attention to be reduced to non-foveal sectors for essential motion information within the visualized foveal sector of the surgeon, ie his foveal information recording, with the speed corresponding to a head rotation of the surgeon automatically on areas of interest of the most complete to direct ndigen Panoramos. A manual selection of the highly resolved foveal attention sector in detail by the surgeon is not required.
  • The foveal panoramic segment is preferably less than 45 ° of a 180 ° panorama detectable by the cameras.
  • Furthermore, it is advantageous if the image processing unit is set up for the simultaneous permanent display of the selected panorama segment (corresponding to the current viewing direction). This ensures that the surgeon retains orientation.
  • The image processing unit is furthermore preferably for automatically selecting the foveal panoramic segment as a function of characteristic movements detected in the recorded panorama, flight movement information from an approach radar, position information from a multilateral device via aircraft and / or vehicles in the observed airport area and / or satellite navigation position information transmitted by aircraft set up. In this way, the surgeon's attention can be automatically directed to the detected moving objects as a function of said sensed movements in the observed airport area. For example, when approaching an aircraft, the approach of an aircraft is detected by means of the approach radar or satellite navigation position information transmitted by the aircraft, and the high-resolution representation of the foveal panoramic segment is directed at the approaching aircraft. After the situation has been detected by the surgeon, they can then manually select another foveal panoramic segment or a new foveal panoramic section can be automatically selected by coupling the automatic, sensor-based object recognition and position determination to the video panorama reconstruction.
  • The image processing unit is preferably equipped with a touch-sensitive input device, in particular a touch-sensitive monitor, in order to select a panorama segment as a function of a finger movement on the input device. Such a "touch-screen" monitor has the advantage of quick and easy command input by the operator.
  • The information about moving objects outside the selected panoramic segment is preferably displayed in horizontal and vertical image bars adjoining the representation of the foveal panoramic segment. The further representation of the movement information should then depend on the spatial position of the objects belonging to the movement information. This allows the operator to keep track of the objects in the vicinity of the foveal panorama, which have yet to be taken into account in his piloting.
  • The image processing unit is preferably connected to a further display unit for displaying the panorama captured by the cameras. The entire example 180 ° panoramic view is then presented (not high resolution) on the other display unit.
  • The object is further achieved by the method having the features of claims 13 to 15 and the computer program configured for carrying out the method.
  • The method provides for maximizing the possible frame rate and minimizing the required transmission bandwidth for a given hardware constellation by displaying the panoramic area corresponding only to the attention focus. A virtual panorama in an image memory is generated by a computer from a real-time video data stream by seamless sequencing of the frames from the individual panoramic cameras with selection and representation of an arbitrary region by shifting an image section by selecting a virtual viewing position.
  • Data reduction is accomplished by the exclusive transmission of portions (attention focus) of the compressed video data streams to an image processing computer. The transmission capacity gained for this purpose is used for dynamically adapting the compression parameters to optimize the image quality. For example, this can automatically reduce the compression if the image quality deteriorates in the dark.
  • The invention will be explained in more detail with reference to the accompanying drawings with exemplary embodiments. Show it:
  • Fig. 1 -
    Block diagram of an airport traffic information display system according to the invention;
    Fig. 2 -
    Sketch of an airport traffic information display system with a display unit for displaying a foveal panoramic segment and information about moving objects in the peripheral, extra-faroveal panoramic segment, not shown, and a second display unit for displaying a zoomed-out section of the foveal panoramic segment;
    Fig. 3 -
    Sketch of an airport traffic information display system with a display unit for displaying the foveal panoramic segment and horizontal and vertical display bars for displaying information about peripheral moving objects.
  • Fig. 1 FIG. 12 shows a block diagram of an airport traffic information display system 1 having at least one camera 2 which can be oriented to an area of the airport to be observed. Illustrated by way of example are four cameras, which are each connected to an image processing computer 3a, 3b, 3c, 3d. The image processing computers 3a, 3b, 3c, 3d are provided for image data compression and real-time image processing. The panoramic cameras 2 are connected to the image processing computers 3a, 3b, 3c, 3d via fast, broadband data lines 4a, 4b, 4c, 4d. The output of the image processing computers 3a, 3b, 3c, 3d is connected in each case via fast, broadband data lines 5a, 5b, 5c, 5d to a switch and transmitter / converter 6, via which the image data, for example optically via a broadband fiber optic connection 7 to a switch and receiver 8 are forwarded. An image processing unit in the form of another image processing computer 9 is connected to the switch and receiver 8 and for decompressing the video signals of the panoramic cameras 2 according to the set up by an operator 10 selected panoramic segment. The image processing unit 9 is further configured to display the moving object information detected and transmitted by the image processing computers 3a, 3b, 3c, 3d through real-time image processing. The display of at least the foveal panoramic segment corresponding to the attention focus of a selected viewing direction on the observed airport area and the information about moving objects in peripheral, extra-foveal panaromasegment, not shown, takes place, for example, with the two display units 11a, 11b shown.
  • In the illustrated embodiment, another pan-tilt zoom camera 12 is connected via a fast, broadband data line 4e with another image processing computer 3e, which also for real-time image processing and extraction of information about moving objects from the recorded zoom video of a section of the observed airport area. The pan-tilt zoom camera 12 is remotely controllable with control signals which are sent via the image processing unit 9 and a control data line 13 to the pan-tilt-zoom camera 12 via the image processing computer 3e.
  • The control of the airport traffic information display system is carried out by the operator 10 by means of keyboard 14, mouse, touch pad 15, etc., which can be routed via the image processing unit 9 and the data links to the image processing computers 3a to 3d.
  • In this way, for example, camera parameters, such as aperture, focal length, orientation of the pan-tilt-zoom camera, etc. can be changed.
  • Optionally, it is also conceivable to generate control signals automatically by means of real-time processing of image and ASMGCS information by the image processing unit.
  • The illustrated airport traffic information display system 1 is thus a panorama system with a plurality of high-resolution cameras 2 which simultaneously display the area normally monitored by direct view from the tower depict. The separation of the cognitive processing (foveal, peripheral) is exploited by only one of the attention focus of the selected (virtual) viewing direction of the complete high-resolution panorama (for example, 180 ° covered angle range with approximately 1600 x 1200 pixels) the surgeon 10 to the far airport corresponding segment of the panorama (<45 ° at 180 ° panorama with more than four cameras 2) with maximum resolution (eg 1600 x 1200 pixels per camera) is displayed.
  • Useful for the manual selection and visualization of the current selected panoramic segment is the simultaneous, permanent display of the viewing direction (currently visualized camera segments) on a scaled-down segment toolbar, e.g. B. at the bottom of one of the two monitors or using a separate from the screen, symbolic representation. For quick selection of a panoramic segment of interest, it is conceivable that alternatively 10 selectable manual or automatic methods can be used by the operator. The active, manual working mode has an override function via the automatic mode. The automatic mode selects the displayed segment based on the current traffic situation. Here, the selection is bound to the current task and is alternatively by
  1. a) results of a parallel running automatic image processing for motion detection;
  2. b) from in any case available approach radar information;
  3. c) from possibly existing position data of a multilateration system (transit time measurement) or
  4. (d) satellite navigation data transmitted from the aircraft via automatic dependance surveillance broadcast (ADS-B)
controlled.
  • For example, in a detected aircraft in the landing approach, the position of which was determined by image processing in the corresponding peripheral segment, the currently active portion is aligned to the peripheral segment and simultaneously the orientation angle (azimuth, vertical angle) and focus (focal length) of the pan-tilt Zoom camera 12 automatically aligned to the fixed approach area. The region selected in the panorama segment and enlarged by means of the separate, remote-controllable pan-tilt-zoom camera 12 is displayed in the panorama with a suitable rectangular marking box and displayed as a zoom image on an adjacent monitor 11 b, as in FIG FIG. 2 outlined.
  • Fig. 3 has the two monitors 11a, 11b of the workplace of the surgeon 10 with the selected panoramic segment on the left monitor 11a and zoom zoom on the right monitor 11 b recognize. At the bottom of the panoramic segment monitor there is a narrow information image bar 16 below, which contains information on the position of the selected segment and on moving objects in the panoramic areas (periphery), not shown. Markings in the lower bar indicate the segmentation of the entire panorama in equal sections corresponding orientation of z. B. four cameras 2 on. The position of the selected segment is marked by a (eg red) colored bar. The selected segment is generally composed of shares of two adjacent panoramic cameras. A small lighter (eg, yellow) rectangle in this mark indicates the zoom range shown on the right monitor 11b. The zoom range can also be moved independently in peripheral panoramic segments that are not currently visualized. A marker, in the example shown the rectangle in the lower information image bar, indicates the position in the overall panorama. Another symbol, a red cross with an arrow sketched in the right-hand area of the information image bar 16, indicates a detected moving object and its direction of movement. The length of the arrow can be chosen as a measure of the speed. There may be different, z. B. coupled to the segment position symbols for the different phases of the landing or the start are selected (starting / landing on runways, rolling over a taxiway, crossing of taxiways, entering the apron area, etc.). The information about the detected moving object can be from different sources such as real-time motion detection image processing implemented in parallel to the decompression algorithms in the image processing computers 3a to 3e. Also conceivable is the extraction of such information about detected moving objects by multilateral systems or satellite navigation with on-board ground (or vehicle control center) data links. Also, passive, immobile objects, the image processing z. If, for example, new obstacles have been detected, they can be indicated by another icon.
  • Alternatively and in addition to the representation of the peripheral information on the lower information image bar 16 similar narrow information image bars 17a, 17b can be displayed in the vertical display margins. This is possible when using high resolution displays with HDTV format (1920 x 1080 pixels) without significantly reducing the resolution set by the cameras 2 (1,600 x 1,200 pixels). The simultaneous representation of a detected, moving object in the peripheral area on one of the vertical information image strips 17a, 17b can then be used to represent the height above the airport surface and, where appropriate, the identification of the flying object (call sign). For the quantitative graphic display of the altitude, the knowledge of the distance of the object in the direction of the respective panoramic segment is required, which is taken from any available satellite navigation data (transmitted via ADS-B) or multilateration data (transit time measurement). The graphical altitude display then requires the insertion of a reference line in the side bar, which corresponds to the base of the vertical under the aircraft. Arrows in the vertical information bars assigned to the symbols (eg, cross for landing or star for take-off) indicate descending or ascending flight in a downward or upward direction of the arrow. Arrows are again speeds. If the latter are not known, this is indicated on the arrow by a marking (eg dash). This variant is in the Fig. 2 shown. In addition to the symbols in the lateral information image bars 17a, 17b for peripheral objects, the call signs of the aircraft are displayed, which are supplied together with the position information of ASMGCS sensors (secondary, multilateration system, satellite navigation system). As input devices for the interaction of the surgeon 10 with the panoramic system stand in the simplest case mouse and keyboard 14 available to, among other things, the currently required panoramic segment on the lower information screen bar 16 z. B. by clicking or pressing an arrow key to select.
  • An alternative input device is based on a touch pad 15 or touch screen. The selected panorama segment is moved by finger movement on the touch pad / touch screen and displayed accordingly. The zoom control command is performed with three fingers as follows:
    • moving together in one direction moving zoom camera to a new position;
    • Pulling apart or moving together of the fingertips controls the enlargement or reduction of the zoom section whose correspondingly shifting marking elements are displayed simultaneously in the panoramic display.
  • In an optional implementation, the segment selection of the panorama may be based on the signals of a brain-computer interface, which by means of evaluation of EEG signals, the mental image "right" / "left arm" / "move left arm" in a corresponding shift of the red segment bar in the lower information image bar 16 and thus converted into the visualized area. Furthermore, a segment control by means of eye movement registration is possible. Focussing on a specific point on the segment display bar will cause the center of the displayed panorama section to fall to that point. Alternatively, the current segment position may move to the right or to the left if the line of sight (direction of attention) is oriented to the right or left accordingly.
  • In an optional, not shown embodiment, in addition to the pan-tilt zoom camera 12 instead of the complete panoramic camera system consisting of several cameras 2 only a single high-resolution, mounted on a turntable camera 2 is used, which is aligned in each case to the attention focus corresponding panoramic segment , The manual or automatic Selection of the segment to be displayed in this case controls the horizontal camera position (azimuth angle about the vertical axis of rotation) about an orientation of the turntable. The rotational speed should correspond to that of the head of an operator 10 when he turns his attention in the real tower to a new segment. The separately controllable pan-tilt-zoom camera 12, which in contrast to the panoramic camera 2 can also be tilted vertically about a horizontal axis of rotation, can optionally be coupled to the panoramic camera 2 with respect to the horizontal coarse alignment. Since parallel motion detection by means of real-time image processing for peripheral areas outside the single panoramic camera is not required, information about peripheral moving objects can only be delivered via ASMGCS sensors on ground radar, multilateration, satellite navigation with (ADS-B) on-board data link and in the information bars 16, 17 are displayed on the screen monitor edge.
  • This airport traffic information display system 1 can be further extended by the addition of inexpensive standard video cameras to provide a complete panorama with only a normal resolution (eg 768 x 576 pixels). The standard video cameras are not for visualization, but only for automatic motion detection using real-time image processing according to the FIG. 1 described embodiment. The high-resolution camera 2 for visualization together with the non-visualized, low-resolution background images corresponds to the human visual system with high-resolution fovea and low-resolution peripheral retinal area. The Realsierung is also conceivable with only a standard camera for the peripheral information, which with a fisheye lens for a shooting angle of z. B. 180 ° must be equipped.
  • Claims (16)

    1. Airport traffic information display system (1) having at least one camera (2) which can be aligned with an observable airport area, an image processing unit (9) connected to the at least one camera (2) and at least one display unit (11) connected to the image processing unit (9) , characterized in that the image processing unit (9) is arranged to select and display a foveal panoramic segment corresponding to the attention focus of a selected line of sight on the observed airport area and to simultaneously display information about moving objects in the peripheral, extra-foveal panoramic segment (not shown).
    2. The airport traffic information display system (1) according to claim 1, characterized by a remotely controllable pan-tilt zoom camera (12) adapted to select a zoomed-out section of the panoramic segment to display the zoomed-out area on one of the display units (11) ,
    3. Airport traffic information display system (1) according to claim 1 or 2, characterized in that the image processing unit (9) is directly connected to at least one camera (2).
    4. Airport traffic information display system (1) according to claim 1 or 2, characterized in that the image processing unit (9) is indirectly connected to at least one camera (2).
    5. Airport traffic information display system (1) according to any one of the preceding claims, characterized in that four cameras (2) are provided for receiving a high-resolution panorama of preferably 180 °.
    6. Airport traffic information display system (1) according to one of the preceding claims, characterized by at least one sensor connected to the image processing unit (9) for detecting the foveal attention sector of an operator, wherein the image processing unit (9) is arranged to select the foveal panoramic segment in dependence on the foveal attention sector.
    7. Airport traffic information display system (1) according to any one of the preceding claims, characterized in that the fovea panoramic segment is smaller than 45 ° of the cameras (2) detectable panorama of preferably 180 °.
    8. Airport traffic information display system (1) according to any one of the preceding claims, characterized in that the image processing unit (9) is adapted for the simultaneous permanent display of the selected panoramic segment.
    9. The airport traffic information display system (1) according to any one of the preceding claims, characterized in that the image processing unit (9) for automatically selecting the foveal panoramic segment in response to characteristic movements detected in the captured panorama, flight movement information from an approach radar, position information from a multilateration device aircraft and / or vehicles in the observed airport area, and / or aircraft-transmitted satellite navigation position information is set up.
    10. Airport traffic information display system (1) according to any one of the preceding claims, characterized in that the image processing unit (9) is connected to a touch-sensitive input device, in particular a touch-sensitive monitor for selecting a panoramic segment in response to a finger or pen movement on the input device.
    11. Airport traffic information display system (1) according to one of the preceding claims, characterized in that the image processing unit (9) for displaying the information on moving Objects in horizontal and vertical to the representation of the panoramic segment adjacent image bars is set up.
    12. Airport traffic information display system according to one of the preceding claims, characterized in that the image processing unit (9) is connected to a further display unit (11) for displaying the panorama detected by the cameras (2).
    13. Method for displaying an observable airport area on a display unit, characterized by :
      Recording a real-time video data stream of at least one area of an airport,
      Selecting and displaying a foveal panoramic segment corresponding to the attention focus of a selected line of sight on the observed airport area; and
      - Simultaneous display of information about moving objects in the peripheral, non-foveal panoramic segment, not shown.
    14. The method of claim 13, characterized by generating a virtual panorama of an airport area from a real-time video data stream by juxtaposing images from individual panoramic cameras and selecting and displaying the foveal panoramic segment by shifting an image section of the virtual panorama by selecting a virtual viewing position.
    15. A method according to claim 13 or 14, characterized by continuously shifting a selected and displayed high-resolution panoramic segment over a panorama taken by a number of cameras, so that the panoramic segment contains portions of video sequences of adjacent cameras.
    16. Computer program with program code means for carrying out the method according to one of claims 13 to 15, when the computer program is executed on a computer.
    EP20070024748 2006-12-20 2007-12-20 Airport traffic information display system Active EP1936583B1 (en)

    Priority Applications (1)

    Application Number Priority Date Filing Date Title
    DE200610060904 DE102006060904B4 (en) 2006-12-20 2006-12-20 Airport traffic information display system

    Publications (2)

    Publication Number Publication Date
    EP1936583A1 true EP1936583A1 (en) 2008-06-25
    EP1936583B1 EP1936583B1 (en) 2012-01-18

    Family

    ID=39271425

    Family Applications (1)

    Application Number Title Priority Date Filing Date
    EP20070024748 Active EP1936583B1 (en) 2006-12-20 2007-12-20 Airport traffic information display system

    Country Status (4)

    Country Link
    EP (1) EP1936583B1 (en)
    AT (1) AT542208T (en)
    DE (1) DE102006060904B4 (en)
    ES (1) ES2379644T3 (en)

    Families Citing this family (1)

    * Cited by examiner, † Cited by third party
    Publication number Priority date Publication date Assignee Title
    DE102011084524A1 (en) 2011-10-14 2013-04-18 Robert Bosch Gmbh Method for displaying a vehicle environment

    Citations (6)

    * Cited by examiner, † Cited by third party
    Publication number Priority date Publication date Assignee Title
    US5659475A (en) * 1994-03-17 1997-08-19 Brown; Daniel M. Electronic air traffic control system for use in airport towers
    EP1170715A2 (en) * 2000-07-04 2002-01-09 H.A.N.D. GmbH Method for surface surveillance
    US6628321B1 (en) * 1998-12-30 2003-09-30 Honeywell, Inc. Method and apparatus for simulating views from a window
    US20050231419A1 (en) * 2004-04-15 2005-10-20 Lockheed Martin Ms2 Augmented reality traffic control center
    DE102005005879A1 (en) 2005-02-09 2006-08-17 Schill + Seilacher "Struktol" Ag Nitrogen-containing bridged derivatives of 6H-dibenz [c, e] [1,2] oxaphosphorine-6-oxides, processes for their preparation and their use
    EP1791364A1 (en) * 2005-11-23 2007-05-30 Deutsche Forschungsanstalt für Luft- und Raumfahrt e.V. Air traffic guidance device

    Family Cites Families (1)

    * Cited by examiner, † Cited by third party
    Publication number Priority date Publication date Assignee Title
    US7889232B2 (en) * 2004-06-22 2011-02-15 Stratech Systems Limited Method and system for surveillance of vessels

    Patent Citations (6)

    * Cited by examiner, † Cited by third party
    Publication number Priority date Publication date Assignee Title
    US5659475A (en) * 1994-03-17 1997-08-19 Brown; Daniel M. Electronic air traffic control system for use in airport towers
    US6628321B1 (en) * 1998-12-30 2003-09-30 Honeywell, Inc. Method and apparatus for simulating views from a window
    EP1170715A2 (en) * 2000-07-04 2002-01-09 H.A.N.D. GmbH Method for surface surveillance
    US20050231419A1 (en) * 2004-04-15 2005-10-20 Lockheed Martin Ms2 Augmented reality traffic control center
    DE102005005879A1 (en) 2005-02-09 2006-08-17 Schill + Seilacher "Struktol" Ag Nitrogen-containing bridged derivatives of 6H-dibenz [c, e] [1,2] oxaphosphorine-6-oxides, processes for their preparation and their use
    EP1791364A1 (en) * 2005-11-23 2007-05-30 Deutsche Forschungsanstalt für Luft- und Raumfahrt e.V. Air traffic guidance device

    Non-Patent Citations (2)

    * Cited by examiner, † Cited by third party
    Title
    C.D. WICKENS: "Multiple Resources and Performance Prediction", THEOR. ISSUES IN ERGON. SCI., vol. 3, no. 2, 2002, pages 159 - 177, XP032562278, DOI: doi:10.1109/BIBM.2013.6732535
    N. FÜRSTENAU: "Perspectives of Virtual Reality for Integration", 12TH SCIENTIFIC SEMINAR, DLR, INST. OF FLIGHT GUIDANCE, October 2002 (2002-10-01)

    Also Published As

    Publication number Publication date
    AT542208T (en) 2012-02-15
    ES2379644T3 (en) 2012-04-30
    DE102006060904A1 (en) 2008-06-26
    EP1936583B1 (en) 2012-01-18
    DE102006060904B4 (en) 2011-05-12

    Similar Documents

    Publication Publication Date Title
    US7230632B2 (en) Airport display method including changing zoom scales
    US7769475B2 (en) Apparatus for automatically introducing celestial object, terminal device and control system for astronomical telescope
    EP2000778B1 (en) Airport taxiway navigation system
    US8116526B2 (en) Methods and system for communication and displaying points-of-interest
    US7737867B2 (en) Multi-modal cockpit interface for improved airport surface operations
    US4805015A (en) Airborne stereoscopic imaging system
    US8040361B2 (en) Systems and methods for combining virtual and real-time physical environments
    JP4900741B2 (en) Image recognition apparatus, operation determination method, and program
    US6903752B2 (en) Method to view unseen atmospheric phenomenon using augmented reality
    US7061401B2 (en) Method and apparatus for detecting a flight obstacle
    US9253453B2 (en) Automatic video surveillance system and method
    US5182641A (en) Composite video and graphics display for camera viewing systems in robotics and teleoperation
    US8713215B2 (en) Systems and methods for image stream processing
    CN104205175B (en) Information processor, information processing system and information processing method
    US7486291B2 (en) Systems and methods using enhanced vision to provide out-the-window displays for a device
    JP2007043225A (en) Picked-up processing apparatus and picked-up processing method
    EP0537945A1 (en) Computer generated images with real view overlay
    JP2008217775A (en) Obstacle avoidance situation display generator
    CA2691375C (en) Aircraft landing assistance
    CN102343980B (en) For strengthening the method and system of the discrimination of the posture in visual pattern
    JP2017509520A (en) How to provide flight information
    US7312725B2 (en) Display system for operating a device with reduced out-the-window visibility
    WO2014077046A1 (en) Image display device and image display method, mobile body device, image display system, and computer program
    US8446509B2 (en) Methods of creating a virtual window
    US20170322551A1 (en) Systems and methods for target tracking

    Legal Events

    Date Code Title Description
    AX Request for extension of the european patent to:

    Extension state: AL BA HR MK RS

    AK Designated contracting states

    Kind code of ref document: A1

    Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC MT NL PL PT RO SE SI SK TR

    17P Request for examination filed

    Effective date: 20080930

    17Q First examination report despatched

    Effective date: 20081112

    AKX Designation fees paid

    Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC MT NL PL PT RO SE SI SK TR

    R17C First examination report despatched (corrected)

    Effective date: 20100126

    AK Designated contracting states

    Kind code of ref document: B1

    Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC MT NL PL PT RO SE SI SK TR

    REG Reference to a national code

    Ref country code: GB

    Ref legal event code: FG4D

    Free format text: NOT ENGLISH

    REG Reference to a national code

    Ref country code: CH

    Ref legal event code: EP

    REG Reference to a national code

    Ref country code: AT

    Ref legal event code: REF

    Ref document number: 542208

    Country of ref document: AT

    Kind code of ref document: T

    Effective date: 20120215

    Ref country code: IE

    Ref legal event code: FG4D

    Free format text: LANGUAGE OF EP DOCUMENT: GERMAN

    REG Reference to a national code

    Ref country code: DE

    Ref legal event code: R096

    Ref document number: 502007009108

    Country of ref document: DE

    Effective date: 20120315

    REG Reference to a national code

    Ref country code: NL

    Ref legal event code: T3

    REG Reference to a national code

    Ref country code: ES

    Ref legal event code: FG2A

    Ref document number: 2379644

    Country of ref document: ES

    Kind code of ref document: T3

    Effective date: 20120430

    REG Reference to a national code

    Ref country code: SE

    Ref legal event code: TRGR

    LTIE Lt: invalidation of european patent or patent extension

    Effective date: 20120118

    REG Reference to a national code

    Ref country code: NL

    Ref legal event code: T3

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: BG

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20120418

    Ref country code: IS

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20120518

    Ref country code: LT

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20120118

    REG Reference to a national code

    Ref country code: IE

    Ref legal event code: FD4D

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: GR

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20120419

    Ref country code: PL

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20120118

    Ref country code: FI

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20120118

    Ref country code: LV

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20120118

    Ref country code: PT

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20120518

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: CY

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20120118

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: IE

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20120118

    Ref country code: DK

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20120118

    Ref country code: SI

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20120118

    Ref country code: RO

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20120118

    Ref country code: CZ

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20120118

    Ref country code: EE

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20120118

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: SK

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20120118

    26N No opposition filed

    Effective date: 20121019

    REG Reference to a national code

    Ref country code: DE

    Ref legal event code: R097

    Ref document number: 502007009108

    Country of ref document: DE

    Effective date: 20121019

    BERE Be: lapsed

    Owner name: DEUTSCHES ZENTRUM FUR LUFT- UND RAUMFAHRT E.V.

    Effective date: 20121231

    REG Reference to a national code

    Ref country code: CH

    Ref legal event code: PL

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: MC

    Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

    Effective date: 20121231

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: BE

    Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

    Effective date: 20121231

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: CH

    Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

    Effective date: 20121231

    Ref country code: LI

    Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

    Effective date: 20121231

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: MT

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20120118

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: TR

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20120118

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: LU

    Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

    Effective date: 20121220

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: HU

    Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

    Effective date: 20071220

    REG Reference to a national code

    Ref country code: DE

    Ref legal event code: R082

    Ref document number: 502007009108

    Country of ref document: DE

    Representative=s name: GRAMM, LINS & PARTNER PATENT- UND RECHTSANWAEL, DE

    REG Reference to a national code

    Ref country code: FR

    Ref legal event code: PLFP

    Year of fee payment: 9

    REG Reference to a national code

    Ref country code: FR

    Ref legal event code: PLFP

    Year of fee payment: 10

    REG Reference to a national code

    Ref country code: FR

    Ref legal event code: PLFP

    Year of fee payment: 11

    PGFP Annual fee paid to national office [announced from national office to epo]

    Ref country code: NL

    Payment date: 20171212

    Year of fee payment: 11

    Ref country code: FR

    Payment date: 20171120

    Year of fee payment: 11

    PGFP Annual fee paid to national office [announced from national office to epo]

    Ref country code: AT

    Payment date: 20171128

    Year of fee payment: 11

    Ref country code: SE

    Payment date: 20171208

    Year of fee payment: 11

    Ref country code: IT

    Payment date: 20171219

    Year of fee payment: 11

    Ref country code: GB

    Payment date: 20171128

    Year of fee payment: 11

    PGFP Annual fee paid to national office [announced from national office to epo]

    Ref country code: ES

    Payment date: 20180105

    Year of fee payment: 11

    REG Reference to a national code

    Ref country code: DE

    Ref legal event code: R084

    Ref document number: 502007009108

    Country of ref document: DE

    PGFP Annual fee paid to national office [announced from national office to epo]

    Ref country code: DE

    Payment date: 20181114

    Year of fee payment: 12

    REG Reference to a national code

    Ref country code: SE

    Ref legal event code: EUG

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: SE

    Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

    Effective date: 20181221

    REG Reference to a national code

    Ref country code: NL

    Ref legal event code: MM

    Effective date: 20190101

    REG Reference to a national code

    Ref country code: AT

    Ref legal event code: MM01

    Ref document number: 542208

    Country of ref document: AT

    Kind code of ref document: T

    Effective date: 20181220

    GBPC Gb: european patent ceased through non-payment of renewal fee

    Effective date: 20181220

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: NL

    Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

    Effective date: 20190101

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: IT

    Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

    Effective date: 20181220

    Ref country code: FR

    Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

    Effective date: 20181231

    PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

    Ref country code: GB

    Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

    Effective date: 20181220

    Ref country code: AT

    Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

    Effective date: 20181220