WO2008077105A1 - Method and apparatus to facilitate providing a synthetic view of terrain for use in a moving vehicle - Google Patents

Method and apparatus to facilitate providing a synthetic view of terrain for use in a moving vehicle Download PDF

Info

Publication number
WO2008077105A1
WO2008077105A1 PCT/US2007/088148 US2007088148W WO2008077105A1 WO 2008077105 A1 WO2008077105 A1 WO 2008077105A1 US 2007088148 W US2007088148 W US 2007088148W WO 2008077105 A1 WO2008077105 A1 WO 2008077105A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
synthetic view
person
terrain
particular image
Prior art date
Application number
PCT/US2007/088148
Other languages
French (fr)
Inventor
Ovidiu Gabriel Vlad
Jr. Lawrence Carl Spaete
Alfred Robert Zantow
Janusz Biegaj
Original Assignee
Embedded Control Systems
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Embedded Control Systems filed Critical Embedded Control Systems
Publication of WO2008077105A1 publication Critical patent/WO2008077105A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C23/00Combined instruments indicating more than one navigational value, e.g. for aircraft; Combined measuring devices for measuring two or more variables of movement, e.g. distance, speed or acceleration
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/04Anti-collision systems
    • G08G5/045Navigation or guidance aids, e.g. determination of anti-collision manoeuvers

Definitions

  • This invention relates generally to synthetic vision.
  • Synthetic vision systems of various kinds are known in the art. Synthetic vision typically comprises a set of technologies that provide drivers of vehicles (including but not limited to aircraft) with images that assist the driver with understanding their operating environment. Such systems tend to use information regarding position and location to make selective use of stored information regarding local terrain and obstacles available to the driver via a corresponding graphic display.
  • FIG. 1 comprises a flow diagram as configured in accordance with various embodiments of the invention
  • FIG. 2 comprises a schematic diagram as configured in accordance with various embodiments of the invention.
  • FIG. 3 comprises a schematic diagram as configured in accordance with various embodiments of the invention.
  • FIG. 4 comprises a block diagram as configured in accordance with various embodiments of the invention.
  • determinations are made in a moving vehicle and with respect to a person in the vehicle who has an ordinary expected gaze directionality while in the moving vehicle.
  • These determinations can comprise automatically determining a position of the moving vehicle with respect to terrain past which the moving vehicle is traveling and automatically determining an orientation attitude of the moving vehicle with respect to the terrain, and then automatically using this position and orientation attitude to determine (in the absence of executable program instructions) a synthetic view to provide to the person in the vehicle.
  • this synthetic view comprises a view of the terrain that comports with the ordinary expected gaze directionality of the person in the vehicle.
  • determining this synthetic view can comprise identifying a particular image that is stored in a database, wherein the database contains a variety of candidate terrain images. This can further comprise, if desired, using a plurality of databases that each contain candidate terrain images. By one approach, these databases can comprise scaled databases.
  • these teachings will also accommodate processing the selected particular image to thereby significantly reduce scintillation in the synthetic view.
  • processing can comprise, for example, low pass filtering the particular image as a function of a ratio of a distance between pixels in an X direction and a distance between pixels in a Y direction.
  • This can also comprise further low pass filtering of the edges of the particular image to thereby reduce informational content at those edges as versus a central portion of the particular image.
  • These teachings will also accommodate detecting real-time local features external to the moving vehicle and using those detected local features to contribute to the content of the synthetic view. Somewhat similarly, these teachings can also accommodate receiving real-time Traffic Collision Avoidance System (TCAS) information regarding local aircraft and using that information as well to further contribute to the content of the synthetic view.
  • TCAS Traffic Collision Avoidance System
  • the described process 100 can be carried out with respect to a person in the vehicle (such as a pilot, co-pilot, navigator, passenger, or the like) who has an ordinary expected gaze directionality while in the moving vehicle.
  • this expected gaze directionality comprises a function, at least in substantial part, of the person's role while the vehicle moves.
  • the person comprises a pilot, for example, their expected gaze directionality will ordinarily comprise a forward-looking view out through a windshield.
  • this reference to an ordinary expected gaze directionality does not require that the actual person always gaze only in this particular direction, nor even that a given person will ever gaze in this particular direction; rather, this reference to an ordinary expected gaze directionality specifies an anticipated direction of view as would accord with an ordinary and typical person's likely role during movement of the vehicle.
  • This process 100 provides for the automatic determination 101 of a position of the moving vehicle with respect to terrain past which the moving vehicle is traveling. This can comprise, for example, determining longitude, latitude, and altitude information for the vehicle. Other substitutes for these particular metrics exist and can be used instead if desired. There are various ways by which such information can be automatically determined. By one approach, for example, this can comprise the use of a Global Positioning System (GPS) receiver to receive GPS signals to determine all three of these parameters. By another approach, or in combination therewith, this can comprise using dead reckoning techniques as are also known in the art. [0019] This process 100 also provides for automatically determining 102 an orientation attitude of the moving vehicle with respect to the terrain.
  • GPS Global Positioning System
  • this orientation attitude information can comprise one or more of pitch, roll, and yaw information for the vehicle. These parameters are very well known in the art and require no further explanation here.
  • This process 100 then provides for automatically using 103 this position and orientation attitude information to determine a synthetic view to provide to the person in the vehicle.
  • this step occurs in the absence of executable program instructions. This can be supported, for example, through use of a hardware-configured platform and a stored-image approach that avoids the need to reconstruct, generate, or create the desired synthetic view. Further elaboration on this point will appear below as appropriate.
  • this process 100 will further comprise detecting 104 real-time local features that are external to the moving vehicle such that these detected local features can be used to contribute to the content of the synthetic view.
  • These local features can comprise, for example, man-made objects such as buildings, radio transmission towers, and the like as may comprise a part of the terrain past which the vehicle is moving.
  • Local features can also comprise temporary and/or mobile features such as, for example, another aircraft or vehicle on a runway. Detection of such local features can be accomplished using any of a variety of technologies and methodologies. Examples include, but are not limited to, radar, infrared, sonar, and so forth.
  • this process 100 will also accommodate receiving 105 realtime Traffic Collision Avoidance System (TCAS) information regarding local aircraft such that this TCAS information can also be used to contribute as well to the synthetic view.
  • TCAS receivers are known in the art and comprise a computerized avionics device that monitors the airspace around an aircraft, independent of air traffic control, and warns pilots of the presence of other aircraft that may present a threat of an airborne collision. These devices are required of all aircraft that exceed 5700 kg or that are authorized to carry more than 19 passengers.
  • this synthetic view comprises a view of the terrain that comports with the ordinary expected gaze directionality of the person in the vehicle. By then providing this synthetic view via an appropriately located display, the person will have the benefit of a synthetic view of passing terrain that offers information in a highly intuitive manner as the synthetic view accords with the ordinary view of this person.
  • calculations can be made to determine, for each pixel on the display 203, a corresponding location on the ground (or in the sky).
  • the ground is assumed to be flat.
  • this can comprise mapping each pixel on the display 203 to a point on the ground (or in the sky) and then finding the appropriate display image for each particular point on the ground.
  • the aforementioned step of determining 103 a synthetic view to provide to the person in the vehicle can itself comprise identifying a particular image that is stored in a database.
  • This particular image can comprise, for example, one of many candidate terrain images.
  • these databases can comprise so-called scaled databases.
  • this can comprise scaling each database to contain the same essential imagery but with only one half the resolution of a next higher-resolution database.
  • this approach one can provision as many scaled databases as may be usefully required as the storage requirements for all of the databases together will only use 4/3 rds the storage space of the highest resolution database being utilized.
  • scaled databases provides a number of benefits, amongst them being an ability to provide for perspective of distance in the resultant synthetic view with little or no so-called jagged lines.
  • these different databases contain information on how to display the desired terrain view when looking at that location from different distances.
  • a pixel on the display that is to represent a location in the database that is very close to the current position would represent a very small area of land on the Earth.
  • this same pixel will represent a much greater area of the planet's surface. Consequently, a different database can be used where the averaging of the different pixels for the desired location has already taken place offline.
  • the aircraft and the ground below will both tend to be level, there will tend to be relatively small change in the differences of these points.
  • an average based upon using adjacent points in a same row will normally tend to be smaller than an average that is based on points taken from the same column for a given point of interest.
  • this can influence which scaled database to use.
  • using a scaled database with a smaller average distance while offering greater detail at times, can also yield increased scintillation. The opposite is also often true.
  • this process 100 can further optionally provide for determining 106 whether to process the retrieved image(s) to significantly reduce scintillation in the synthetic view that might otherwise be provided to the viewer in the vehicle. This determination can be based, for example, upon determining whether there is a large enough difference in the distance between pixels in the X and Y directions of the database content such that a different scaling of the database would be applied. If no difference in database scaling is needed, then this determination step 106 can conclude that such processing is not required and the process 100 can be diverted as appropriate to meet the needs of a given application setting.
  • this process 100 can then provide for processing 107 the particular image to significantly reduce scintillation in the synthetic view that is provided to the person in the vehicle.
  • this can comprise low pass filtering the particular image as a function of a ratio of a distance between pixels in an X direction and a distance between pixels in a Y direction.
  • this can comprise using a weighted average of the average distances associated with both the row and columns for a given point. A ratio reflecting use of 90 percent of the row average and 10 percent of the column average works well in many application settings.
  • this processing can also comprise low pass filtering the edges of the particular image to thereby reduce informational content at those edges as versus a central portion of the particular image. This is often acceptable as the main object of interest will often be in the center of the display 203. Reducing the level of detail somewhat at the edges of the display will therefore often not produce noticeable issues with respect to resolution while also serving to reduce the amount of perceived scintillation.
  • the synthetic view can comprise, at least in part, photographic imagery of the terrain (as captured, for example, by satellite and/or other aircraft).
  • the synthetic view can comprise, at least in part, elements of an aviation navigation chart.
  • examples in this regard might include, but are not limited to, specific roads of navigational interest and value, generally denoted urban areas (using, for example, a color such as yellow), radio frequencies employed by different airports and airstrips for their air-to-ground communications, airport names and spheres of coverage, and so forth.
  • aviation navigation chart content can be employed alone (thereby making the synthetic view a kind of dynamic moving chart that moves in accordance with the movement of the aircraft such that the presentation of the scale, field of view, and orientation of the chart accords with the pilot's own view of the external terrain through the front windshield) or in combination with other informational content (using, for example, an overlay approach).
  • the enabling apparatus 402 can comprise a position determination unit 403 that is configured and arranged to automatically determine a position of the moving vehicle 401 with respect to terrain past which the moving vehicle 401 is traveling and an orientation determination unit 404 that is configured and arranged to automatically determine an orientation attitude of the moving vehicle 401 with respect to the aforementioned terrain.
  • These two units 403 and 404 operably couple to a hardware-based processor which further operably couples to a display 406 (either directly or via certain databases as described below, depending upon the needs and/or opportunities as tend to characterize a given application setting).
  • the processor comprises a hardware-based processor 405.
  • this will be understood to refer to a processing platform having logic elements that are each comprised of dedicated corresponding hardware components.
  • this reference to a hardware-based processor specifically refers to a processing platform that lacks executable program instructions (where the latter are understood to comprise software-based instructions as versus hard-wired components).
  • This approach though counterintuitive to many, has been determined by the applicant to provide a number of advantages. These include, but are not necessarily limited to, simplicity and reliability in operation.
  • the challenges of designing such a platform are largely overcome in this particular instance by taking into account and relying upon the various teachings set forth herein, as these teachings greatly simplify the computational requirements of selecting and then employing high quality synthetic view images at a high refresh rate.
  • these teachings can employ one or more databases comprising, if desired, scaled databases. Such databases are shown in FIG. 4 as a first database 407 through an Nth database 408 (where "N” will be understood to comprise an integer greater than one) that operably couple to the aforementioned hardware-based processor 405. Also if desired, this apparatus can comprise a real-time local features detector 409 and/or a TCAS receiver 410 as mentioned earlier. Such additional components can also be operably coupled to the hardware-based processor 405 to thereby further inform its functionality.
  • the hardware-based processor 405 can be configured and arranged to carry out one or more of the steps, actions, or functionality as has been set forth herein. This can specifically comprise, for example, using the position and orientation attitude information from the position determination unit 403 and the orientation determination unit 404 to determine a synthetic view (as described above) to provide to a person in the vehicle 401 via the display 406.
  • a synthetic view as described above
  • Those skilled in the art will recognize and understand that such an apparatus 402 may be comprised of a plurality of physically distinct elements as is suggested by the illustration shown in FIG. 4. It is also possible, however, to view this illustration as comprising a logical view, in which case one or more of these elements can be enabled and realized via a shared platform. It will also be understood that such a shared platform may comprise a wholly or at least partially programmable platform as are known in the art.
  • these teachings permit a very high resolution display of highly relevant external information to be presented via a synthetic display in a manner that renders identification, interpretation, and use of that information in a highly intuitive manner by the viewer.

Abstract

Specific determinations are made in a moving vehicle (901) and with respect to a person in the vehicle who has an ordinary expected gaze directionality while in the moving vehicle. These determinations can comprise automatically determining a position (101) of the moving vehicle with respect to terrain past which the moving vehicle is traveling, and automatically determining an orientation attitude (102) of the moving vehicle with respect to the terrain, and then automatically using (103) this position and orientation attitude to determine (in the absence of executable program instructions) a synthetic view to provide to the person in the vehicle. By one approach this synthetic view comprises a view of the terrain that comports with the ordinary expected gaze directionality of the person in the vehicle.

Description

METHOD AND APPARATUS TO FACILITATE PROVIDING A SYNTHETIC VIEW OF TERRAIN FOR USE IN A MOVING VEHICLE
Related Applicationfs")
[0001] This application claims the benefit of U.S. Provisional Application
Number 60/870,703, filed December 19, 2006, which is incorporated by reference in its entirety herein.
Technical Field
[0002] This invention relates generally to synthetic vision.
Background
[0003] Synthetic vision systems of various kinds are known in the art. Synthetic vision typically comprises a set of technologies that provide drivers of vehicles (including but not limited to aircraft) with images that assist the driver with understanding their operating environment. Such systems tend to use information regarding position and location to make selective use of stored information regarding local terrain and obstacles available to the driver via a corresponding graphic display.
[0004] Synthetic vision continues to hold great promise while also frequently falling fall short of hoped-for benefits, pricing, usability, and value. Problems range from issues regarding the relative utility of the information provided to annoyance with display flickering due to relatively low refresh rates (which is in turn owing in many cases to compromises made with respect to the computational platform selected to support the synthetic vision processing regarding cost and complexity versus capability and performance).
Brief Description of the Drawings
[0005] The above needs are at least partially met through provision of the method and apparatus to facilitate providing a synthetic view of terrain for use in a moving vehicle described in the following detailed description, particularly when studied in conjunction with the drawings, wherein:
[0006] FIG. 1 comprises a flow diagram as configured in accordance with various embodiments of the invention;
[0007] FIG. 2 comprises a schematic diagram as configured in accordance with various embodiments of the invention;
[0008] FIG. 3 comprises a schematic diagram as configured in accordance with various embodiments of the invention; and
[0009] FIG. 4 comprises a block diagram as configured in accordance with various embodiments of the invention.
[0010] Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions and/or relative positioning of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of various embodiments of the present invention. Also, common but well-understood elements that are useful or necessary in a commercially feasible embodiment are often not depicted in order to facilitate a less obstructed view of these various embodiments of the present invention. It will further be appreciated that certain actions and/or steps may be described or depicted in a particular order of occurrence while those skilled in the art will understand that such specificity with respect to sequence is not actually required. It will also be understood that the terms and expressions used herein have the ordinary technical meaning as is accorded to such terms and expressions by persons skilled in the technical field as set forth above except where different specific meanings have otherwise been set forth herein.
Detailed Description
[0011] Generally speaking, pursuant to these various embodiments, specific determinations are made in a moving vehicle and with respect to a person in the vehicle who has an ordinary expected gaze directionality while in the moving vehicle. These determinations can comprise automatically determining a position of the moving vehicle with respect to terrain past which the moving vehicle is traveling and automatically determining an orientation attitude of the moving vehicle with respect to the terrain, and then automatically using this position and orientation attitude to determine (in the absence of executable program instructions) a synthetic view to provide to the person in the vehicle. By one approach this synthetic view comprises a view of the terrain that comports with the ordinary expected gaze directionality of the person in the vehicle.
[0012] By one approach, determining this synthetic view can comprise identifying a particular image that is stored in a database, wherein the database contains a variety of candidate terrain images. This can further comprise, if desired, using a plurality of databases that each contain candidate terrain images. By one approach, these databases can comprise scaled databases.
[0013] When using a stored image in this manner, these teachings will also accommodate processing the selected particular image to thereby significantly reduce scintillation in the synthetic view. Such processing can comprise, for example, low pass filtering the particular image as a function of a ratio of a distance between pixels in an X direction and a distance between pixels in a Y direction. This can also comprise further low pass filtering of the edges of the particular image to thereby reduce informational content at those edges as versus a central portion of the particular image.
[0014] These teachings will also accommodate detecting real-time local features external to the moving vehicle and using those detected local features to contribute to the content of the synthetic view. Somewhat similarly, these teachings can also accommodate receiving real-time Traffic Collision Avoidance System (TCAS) information regarding local aircraft and using that information as well to further contribute to the content of the synthetic view.
[0015] So configured, a powerful yet economic synthetic vision solution provides vehicle occupants with a clear and intuitive view of terrain features that might otherwise be obscured or unappreciated due to environmental conditions or other reasons. These teachings will support, in a softwareless operating context, a relatively high display refresh rate that avoids prior flickering issues. These teachings will also support the use of extremely high resolution images notwithstanding this high refresh rate, thereby yielding a very clear display capable of usefully providing a large quantity of fine detail of potential use to the vehicle occupants.
[0016] These and other benefits may become clearer upon making a thorough review and study of the following detailed description. Referring now to the drawings, and in particular to FIG. 1, an illustrative process that is compatible with many of these teachings will now be presented. The described process 100 can be carried out in a moving vehicle of choice including any of a wide variety of terrestrial vehicles as well as waterborne or airborne vehicles. For the purposes of illustration and with no intention of suggesting limitations in this regard, the examples provided herein will presume the vehicle to comprise an aircraft.
[0017] It will also be understood that the described process 100 can be carried out with respect to a person in the vehicle (such as a pilot, co-pilot, navigator, passenger, or the like) who has an ordinary expected gaze directionality while in the moving vehicle. In many cases this expected gaze directionality comprises a function, at least in substantial part, of the person's role while the vehicle moves. When the person comprises a pilot, for example, their expected gaze directionality will ordinarily comprise a forward-looking view out through a windshield. As used herein, this reference to an ordinary expected gaze directionality does not require that the actual person always gaze only in this particular direction, nor even that a given person will ever gaze in this particular direction; rather, this reference to an ordinary expected gaze directionality specifies an anticipated direction of view as would accord with an ordinary and typical person's likely role during movement of the vehicle.
[0018] This process 100 provides for the automatic determination 101 of a position of the moving vehicle with respect to terrain past which the moving vehicle is traveling. This can comprise, for example, determining longitude, latitude, and altitude information for the vehicle. Other substitutes for these particular metrics exist and can be used instead if desired. There are various ways by which such information can be automatically determined. By one approach, for example, this can comprise the use of a Global Positioning System (GPS) receiver to receive GPS signals to determine all three of these parameters. By another approach, or in combination therewith, this can comprise using dead reckoning techniques as are also known in the art. [0019] This process 100 also provides for automatically determining 102 an orientation attitude of the moving vehicle with respect to the terrain. The particular parameters utilized in this regard can vary to some extent with the application setting. When the vehicle comprises an aircraft, this orientation attitude information can comprise one or more of pitch, roll, and yaw information for the vehicle. These parameters are very well known in the art and require no further explanation here.
[0020] This process 100 then provides for automatically using 103 this position and orientation attitude information to determine a synthetic view to provide to the person in the vehicle. By one approach, this step occurs in the absence of executable program instructions. This can be supported, for example, through use of a hardware-configured platform and a stored-image approach that avoids the need to reconstruct, generate, or create the desired synthetic view. Further elaboration on this point will appear below as appropriate.
[0021] If desired, this process 100 will further comprise detecting 104 real-time local features that are external to the moving vehicle such that these detected local features can be used to contribute to the content of the synthetic view. These local features can comprise, for example, man-made objects such as buildings, radio transmission towers, and the like as may comprise a part of the terrain past which the vehicle is moving. Local features can also comprise temporary and/or mobile features such as, for example, another aircraft or vehicle on a runway. Detection of such local features can be accomplished using any of a variety of technologies and methodologies. Examples include, but are not limited to, radar, infrared, sonar, and so forth.
[0022] Also if desired, this process 100 will also accommodate receiving 105 realtime Traffic Collision Avoidance System (TCAS) information regarding local aircraft such that this TCAS information can also be used to contribute as well to the synthetic view. TCAS receivers are known in the art and comprise a computerized avionics device that monitors the airspace around an aircraft, independent of air traffic control, and warns pilots of the presence of other aircraft that may present a threat of an airborne collision. These devices are required of all aircraft that exceed 5700 kg or that are authorized to carry more than 19 passengers. [0023] By one approach, this synthetic view comprises a view of the terrain that comports with the ordinary expected gaze directionality of the person in the vehicle. By then providing this synthetic view via an appropriately located display, the person will have the benefit of a synthetic view of passing terrain that offers information in a highly intuitive manner as the synthetic view accords with the ordinary view of this person.
[0024] There are certain pieces of background information that may be useful to understand before providing further elaboration in this regard. For the sake of example and illustration and not by way of limitation, it will be presumed that the vehicle's display always scans from the top left pixel going across the upper row of pixels and then proceeds with the next row of pixels down and that there is no interlacing used with the display. Additionally, there are 1920 pixels in the X direction and 1200 pixels in the Y direction of the display.
[0025] In many application settings it can be assumed that there are several parameters that are constants depending upon the installation of the display in the aircraft. Some of these constants should be the same from one type of aircraft to the next though others may differ. An example of this would be the distance from the pilot's eye to the center of the top-most row of pixels on the screen. Referring now to FIG. 2, it may be assumed for this example that a line 201 from the pilot's eye 202 to the center pixel in the top-most row is perpendicular with the screen 203 in a case where, as shown, the intention is to present the viewer with a view of the outside world (and particularly the ground) as though the display were, in fact, transparent. In this view, θsx represents the pilot's viewing angle going across the display 203. Likewise, ΘSY represents the pilot's viewing angle going up and down on the display 203.
[0026] Referring now generally to FIG. 3, calculations can be made to determine, for each pixel on the display 203, a corresponding location on the ground (or in the sky). (In this particular simple illustrative example, the ground is assumed to be flat.) Using the six parameters already noted (latitude, longitude, altitude, pitch, yaw, and roll) and applicable geometry one can readily determine such information as will be well understood by those skilled in the art. By one approach, this can comprise mapping each pixel on the display 203 to a point on the ground (or in the sky) and then finding the appropriate display image for each particular point on the ground. [0027] With this in mind, it may be well noted at this point that the aforementioned step of determining 103 a synthetic view to provide to the person in the vehicle can itself comprise identifying a particular image that is stored in a database. This particular image can comprise, for example, one of many candidate terrain images. By one approach, there can be a plurality of databases available for this purpose, where each of the databases has stored therein a plurality of candidate terrain images.
[0028] If desired, these databases can comprise so-called scaled databases. By one approach, this can comprise scaling each database to contain the same essential imagery but with only one half the resolution of a next higher-resolution database. Using this approach, one can provision as many scaled databases as may be usefully required as the storage requirements for all of the databases together will only use 4/3 rds the storage space of the highest resolution database being utilized. Using scaled databases provides a number of benefits, amongst them being an ability to provide for perspective of distance in the resultant synthetic view with little or no so-called jagged lines.
[0029] Viewed another way, these different databases contain information on how to display the desired terrain view when looking at that location from different distances. In other words, a pixel on the display that is to represent a location in the database that is very close to the current position would represent a very small area of land on the Earth. However, when that same location on the Earth is displayed in a single pixel from a distance that is very far away then this same pixel will represent a much greater area of the planet's surface. Consequently, a different database can be used where the averaging of the different pixels for the desired location has already taken place offline.
[0030] Such an approach saves on the number of computations that must take place in real-time.
[0031] By one approach, one can determine the distance between adjacent points by averaging with respect to points in a row on the display 203. As generally the aircraft and the ground below will both tend to be level, there will tend to be relatively small change in the differences of these points. It may also be noted that an average based upon using adjacent points in a same row will normally tend to be smaller than an average that is based on points taken from the same column for a given point of interest. As the average distance changes in the row versus the column, this can influence which scaled database to use. It may be noted, however, that using a scaled database with a smaller average distance, while offering greater detail at times, can also yield increased scintillation. The opposite is also often true.
[0032] Referring again to FIG. 1, when using images as suggested earlier as are retrieved from one or more databases (and particularly when working with scaled databases), this process 100 can further optionally provide for determining 106 whether to process the retrieved image(s) to significantly reduce scintillation in the synthetic view that might otherwise be provided to the viewer in the vehicle. This determination can be based, for example, upon determining whether there is a large enough difference in the distance between pixels in the X and Y directions of the database content such that a different scaling of the database would be applied. If no difference in database scaling is needed, then this determination step 106 can conclude that such processing is not required and the process 100 can be diverted as appropriate to meet the needs of a given application setting.
[0033] When this determination is positive, however, this process 100 can then provide for processing 107 the particular image to significantly reduce scintillation in the synthetic view that is provided to the person in the vehicle. By one approach, this can comprise low pass filtering the particular image as a function of a ratio of a distance between pixels in an X direction and a distance between pixels in a Y direction. For example, this can comprise using a weighted average of the average distances associated with both the row and columns for a given point. A ratio reflecting use of 90 percent of the row average and 10 percent of the column average works well in many application settings.
[0034] By another approach, applied alone or in conjunction with the foregoing, this processing can also comprise low pass filtering the edges of the particular image to thereby reduce informational content at those edges as versus a central portion of the particular image. This is often acceptable as the main object of interest will often be in the center of the display 203. Reducing the level of detail somewhat at the edges of the display will therefore often not produce noticeable issues with respect to resolution while also serving to reduce the amount of perceived scintillation. [0035] Those skilled in the art will recognize and appreciate that these teachings are highly flexible and readily scaled to accommodate a wealth of informational content. For example, by one approach, the synthetic view can comprise, at least in part, photographic imagery of the terrain (as captured, for example, by satellite and/or other aircraft). Such photographic content can be used alone or in combination with other imagery of choice. As another example in this regard, if desired, the synthetic view can comprise, at least in part, elements of an aviation navigation chart. Examples in this regard might include, but are not limited to, specific roads of navigational interest and value, generally denoted urban areas (using, for example, a color such as yellow), radio frequencies employed by different airports and airstrips for their air-to-ground communications, airport names and spheres of coverage, and so forth. Again, such aviation navigation chart content can be employed alone (thereby making the synthetic view a kind of dynamic moving chart that moves in accordance with the movement of the aircraft such that the presentation of the scale, field of view, and orientation of the chart accords with the pilot's own view of the external terrain through the front windshield) or in combination with other informational content (using, for example, an overlay approach).
[0036] Those skilled in the art will appreciate that the above-described processes are readily enabled using any of a wide variety of available and/or readily configured platforms, including partially or wholly programmable platforms as are known in the art or dedicated purpose platforms as may be desired for some applications. Referring now to FIG. 4, an illustrative approach to such a platform will now be provided.
[0037] As noted earlier, these teachings may be employed in conjunction with a moving vehicle 401 of choice. The enabling apparatus 402 can comprise a position determination unit 403 that is configured and arranged to automatically determine a position of the moving vehicle 401 with respect to terrain past which the moving vehicle 401 is traveling and an orientation determination unit 404 that is configured and arranged to automatically determine an orientation attitude of the moving vehicle 401 with respect to the aforementioned terrain. These two units 403 and 404 operably couple to a hardware-based processor which further operably couples to a display 406 (either directly or via certain databases as described below, depending upon the needs and/or opportunities as tend to characterize a given application setting).
[0038] As noted, in this illustrative embodiment the processor comprises a hardware-based processor 405. As used herein, this will be understood to refer to a processing platform having logic elements that are each comprised of dedicated corresponding hardware components. In particular, it will be understood that this reference to a hardware-based processor specifically refers to a processing platform that lacks executable program instructions (where the latter are understood to comprise software-based instructions as versus hard-wired components). This approach, though counterintuitive to many, has been determined by the applicant to provide a number of advantages. These include, but are not necessarily limited to, simplicity and reliability in operation. The challenges of designing such a platform are largely overcome in this particular instance by taking into account and relying upon the various teachings set forth herein, as these teachings greatly simplify the computational requirements of selecting and then employing high quality synthetic view images at a high refresh rate.
[0039] As noted earlier, these teachings can employ one or more databases comprising, if desired, scaled databases. Such databases are shown in FIG. 4 as a first database 407 through an Nth database 408 (where "N" will be understood to comprise an integer greater than one) that operably couple to the aforementioned hardware-based processor 405. Also if desired, this apparatus can comprise a real-time local features detector 409 and/or a TCAS receiver 410 as mentioned earlier. Such additional components can also be operably coupled to the hardware-based processor 405 to thereby further inform its functionality.
[0040] So configured and arranged, the hardware-based processor 405 can be configured and arranged to carry out one or more of the steps, actions, or functionality as has been set forth herein. This can specifically comprise, for example, using the position and orientation attitude information from the position determination unit 403 and the orientation determination unit 404 to determine a synthetic view (as described above) to provide to a person in the vehicle 401 via the display 406. [0041] Those skilled in the art will recognize and understand that such an apparatus 402 may be comprised of a plurality of physically distinct elements as is suggested by the illustration shown in FIG. 4. It is also possible, however, to view this illustration as comprising a logical view, in which case one or more of these elements can be enabled and realized via a shared platform. It will also be understood that such a shared platform may comprise a wholly or at least partially programmable platform as are known in the art.
[0042] So configured and arranged, these teachings permit a very high resolution display of highly relevant external information to be presented via a synthetic display in a manner that renders identification, interpretation, and use of that information in a highly intuitive manner by the viewer. The real-time accord between the presentation of such information, along with the position and orientation-based presentation of this information in a manner that comports with the real world view that the observer otherwise has of the same field of view, for example, contributes significantly in this regard.
[0043] Those skilled in the art will recognize that a wide variety of modifications, alterations, and combinations can be made with respect to the above described embodiments without departing from the spirit and scope of the invention, and that such modifications, alterations, and combinations are to be viewed as being within the ambit of the inventive concept.

Claims

We claim:
1. A method comprising: in a moving vehicle and with respect to a person in the vehicle having an ordinary expected gaze directionality while in the moving vehicle: automatically determining a position of the moving vehicle with respect to terrain past which the moving vehicle is traveling; automatically determining an orientation attitude of the moving vehicle with respect to the terrain; automatically using the position and the orientation attitude to determine, in the absence of executable program instructions, a synthetic view to provide to the person in the vehicle, wherein the synthetic view comprises a view of the terrain that comports with the ordinary expected gaze directionality of the person in the vehicle.
2. The method of claim 1 wherein the position comprises longitude, latitude, and altitude information for the vehicle.
3. The method of claim 1 wherein the orientation attitude comprises pitch, roll, and yaw information for the vehicle.
4. The method of claim 1 wherein determining a synthetic view to provide to the person in the vehicle comprises identifying a particular image that is stored in a database.
5. The method of claim 4 wherein determining a synthetic view to provide to the person in the vehicle comprises identifying a particular image that is stored in a database comprises identifying a particular image as is stored on one of a plurality of databases that each contain candidate terrain images.
6. The method of claim 5 wherein the plurality of databases comprise scaled databases.
7. The method of claim 6 further comprising: determining whether to process the particular image to significantly reduce scintillation in the synthetic view that is provided to the person in the vehicle.
8. The method of claim 7 where, upon determining to process the particular image to significantly reduce scintillation in the synthetic view that is provided to the person in the vehicle processing the particular image to significantly reduce scintillation in the synthetic view that is provided to the person in the vehicle.
9. The method of claim 8 wherein processing the particular image to significantly reduce scintillation in the synthetic view comprises: low pass filtering the particular image as a function of a ratio of a distance between pixels in an X direction and a distance between pixels in a Y direction.
10. The method of claim 9 wherein processing the particular image to significantly reduce scintillation in the synthetic view further comprises: further low pass filtering edges of the particular image to thereby reduce informational content at the edges as versus a central portion of the particular image.
11. The method of claim 1 wherein the synthetic view comprises, at least in part, an aviation navigation chart.
12. The method of claim 11 wherein the synthetic view further comprises, at least in part, photographic imagery of the terrain.
13. The method of claim 1 further comprising: detecting real-time local features external to the moving vehicle to provide detected local features information; using the detected local features to contribute to content of the synthetic view.
14. The method of claim 1 further comprising: receiving real-time Traffic Collision Avoidance System (TCAS) information regarding local aircraft; using the Traffic Collision Avoidance System (TCAS) information to contribute to content of the synthetic view.
15. An apparatus for use in a moving vehicle and with respect to a person in the vehicle having an ordinary expected gaze directionality while in the moving vehicle comprising: a position determination unit that is configured and arranged to automatically determine a position of the moving vehicle with respect to terrain past which the moving vehicle is traveling; an orientation determination unit that is configured and arranged to automatically determine an orientation attitude of the moving vehicle with respect to the terrain; a display; a hardware-based processor operably coupled to the position determination unit, the orientation determination unit, and the display, and being configured and arranged to automatically use the position and the orientation attitude to determine, in the absence of executable program instructions, a synthetic view to provide to the person in the vehicle via the display, wherein the synthetic view comprises a view of the terrain that comports with the ordinary expected gaze directionality of the person in the vehicle.
16. The apparatus of claim 15 wherein the position determination unit is configured and arranged to determine longitude, latitude, and altitude information for the vehicle.
17. The apparatus of claim 15 wherein the orientation determination unit is configured and arranged to determine pitch, roll, and yaw information for the vehicle.
18. The apparatus of claim 1 further comprising: a database that is operably coupled to the hardware-based processor and the display; and wherein the hardware-based processor is configured and arranged to determine a synthetic view to provide to the person in the vehicle by identifying a particular image that is stored in the database.
19. The apparatus of claim 18 wherein: the database comprises a plurality of databases that each contain candidate terrain images; and the hardware-based processor is further configured and arranged to identify a particular image that is stored in the database by identifying a particular image as is stored on one of the plurality of databases.
20. The apparatus of claim 19 wherein the plurality of databases comprise scaled databases.
PCT/US2007/088148 2006-12-19 2007-12-19 Method and apparatus to facilitate providing a synthetic view of terrain for use in a moving vehicle WO2008077105A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US87070306P 2006-12-19 2006-12-19
US60/870,703 2006-12-19

Publications (1)

Publication Number Publication Date
WO2008077105A1 true WO2008077105A1 (en) 2008-06-26

Family

ID=39253932

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2007/088148 WO2008077105A1 (en) 2006-12-19 2007-12-19 Method and apparatus to facilitate providing a synthetic view of terrain for use in a moving vehicle

Country Status (1)

Country Link
WO (1) WO2008077105A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6496760B1 (en) * 1999-07-21 2002-12-17 Honeywell International Inc. Flight information display with plane of flight view
US20030193411A1 (en) * 1999-04-01 2003-10-16 Price Ricardo A. Electronic flight instrument displays
EP1462767A1 (en) * 2003-02-27 2004-09-29 The Boeing Company Flight guidance system and symbology and control system providing perspective flight guidance

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030193411A1 (en) * 1999-04-01 2003-10-16 Price Ricardo A. Electronic flight instrument displays
US6496760B1 (en) * 1999-07-21 2002-12-17 Honeywell International Inc. Flight information display with plane of flight view
EP1462767A1 (en) * 2003-02-27 2004-09-29 The Boeing Company Flight guidance system and symbology and control system providing perspective flight guidance

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
CHAKRABARTY J ET AL: "Multi - view synthetic vision display system for general aviation", 6 March 2004, AEROSPACE CONFERENCE, 2004. PROCEEDINGS. 2004 IEEE BIG SKY, MT, USA 6-13 MARCH 2004, PISCATAWAY, NJ, USA,IEEE, US, PAGE(S) 1618-1627, ISBN: 0-7803-8155-6, XP010748280 *

Similar Documents

Publication Publication Date Title
EP2778617B1 (en) Aircraft flight deck displays and systems and methods for enhanced display of obstacles in a combined vision display
US8160755B2 (en) Displaying air traffic symbology based on relative importance
US8406466B2 (en) Converting aircraft enhanced vision system video to simulated real time video
US8026834B2 (en) Method and system for operating a display device
US8892357B2 (en) Ground navigational display, system and method displaying buildings in three-dimensions
US8903655B2 (en) Method and system for displaying emphasized aircraft taxi landmarks
US20100253546A1 (en) Enhanced situational awareness system and method
EP3438614B1 (en) Aircraft systems and methods for adjusting a displayed sensor image field of view
EP2037216B1 (en) System and method for displaying a digital terrain
EP2613125B1 (en) System and method for indicating a perspective cockpit field-of-view on a vertical situation display
EP2317366A2 (en) System for providing a pilot of an aircraft with a visual depiction of a terrain
US10332413B2 (en) System and method for adjusting the correlation between a visual display perspective and a flight path of an aircraft
US8314719B2 (en) Method and system for managing traffic advisory information
EP1972897B1 (en) System and method for indicating the field of view of a three dimensional display on a two dimensional display
US20090322753A1 (en) Method of automatically selecting degree of zoom when switching from one map to another
US8170789B2 (en) Method for providing search area coverage information
US8649915B2 (en) Method and apparatus to facilitate providing a synthetic view of terrain for use in a moving vehicle
WO2008077105A1 (en) Method and apparatus to facilitate providing a synthetic view of terrain for use in a moving vehicle
EP3933805A1 (en) Augmented reality vision system for vehicular crew resource management
EP3748303A1 (en) Aircraft, enhanced flight vision system, and method for displaying an approaching runway area
EP2565668A1 (en) Method and apparatus for providing motion cues in compressed displays

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07865866

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: "NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC", F1205A

122 Ep: pct application non-entry in european phase

Ref document number: 07865866

Country of ref document: EP

Kind code of ref document: A1