GB2254214A - Imaging systems - Google Patents
Imaging systems Download PDFInfo
- Publication number
- GB2254214A GB2254214A GB8704012A GB8704012A GB2254214A GB 2254214 A GB2254214 A GB 2254214A GB 8704012 A GB8704012 A GB 8704012A GB 8704012 A GB8704012 A GB 8704012A GB 2254214 A GB2254214 A GB 2254214A
- Authority
- GB
- United Kingdom
- Prior art keywords
- sensor
- imaging
- image
- aircraft
- imaging system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S11/00—Systems for determining distance or velocity not using reflection or reradiation
- G01S11/12—Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
Description
2254214 - 1 Imaging System This invention relates to an imaging system and
in particular it relates to airborne imaging systems for discriminating between different size and shaped objects seen at different ranges.
Current imaging algorithms, particularly for use in air to air and air to surface scenarios, make use of the information present in a discrete series of images produced by, for instance, a forward looking infra-red (FLIR) or TV type imager. However, several types of object will be detected by such a system and at all ranges and orientations.
The images will be diffuse and could be confused since a cow, a house and a vehicle for instance may produce images of the same apparent size albeit at different ranges. Current algorithms are unable to distinguish between such objects. This could be disadvantageous if one particular object is being searched for or tracked or if the imaging system forms part of a navigation process.
If range information were also available to the imaging system, then objects could be discriminated according to their range and relative size within the image; accordingly the consistency of recognition performance could be greatly increased. However, directly 2 - is obtaining forward range information from an aircraft is not simple and in certain circumstances active systems such.as radar or laser range finders cannot be used.
According to the present invention there is provided an imaging system including; an imaging sensor for viewing a ground region and obtaining an image thereof; one or more further sensors for measuring chosen parameters related to the relative position of the imaging sensor with respect to the ground region; and means for using the output of the or each further sensor to ascribe range to portions of the image at least partly in dependence upon their location within the field of view of the imager.
The imaging system may be adapted to view a ground region from an aircraft.
Preferably, the further sensors include means to measure the ground clearances, barometric height and attitude (roll, pitch and yaw) of the aircraft. By using such data with the image data from the sensor a range may be ascribed to each portion of the image and hence to each individual object detected within the image. Objects can then be classified according to their positions and relative sizes within the image.
In a second aspect the invention provides an imaging process including the steps of obtaining an image of a ground region with an airborne imaging sensor having a field of view; measuring the height H and ground clearance h of the sensor over a period of time and comparing the is values of H and h to give an indication of terrain undulations, and applying the acquired undulation data to the image from the imaging sensor in order to ascribe range to portions of the image at least partly in dependence upon their location within the field of view of the imager.
Figure 1 of the accompanying drawings shows how a knowledge of the barometric height and ground clearance of an aircraft can help to ascertain more accurately a range map of the area surveyed. A typical electro optical (EO) sensor, which is generally an infra red sensor, is mounted on an aircraft at a depression from having a vertical field of view 2a. an idealised case where the ground the horizontal and Figure la represents is perfectly flat and the aircraft is flying straight and level at a height h. It is clear from the diagram that if an object is seen on the centre line of the image, then its horizontal range R from the aircraft is equal to h/tan d, and its slant range h/sin d. From a knowledge of the field of view 2a in the vertical direction, together with the number of TV lines in the overall image, each line may have a range ascribed to it. This will of course not be a linear relationship because of the oblique viewing angle.
Clearly, the situation shown in figure la is simplistic and in reality flight will not be straight and level and the terrain will not be flat. An example of - 4 this is shown in figure lb where although the aircraft is shown as flying straight and level the ground is seen to be undulating. At the point shown where the object is detected the aircraft is measured as being at a height h above the ground. Using the same mathematics as for figure la would in this case suggest that the object is at a range R' which is obviously false. In this example therefore the barometric height, shown as H should be used to obtain a correct value for the range R. However, if the object lies on a piece of ground that is undulating and above the base barometric level then a mere reading of the barometric height will also lead to inaccuracies in the range.
To overcome these problems the two heights which can is be measured by the aircraft, i.e. the barometric height H and the ground clearance h can be continually measured and compared. It is the comparison of the two values on a continuous basis which can separate generally the undulations of the ground from the undulations of the aircraft and hence which can be processed to obtain an estimate of the ground level of the object. From such measurements, an estimate of its true position can be made. The following table 1 gives some typical comparisons of h and H on a continuous basis and the appropriate analyses in each case:
- 5 H Steady Steady Varying Varying h Varying Steady Steady Varying DIAGNOSIS Constant height, terrain undulating Constant height, flat terrain Varying height, terrain undulating (i.e. terrain following) Varying height, varying ground clearance Figure lb does of course greatly exaggerate the problem although the problem is becoming more pronounced since modern day aircraft can fly at lower ground clearances than before. The method of the present invention therefore provides an improved process for estimating the correct range of any object within the image, and at any point of the aircraft's flight a range map can be produced upon which individual objects may be placed. The range map need not be a real image in any sense and could be merely represented by electronically stored data within a memory. From a knowledge of the relative sizes of the objects on this range map, obviously irrelevant data can be eliminated and thus the user has a greater chance of detecting those objects in which he is interested.
Embodiments of the present invention will now be described by way of example only with reference to the accompanying drawings in which:
Figure 1 shows schematically how terrain undulation can affect the perceived range of a target; - 6 Figure 2 shows schematically apparatus adapted for use in the present invention; and Figure 3 shows schematically a typical range map.
Referring to figure 2 an imaging system according to one embodiment of the present invention uses four sensing elements, each measuring a different parameter of the aircraft's flight. The scene of interest is viewed by an Electro-optical (EO) sensor 1 which is affixed to an aircraft and views a field of view as shown in figure 1 to produce an image including all the possible objects within the imagers field of view, in diffuse form. Image information from EO sensor 1 is processed by normal and well known algorithms to yield the x, y co-ordinates 2 of all possible objects. This will of course include such things as cows, trees, etc. which may or may not be of interest. An inertial navigation (IN) system 3 of any type commonly found in an aircraft is used to obtain attitude data 4 so that at any particular time the aircraft's roll, pitch and yaw values are known. IN systems ate well known and can be based on gyroscopic principles.
Height information is obtained by two sensors, a radar altimeter 5 and a barometric altimeter 6. At any point in time the radar altimeter 5 can be used to determine the ground clearance of the aircraft by means of radar pulses and from this can be deduced the value h of figure 1. The barometric altimeter 6 measures changes in is 1 air pressure caused mainly by varying altitude of an aircraft above a datum point at which the altimeter 6 last set, generally the height of the runway from which the aircraft last ascended. These units are generally accurate to within about 50 metres or so and give an absolute value of height, independent of terrain fluctuations. and smoothed at of known form.
of these sensors and buildings can the data.
Three "fusion points" are shown on a time scale in figure 2. At each of these points various pieces of data are fused together in order that the final output of the system takes into account the data received from all four sensing units.
The filtered and smoothed outputs from the radar and barometric altimeters, h and H respectively, are compared on a continuous basis in a unit 9. Simple circuitry well known to those skilled in the art can be used for this purpose and a continuous comparison of the two values can give an indication of the varying both of the ground level and of the aircraft height and hence a closer approximation to the range of a chosen targeted object.
The varying values of h and H are filtered respectively units 7 and 8 which are again It is found useful to smooth the outputs so that local anomalies such as hedges be ignored during further processing of An analysis similar to that of table 1 may be used to give an indication of terrain variation although continuous monitoring of the values of H,h and H-h provides for quantification of the process and allows for a degree of intelligence in the system, leading to the position shown in the box labelled 10 of figure 2 wherein at each point in time an estimate of the true height of the aircraft and of the terrain undulations is formed.
The values thus obtained will be dependent upon the attitude of the aircraft and hence are corrected in a geometric processor 11 with data from the IN system 3. From this process a range map 12 is derived. The gecmetric processor 11 essentially performs hardware matrix multiplications of an assumed perfect range map by another matrix derived from the height, attitude and altitude values from units 4 and 10 which determine the sensor's view point. It produces a third, image size matrix 12 which is a range map relevant to the image being produced at that instant.
Concurrently derived image information from the EO sensor is processed as described above to yield the x and y co-ordinates of all the objects within the image. The range map 12 is then overlaid with the x,y data at 13 in order that each object or potential target now has x,y and range r associated with it. A typical final range map produced in this way is shown in figure 3. This overlaid range map may be displayed to an observer but would generally be held wthin a memory array and only the specific object details relayed. The range map derived by unit 12 is only approximate for reasons described above and hence bands of ranges are formed as shown in figure 3 rather than a series of discrete individual ranges. However, this level of accuracy is generally sufficient to be able to discriminate between different types of target object. From the relative position and size of objects within the image, further apparatus not forming part of the present invention but shown as block 14 can be used automatically to discriminate between different objects to eliminate those in which the operator is not interested, depending upon the use he wishes to make of the data.
The three fusion blocks of the system, i.e. blocks 9,11 and 13 all involve quite straight forward operations and can be easily designed. The geometric processor 11 operates from straight forward geometric principles but is computationally intensive since each individual memory element must be addressed. However, its function is similar to many computer graphics processors which are currently available and known to those in the art.
It is envisaged that eventually a terrain data base will form a standard part of the avionics on most, if not all aircraft. This will contain in effect a digitised map of the area over which the aircraft will be flying and the details shown on the map will include contour and - undulation data. Hence, the output from a digital terrain data base at a particular instant can be used to more accurately pin point the position of objects. Referring to figure 2, the digital output from a terrain data base could be used instead of the output from the barometric altimeter 6. Information from a terrain data base could alternatively be used in addition to the barometric altimeter 6 to provide a further level of fusion.
An alternative method of measuring undulations and contours in terrain is by use of data streaming methods whereby the rates at which lines of the image diverge as the aircraft moves towards and over them can be used to determine slope values. Data streaming techniques currently involve extremely complicated processing is arrangements and would greatly increase the complexity and cost of a data fusion system according to the invention. However, if the complexity of data streaming systems is reduced to a reasonable level then a data streaming processing unit could also be used as an additional fusing step in the present apparatus.
11 -
Claims (9)
1. An imaging system including; an imaging sensor for viewing'a ground region and obtaining an image thereof; one or more further sensors for measuring chosen parameters related to the relative position of the imaging sensor with respect to the ground region; and means for using the output of the or each further sensor to ascribe range to portions of the image at least partly in dependence upon their location within the field of view of the imager.
2. An imaging system as claimed in claim 1 adapted to view the ground region from an aircraft.
3. An imaging system as claimed in 2 wherein the or a further sensor is a sensor for measuring the ground clearance of the aircraft.
4. An imaging system as claimed in claim 2 or claim 3 wherein the or a further sensor is a sensor for measuring the attitude of the aircraft.
5. An wherein 6 An imaging system as claimed in any of claims 2 to 4 the or a further sensor is a barometric altimeter. imaging system as claimed in any of claims 2 to 5 further including a digital terrain data base and wherein an output from the data base is usable to further ascribe range to portions of the image. 7. An imaging process including the steps of obtaining an image of a ground region with an airborne imaging sensor having a field of view; measuring the height H and 12 - ground clearance h of the sensor over a period of time and comparing the values of H and h to give an indication of terrain and flightpath undulations, and; applying the acquired undulation data to the image from the imaging sensor in order to ascribe range to portions of the image at least partly in dependence upon their location within the field of view of the image. 8. An imaging process as claimed in claim 7 including a further step of measuring the attitude of the aircraft and applying the altitude data to the image data from the imaging sensor. 9. An imaging process substantially as hereinbefore described with reference to the accompanying figures. 10. An imaging system substantially as hereinbefore described with reference to figure 2 of the accompanying drawings.
k Amendments to the claims have been filed as follows 1. An imaging system carried by an aircraft including: an imaging sensor for viewing a ground region forward of the aircraft and for obtaining an image thereof; means for determining the ground clearance, h, of the imaging sensor; means for determining changes in the altitude H of the sensor relative to a fixed datum point; first means for using outputs from the ground clearance determining means, change in altitude determining means and angle of depression of the imaging sensor to provide a value related to an estimated vertical displacement of the imaging sensor from the ground region being viewed; and second means for using at least the related value to ascribe range to portions of the image at least partly in dependence upon their location within the field of view of the imaging sensor.
2. An imaging system as claimed in claim 1 wherein the means for determining the angle of depression comprises a sensor for measuring the attitude of the aircraft.
3. An imaging system as claimed in claim 1 or claim 2 wherein the means for determining changes in altitude 14- comprises abarometric altimeter.
4- An imaging system as claimed in any preceding claim including a digital terrain data base and wherein an output from the data base is usable to further ascribe range to portions of the image.
5. An imaging system as claimed in any preceding claim wherein the first means for using includes means for continuously comparing the output signals from the ground clearance determining means and the change in altitude determining means.
6. An imaging process including the steps of obtaining an image of a ground region with an airborne imaging sensor having a field of view; determining changes in altitude H of the sensor, relative to a fixed datum point; using the determined values and angle of of the sensor to obtain a value related to an vertical displacement of the imaging sensor form the ground region being viewed; and using the related value to ascribe range to portions of the image at least partly in dependence upon their location within the field of view of the imaging sensor.
depression estimated I-S-
7. Ah imaging process as claimed in claim 6 includiiig a further step of determining the attitude of the aircraft and applying the attitude data to the image data from the imaging sensor to align the image.
8. An imaging process substantially as hereinbefore described with reference to the accompanying figures.
9. An imaging system substantially as hereinbefore described with reference to Figure 2 of the accompanying drawings.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB8704012A GB2254214B (en) | 1987-02-20 | 1987-02-20 | Imaging systems |
NL8800105A NL8800105A (en) | 1987-02-20 | 1988-01-19 | IMAGING SYSTEM. |
FR8801922A FR2678127A1 (en) | 1987-02-20 | 1988-02-18 | IMAGE FORMATION SYSTEM. |
DE19883805252 DE3805252A1 (en) | 1987-02-20 | 1988-02-19 | IMAGE SYSTEM |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB8704012A GB2254214B (en) | 1987-02-20 | 1987-02-20 | Imaging systems |
Publications (2)
Publication Number | Publication Date |
---|---|
GB2254214A true GB2254214A (en) | 1992-09-30 |
GB2254214B GB2254214B (en) | 1993-03-10 |
Family
ID=10612649
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB8704012A Expired - Fee Related GB2254214B (en) | 1987-02-20 | 1987-02-20 | Imaging systems |
Country Status (4)
Country | Link |
---|---|
DE (1) | DE3805252A1 (en) |
FR (1) | FR2678127A1 (en) |
GB (1) | GB2254214B (en) |
NL (1) | NL8800105A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1599771A2 (en) * | 2003-03-02 | 2005-11-30 | Tomer Malchi | Passive target data acquisition method and system |
WO2007080589A2 (en) | 2006-01-15 | 2007-07-19 | Tomer Malchi | True azimuth and north finding method and system |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2115633A (en) * | 1982-02-22 | 1983-09-07 | Secr Defence | Low level flying aids (referring to Figure 1 of the drawings) |
-
1987
- 1987-02-20 GB GB8704012A patent/GB2254214B/en not_active Expired - Fee Related
-
1988
- 1988-01-19 NL NL8800105A patent/NL8800105A/en not_active Application Discontinuation
- 1988-02-18 FR FR8801922A patent/FR2678127A1/en not_active Withdrawn
- 1988-02-19 DE DE19883805252 patent/DE3805252A1/en not_active Withdrawn
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2115633A (en) * | 1982-02-22 | 1983-09-07 | Secr Defence | Low level flying aids (referring to Figure 1 of the drawings) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1599771A2 (en) * | 2003-03-02 | 2005-11-30 | Tomer Malchi | Passive target data acquisition method and system |
EP1599771A4 (en) * | 2003-03-02 | 2006-06-14 | Tomer Malchi | Passive target data acquisition method and system |
US7107179B2 (en) | 2003-03-02 | 2006-09-12 | Tomer Malchi | Passive target data acquisition method and system |
US7451059B2 (en) | 2003-03-02 | 2008-11-11 | Tomer Malchi | True azimuth and north finding method and system |
WO2007080589A2 (en) | 2006-01-15 | 2007-07-19 | Tomer Malchi | True azimuth and north finding method and system |
Also Published As
Publication number | Publication date |
---|---|
NL8800105A (en) | 1992-10-01 |
GB2254214B (en) | 1993-03-10 |
DE3805252A1 (en) | 1992-12-10 |
FR2678127A1 (en) | 1992-12-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA2764135C (en) | Device and method for detecting a plant | |
EP0436213B1 (en) | Obstacle detection system | |
US8077913B2 (en) | Method and device for determining the actual position of a geodetic instrument | |
US11465743B2 (en) | System and method for selecting an operation mode of a mobile platform | |
JP2009294214A (en) | Method and system for navigation based on topographic structure | |
US20200393246A1 (en) | System and method for measuring a displacement of a mobile platform | |
CN116625354B (en) | High-precision topographic map generation method and system based on multi-source mapping data | |
CN112601928A (en) | Position coordinate estimation device, position coordinate estimation method, and program | |
JP7386136B2 (en) | Cloud height measurement device, measurement point determination method, and cloud type determination method | |
CN116719037A (en) | Environment sensing method and system for intelligent mowing robot | |
CN114820793A (en) | Target detection and target point positioning method and system based on unmanned aerial vehicle | |
US9885569B2 (en) | Passive altimeter | |
CN105021573A (en) | Method and device for tracking-based visibility range estimation | |
US7440591B1 (en) | Validation of terrain and obstacle databases | |
GB2041689A (en) | Vehicle movement sensing | |
US20220404170A1 (en) | Apparatus, method, and computer program for updating map | |
GB2254214A (en) | Imaging systems | |
Hebel et al. | LiDAR-supported navigation of UAVs over urban areas | |
KR100874425B1 (en) | System for measuring size of signboard and method for measuring size of signboard using the same | |
JP6867465B2 (en) | Systems and methods for selecting mobile platform operating modes | |
Valerievich et al. | Experimental assessment of the distance measurement accuracy using the active-pulse television measuring system and a digital terrain model | |
CN110617800A (en) | Emergency remote sensing monitoring method, system and storage medium based on civil aircraft | |
JP2806187B2 (en) | Absolute bearing orientation device | |
Haala et al. | Sensor fusion for airborne 3D data capture | |
CN114322943B (en) | Target distance measuring method and device based on forward-looking image of unmanned aerial vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PCNP | Patent ceased through non-payment of renewal fee |
Effective date: 19930610 |