GB2041689A - Vehicle movement sensing - Google Patents

Vehicle movement sensing Download PDF

Info

Publication number
GB2041689A
GB2041689A GB8002038A GB8002038A GB2041689A GB 2041689 A GB2041689 A GB 2041689A GB 8002038 A GB8002038 A GB 8002038A GB 8002038 A GB8002038 A GB 8002038A GB 2041689 A GB2041689 A GB 2041689A
Authority
GB
United Kingdom
Prior art keywords
signals
image
view
imaging means
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB8002038A
Other versions
GB2041689B (en
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Smiths Group PLC
Original Assignee
Smiths Group PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Smiths Group PLC filed Critical Smiths Group PLC
Priority to GB8002038A priority Critical patent/GB2041689B/en
Publication of GB2041689A publication Critical patent/GB2041689A/en
Application granted granted Critical
Publication of GB2041689B publication Critical patent/GB2041689B/en
Expired legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P3/00Measuring linear or angular speed; Measuring differences of linear or angular speeds
    • G01P3/64Devices characterised by the determination of the time taken to traverse a fixed distance
    • G01P3/80Devices characterised by the determination of the time taken to traverse a fixed distance using auto-correlation or cross-correlation detection means
    • G01P3/806Devices characterised by the determination of the time taken to traverse a fixed distance using auto-correlation or cross-correlation detection means in devices of the type to be classified in G01P3/68

Abstract

A system for measuring the speed, direction or distance travelled by an aircraft 1, uses a television camera 4 mounted to view the ground beneath the aircraft. The system has a store 12 which receives from the camera signals representing a small area Ao of ground within its field of view during one scan of the camera. The system also includes a correlation unit 14 which correlates the contents of the store with the signals from the camera during a later scan so as to identify the same area of ground displaced within the field of view owing to the aircraft movement. By measuring the displacement within the field of view and the distance of the aircraft from the ground, the system obtains a measure of the distance travelled relative to the ground. The speed of the aircraft is calculated from knowledge of the time between scans. <IMAGE>

Description

SPECIFICATION Vehicle movement sensing This invention relates to systems for providing a representation of the movement of a vehicle relative to a surface.
The invention is especially, but not exclusively, concerned with systems for measuring the speed or the distance travelled by an aircraft, such as a helicopter or fixed-wing aircraft,'relative to the ground.
Whilst the speed of an aircraft relative to the air within which it is moving can be measured relatively easily, such as, for example, by the use of a pitot static probe, difficulties have been experienced in obtaining accurate measurements of aircraft speed relative to the ground.
Some aircraft, however, are now equipped with television cameras or other forms of electro-optical sensors directed to view the ground underneath the aircraft, for purposes such as observation or sur eying. The output of the camera is supplied to a television screen or recorder which may be mounted in the aircraft or at a ground station.
It is an object of the present invention to overcome the above-mentioned difficulties by using information provided by such sensors to obtain an indication of aircraft movement relative to the ground.
According to one aspect of the present invention there is provided a system for providing a representation of the movement of a vehicle relative to a surface, wherein the system includes imaging means mounted with the vehicle to view said surface and thereby provide signals representative of an image of a part of said surface, means for providing a measure of the displacement, relative to the field of view of said imaging means, of a part at least of said image caused by movement of the vehicle relative to said surface, and means for providing a representation of the movement of the vehicle relative to said surface in accordance with the displacement of the part at least of said image and the distance of the vehicle from said surface.
The imaging means may be mounted on a platform that is stabilised in azimuth and attitude. The imaging means may be a television camera and the system may be arranged to measure the displacement of a part at least of the image between repeated scans of the camera. The system may include a store means that is arranged to store signals representative of the part at least of the image, and correlation means that is arranged to correlate signals stored in the store means at one time with signals from the imaging means at a later time. The correlation means may be arranged to restrict correlation of signals in the store means with 51 bstantially only those signals from the imaging means representative of areas of the surface compatible with movement of the vehicle.
According to another aspect of the present invention there is provided a method of providing a representation of the movement of a vehicle relative to a surface including the steps of deriving from imaging means mounted with said vehicle to view said surface, signals representative of an image of a part of said surface, obtaining a measure of the displacement, relative to the field of view of said means, of a part at least of said image caused by movement of the vehicle relative to said surface, and providing a representation of the movement of the vehicle relative to said surface in accordance with the displacement of the part at least of said image and the distance of the vehicle from said surface.
A system for measuring the speed and direction of an aircraft relative to the ground, in accordance with the present invention, will now be described, by way of example, with reference to the accompanying drawings, in which: Figure 1 shows schematically an aircraft flying above ground; Figures 2, 3 and 5 show the field of view of a television camera mounted in the aircraft, at differant times; Figure 4 is a schematic diagram of the system.
With reference to Figure 1, an unmanned aircraft 1 is shown as flying in a direction indicated by the arrow 2 over ground 3. The aircraft 1 has a television camera 4 mounted on a platform 5 which is stabilised in azimuth and attitude such that the camera is always directed vertically downwards at the ground 3 beneath the aircraft and such that the edges of the field of view of the camera are always aligned in the same direction regardless of the aircraft's movement. The camera 4 supplies signals via a radio data link unit 6 (Figure 4) to a television screen 7 at a ground station. The field of view during one raster scan of the camera 4 is represented by Figure 2, whereas the field of view during a subsequent raster scan, at a time t later after the aircraft 1 has travelled a distanced relative to the ground 3 - is represented by Figure 3.The system identifies an area Ao of the ground within the field of view during one raster scan and then locates the same area Ao during the subsequent raster scan. The system measures how far this area Ao has been displaced within the field of view and, knowing the height h of the aircraft 1 above ground 3, derives an indication of the distance travelled. The aircraft's speed relative to ground 3 can then be derived from knowledge of the time interval t between raster scans.
The system will now be described in greater detail with reference to Figure 4. The television camera 4 views the ground 3 beneath the aircraft 1 and supplies video signals along line 8 to the. data link unit 6 and a computing unit 10 in the form of a microprocessor. The video signals are of analogue form, giving information regarding brightness level, and including the usual line and frame sync pulses.
The computing unit 10 includes a video processor unit 11 which converts into a digital form those video signals within a small area B (as represented by the vertical strip within the unbroken lines in Figures 2 and 3) of the field of view of the camera 4. The computing unit 10 then passes those digital signals to a first set of storage locations 12. The area B is fixed with respect to the field of view of the television camera 4 and, during the first raster scan, corresponds to the area Ao (as represented by the vertical strip within the broken lines in Figure 2) of the ground 3.
Each horizontal line of the television camera raster may be regarded as divided into a number of small elements the brightness levels of which are defined by digital numbers derived by the video processor 11 in accordance with the video signals. The area B, in this example, is in the form of a narrow vertical strip having a thickness equal to several elements along a raster line, and a length less than that of the field of view. The area B may, however, take different shapes and sizes and may be oriented at any angle to the direction of travel of the aircraft. The area B may, for example, be in the form of a vertical line one raster element thick, or could be of cruciform shape. Alternatively, several discrete areas spaced over the field of view could be used.
Upon a subsequent raster scanning, after time t, video signals representative of the same area B within the field of view, are supplied to a second set of storage locations 13 within the computing unit 10.
Since the aircraft 1 has moved a distance d between these raster scans, the area B in the field of view of the camera 4 now corresponds to a different area At of ground, the original area Ao having been displaced within the field of view, as shown in Figure 3.
The area Ao of ground now corresponds to an area C (as represented by the area within the unbroken lines in Figure 3) in the field of view of the camera 4.
In order to identify the location of this area Cthe video signals for this second scan are correlated with the stored signals representative of the area B in the first scan. The correlation process is shown in Figure 4 as being performed by a correlation unit 14 within the computing unit 10, for ease of understanding. It will be appreciated, however, that this process would not usually be performed by a discrete unit but would be achieved by suitable programming of the computing unit 10. The correlation process will be described in greater detail hereinafter.
When a match (or the best match possible) has been found, the co-ordinates (as defined by the line and frame sync pulses) of the area Care used in a computation process for deriving the separation S between two areas B and C within the field of view, together with the direction in which C is displaced relative to B. Again, this computation process is shown in Figure 4 as being carried out by a discrete computation unit 15 although in reality it would be carried out as a step in the programmed operation of the computing unit 10. The computing unit 10 also receives signals, on line 20 from the aircraft's radar altimeter 21, representative of the height h of the aircraft above ground. Since both S and h are known, the distance d travelled by the aircraft 1 between raster scans can be calculated.With knowledge of the time interval t between the scans, the longitudinal and lateral components V0 and Va respectively of the speed of the aircraft 1 relative to the ground 3 can also be calculated. Signals representative of the speed components V0 and Va are supplied via line 30 to on-board equipment 31 such as a navigation or flight control system; the signals could also be supplied to the ground station via a radio data link.
After completion of the process of correlation of the signals from the television camera 4 with the signals stored in the first set of storage locations 12 another correlation is made. This subsequent correlation is of the signals from the television camera 4 at a time 2t after the first scan, with the signals stored in the second set of storage locations 13 (after the time t).
Signals from the video processor 11 during succeeding scans, at intervals oft seconds, are supplied again to the first and second sets of storage locations 12 and 13 in rotation, thereby erasing their previous contents.
In orderto reduce the time taken by the correlation process of identifying the area C, the search of the raster scan is restricted to only that area H (Figure 3) in which the area C is likely to occur. The area H is larger than the areas B and C, and its disposition within the field of view is determined in accordance with a rough measure of the aircraft's speed and direction of travel as calculated from previous scans or by other methods. Additionally, signals V' on line 40 from an accelerometer 41 may be supplied to the computing unit 10 so as to improve the location of the area H by providing compensation for change in speed of the aircraft 1 between raster scans. The computing unit 10 may also be supplied with signals 8 representatve of the aircraft's heading, on line 50 from the aircraft compass 51.
The correlation process by which the area C, within the area H, is identified may take several forms. In one form, a number of areas B' within the area H, of the same size and shape as the area B, are compared with the signals in the storage location 12.
In this comparison the brightness of each of the elements along each raster line is compared with the stored brightness values of corresponding elements of the area B. A number representative of the degree of similarity between the brightness of each of the corresponding elements is supplied to an accumulator. The accumulator adds these numbers to provide a total representative of the degree of correlation between the area B' and the area B.
Similar totals are calculated for each of the areas B', and that area having the highest degree of correlation is held to be the area C. There are many alternative ways in which the correlation process could be carried out. For example, the individual elements of the area B could be grouped together to form a number of groups each comprising several elements and each having an associated brightness value. The values of these groups could then be compared with the values of similar groups within the area B' of the area H and a degree of correlation for each area B' derived accordingly. The degree of brightness used in the correlation process could range in number from simply two levels, that is, bright or dark, to a multi-level grey scale.
In an alternative correlation process the system operates initially as above to indentify the area C corresponding to the original area B. On a following scan (as shown in Figure 5), a time 2t after the original scan, the system extends the correlation process so as also to identify an area D in the field of view corresponding to the original area Ao on the ground, rather than merely identifying an area in the field of view corresponding to At.Thi5 correlation process may be repeated for any number of time intervals t during which the area Ao remains within the field of view.
A television camera used for remote viewing will normally be scanned many times each second. For this application, however, it is not necessary to use the information from every scan except under exceptional circumstances. It may be found for example, that where the aircraft 1 is moving at high speeds close to the ground 3, the field of view and the speed of scanning of the camera 4 are such that the area Ao is only within the field of view of one scan and that the system therefore fails to operate successfully. In this case it may be found advan tageousto direct the camera 4 forwardly, rearwardly or sideways, rather than vertically downwards and to make suitable correction in the calculation of aircraft speed.The camera 4 need not necessarily be mounted on a stabilised platform, it would also be possible to supply the computing unit 10 with information regarding the pitch, roll and yaw of the aircraft 1 so as thereby to provide the compensation for changes in aircraft attitude.
In the application described above, to unmanned aircraft, the aircraft might, for example, normally be flown by remote radio control from the ground station. The system according to the present invention could be used to provide autonomous control of navigation upon failure of the radio link with the ground station.
Problems might be encountered if operation of the system were initiated whilst the aircraft was moving, since the system would not normally have any information regarding aircraft speed that it could use to enable it to locate the area H. In these circumstances, the system could either be arranged initially to make a correlation with the entire field of view, or the system could utilise velocity information from another source such as an integrated accelerometer output or a pitot static probe.
Whilst the television camera 4 referred to above is of the standard optical form operating at visible wavelengths it would be possible to use an infra-red camera or image-intensifier. It would also be possiblue to use alternative imaging systems having a scanned radiation beam (such as an optical laser or X-ray beam, or an acoustic sonar beam) and a sensor arranged to receive radiation reflected by the ground.

Claims (13)

1. A system for providing a representation of the movement of a vehicle relative to a surface, wherein the system includes imaging means mounted with the vehicle to view said surface and thereby provide signals representative of an image of a part of said surface, means for providing a measure of the displacement, relative to the field of view of said ;maging means, of a part at least of said image caused by movement of the vehicle relative to said surface, and means for providing a representation of the movement of the vehicle relative to said surface in accordance with the displacement of the part at least of said image and the distance of the vehicle from said surface.
2. A system according to Claim 1, wherein the imaging means is mounted with a platform that is stabilised in azimuth.
3. A system according to Claim 1 or 2, wherein the imaging means is mounted with a platform that is stabilised in attitude.
4. A system according to any one of the preceding claims, wherein the imaging means is a television camera.
5. A system according to Claim 4, wherein the system is arranged to measure the displacement of the part at least of said image between repeated scans of the camera.
6. A system according to Claim 4 or 5, wherein the part at least of said image is a strip that extends substantially at right angles to the raster lines of the television camera the strip being shorter than the field of view of the camera.
7. A system according to any one of the preceding claims, wherein the system includes store means that is arranged to store signals representative of the part at least of said image, and correlation means that is arranged to correlate signals stored in the store means at one time with signals from the imaging means at a later time.
8. A system according to Claim 7, wherein the correlation means is arranged to restrict correlation of signals in the store means with substantially only those signals from the imaging means representative of areas of the surface compatible with movement of the vehicle.
9. A system according to Claim 7 or 8, wherein the correlation means is arranged to perform repeated correlations of the signals stored in the store means at one time with signals from the imaging means at successively later times, so long as the part at least of said image of the part of the surface remains within the field of view of the imaging means.
10. A method of providing a representation of the movement of a vehicle relative to a surface including the steps of deriving from imaging means mounted with said vehicle to view said surface, signals representative of an image of a part of said surface, obtaining a measure of the displacement, relative to the field of view of said imaging means, of a part at least of said image caused by movement of the vehicle relative to said surface, and providing a representation of the movement of the vehicle relative to said surface in accordance with the displacement of the part at least of said image and distance of the vehicle from said surface.
11. A method according to Claim 10 including the steps of storing signals representative of an image of a part of said surface, and correlatng the signals stored at one time with signals from the imaging means at a latertime.
12. A system substantially as hereinbefore described with reference to the accompanying drawings.
13. A method substantially as hereinbefore de scribed with reference to the accompanying draw ings.
GB8002038A 1979-01-22 1980-01-22 Vehicle movement sensing Expired GB2041689B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB8002038A GB2041689B (en) 1979-01-22 1980-01-22 Vehicle movement sensing

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB7902384 1979-01-22
GB8002038A GB2041689B (en) 1979-01-22 1980-01-22 Vehicle movement sensing

Publications (2)

Publication Number Publication Date
GB2041689A true GB2041689A (en) 1980-09-10
GB2041689B GB2041689B (en) 1983-01-26

Family

ID=26270309

Family Applications (1)

Application Number Title Priority Date Filing Date
GB8002038A Expired GB2041689B (en) 1979-01-22 1980-01-22 Vehicle movement sensing

Country Status (1)

Country Link
GB (1) GB2041689B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2129641A (en) * 1982-11-09 1984-05-16 Marconi Co Ltd A passive target detector
US4612441A (en) * 1984-08-09 1986-09-16 The United States Of America As Represented By The Secretary Of The Army Moving object detection system using infrared scanning
FR2583882A1 (en) * 1985-06-25 1986-12-26 Renault Device for measuring the speed and position of a moving object with respect to the ground
FR2609174A1 (en) * 1986-12-26 1988-07-01 Renault Device for measuring displacement, speed or acceleration
GB2212687A (en) * 1987-11-17 1989-07-26 Gen Electric Co Plc Vehicle navigation
WO1998053327A1 (en) * 1997-05-22 1998-11-26 Optronic Consult Ab Method and device for contactless measuring of movement
GB2362213A (en) * 1997-07-14 2001-11-14 British Aerospace Inertial navigation accuracy enhancement
WO2008024772A1 (en) * 2006-08-21 2008-02-28 University Of Florida Research Foundation, Inc. Image-based system and method for vehicle guidance and navigation
US7400950B2 (en) 2002-09-23 2008-07-15 Stefan Reich Optical sensing system and system for stabilizing machine-controllable vehicles
DE102009060901A1 (en) * 2009-12-31 2011-07-07 Deutsches Zentrum für Luft- und Raumfahrt e.V., 51147 Landing aid device for vertical landing aircrafts, particularly helicopter, has image capturing unit for optical acquisition of spaces directly below a vertical landing aircraft, and evaluation unit

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2129641A (en) * 1982-11-09 1984-05-16 Marconi Co Ltd A passive target detector
US4614426A (en) * 1982-11-09 1986-09-30 The Marconi Company Limited Passive target detector
US4612441A (en) * 1984-08-09 1986-09-16 The United States Of America As Represented By The Secretary Of The Army Moving object detection system using infrared scanning
FR2583882A1 (en) * 1985-06-25 1986-12-26 Renault Device for measuring the speed and position of a moving object with respect to the ground
FR2609174A1 (en) * 1986-12-26 1988-07-01 Renault Device for measuring displacement, speed or acceleration
GB2212687A (en) * 1987-11-17 1989-07-26 Gen Electric Co Plc Vehicle navigation
US4951213A (en) * 1987-11-17 1990-08-21 The General Electric Company, Plc Vehicle navigation
WO1998053327A1 (en) * 1997-05-22 1998-11-26 Optronic Consult Ab Method and device for contactless measuring of movement
GB2362213A (en) * 1997-07-14 2001-11-14 British Aerospace Inertial navigation accuracy enhancement
GB2362213B (en) * 1997-07-14 2002-02-27 British Aerospace Inertial navigation accuracy enhancement
US6912464B1 (en) 1997-07-14 2005-06-28 Bae Systems Plc Inertial navigation accuracy enhancement
US7400950B2 (en) 2002-09-23 2008-07-15 Stefan Reich Optical sensing system and system for stabilizing machine-controllable vehicles
EP2241896A1 (en) * 2002-09-23 2010-10-20 Stefan Reich Stabilising system for missiles
WO2008024772A1 (en) * 2006-08-21 2008-02-28 University Of Florida Research Foundation, Inc. Image-based system and method for vehicle guidance and navigation
US20090285450A1 (en) * 2006-08-21 2009-11-19 University Of Florida Research Foundation, Inc Image-based system and methods for vehicle guidance and navigation
US8320616B2 (en) 2006-08-21 2012-11-27 University Of Florida Research Foundation, Inc. Image-based system and methods for vehicle guidance and navigation
DE102009060901A1 (en) * 2009-12-31 2011-07-07 Deutsches Zentrum für Luft- und Raumfahrt e.V., 51147 Landing aid device for vertical landing aircrafts, particularly helicopter, has image capturing unit for optical acquisition of spaces directly below a vertical landing aircraft, and evaluation unit

Also Published As

Publication number Publication date
GB2041689B (en) 1983-01-26

Similar Documents

Publication Publication Date Title
US4802757A (en) System for determining the attitude of a moving imaging sensor platform or the like
US4671650A (en) Apparatus and method for determining aircraft position and velocity
JP3345113B2 (en) Target object recognition method and target identification method
US5104217A (en) System for determining and controlling the attitude of a moving airborne or spaceborne platform or the like
US5034812A (en) Image processing utilizing an object data store to determine information about a viewed object
US8554395B2 (en) Method and system for facilitating autonomous landing of aerial vehicles on a surface
AU664393B2 (en) Method and system for point by point measurement of spatial coordinates
RU2613735C2 (en) Method for detecting placement and location by virtual reference images
NL194282C (en) Correlation processor circuit.
GB2041689A (en) Vehicle movement sensing
DE112019001542T5 (en) POSITION ESTIMATE DEVICE
IL193335A (en) Method for geolocalization of one or more targets
CN112381856A (en) Low-slow small target tracking device and method suitable for urban complex background
US3700801A (en) Image analysis and correlation system and method
US20190120965A1 (en) Method and system of digital light processing and light detection and ranging for guided autonomous vehicles
US3641484A (en) Contour-mapping system
US20220191467A1 (en) Calibration unit for a monitoring device, monitoring device for man-overboard monitoring, and method for calibration
US20200150228A1 (en) Method of Providing Interference Reduction and a Dynamic Region of Interest in a LIDAR System
RU2234739C1 (en) Method of prevention of collision of flying vehicle with earth
US5116118A (en) Geometric fiedlity of imaging systems employing sensor arrays
US3307177A (en) Navigation method and apparatus
JP2746487B2 (en) Aircraft position measurement method for vertical take-off and landing aircraft
EP0498416B1 (en) Inter-car distance detecting device
JP2662583B2 (en) On-vehicle distance detection device
EP0993658B1 (en) Method and apparatus for providing vision using virtual image

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 19940122