GB2228642A - Object location using TV camera - Google Patents

Object location using TV camera Download PDF

Info

Publication number
GB2228642A
GB2228642A GB8925181A GB8925181A GB2228642A GB 2228642 A GB2228642 A GB 2228642A GB 8925181 A GB8925181 A GB 8925181A GB 8925181 A GB8925181 A GB 8925181A GB 2228642 A GB2228642 A GB 2228642A
Authority
GB
United Kingdom
Prior art keywords
information
sensor means
location
area
viewed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB8925181A
Other versions
GB8925181D0 (en
GB2228642B (en
Inventor
Keith Charles Rawlings
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Smiths Group PLC
Original Assignee
Smiths Group PLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Smiths Group PLC filed Critical Smiths Group PLC
Publication of GB8925181D0 publication Critical patent/GB8925181D0/en
Publication of GB2228642A publication Critical patent/GB2228642A/en
Application granted granted Critical
Publication of GB2228642B publication Critical patent/GB2228642B/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves

Description

1 A IMAGE PROCESSING APPARATUS AND METHODS This invention relates to image
processing apparatus and methods.
The invention is more particularly concerned with image processing apparatus and methods for use in determining the range and or alternatively the size of an object.
Conventional techniques of measuring the range of an object involve the use of a laser or radar rangefinder. Whilst these can be accurate, they have the disadvantage the they must be precisely aimed and can make the presence of the observer apparent. It is common practice for tanks and military aircraft or vessels to be equipped with laser and radar detectors so that an alarm is given when they are under observation by laser or radar radiation. The detector can be used to direct countermeasure= at the laser or radar source which is used to provide a homing beacon for a missile or the like.
It would, therefore, be a considerably advantage to be able to measure the range and size of an object remotely in a passive way without increasing the risk of detection.
1 It is an object of the present invention to provide apparatus and a method that can be used to overcome the above-mentioned disadvantages.
According to one aspect of the present invention there is provided image processing apparatus, including store means arranged to store topographical mapping information about an area to be viewed, sensor means arranged to view a part at least of the area in perspective and to derive information as to the location of an object in the field of view of the sensor means, means for determining where the line of sight of the object from thb sensor means intercepts the stored topographical mapping, and the apparatus being arranged to determine the range of the object from its location relative to a corresponding part at least of the stored topographical mapping.
The apparatus may include object data store means containing information about the objects in the area to be viewed, means for comparing an output from the sensor means with the object data store means to identify an object viewed and to provide an output of the information in the store means in accordance therewith. The means for comparing may be arranged to provide an output in respect of any new object viewed but not present in the data store - 3 means. The information in the object data store means may include information about the location of objects in the area, the apparatus including means for providing information in respect of the location of the sensor means, and the apparatus being arranged to provide an indication of the range of the identified object by trignometry calculation from the information about the location of the object and the location of the sensor.
The object data store means preferably contains information about the size of the objects in the area, the apparatus being arranged to derive an indication of the angle subtended by the identified object at the sensor means, and the apparatus being arranged to provide an indication of the range of the object from information regarding its size and the angle subtended.
The apparatus may include means for deriving an indication of the angle subtended by the object at the sensor means, and the apparatus being arranged to provide an indication of the size of the object from information about the angle subtended and the range of the object from the sensor means. The apparatus may include display means, the display means being arranged to provide a display representation of the range of the object.
P k According to another aspect of the present invention there is provided image processing apparatus including object data store means containing information about the appearance and location of objects in an area to be viewed, sensor means arranged to view a part at least of the area, means for comparing an output of the sensor means with information in the data store means to identify an object viewed and to read out information about its location, means for providing information about the location of the sensor means, and means for calculating the range of the object from the information about the location of the object and the location of the sensor means.
According to a further aspect of the present invention there is provided image processing apparatus including object data store means containing information about the appearance and size of objects in an area to be viewed, sensor means arranged to view a part at least of the area, means for comparing an output of the sensor means with information in the data store means to identify an object viewed and to read out information about its size, means for deriving an indication of the angle subtended by the object at the sensor means, and means for calculating the range of the object from the information about its size and the angle subtended.
A Ik, According to yet another aspect of the present invention there is provided image processing apparatus including store means arranged to store mapping information about an area to be viewed, sensor means arranged to view a part at least of the area, and means for comparing the output of the sensor means with the stored mapping information in respect of said part at least of the area such as to identify the presence of an object viewed by the sensor means from its absence from the stored mapping.
The apparatus may include means for receiving aircraft navigation information, the apparatus being arranged to utilize said aircraft navigation information in determining the location of the object relative to the corresponding part at least of the stored mapping information.
The sensor means may include a television camera and may be infra-red sensor means.
According to another aspect of the present invention there is provided a method of image processing comprising the steps of: storing topographical mapping information about an area to be viewed; viewing with sensor means a part at least of the area in perspective; determining the line of sight of an object in the field of
Ii view; determining where the line of sight intercepts the stored topographical mapping; and determining the range of the object from its location relative to a corresponding part at least of the stored topographical mapping.
The method may include the steps of comparing an output from the sensor means with data about objects in the area to be viewed so as to identify objects viewed, and providing an output of information about the identified objects in accordance with said data. The method may include the steps of providing an output in respect of any new object viewed but not present in the data.
The stored information preferably includes information about the location of the objects in the area, information being provided in respect of the location of the sensor means, and an indication being provided of the range of the object identified by trigometrical calculation from the information about the location of the object and the location of the sensor means.
The stored information may include information about the size of the object, and including the steps of deriving an indication of the angle subtended by the object at the sensor means, and providing indication of the range of the object from the information regarding its 1 size and the angle subtended.
The method may include the steps of deriving an indication of the angle subtended by the object at the sensor means, and providing an indication of the size of the object from the angle subtended and the range of the object from the sensor means.
According to another aspect of the present invention there is provided a method of image processing comprising the steps of: providing a store of information about the appearance and location of objects in an area to be viewed; viewing with sensor means at least a part of the area; comparing an output of the sensor means with the stored information about the object to identify an object viewed; reading out information about the location of the object; providing information about the location of the sensor means; and calculating the range of the object from the information about the location of the object and the location of the sensor means.
According to another aspect of the present invention there is provided a method of image processing comprising the steps of: providing a store of information about the appearance and size of objects in an area to be viewed; viewing with sensor means at least a part of the area; comparing an output of the sensor means with the stored information about the objects to identify an object viewed; reading out information about the size of the object; deviring an indication of the angle subtended by the object at the sensor means; and calculating the range of the object from the information about its size and the angle subtended.
According to another aspect of the present invention there is provided a method of image processing comprising the steps of: storing mapping information about an area to be viewed; viewing with sensor means at least a part of the area; comparing the output of the sensor means with the store'd mapping information in respect of said part at least of the area; and identifying the presence of an object viewed by the sensor means from its absence from the stored mapping.
The sensor means may be mounted on an aircraft and the method include the step of utilizing aircraft navigation information in determining the location of the object relative to the corresponding part at least of the stored mapping information.
1 11k Image processing apparatus and its method of use, for an aircraft, according to be present invention, will now be described, by way of example, with reference to the accompanying drawings, in which:
Figure 1 is a side elevation view of the aircraft flying over ground; is a schematic diagram of the apparatus; and is a flow diagram illustrating operation of the apparatus.
Figure 2 Figure 3 1 It :- 10 - With reference first to Figure 1, the aircraft P is represented as flying at a height H above ground G and at an altitude A above mean sea level. The ground G is represented as having an uneven surface or topology. Image processing apparatus on the aircraft P is directed towards an object 0 on the ground G in front of the aircraft. The object is at a range R from the aircraft and subtends an anglee.
With reference now also to Figure 2, the image processing apparatus includes an infra-red or other television camera 1 mounted on the aircraft structure 2 and the viewing angle of which is controlled by an actuator 3. Control of the viewing angle and scanning of the camera 1 is effected by a processing unit 4 which also receives the image output signals from the camera via line 5. The actuator 3 is driven by a camera drive unit 6 via a mechanical scan controller 7 and amplifier 8. Feedback is provided from the actuator 3 via line 9 to the drive unit 6. Electronic scanning is also controlled by the drive unit 6 via an electronic scan controller 10 with feedback via line 11.
1 4 k The signals on line 5 are supplied to an image velocity smear compensator 40 which is controlled by the -output of a platform motion reference system 41. The platform motion reference system 41 also provides an output to a platform motion compensation unit 12 which itself provides an output to the camera drive unit 6 so that the camera boresight is corrected for movement of the aircraft. The camera movement controls are of a high resolution and high frequency and may be mounted, with the camera 1, in a wing-mounted imaging pod. After velocity smear compensation, the image signals are passed to an optical error compensation unit 42 which compensates for lens defects and the like in accordance with data regarding these optical corrections contained in a store 43.
Following those two compension steps, the signals are supplied to unit 44 which acts as an expected objects pre-filter 44. This pre-filter 44 receives inputs for comparison from object data stores 73, 74 and 75 via an object data store navigator 45. The stores 73, 74 and 75 contain data regarding the appearance of expected objects in plan, their size and location. The object data store navigator 45 receives information from an aircraft attitude sensor 47, position sensor 48 and height sensor 49 together with camera boresight feedback information via line 50 from the camera drive If X - 12 unit 6. The navigator 45 uses this information to identify the appropriate locations within the stores 73 to 75 likely to contain data on objects within the field of view of the camera 1. The attitude sensor 47 and position sensor 48 could be parts of an inertial navigation system. This data is supplied to the pre-filter 44 via an image perspective generator 46 which transforms the data store information into the same perspective as seen by the camera 1. The image generator 46 also supplies information about the stored image in perspective to another store 51.
After pre-filtering, an image comparator/object detector 52 makes further comparison of the image information with the perspective transformed image in the store 51. The detector 52 may employ conventional pattern recognition and matching techniques well known in image recognition. Image information about objects identified with sufficient certainty by the pre-filter 44 can be supplied directly to the output of the detector 52, via line 53. Information about a new object, that is, one which is present in the field of view of the camera 1 but absent from the object data stores 73 to 75, is supplied by the detector 52 to a new object characterizer 54. The characterizer 54 provides an output via a formatter 55 to a display 56, or other utilization device, by which information regarding the characteristics of the -f - 13 new object are presented. This may indicate the presence of a likely threat to the aircraft because of its unknown nature.
The detector 52 provides outputs in respect of identified objects to a trignometric range and size processor 61, an iconometric range and size processor 62 and a kinematic range processor 63. An output is also provided to a subtended image extractor 64 which calculates the angle subtended at the camera 1 by the object viewed and supplies this information to the processors 61 and 62.
The processors 61 and 62 also receive inputs from a topographical map 70, a pre-loaded intelligence map 71 and a new object and update map 72 which are associated with respective ones of the object data stores 73 to 75. The topographical map 70 contains information about the topography, that is, ground contours and permanent features on the ground on which is superimposed the information about the location of other objects which may be more transient. The intelligence map 71 contains additional information about the location of objects within the topographical map which may be gathered and loaded just prior to the flight to bring up to date the map information. The new object and update map 72 and its associated data store 75 is supplied with information from 4 %L ---14 external sources (not shown) such as data links, crew inputs or the new object characterizer 54. The crew input could, for example, include a helmet-mounted sight and a speech recognizer so that the sight could be aimed at a target which is then vocally named by the pilot. Information about the appearance of the named object would then be read out of the store 75 for use in subsequent target tracking.
Each of the maps 70 to 72 provides an output to a search director unit 80 which directs the camera 1 towards a region most likely to contain an object of interest. The search director 80 also receives inputs from the attitude sensor 47, position sensor 48 and height sensor 49, together with the output from a unit 81 by which new object search patterns can be defined by the user. The search director 80 provides an output via line 82 to the camera drive unit 6.
The trigonometric range/size processor 61 receives from the detector 52 information about the viewing angle of the object, that is, the camera bore sight alignment, and the present aircraft position and height. The object is assumed to be at the point where the boresight intercepts the topography as contained in the map 70. From this information, the processor 61 calcuates, by trignometry, the range of the aircraft to 1 t 15 7- the object and supplies this information to a display of range at 76. If the object subtends a measurable angle, as determined by the subtended image extractor 64, the processor 61 calculates trignometrically the size of the object, from the range information and the subtended angle. The size information is supplied to a display of size at 77. This information about range and size can be provided even if the nature of the object is unknown. It will be appreciated that there is a degree of ambiguity because the assumption must be made that the object is located on the ground, whereas this might not be the case. The processor 61 also provides an output to a position display 78 to indicate, to the user, the location of the observed object.
Where the nature of the object is known, because it is matched with an object in the data stores 73 and 75, a similar trignometrical calculation is made by the processor 61. Further calculations, however, are also made. If the map position of the object is unambiguous, that is, there is only one location of the identified object in the data store, then the location of the object is known from the stored information about the object and without the need for information about the boresight attitude. The trigometric processor 61 calculates the range to the object by trignometry from this information. If the position of the object is ambiguous, the range is 16 - determined using knowledge of the boresight attitude as previously.
The iconometric processor 62 also provides an indication of object range where the nature of the object, and hence its real size, has been identified and where the angle subtended by the object is measurable. This information enables the range of the object to be calculated by simple iconometric trignometry. The stored information about the size of a known object can be compared with the size determined by the trignometric processor 61 for checking purposes.
The system can also be used for determining the size and range of airborne targets. If these are identified, and therefore have a known size, their range can be calculated by the iconometric processor 62 from knowledge of the angle subtended by the object at the camera 1. Where the real size of the object is not known, the kinematic range processor 63 tracks the airborne object and calculates an approximate range from knowledge of the movement of the viewing aircraft and the change in bearing of the target from the viewing aircraft. This calculation can only be probabilistic by virtue of the mathematics involved. The iconometric processor 62 can also provide an output to a decoy display 79 if the camera 1 sees an object which resembles in appearance a stored 1 1 i 17 - object but has a size that is less than that of the stored object. For example, an airborne object which is smaller than a conventional manned aircraft would be indicated as a decoy.
The camera drive 6 is coupled to a mode selector 90 by which input from a weapons system 91, a navigation system 92 or a manual control 93 is selected. The weapons system 91 may be a weapons homing head to which the camera 1 is slaved.
The method of use of the apparatus is further clarified by the flow chart of Figure 3 which is self explanatory.
The apparatus has the advantage of enabling the range and size of an object to be determined solely by passive means and without making the presence of the aircraft apparent to the object.
- 18

Claims (1)

  1. Image processing apparatus including store means arranged to store topographical mapping information about an area to be viewed, sensor means arranged to view a part at least of the area in perspective and to derive information as to the location of an object in the field of view of the sensor means, means for determining where the line of sight of the object from the sensor means intercepts the stored topographical mapping, and wherein the apparatus is arranged to determine the range of the object from its location relative to a corresponding part at least of the stored topographical mapping.
    Image processing apparatus according to Claim 1, including object data store means containing information about the objects in the area to be viewed, means for comparing an output from the sensor means with the object data store means to identify an object viewed and to provide an output of the information in the store means in accordance therewith.
    z - 19 Image processing apparatus according to Claim 2, wherein the means for comparing is arranged to provide an output in respect of any new object viewed but not present in the data store means.
    Image processing apparatus according to Claim 2 or 3, wherein the information in the object data store means includes information about the location of the objects in the area, wherein the apparatus includes means for providing information in respect of the location of the sensor means, and wherein the apparatus is arranged to provide an indication of the range of the identified object by trignometry calculation from the information about the location of the object and the location of the sensor means.
    Image processing apparatus according to any one of Claims 2 to 4, wherein the object data store means contains information about the size of the objects in the area, wherein the apparatus is arranged to derive an indication of the angle subtended by the identified object at the sensor means, and wherein the apparatus is arranged to provide an indication of the range of the object from information regarding its size and the angle subtended.
    1 1.
    - 20 Image processing apparatus according to any one of the preceding claims, wherein the apparatus includes means for deriving an indication of the angle subtended by the object at the sensor means, and wherein the apparatus is arranged to provide an indication of the size of the object from information about the angle subtended and the range of the object from the sensor means.
    Image processing apparatus according to any one of the preceeding claims, including display means, and wherein the display means is arranged to provide a display representation of the range of the object.
    Image processing apparatus including object data store means containing information about the appearance and location of objects in an area to be viewed, sensor means arranged to view a part at least of the area, means for comparing an output of the sensor means with information in the data store means to identify an object viewed and to read out information about its location, means for providing information about the location of the sensor means, and means for calculating the range of the object from the information about the location of the object and the location of the t 21 sensor means.
    9.
    10.
    Image processing apparatus including object data store means containing information about the appearance and size of objects in an area to be viewed, sensor means arranged to view a part at least of the area, means for comparing an output of the sensor means with information in the data store means to identify an object viewed and to read out information about its size, means for deriving an indication of the angle subtended by the object at the sensor means, and means for calculating the range of the object from the information about its size and the angle subtended.
    Image processing apparatus including store means arranged to store mapping information about an area to be viewed, sensor means arranged to view a part at least of the area, and means for comparing the output of the sensor means with the stored mapping information in respect of said part at least of the area such as to identify the presence of an object viewed by the sensor means from its absence from the stored mapping.
    A 11.
    12.
    13.
    14.
    Z2 - Image processing apparatus for an aircraft according to any one of the preceding claims including means for receiving aircraft navigation information, and wherein the apparatus is arranged to utilize said aircraft navigation information in determining the location of the object relative to the corresponding part at least of the storeJ mapping information.
    Image processing apparatus according to any one of the preceding claims, wherein the sensor means includes a television camera.
    Image procesing apparatus according to any one of the preceding claims, wherein the sensor means is infra-red sensor means.
    A method of image processing comprising the steps of: storing topographical mapping information about an area to be viewed; viewing with sensor means a part at least of the area in perspective; determining the line of sight of an object in the field of view; determining where the line of sight intercepts the stored topographical mapping; and determining the range of the object from its location relative to a corresponding part at least of the stored topographical mapping.
    1 I- 1 A 15.
    16.
    17.
    18.
    - 23 A method according to Claim 14, including the steps of comparing an output from the sensor means with data about objects in the area to be viewed so as to identify objects viewed, and providing an output of information about the identified objects in accordance with said data.
    A method according to Claim 15, including the step of providing an output in respect of any new object viewed but not present in the data.
    A method according to Claim 15 or 16, wherein the said stored information includes information about the location of the objects in the area, wherein information is provided in respect of the location of the sensor means, and wherein an indication is provided of the range of the object identified by trigometrical calculation from the information about the location of the object and the location of the sensor means.
    A method according to any one of Claims 15 to 17, wherein the said stored information includes information about the size of the object, and including the steps of deriving an indication of the angle subtended by the object at the sensor means, and providing indication of the range of the object from said information regarding its size and the angle subtended.
    19.
    20.
    -1 A method of image processing according to any one of Claims 14 to 18, including the steps of deriving an indication of the angle subtended by the object at the sensor means, and providing an indication of the size of the object from the angle subtended and the range of the object from the sensor means.
    A method of image processing comprising the steps of: providing a store of information about the appearance and location of objects in an area to be viewed; viewing with sensor means at least a part of the area; comparing an output of the sensor means with the stored information about the object to identify an object viewed; reading out information about the location of the object; providing information about the location of the sensor means; and calculating the range of the object from the information about the location of the object and the location of the sensor means.
    -1 h k- A 1 21.
    22.
    A method of image processing comprising the steps of: providing a store of information about the appearance and size of objects in an area to be viewed; viewing with sensor means at least a part of the area; comparing an output of the sensor means with the stored information about the objects to identify an object viewed; reading out information about the size of the object; deriving an indication of the angle subtended by the object at the sensor means; and calculating the range of the object from the information about its size and the angle subtended.
    A method of image processing comprising the steps of: storing mapping information about an area to be viewed; viewing with sensor means at least a part of the area; comparing the output of the sensor means with the stored mapping information in respect of said part at least of the area; and identifying the presence of an object viewed by the sensor means from its absence from the stored mapping.
    26 - 23.
    24.
    25.
    26.
    A method according to any one of Claims 14 to 22, wherein the said sensor means is mounted on an aircraft, and including the step of utilizing aircraft navigation information in determining the location of the object relative to the corresponding part at least of the stored mapping information.
    Image processing apparatus substantially as hereinbefore described with reference to the accompanying drawings.
    A method of image processing substantially as hereinbefore described with reference to the accompanying drawings.
    Any novel feature or combination of features as hereinbefore described.
    1 Published 1990 at ThePatentOffice,State Hou-.-.66"71 High Holborn. London WC1R 4TP. Further copies maybe obtainedfoom The Patent Office Sales Branch, St Mary Cray. Orpington. Kent BR5 3RD- Printed by Multiplex techniques ltd, St Mary Cray. Kent, Con- 1'87 1
GB8925181A 1988-11-14 1989-11-08 Image processing apparatus and methods Expired - Lifetime GB2228642B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB888826550A GB8826550D0 (en) 1988-11-14 1988-11-14 Image processing apparatus and methods

Publications (3)

Publication Number Publication Date
GB8925181D0 GB8925181D0 (en) 1990-04-25
GB2228642A true GB2228642A (en) 1990-08-29
GB2228642B GB2228642B (en) 1993-06-23

Family

ID=10646806

Family Applications (6)

Application Number Title Priority Date Filing Date
GB888826550A Pending GB8826550D0 (en) 1988-11-14 1988-11-14 Image processing apparatus and methods
GB8925181A Expired - Lifetime GB2228642B (en) 1988-11-14 1989-11-08 Image processing apparatus and methods
GB919103281A Pending GB9103281D0 (en) 1988-11-14 1991-02-15 Image processing apparatus and methods
GB919103282A Pending GB9103282D0 (en) 1988-11-14 1991-02-15 Image processing apparatus and methods
GB9105682A Expired - Fee Related GB2243741B (en) 1988-11-14 1991-03-18 Image processing apparatus and methods
GB9105681A Expired - Fee Related GB2243740B (en) 1988-11-14 1991-03-18 Image processing apparatus and methods

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GB888826550A Pending GB8826550D0 (en) 1988-11-14 1988-11-14 Image processing apparatus and methods

Family Applications After (4)

Application Number Title Priority Date Filing Date
GB919103281A Pending GB9103281D0 (en) 1988-11-14 1991-02-15 Image processing apparatus and methods
GB919103282A Pending GB9103282D0 (en) 1988-11-14 1991-02-15 Image processing apparatus and methods
GB9105682A Expired - Fee Related GB2243741B (en) 1988-11-14 1991-03-18 Image processing apparatus and methods
GB9105681A Expired - Fee Related GB2243740B (en) 1988-11-14 1991-03-18 Image processing apparatus and methods

Country Status (4)

Country Link
US (1) US5034812A (en)
DE (1) DE3937427A1 (en)
FR (1) FR2639127B1 (en)
GB (6) GB8826550D0 (en)

Families Citing this family (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03196372A (en) * 1989-12-26 1991-08-27 Toshiba Corp Image positioning device for image
US5247356A (en) * 1992-02-14 1993-09-21 Ciampa John A Method and apparatus for mapping and measuring land
US5815411A (en) * 1993-09-10 1998-09-29 Criticom Corporation Electro-optic vision system which exploits position and attitude
US6037936A (en) 1993-09-10 2000-03-14 Criticom Corp. Computer vision system with a graphic user interface and remote camera control
US7301536B2 (en) 1993-09-10 2007-11-27 Geovector Corporation Electro-optic vision systems
US6064398A (en) * 1993-09-10 2000-05-16 Geovector Corporation Electro-optic vision systems
GB2300082B (en) * 1995-04-21 1999-09-22 British Aerospace Altitude measuring methods
US6535210B1 (en) * 1995-06-07 2003-03-18 Geovector Corp. Vision system computer modeling apparatus including interaction with real scenes with respect to perspective and spatial relationship as measured in real-time
JP3622094B2 (en) * 1995-10-05 2005-02-23 株式会社日立製作所 Map update support apparatus and map information editing method
DE19546506A1 (en) * 1995-12-13 1997-06-19 Daimler Benz Ag Vehicle navigation system and signal processing method for such a navigation system
DE19546507A1 (en) * 1995-12-13 1997-06-19 Daimler Benz Ag Vehicle navigation and radar system
JP2000153476A (en) * 1998-09-14 2000-06-06 Honda Motor Co Ltd Leg type movable robot
US6173239B1 (en) 1998-09-30 2001-01-09 Geo Vector Corporation Apparatus and methods for presentation of information relating to objects being addressed
US6396475B1 (en) 1999-08-27 2002-05-28 Geo Vector Corp. Apparatus and methods of the remote address of objects
US6356227B1 (en) * 1999-09-16 2002-03-12 Honeywell International, Inc. Smearing compensation apparatus for a radar system
US6522292B1 (en) 2000-02-23 2003-02-18 Geovector Corp. Information systems having position measuring capacity
US8224078B2 (en) 2000-11-06 2012-07-17 Nant Holdings Ip, Llc Image capture and identification system and process
US7899243B2 (en) * 2000-11-06 2011-03-01 Evryx Technologies, Inc. Image capture and identification system and process
US7680324B2 (en) 2000-11-06 2010-03-16 Evryx Technologies, Inc. Use of image-derived information as search criteria for internet and other search engines
US9310892B2 (en) 2000-11-06 2016-04-12 Nant Holdings Ip, Llc Object information derived from object images
US7565008B2 (en) 2000-11-06 2009-07-21 Evryx Technologies, Inc. Data capture and identification system and process
US7031875B2 (en) 2001-01-24 2006-04-18 Geo Vector Corporation Pointing systems for addressing objects
US20030184594A1 (en) * 2002-03-25 2003-10-02 John Ellenby Apparatus and methods for interfacing with remote addressing systems
US7424133B2 (en) 2002-11-08 2008-09-09 Pictometry International Corporation Method and apparatus for capturing, geolocating and measuring oblique images
DE10259123B3 (en) * 2002-12-18 2004-09-09 Dräger Safety AG & Co. KGaA Device and method for monitoring the use of respiratory protective device wearers
US20040219961A1 (en) * 2003-04-08 2004-11-04 Ellenby Thomas William Computer games having variable execution dependence with respect to spatial properties of a mobile unit.
US20060190812A1 (en) * 2005-02-22 2006-08-24 Geovector Corporation Imaging systems including hyperlink associations
US7991242B2 (en) * 2005-05-11 2011-08-02 Optosecurity Inc. Apparatus, method and system for screening receptacles and persons, having image distortion correction functionality
CA2608119A1 (en) 2005-05-11 2006-11-16 Optosecurity Inc. Method and system for screening luggage items, cargo containers or persons
US7899232B2 (en) 2006-05-11 2011-03-01 Optosecurity Inc. Method and apparatus for providing threat image projection (TIP) in a luggage screening system, and luggage screening system implementing same
US8494210B2 (en) 2007-03-30 2013-07-23 Optosecurity Inc. User interface for use in security screening providing image enhancement capabilities and apparatus for implementing same
US7873238B2 (en) * 2006-08-30 2011-01-18 Pictometry International Corporation Mosaic oblique images and methods of making and using same
DE102007002813A1 (en) * 2007-01-18 2008-07-24 Robert Bosch Gmbh A method for estimating a distance of a vehicle to an object and driver assistance system
US8593518B2 (en) * 2007-02-01 2013-11-26 Pictometry International Corp. Computer system for continuous oblique panning
US8520079B2 (en) * 2007-02-15 2013-08-27 Pictometry International Corp. Event multiplexer for managing the capture of images
US9262818B2 (en) 2007-05-01 2016-02-16 Pictometry International Corp. System for detecting image abnormalities
US8385672B2 (en) * 2007-05-01 2013-02-26 Pictometry International Corp. System for detecting image abnormalities
US7991226B2 (en) 2007-10-12 2011-08-02 Pictometry International Corporation System and process for color-balancing a series of oblique images
US8531472B2 (en) * 2007-12-03 2013-09-10 Pictometry International Corp. Systems and methods for rapid three-dimensional modeling with real façade texture
US8588547B2 (en) 2008-08-05 2013-11-19 Pictometry International Corp. Cut-line steering methods for forming a mosaic image of a geographical area
US8401222B2 (en) * 2009-05-22 2013-03-19 Pictometry International Corp. System and process for roof measurement using aerial imagery
US9330494B2 (en) 2009-10-26 2016-05-03 Pictometry International Corp. Method for the automatic material classification and texture simulation for 3D models
US8477190B2 (en) 2010-07-07 2013-07-02 Pictometry International Corp. Real-time moving platform management system
US8823732B2 (en) 2010-12-17 2014-09-02 Pictometry International Corp. Systems and methods for processing images with edge detection and snap-to feature
CA2835290C (en) 2011-06-10 2020-09-08 Pictometry International Corp. System and method for forming a video stream containing gis data in real-time
GB2508565B (en) 2011-09-07 2016-10-05 Rapiscan Systems Inc X-ray inspection system that integrates manifest data with imaging/detection processing
US9183538B2 (en) 2012-03-19 2015-11-10 Pictometry International Corp. Method and system for quick square roof reporting
US9881163B2 (en) 2013-03-12 2018-01-30 Pictometry International Corp. System and method for performing sensitive geo-spatial processing in non-sensitive operator environments
US9244272B2 (en) 2013-03-12 2016-01-26 Pictometry International Corp. Lidar system producing multiple scan paths and method of making and using same
US9753950B2 (en) 2013-03-15 2017-09-05 Pictometry International Corp. Virtual property reporting for automatic structure detection
US9275080B2 (en) 2013-03-15 2016-03-01 Pictometry International Corp. System and method for early access to captured images
SE537279C2 (en) 2013-07-12 2015-03-24 BAE Systems Hägglunds AB System and procedure for handling tactical information in combat vehicles
AU2015204838B2 (en) 2014-01-10 2020-01-02 Pictometry International Corp. Unmanned aircraft structure evaluation system and method
US9292913B2 (en) 2014-01-31 2016-03-22 Pictometry International Corp. Augmented three dimensional point collection of vertical structures
CA2938973A1 (en) 2014-02-08 2015-08-13 Pictometry International Corp. Method and system for displaying room interiors on a floor plan
AU2017221222B2 (en) 2016-02-15 2022-04-21 Pictometry International Corp. Automated system and methodology for feature extraction
EP3772702A3 (en) 2016-02-22 2021-05-19 Rapiscan Systems, Inc. Methods for processing radiographic images
US10671648B2 (en) 2016-02-22 2020-06-02 Eagle View Technologies, Inc. Integrated centralized property database systems and methods
US20220016773A1 (en) * 2018-11-27 2022-01-20 Sony Group Corporation Control apparatus, control method, and program

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2165714A (en) * 1984-09-07 1986-04-16 Messerschmitt Boelkow Blohm Electro-optical aiming device

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1605171A (en) * 1975-12-05 1982-10-13 Secr Defence Instruments for aircraft
CA1116286A (en) * 1979-02-20 1982-01-12 Control Data Canada, Ltd. Perimeter surveillance system
DE2938853A1 (en) * 1979-09-26 1981-04-09 Vereinigte Flugtechnische Werke Gmbh, 2800 Bremen AREA NAVIGATION SYSTEM FOR AIRCRAFT
US4391514A (en) * 1981-02-13 1983-07-05 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Sidelooking laser altimeter for a flight simulator
US4591987A (en) * 1983-07-27 1986-05-27 Kollmorgen Technologies Corp. Video rangefinder
US4679077A (en) * 1984-11-10 1987-07-07 Matsushita Electric Works, Ltd. Visual Image sensor system
US4739401A (en) * 1985-01-25 1988-04-19 Hughes Aircraft Company Target acquisition system and method
JPS61281915A (en) * 1985-06-07 1986-12-12 Kokusai Kogyo Kk Vehicle device for measuring properties of road surface
JPS6278979A (en) * 1985-10-02 1987-04-11 Toshiba Corp Picture processor
GB2210487B (en) * 1987-09-11 1991-07-10 Gen Electric Co Plc Object recognition
US4872051A (en) * 1987-10-01 1989-10-03 Environmental Research Institute Of Michigan Collision avoidance alarm system
JPH0695008B2 (en) * 1987-12-11 1994-11-24 株式会社東芝 Monitoring device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2165714A (en) * 1984-09-07 1986-04-16 Messerschmitt Boelkow Blohm Electro-optical aiming device

Also Published As

Publication number Publication date
GB2243740A (en) 1991-11-06
GB8826550D0 (en) 1989-05-17
US5034812A (en) 1991-07-23
GB2243740B (en) 1993-06-23
GB2243741A (en) 1991-11-06
GB8925181D0 (en) 1990-04-25
GB9105682D0 (en) 1991-05-01
GB9103282D0 (en) 1991-04-03
FR2639127A1 (en) 1990-05-18
DE3937427A1 (en) 1990-05-17
GB2228642B (en) 1993-06-23
GB9105681D0 (en) 1991-05-01
GB9103281D0 (en) 1991-04-03
GB2243741B (en) 1993-06-23
FR2639127B1 (en) 1993-04-16

Similar Documents

Publication Publication Date Title
US5034812A (en) Image processing utilizing an object data store to determine information about a viewed object
US5822713A (en) Guided fire control system
US5379676A (en) Fire control system
US5086396A (en) Apparatus and method for an aircraft navigation system having improved mission management and survivability capabilities
US4439755A (en) Head-up infinity display and pilot's sight
US6836320B2 (en) Method and apparatus for active boresight correction
US6377401B1 (en) Head tracker system
US5483865A (en) Aircraft sighting system
US6072571A (en) Computer controlled optical tracking system
US4274609A (en) Target and missile angle tracking method and system for guiding missiles on to targets
JPH03213498A (en) Optoelectronics system to support air attach and air navigation assignment
JPH0124275B2 (en)
GB2143948A (en) Apparatus for determining the direction of a line of sight
JP2001518627A (en) Targeting system based on radio frequency interferometer and laser rangefinder / indicator
GB1578136A (en) Helmet-mounted sights
US6202535B1 (en) Device capable of determining the direction of a target in a defined frame of reference
US5726747A (en) Computer controlled optical tracking system
US5274236A (en) Method and apparatus for registering two images from different sensors
US6249589B1 (en) Device for passive friend-or-foe discrimination
CN113885312A (en) Photoelectric tracking system and method
RU2247921C2 (en) Method for finding one's bearings on the ground and device for its realization
US3772516A (en) Magnifier scanner tracker
Downey et al. Electro-optical tracking systems considerations
US5373318A (en) Apparent size passive range method
RU2242019C2 (en) Method for determination of co-ordinates of distant object on terrain and device for its realization

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)
PE20 Patent expired after termination of 20 years

Expiry date: 20091107