GB2323231A - Imaging systems - Google Patents

Imaging systems Download PDF

Info

Publication number
GB2323231A
GB2323231A GB9011075A GB9011075A GB2323231A GB 2323231 A GB2323231 A GB 2323231A GB 9011075 A GB9011075 A GB 9011075A GB 9011075 A GB9011075 A GB 9011075A GB 2323231 A GB2323231 A GB 2323231A
Authority
GB
United Kingdom
Prior art keywords
eye position
imaging system
screen
scene
high definition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB9011075A
Other versions
GB9011075D0 (en
GB2323231B (en
Inventor
Stephen Phillip Braim
Martin William Thomas
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
UK Secretary of State for Defence
Original Assignee
UK Secretary of State for Defence
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by UK Secretary of State for Defence filed Critical UK Secretary of State for Defence
Publication of GB9011075D0 publication Critical patent/GB9011075D0/en
Publication of GB2323231A publication Critical patent/GB2323231A/en
Application granted granted Critical
Publication of GB2323231B publication Critical patent/GB2323231B/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/02Viewing or reading apparatus
    • G02B27/022Viewing apparatus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41GWEAPON SIGHTS; AIMING
    • F41G3/00Aiming or laying means
    • F41G3/22Aiming or laying means for vehicle-borne armament, e.g. on aircraft
    • F41G3/225Helmet sighting systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/20Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
    • H04N23/21Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only from near infrared [NIR] radiation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Health & Medical Sciences (AREA)
  • Toxicology (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Engineering & Computer Science (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)

Abstract

An imaging system is formed by a combination of a narrow angle image 12, and a wide angle image displayed together on a screen 11 (Fig 2). Separate detector arrays 8, 9 and optical systems 1, 2 are associated with each image. An eye position monitor 15 (fig 3) detects the angular position of an operator's 16 eyes and the resulting signal is used to steer 3 (Fig 4) the high resolution optical system in sympathy with the operator's 16 direction of gaze. Thus a high resolution image which may be variable in size follows the operator's inspection of the whole screen 11. Eye position is monitored (Figure 3) by pairs of light emitting diodes (21, 24) and detectors (25, 26). The eye position monitor is calibrated and a computer generated look up table formed and used to control beam steering motors (3) carrying mirrors. The arrays (8, 9) may be x-y matrix arrays of CdHgTe thermal detectors.

Description

IMAGING SYSTEMS This invention relates to imaging systems in which a scene is detected and displayed on a screen for observation.
One type of imaging system detects the thermal, infra-red, radiation using very sensitive CdHgTe detectors cooled to liquid nitrogen temperatures. Such an imager may employ a scanning system of rotating mirrors to sweep out a whole scene by a single or relatively small number of detectors to provide a visible display of typically 800 x 500 pixels. Alternatively an array of detectors may be made to detect the thermal scene without any form of scanning; such a system is termed a staring array system. It has the advantage of no scanning mirrors etc. with their attendant cost in complexity and power consumption. However, it is difficult to make large scale arrays. Typically the largest CdHgTe array at present is a 128 x 128 element array. Thus a staring array cannot produce a high definition image unless the angle of view is extremely narrow; for a more reasonable angle of view the definition is very coarse.
According to this invention the above problems are overcome by employing two imaging systems, one a low definition wide angle system and the other a high definition narrow angle system, together with an eye position indicator whose output is used to direct the high definition system to an area of interest within the wide angle of view; in this way an observer only sees high definition images over the whole field of the wide angle system.
The invention relies on the fact that the resolution of a human eye is high at the centre of the fovea and much lower elsewhere, and that the high definition system is slaved to follow an operators direction of gaze. Subjectively an operator observes the whole scene as a high definition scene.
According to this invention an imaging system comprises a detector array sensitive to the wavelength of interest, an optical system for directing radiation from a scene onto the detector array, a display screen for observing an image of the scene, means for displaying the detector array as a visible display on the screen, Characterised by: two optical systems, one a wide angle low definition system, and the other a narrow angle high definition system, two detector arrays one associated with each optical system, a beam steering system for pointing the narrow angle system over the field of the wide angle system, an eye position monitor for detecting the angular position of an operators eyes, a feedback loop for controlling the scanning system in synchronism with the operators angular eye position, and means for combining the outputs of the two detector arrays on the screen, the arrangement being such that the whole scene appears to an observer as being of high definition.
The detector arrays may be thermal detectors, e.g. of CdHgTe material, and may be cooled with liquid nitrogen, or expanded high pressure air, or Peltier cooler, or may operate at ambient temperatures. Typical arrays may be 50 x 50 to 128 x 128 or more element arrays.
The narrow angle optical systems may provide a 4x to 12x, (typically 6x) resolution relative to that of the wide angle system.
The invention will now be described, by way of example only, with reference to the accompanying drawings of which: Figure 1 is a schematic view showing a scene being detected and an operator observing a display screen; Figure 2 shows a display screen with a high definition patch within a low definition background; Figure 3 shows an eye position monitor; Figure 4 shows a part of Figure 1 to an enlarged scale; Figure 5 is a graph of a human eye resolution; Figure 6 is a block diagram of the electrical circuit; Figure 7 is a block diagram of a part of Figure 6; Figures 8(a), (b) are eye movement traces as measured and as corrected respectively, Figures 9, 10, 11 are flow diagrams for a computer program.
The imaging system shown in Figure 1 is used to examine a scene S with a wide angle camera 1 having a focal length of 30mm and a high definition narrow angle camera 2 having a focal length of 180mm.
This narrow angle camera 2, see Figure 4, collects light over about 1/6 th the angle of the wide angle camera 1 and is pointed 3 in azimuth over the scene S by horizontal scan mirror 4 and motor 5, and pointed in elevation by a vertical scan mirror 6 and motor 7.
Motors 5, 7 may be torque motor supplied by General Scanning Inc G300PD galvanometers. At the rear of the cameras 1, 2 are two similar detector arrays 8, 9 (one only shown in Figure 4), e.g. a 128 x 128 array of CdHgTe elements cooled by liquid nitrogen. The detectors used in Figure 1 are 50 x 50 uncooled arrays such as a RETICON RA50z50 Si photo diode arrays sensitive from near U.V. to the near I.R.; these arrays incorporate on-chip shift registers and multiplexers for providing a serial output video data when supplied with external clock and timing pulses.
Output from the arrays 8, 9 is via system electronics 10, shown in more detail in Figure 6, to a display 11 such as a cathode ray tube (CRT) or television monitor. The output from the wide angle camera 1 completely fills the display with a course resolution image. The output from the narrow angle, high definition, camera 2 forms a patch 12 covering 1/6th of the height and width of the display 11; as explained later this patch 12 can be placed anywhere on the display 11. An observer 16 views this display 11 at a fixed distance set by a head rest 13, and chin rest 14. For a display 100 mm square an observer's eye is set at 260-270 mm from the display giving an angle of view of about 23 for the whole display and an angle of 4 for the high definition patch 12.
An eye position monitor 15 detects the positions of an observer's eyes and this is fed, after processing by a computer 17, to control the scan mirrors 3, 5 and hence the position of the high definition patch 12 in the whole scene S.
Figure 2 shows the relative size of the high definition patch 12 on a background of lower definition formed on the display monitor 11.
For the particular implementation shown the magnification of the patch 12 is 6x, thus the patch occupies 1/6th of the height and width of the display 11, and therefore contains 6x more detail.
The patch 12 is formed by the individual outputs of the 50 x 50 element array 9 and is therefore a 50 x 50 picture point array not a 6 x 6 array as shown. The wide angle camera 1 provides a 50 x 50 array of picture points covering the whole display. The size of each picture point in the background is six times that of each picture point in the high definition patch 12. The size of the CRT electron beam spot is much smaller than needed to represent each picture point in the background display. Thus the electron beam is scanned in a 4 x 4 matrix to provide the required size of background picture point representing the output from a single detector element. A larger number of matrix points could be used, e.g. 6 x 6, within the same area, but the 4 x 4 matrix was found to give an adequately uniform background picture point.
For normal viewing the display is presented as a small area of high definition on a low definition background.
However, the observer's eye is only able to resolve as indicated by Figure 5 which shows relative visual acuity against angle of degree from the fovea centre. Resolution falls rapidly either side of the centre of the fovea. The high resolution patch display covers an angle of about 4 which is seen to include the most useful range of fovea detection. The lower definition background is at a 6x lower definition which is relatively unnoticed by the observer.
Thus an observer, arranged in the correct relative position to the display, will see a high definition display only; provided the observer looks directly at the centre of the patch.
To maintain the subjective appearance of a whole display of high resolution the patch 12 is moved over the display 11 in sympathy with an observer's eye movements.
The eye position monitor 15 Figure 3, is mounted 250 mm away from the display 11, i.e. 10-20 mm from the observer's eye. This position monitor 15 comprises two sets of sensors 21, 22 one for each eye. Each sensor 21, 22 comprises a pair of light emitting diodes LEDs 23, 24 and a pair of photo diodes 25, 26 forming detectors. For the right eye the LEDs 23, 24 and diodes 25, 26 are arranged either side of a vertical axis to monitor horizontal eye movement by detecting the difference in reflectance between the cornea and sclera as the eye moves in a left-right direction. The LEDs 23, 24 emit near I.R. light, e.g. about 930nm which is unnoticed by the observer. For the left eye the LED 23, 24 and diodes 25, 26 are arranged in pairs, one above the other, to detect vertical eye movement by detecting reflectance changes between the eye and eyelid. Any movement in the eye produces a change in the detected reflected radiation and hence a change in the photo diode 25, 26 output. These outputs are used to control the motors 5, 7 giving a movement of the scanning mirrors 4, 6 corresponding to the eye movement. Using Applied Research Development eye position monitor, Catalogue ref EM130, a +/- 0.5 volt analogue output is obtainable for an eye movement of +/- 100. The net result is the high definition patch 12 always appears on the display at the position the observer 16 is looking at. The effect is a complete display of high definition because the observer can only appreciate high definition over a small angle.
The ability to beamsteer the narrow angle camera 2, Figure 4, by the horizontal scan mirrors 4, 6 allows the camera 2 to detect any required part of the scene S. Each motor 5, 7 receives a signal from the eye position monitor 15 so the camera 2 scans as the observer's eye scans the display 11. In practice calibration and some processing of the eye position monitor 15 is necessary to obtain a smooth scanning of the camera 2.
Although each of the two sets of sensors 21, 22 is arranged to detect eye movement in one specific plane there is some cross talk between the sets. The vertical movement sensor 21 is affected by horizontal eye movement; the horizontal movement sensor 22 is even more affected by vertical eye movement. The effect of this is shown in Figure 8 which plots the outputs from the sensors 21, 22 whilst an observer looks along the perimeter of the display (four sides), then along imaginary cross wires at the display centre.
Clearly some processing of this raw data is beneficial. For calibrating the sensors four visible light LEDs 27, 28, 29, 30 are mounted at the corners of the display 11 and switched ON in sequence. An observer 16 stares at each LED whilst it is illuminating, then moves to stare at the next LED illuminated.
Several readings are taken for each corner eye position and averaged by the computer 17. Three circuits of the display 11 are made whilst sensor reading are taken. From the measured readings a look-up table is constructed by the computer 17. Thus the computer 17 outputs two eight bit words representing corrected or processed sensor 21, 22 data for use by the vertical and horizontal scan motors 5, 7. The result of computer processing the sensor diode 25, 26 outputs of Figure 8(a), are shown in Figure 8(b), note an eye blink present in Figure 8(a) at the bottom left quadrant is removed in Figure 8(b). The corrected eye movement data e.g.
Figure 8(b), is used by the scanning motors 5, 7. Figure 11 shows a flow chart for the sensors calibration routine within the computer 17.
This calibration routine is performed within the computer routine shown in Figures 9, 10. The following functions are performed: 1. Collect eye position data at high speed; 2. Perform eye calibration; 3. Calculate deskewing matrix; 4. Correct incoming eye data; 5. Determine whether eye gaze position has changed; 6. Trap eye blinks, 7. Provide co-ordinates for scan mirrors 4, 6.
As shown in the flow chart 9, after start up the parameters of ranges of eye movements, delays to damp out involuntary eye movements, etc., are input either manually or from a file.
Normally the computer 17 carries on and performs the calibration routine. However, if an operator 16 has switched off the automatic movement sensors for a while as explained below or has temporarily left the display, previously determined calibration data can be read direct from a stored file. Knowing the correct position of the display corners and hence the position of the four corner LEDs 27, 28, 29, 30 the plot calibration data can be calculated, a correction mesh or grid matrix calculated, and a correction mesh or grid look-up table formed. A dummy calculator and data retrieval loop is included for use with a joystick manual control explained later. This is used when the eye monitor 15 is temporarily disabled.
The calibration data, Figure 9, is received as data input, Figure 10. Output from the eye sensors 21, 22 is fed as input x, y eye data, Figure 10. The computer 17 checks the values are within limits, apparently deliberate eye movements, and stabilise at new values, then forms new x, y positions. These new eye positions are corrected using the look-up table, and the corrected values fed into the vertical and horizontal scan motors 5, 7 causing scanning of the camera 2 in sympathy with the operators eye movements.
This continues as required. An additional feature of the programming in storing and recording of the data, both eye movements and cameras 1, 2 outputs. This allows re-runs of previous experiments by taking data from file instead of inputs from the eye monitor 15.
Overall electrical control of the above is explained by Figures 6, 7. The computer 17 initiates a synchronization signal to a timing/blanking circuit 31 which provides clocking pulses cp to clock/driver circuits 32, 33 at the detectors arrays 8, 9. These clocking pulses cause the arrays to read out the information on each of the 50 x 50 detector elements through amplifiers 34, 35 to a 2:1 multiplexer 36. Each amplifier 34, 35 has an integral automatic gain control which monitors the signal level over each frame period and adjusts the gain to maintain the average signal at a desired level. This assists in maintaining a more uniform output signal when ambient light conditions change. Additionally the output of the wide angle array 8 is taken to a video digitiser 37 and then onto the computer 17. From the multiplexer 36 the signals pass into a video input 38 to the display 11. The timing circuit 31 supplies blanking signals to the display 11 and timing signals to a timebase generator 39 which generates x and y sweep signals for the display 11. The eye position monitors 15 have their analogue output amplified then converted in an A.D. converter 42 to digital signals for use by the computer 17. The raw eye position information is corrected in the computer 17 and passed through a selector 43 before being converted to analogue signals in a D/A converter 44 for use by the scan mirror motors 5, 7. Output from the selector 43 is also to the timing/blanking circuit 31 for positioning the display patch 12 where required. An analogue joystick control 45 is fed into the selector 43 for manual control of the high resolution camera 2 and display patch 12 position.
In operation the corrected eye position is fed from the computer 17 to the scan mirrors 4, 6 and to the timing circuit 31. This both positions the direction of the high definition camera 2 detection and forms a blanking patch on the display 11 at the required position to receive the high definition patch 12. The two arrays 8, 9 are read alternately through the multiplexer 36 to the display 11. The low resolution array 8 is read onto the display 11 to fill the screen except at the blanked patch. The high resolution array 9 is then read onto the display 11 at the patch 12 position.
The frame rate is sufficiently high that a flicker-free display is seen by the observer 16. Each detector element in the low resolution array 8 receives light from 6x the area detected by each element in the other array 9. Thus the information from each low resolution element detector is enlarged by 6x in a vertical raster scan sequence on the diplay 11.
Additional features are also provided so the automatic eye position can be temporarily disabled. A manual input by the joystick 45 can be made to the computer 17 to move the relative position of the high resolution detection and consequential patch 12 position on the display 11. This allows detailed checking of any area by two or more operators at the display 11. Also the digitised video signal can be searched by the computer 17 to locate positions of brightest, and hence (for an IR sensor) hottest, signal. The high resolution camera 2 can then be directed to examine such hot spots automatically. Another useful feature is movement detection within the scene 5. A frame of digitised scene can be compared with an ealier frame to indicate areas where movement has occurred.
The high resolution camera 2 can then be directed to these areas.
A further feature is pattern recognition where a frame of digitised information is compared with stored images of known features such as cars or tanks. When the computer 17 finds a correlation of a known feature with that from the low resolution camera 1, the high resolution camera 2 can be directed onto this feature for closer study.
Resolution may be improved by introducing the technique known as microscan or dither. In this the display is displaced by a fraction of a pixel. For example four sucessive displacements may be half a pixel to the left, down, right, up to the start position.
This is repeated. The resulting four frames of information are stored in a single video memory, of four times the capacity of a non-scanned system, and then read out to provide a single frame of display information at four times the resolution. The dither scanning may be achieved by a rotating swash plate reflector in front of the wide angle camera 1. Alternatively a mirror may be mounted on piezo electric transducers, energised in sequence to microscan the scene.
In a modification the high definition camera may have zoom optics to vary the amount of magnification of the patch. This requires adjustment to the patch size by the blanking circuit 31.

Claims (8)

Claims:
1. An imaging system comprising a detector array sensitive to the wavelength of interest, an optical system for directing radiation from a scene onto the detector array, a display screen for observing an image of the scene, means for displaying the detector array as a visible display on the screen, Characterised by: two optical systems, one a wide angle low definition system, and the other a narrow angle high definition system, two detector arrays one associated with each optical system, a beam steering system for pointing the narrow angle system over the field of the wide angle system, an eye position monitor for detecting the angular position of an operators eyes, a feedback loop for controlling the scanning system in synchronism with the operators angular eye position, and means for combining the outputs of the two detector arrays on the screen, the arrangement being such that the whole scene appears to an observer as being of high definition.
2. The imaging system of claim 1 wherein the detectors are thermal detectors.
3. The imaging system of claim 1 wherein the feedback loop includes means for taking the output of the eye position monitor, processing such output to produce a corrected eye position signal, and using this signal to control the beam steering system.
4. The imaging system of claim 1 wherein the means for combining the outputs of the two detector arrays comprises means for displaying a low definition scene on the whole of the screen except at a blanked out area of the screen, and displaying a high definition scene within the blanked out area and further including means for moving the blanked out area of the screen in sympathy with an operator's angular eye position.
5. The imaging system of claim 1 wherein the resolution of of the narrow angle optical system relative to that of the wide angle optical system is between 4 and 12 times.
6. The imaging system of claim 1 wherein the beam steering system includes mirrors and electric motors.
7. The imaging system of claim 1 wherein the eye position monitor comprises a pair of light emitters and a pair of light detectors.
8. The imaging system as claimed in claim 1 constructed, arranged and adapted to operate substantially as hereinbefore described with reference to the accompanying drawings.
8. The imaging system as claimed in claim 1 constructed, arranged and adapted to operate substantially as hereinbefore described with reference to the accompanying drawings.
Amendments to the claims have been filed as follows 1. An imaging system comprising a detector array sensitive to the wavelength of interest, an optical system for directing radiation from a scene onto the detector array, a display screen for observing an image of the scene, means for displaying the detector array as a visible display on the screen, Characterised by: two optical systems, one a wide angle low definition system, and the other a narrow angle high definition system, two detector arrays one associated with each optical system, a beam steering system for pointing the narrow angle system over the field of the wide angle system, an eye position monitor for detecting the angular position of an operator's eyes, a feedback loop for controlling the beam steering system in synchronism with the operators angular eye position, and means for combining the outputs of the two detector arrays on the screen, the arrangement being such that the whole scene appears to an operator as being of high definition.
2. The imaging system of claim 1 wherein the detectors are thermal detectors.
3. The imaging system of claim 1 wherein the feedback loop includes means for taking the output of the eye position monitor, processing such output to produce a corrected eye position signal, and using this signal to control the beam steering system.
4. The imaging system of claim 1 wherein the means for combining the outputs of the two detector arrays comprises means for displaying a low definition scene on the whole of the screen except at a blanked out area of the screen, and displaying a high definition scene within the blanked out area and further including means for moving the blanked out area of the screen in sympathy with an operator's angular eye position.
5. The imaging system of claim 1 wherein the resolution of of the narrow angle optical system relative to that of the wide angle optical system is between 4 and 12 times.
6. The imaging system of claim 1 wherein the beam steering system includes mirrors and electric motors.
7. The imaging system of claim 1 wherein the eye position monitor comprises a pair of light emitters and a pair of light detectors.
GB9011075A 1989-05-18 1990-05-17 Imaging systems Expired - Fee Related GB2323231B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GBGB8911389.8A GB8911389D0 (en) 1989-05-18 1989-05-18 Imaging systems

Publications (3)

Publication Number Publication Date
GB9011075D0 GB9011075D0 (en) 1998-05-20
GB2323231A true GB2323231A (en) 1998-09-16
GB2323231B GB2323231B (en) 1998-12-23

Family

ID=10656937

Family Applications (2)

Application Number Title Priority Date Filing Date
GBGB8911389.8A Pending GB8911389D0 (en) 1989-05-18 1989-05-18 Imaging systems
GB9011075A Expired - Fee Related GB2323231B (en) 1989-05-18 1990-05-17 Imaging systems

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GBGB8911389.8A Pending GB8911389D0 (en) 1989-05-18 1989-05-18 Imaging systems

Country Status (1)

Country Link
GB (2) GB8911389D0 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0996283A1 (en) * 1998-10-23 2000-04-26 Alcatel Remote control device for camera
WO2004041078A2 (en) * 2002-11-08 2004-05-21 Ludwig-Maximilians-Universität Housing device for head-worn image recording and method for control of the housing device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4421486A (en) * 1982-03-29 1983-12-20 The United States Of America As Represented By The Secretary Of The Navy Field of view test apparatus
US4479784A (en) * 1981-03-03 1984-10-30 The Singer Company Eye line-of-sight responsive wide angle visual system
GB2143397A (en) * 1983-06-30 1985-02-06 Davin Optical Limited Low light viewing apparatus
US4513317A (en) * 1982-09-28 1985-04-23 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Retinally stabilized differential resolution television display

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4479784A (en) * 1981-03-03 1984-10-30 The Singer Company Eye line-of-sight responsive wide angle visual system
US4421486A (en) * 1982-03-29 1983-12-20 The United States Of America As Represented By The Secretary Of The Navy Field of view test apparatus
US4513317A (en) * 1982-09-28 1985-04-23 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Retinally stabilized differential resolution television display
GB2143397A (en) * 1983-06-30 1985-02-06 Davin Optical Limited Low light viewing apparatus

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0996283A1 (en) * 1998-10-23 2000-04-26 Alcatel Remote control device for camera
WO2004041078A2 (en) * 2002-11-08 2004-05-21 Ludwig-Maximilians-Universität Housing device for head-worn image recording and method for control of the housing device
WO2004041078A3 (en) * 2002-11-08 2004-07-01 Univ Muenchen L Maximilians Housing device for head-worn image recording and method for control of the housing device

Also Published As

Publication number Publication date
GB9011075D0 (en) 1998-05-20
GB8911389D0 (en) 1998-05-20
GB2323231B (en) 1998-12-23

Similar Documents

Publication Publication Date Title
US5192864A (en) Two dimensional display produced by one dimensional scanned emitters
EP0601985B1 (en) Arrangement for displaying a three-dimensional image
US5317394A (en) Distributed aperture imaging and tracking system
US7864432B2 (en) Fusion night vision system
EP1668894B1 (en) Infra-red (ir) sensor with controllable sensitivity
US20070228259A1 (en) System and method for fusing an image
US5117446A (en) X-ray diagnostic apparatus comprising means for the enlarged visual display of a selectable detail of the overall image
US4118733A (en) Surveillance arrangement including a television system and infrared detector means
JPS63107294A (en) Three-dimensional visual projector
EP0775414B1 (en) Apparatus and method for converting an optical image of an object into a digital representation
US7583293B2 (en) Apparatus and method for generating multi-image scenes with a camera
US6471355B1 (en) Image control system
US5107117A (en) Optoelectronic viewing system
ES2239809T3 (en) IMAGES IN THREE DIMENSIONS WITH LINEAR SWEEP.
US4245240A (en) Color camera having linear scanning arrays and vertical scanning mirror
US4475039A (en) Infrared viewing apparatus
US4040087A (en) Electronic motion compensation for the pyroelectric vidicon
GB2323231A (en) Imaging systems
US5391873A (en) Imaging apparatus compensating for nonuniformity in detector outputs
JP4574758B2 (en) Microscope image observation device
US5444235A (en) Scanning light valve sensor system employing fiber optics
US4109149A (en) Shade reducing aperture stop for thermal imaging systems
CA2140681C (en) Wide area coverage infrared search system
JP3219071B2 (en) Infrared laser imaging device
JP2001242070A (en) Photographing device

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20040517